New upstream version 1.42.0+dfsg1

This commit is contained in:
Ximin Luo 2020-04-04 01:11:41 +01:00
parent 60c5eb7d04
commit dfeec24772
4387 changed files with 548372 additions and 355513 deletions

View File

@ -367,7 +367,7 @@ labels to triage issues:
to fix the issue. to fix the issue.
* The dark blue **final-comment-period** label marks bugs that are using the * The dark blue **final-comment-period** label marks bugs that are using the
RFC signoff functionality of [rfcbot][rfcbot] and are currently in the final RFC signoff functionality of [rfcbot] and are currently in the final
comment period. comment period.
* Red, **I**-prefixed labels indicate the **importance** of the issue. The * Red, **I**-prefixed labels indicate the **importance** of the issue. The
@ -385,7 +385,7 @@ labels to triage issues:
label. label.
* The gray **proposed-final-comment-period** label marks bugs that are using * The gray **proposed-final-comment-period** label marks bugs that are using
the RFC signoff functionality of [rfcbot][rfcbot] and are currently awaiting the RFC signoff functionality of [rfcbot] and are currently awaiting
signoff of all team members in order to enter the final comment period. signoff of all team members in order to enter the final comment period.
* Pink, **regression**-prefixed labels track regressions from stable to the * Pink, **regression**-prefixed labels track regressions from stable to the

805
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -23,6 +23,7 @@ members = [
"src/tools/rustfmt", "src/tools/rustfmt",
"src/tools/miri", "src/tools/miri",
"src/tools/rustdoc-themes", "src/tools/rustdoc-themes",
"src/tools/unicode-table-generator",
] ]
exclude = [ exclude = [
"build", "build",

View File

@ -174,28 +174,3 @@ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
of your accepting any such warranty or additional liability. of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,3 +1,108 @@
Version 1.42.0 (2020-03-12)
==========================
Language
--------
- [You can now use the slice pattern syntax with subslices.][67712] e.g.
```rust
fn foo(words: &[&str]) {
match words {
["Hello", "World", "!", ..] => println!("Hello World!"),
["Foo", "Bar", ..] => println!("Baz"),
rest => println!("{:?}", rest),
}
}
```
- [You can now use `#[repr(transparent)]` on univariant `enum`s.][68122] Meaning
that you can create an enum that has the exact layout and ABI of the type
it contains.
- [There are some *syntax-only* changes:][67131]
- `default` is syntactically allowed before items in `trait` definitions.
- Items in `impl`s (i.e. `const`s, `type`s, and `fn`s) may syntactically
leave out their bodies in favor of `;`.
- Bounds on associated types in `impl`s are now syntactically allowed
(e.g. `type Foo: Ord;`).
- `...` (the C-variadic type) may occur syntactically directly as the type of
any function parameter.
These are still rejected *semantically*, so you will likely receive an error
but these changes can be seen and parsed by procedural macros and
conditional compilation.
Compiler
--------
- [Added tier 2\* support for `armv7a-none-eabi`.][68253]
- [Added tier 2 support for `riscv64gc-unknown-linux-gnu`.][68339]
- [`Option::{expect,unwrap}` and
`Result::{expect, expect_err, unwrap, unwrap_err}` now produce panic messages
pointing to the location where they were called, rather than
`core`'s internals. ][67887]
\* Refer to Rust's [platform support page][forge-platform-support] for more
information on Rust's tiered platform support.
Libraries
---------
- [`iter::Empty<T>` now implements `Send` and `Sync` for any `T`.][68348]
- [`Pin::{map_unchecked, map_unchecked_mut}` no longer require the return type
to implement `Sized`.][67935]
- [`io::Cursor` now derives `PartialEq` and `Eq`.][67233]
- [`Layout::new` is now `const`.][66254]
- [Added Standard Library support for `riscv64gc-unknown-linux-gnu`.][66899]
Stabilized APIs
---------------
- [`CondVar::wait_while`]
- [`CondVar::wait_timeout_while`]
- [`DebugMap::key`]
- [`DebugMap::value`]
- [`ManuallyDrop::take`]
- [`matches!`]
- [`ptr::slice_from_raw_parts_mut`]
- [`ptr::slice_from_raw_parts`]
Cargo
-----
- [You no longer need to include `extern crate proc_macro;` to be able to
`use proc_macro;` in the `2018` edition.][cargo/7700]
Compatibility Notes
-------------------
- [`Error::description` has been deprecated, and its use will now produce a
warning.][66919] It's recommended to use `Display`/`to_string` instead.
- [`use $crate;` inside macros is now a hard error.][37390] The compiler
emitted forward compatibility warnings since Rust 1.14.0.
- [As previously announced, this release reduces the level of support for
32-bit Apple targets to tier 3.][apple-32bit-drop]. This means that the
source code is still available to build, but the targets are no longer tested
and no release binary is distributed by the Rust project. Please refer to the
linked blog post for more information.
[37390]: https://github.com/rust-lang/rust/issues/37390/
[68253]: https://github.com/rust-lang/rust/pull/68253/
[68348]: https://github.com/rust-lang/rust/pull/68348/
[67935]: https://github.com/rust-lang/rust/pull/67935/
[68339]: https://github.com/rust-lang/rust/pull/68339/
[68122]: https://github.com/rust-lang/rust/pull/68122/
[67712]: https://github.com/rust-lang/rust/pull/67712/
[67887]: https://github.com/rust-lang/rust/pull/67887/
[67131]: https://github.com/rust-lang/rust/pull/67131/
[67233]: https://github.com/rust-lang/rust/pull/67233/
[66899]: https://github.com/rust-lang/rust/pull/66899/
[66919]: https://github.com/rust-lang/rust/pull/66919/
[66254]: https://github.com/rust-lang/rust/pull/66254/
[cargo/7700]: https://github.com/rust-lang/cargo/pull/7700
[`DebugMap::key`]: https://doc.rust-lang.org/stable/std/fmt/struct.DebugMap.html#method.key
[`DebugMap::value`]: https://doc.rust-lang.org/stable/std/fmt/struct.DebugMap.html#method.value
[`ManuallyDrop::take`]: https://doc.rust-lang.org/stable/std/mem/struct.ManuallyDrop.html#method.take
[`matches!`]: https://doc.rust-lang.org/stable/std/macro.matches.html
[`ptr::slice_from_raw_parts_mut`]: https://doc.rust-lang.org/stable/std/ptr/fn.slice_from_raw_parts_mut.html
[`ptr::slice_from_raw_parts`]: https://doc.rust-lang.org/stable/std/ptr/fn.slice_from_raw_parts.html
[`CondVar::wait_while`]: https://doc.rust-lang.org/stable/std/sync/struct.Condvar.html#method.wait_while
[`CondVar::wait_timeout_while`]: https://doc.rust-lang.org/stable/std/sync/struct.Condvar.html#method.wait_timeout_while
Version 1.41.1 (2020-02-27) Version 1.41.1 (2020-02-27)
=========================== ===========================
@ -8,6 +113,7 @@ Version 1.41.1 (2020-02-27)
[69225]: https://github.com/rust-lang/rust/issues/69225 [69225]: https://github.com/rust-lang/rust/issues/69225
[69145]: https://github.com/rust-lang/rust/pull/69145 [69145]: https://github.com/rust-lang/rust/pull/69145
Version 1.41.0 (2020-01-30) Version 1.41.0 (2020-01-30)
=========================== ===========================
@ -218,7 +324,7 @@ Compatibility Notes
- [Using `#[inline]` on function prototypes and consts now emits a warning under - [Using `#[inline]` on function prototypes and consts now emits a warning under
`unused_attribute` lint.][65294] Using `#[inline]` anywhere else inside traits `unused_attribute` lint.][65294] Using `#[inline]` anywhere else inside traits
or `extern` blocks now correctly emits a hard error. or `extern` blocks now correctly emits a hard error.
[65294]: https://github.com/rust-lang/rust/pull/65294/ [65294]: https://github.com/rust-lang/rust/pull/65294/
[66103]: https://github.com/rust-lang/rust/pull/66103/ [66103]: https://github.com/rust-lang/rust/pull/66103/
[65843]: https://github.com/rust-lang/rust/pull/65843/ [65843]: https://github.com/rust-lang/rust/pull/65843/
@ -5076,10 +5182,10 @@ Stabilized APIs
--------------- ---------------
* [`std::panic`] * [`std::panic`]
* [`std::panic::catch_unwind`][] (renamed from `recover`) * [`std::panic::catch_unwind`] (renamed from `recover`)
* [`std::panic::resume_unwind`][] (renamed from `propagate`) * [`std::panic::resume_unwind`] (renamed from `propagate`)
* [`std::panic::AssertUnwindSafe`][] (renamed from `AssertRecoverSafe`) * [`std::panic::AssertUnwindSafe`] (renamed from `AssertRecoverSafe`)
* [`std::panic::UnwindSafe`][] (renamed from `RecoverSafe`) * [`std::panic::UnwindSafe`] (renamed from `RecoverSafe`)
* [`str::is_char_boundary`] * [`str::is_char_boundary`]
* [`<*const T>::as_ref`] * [`<*const T>::as_ref`]
* [`<*mut T>::as_ref`] * [`<*mut T>::as_ref`]
@ -5359,18 +5465,18 @@ Libraries
--------- ---------
* Stabilized APIs: * Stabilized APIs:
* [`str::encode_utf16`][] (renamed from `utf16_units`) * [`str::encode_utf16`] (renamed from `utf16_units`)
* [`str::EncodeUtf16`][] (renamed from `Utf16Units`) * [`str::EncodeUtf16`] (renamed from `Utf16Units`)
* [`Ref::map`] * [`Ref::map`]
* [`RefMut::map`] * [`RefMut::map`]
* [`ptr::drop_in_place`] * [`ptr::drop_in_place`]
* [`time::Instant`] * [`time::Instant`]
* [`time::SystemTime`] * [`time::SystemTime`]
* [`Instant::now`] * [`Instant::now`]
* [`Instant::duration_since`][] (renamed from `duration_from_earlier`) * [`Instant::duration_since`] (renamed from `duration_from_earlier`)
* [`Instant::elapsed`] * [`Instant::elapsed`]
* [`SystemTime::now`] * [`SystemTime::now`]
* [`SystemTime::duration_since`][] (renamed from `duration_from_earlier`) * [`SystemTime::duration_since`] (renamed from `duration_from_earlier`)
* [`SystemTime::elapsed`] * [`SystemTime::elapsed`]
* Various `Add`/`Sub` impls for `Time` and `SystemTime` * Various `Add`/`Sub` impls for `Time` and `SystemTime`
* [`SystemTimeError`] * [`SystemTimeError`]
@ -5557,8 +5663,8 @@ Libraries
* Stabilized APIs * Stabilized APIs
* `Path` * `Path`
* [`Path::strip_prefix`][] (renamed from relative_from) * [`Path::strip_prefix`] (renamed from relative_from)
* [`path::StripPrefixError`][] (new error type returned from strip_prefix) * [`path::StripPrefixError`] (new error type returned from strip_prefix)
* `Ipv4Addr` * `Ipv4Addr`
* [`Ipv4Addr::is_loopback`] * [`Ipv4Addr::is_loopback`]
* [`Ipv4Addr::is_private`] * [`Ipv4Addr::is_private`]
@ -5771,7 +5877,7 @@ Libraries
* Stabilized APIs: * Stabilized APIs:
[`Read::read_exact`], [`Read::read_exact`],
[`ErrorKind::UnexpectedEof`][] (renamed from `UnexpectedEOF`), [`ErrorKind::UnexpectedEof`] (renamed from `UnexpectedEOF`),
[`fs::DirBuilder`], [`fs::DirBuilder::new`], [`fs::DirBuilder`], [`fs::DirBuilder::new`],
[`fs::DirBuilder::recursive`], [`fs::DirBuilder::create`], [`fs::DirBuilder::recursive`], [`fs::DirBuilder::create`],
[`os::unix::fs::DirBuilderExt`], [`os::unix::fs::DirBuilderExt`],
@ -5784,11 +5890,11 @@ Libraries
[`collections::hash_set::HashSet::drain`], [`collections::hash_set::HashSet::drain`],
[`collections::binary_heap::Drain`], [`collections::binary_heap::Drain`],
[`collections::binary_heap::BinaryHeap::drain`], [`collections::binary_heap::BinaryHeap::drain`],
[`Vec::extend_from_slice`][] (renamed from `push_all`), [`Vec::extend_from_slice`] (renamed from `push_all`),
[`Mutex::get_mut`], [`Mutex::into_inner`], [`RwLock::get_mut`], [`Mutex::get_mut`], [`Mutex::into_inner`], [`RwLock::get_mut`],
[`RwLock::into_inner`], [`RwLock::into_inner`],
[`Iterator::min_by_key`][] (renamed from `min_by`), [`Iterator::min_by_key`] (renamed from `min_by`),
[`Iterator::max_by_key`][] (renamed from `max_by`). [`Iterator::max_by_key`] (renamed from `max_by`).
* The [core library][1.6co] is stable, as are most of its APIs. * The [core library][1.6co] is stable, as are most of its APIs.
* [The `assert_eq!` macro supports arguments that don't implement * [The `assert_eq!` macro supports arguments that don't implement
`Sized`][1.6ae], such as arrays. In this way it behaves more like `Sized`][1.6ae], such as arrays. In this way it behaves more like

View File

@ -14,6 +14,12 @@
# ============================================================================= # =============================================================================
[llvm] [llvm]
# Indicates whether LLVM rebuild should be skipped when running bootstrap. If
# this is `false` then the compiler's LLVM will be rebuilt whenever the built
# version doesn't have the correct hash. If it is `true` then LLVM will never
# be rebuilt. The default value is `false`.
#skip-rebuild = false
# Indicates whether the LLVM build is a Release or Debug build # Indicates whether the LLVM build is a Release or Debug build
#optimize = true #optimize = true
@ -132,6 +138,10 @@
# specified, use this rustc binary instead as the stage0 snapshot compiler. # specified, use this rustc binary instead as the stage0 snapshot compiler.
#rustc = "/path/to/bin/rustc" #rustc = "/path/to/bin/rustc"
# Instead of download the src/stage0.txt version of rustfmt specified,
# use this rustfmt binary instead as the stage0 snapshot rustfmt.
#rustfmt = "/path/to/bin/rustfmt"
# Flag to specify whether any documentation is built. If false, rustdoc and # Flag to specify whether any documentation is built. If false, rustdoc and
# friends will still be compiled but they will not be used to generate any # friends will still be compiled but they will not be used to generate any
# documentation. # documentation.
@ -171,21 +181,23 @@
# Indicate whether the vendored sources are used for Rust dependencies or not # Indicate whether the vendored sources are used for Rust dependencies or not
#vendor = false #vendor = false
# Typically the build system will build the rust compiler twice. The second # Typically the build system will build the Rust compiler twice. The second
# compiler, however, will simply use its own libraries to link against. If you # compiler, however, will simply use its own libraries to link against. If you
# would rather to perform a full bootstrap, compiling the compiler three times, # would rather to perform a full bootstrap, compiling the compiler three times,
# then you can set this option to true. You shouldn't ever need to set this # then you can set this option to true. You shouldn't ever need to set this
# option to true. # option to true.
#full-bootstrap = false #full-bootstrap = false
# Enable a build of the extended rust tool set which is not only the compiler # Enable a build of the extended Rust tool set which is not only the compiler
# but also tools such as Cargo. This will also produce "combined installers" # but also tools such as Cargo. This will also produce "combined installers"
# which are used to install Rust and Cargo together. This is disabled by # which are used to install Rust and Cargo together. This is disabled by
# default. # default. The `tools` option (immediately below) specifies which tools should
# be built if `extended = true`.
#extended = false #extended = false
# Installs chosen set of extended tools if enabled. By default builds all. # Installs chosen set of extended tools if `extended = true`. By default builds all.
# If chosen tool failed to build the installation fails. # If chosen tool failed to build the installation fails. If `extended = false`, this
# option is ignored.
#tools = ["cargo", "rls", "clippy", "rustfmt", "analysis", "src"] #tools = ["cargo", "rls", "clippy", "rustfmt", "analysis", "src"]
# Verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose # Verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose
@ -400,6 +412,13 @@
# Whether to verify generated LLVM IR # Whether to verify generated LLVM IR
#verify-llvm-ir = false #verify-llvm-ir = false
# Compile the compiler with a non-default ThinLTO import limit. This import
# limit controls the maximum size of functions imported by ThinLTO. Decreasing
# will make code compile faster at the expense of lower runtime performance.
# If `incremental` is set to true above, the import limit will default to 10
# instead of LLVM's default of 100.
#thin-lto-import-instr-limit = 100
# Map all debuginfo paths for libstd and crates to `/rust/$sha/$crate/...`, # Map all debuginfo paths for libstd and crates to `/rust/$sha/$crate/...`,
# generally only set for releases # generally only set for releases
#remap-debuginfo = false #remap-debuginfo = false

View File

@ -1 +1 @@
f3e1a954d2ead4e2fc197c7da7d71e6c61bad196 b8cedc00407a4c56a3bda1ed605c6fc166655447

View File

@ -47,7 +47,11 @@ serde_json = "1.0.2"
toml = "0.5" toml = "0.5"
lazy_static = "1.3.0" lazy_static = "1.3.0"
time = "0.1" time = "0.1"
petgraph = "0.4.13" ignore = "0.4.10"
[target.'cfg(windows)'.dependencies.winapi]
version = "0.3"
features = ["fileapi", "ioapiset", "jobapi2", "handleapi", "winioctl"]
[dev-dependencies] [dev-dependencies]
pretty_assertions = "0.5" pretty_assertions = "0.5"

View File

@ -2,8 +2,8 @@
// `src/bootstrap/native.rs` for why this is needed when compiling LLD. // `src/bootstrap/native.rs` for why this is needed when compiling LLD.
use std::env; use std::env;
use std::process::{self, Stdio, Command};
use std::io::{self, Write}; use std::io::{self, Write};
use std::process::{self, Command, Stdio};
fn main() { fn main() {
let real_llvm_config = env::var_os("LLVM_CONFIG_REAL").unwrap(); let real_llvm_config = env::var_os("LLVM_CONFIG_REAL").unwrap();

View File

@ -7,7 +7,7 @@
use std::env; use std::env;
use bootstrap::{Config, Build}; use bootstrap::{Build, Config};
fn main() { fn main() {
let args = env::args().skip(1).collect::<Vec<_>>(); let args = env::args().skip(1).collect::<Vec<_>>();

View File

@ -27,9 +27,7 @@ fn main() {
// Detect whether or not we're a build script depending on whether --target // Detect whether or not we're a build script depending on whether --target
// is passed (a bit janky...) // is passed (a bit janky...)
let target = args.windows(2) let target = args.windows(2).find(|w| &*w[0] == "--target").and_then(|w| w[1].to_str());
.find(|w| &*w[0] == "--target")
.and_then(|w| w[1].to_str());
let version = args.iter().find(|w| &**w == "-vV"); let version = args.iter().find(|w| &**w == "-vV");
let verbose = match env::var("RUSTC_VERBOSE") { let verbose = match env::var("RUSTC_VERBOSE") {
@ -57,19 +55,16 @@ fn main() {
dylib_path.insert(0, PathBuf::from(&libdir)); dylib_path.insert(0, PathBuf::from(&libdir));
let mut cmd = Command::new(rustc); let mut cmd = Command::new(rustc);
cmd.args(&args) cmd.args(&args).env(bootstrap::util::dylib_path_var(), env::join_paths(&dylib_path).unwrap());
.env(bootstrap::util::dylib_path_var(),
env::join_paths(&dylib_path).unwrap());
// Get the name of the crate we're compiling, if any. // Get the name of the crate we're compiling, if any.
let crate_name = args.windows(2) let crate_name =
.find(|args| args[0] == "--crate-name") args.windows(2).find(|args| args[0] == "--crate-name").and_then(|args| args[1].to_str());
.and_then(|args| args[1].to_str());
if let Some(crate_name) = crate_name { if let Some(crate_name) = crate_name {
if let Some(target) = env::var_os("RUSTC_TIME") { if let Some(target) = env::var_os("RUSTC_TIME") {
if target == "all" || if target == "all"
target.into_string().unwrap().split(",").any(|c| c.trim() == crate_name) || target.into_string().unwrap().split(",").any(|c| c.trim() == crate_name)
{ {
cmd.arg("-Ztime"); cmd.arg("-Ztime");
} }
@ -101,15 +96,22 @@ fn main() {
// `compiler_builtins` are unconditionally compiled with panic=abort to // `compiler_builtins` are unconditionally compiled with panic=abort to
// workaround undefined references to `rust_eh_unwind_resume` generated // workaround undefined references to `rust_eh_unwind_resume` generated
// otherwise, see issue https://github.com/rust-lang/rust/issues/43095. // otherwise, see issue https://github.com/rust-lang/rust/issues/43095.
if crate_name == Some("panic_abort") || if crate_name == Some("panic_abort")
crate_name == Some("compiler_builtins") && stage != "0" { || crate_name == Some("compiler_builtins") && stage != "0"
{
cmd.arg("-C").arg("panic=abort"); cmd.arg("-C").arg("panic=abort");
} }
// Set various options from config.toml to configure how we're building // Set various options from config.toml to configure how we're building
// code. // code.
let debug_assertions = match env::var("RUSTC_DEBUG_ASSERTIONS") { let debug_assertions = match env::var("RUSTC_DEBUG_ASSERTIONS") {
Ok(s) => if s == "true" { "y" } else { "n" }, Ok(s) => {
if s == "true" {
"y"
} else {
"n"
}
}
Err(..) => "n", Err(..) => "n",
}; };
@ -178,17 +180,17 @@ fn main() {
if env::var_os("RUSTC_PRINT_STEP_TIMINGS").is_some() { if env::var_os("RUSTC_PRINT_STEP_TIMINGS").is_some() {
if let Some(crate_name) = crate_name { if let Some(crate_name) = crate_name {
let start = Instant::now(); let start = Instant::now();
let status = cmd let status = cmd.status().unwrap_or_else(|_| panic!("\n\n failed to run {:?}", cmd));
.status()
.unwrap_or_else(|_| panic!("\n\n failed to run {:?}", cmd));
let dur = start.elapsed(); let dur = start.elapsed();
let is_test = args.iter().any(|a| a == "--test"); let is_test = args.iter().any(|a| a == "--test");
eprintln!("[RUSTC-TIMING] {} test:{} {}.{:03}", eprintln!(
crate_name, "[RUSTC-TIMING] {} test:{} {}.{:03}",
is_test, crate_name,
dur.as_secs(), is_test,
dur.subsec_nanos() / 1_000_000); dur.as_secs(),
dur.subsec_nanos() / 1_000_000
);
match status.code() { match status.code() {
Some(i) => std::process::exit(i), Some(i) => std::process::exit(i),

View File

@ -3,9 +3,9 @@
//! See comments in `src/bootstrap/rustc.rs` for more information. //! See comments in `src/bootstrap/rustc.rs` for more information.
use std::env; use std::env;
use std::process::Command;
use std::path::PathBuf;
use std::ffi::OsString; use std::ffi::OsString;
use std::path::PathBuf;
use std::process::Command;
fn main() { fn main() {
let args = env::args_os().skip(1).collect::<Vec<_>>(); let args = env::args_os().skip(1).collect::<Vec<_>>();
@ -35,8 +35,7 @@ fn main() {
.arg("dox") .arg("dox")
.arg("--sysroot") .arg("--sysroot")
.arg(&sysroot) .arg(&sysroot)
.env(bootstrap::util::dylib_path_var(), .env(bootstrap::util::dylib_path_var(), env::join_paths(&dylib_path).unwrap());
env::join_paths(&dylib_path).unwrap());
// Force all crates compiled by this compiler to (a) be unstable and (b) // Force all crates compiled by this compiler to (a) be unstable and (b)
// allow the `rustc_private` feature to link to other unstable crates // allow the `rustc_private` feature to link to other unstable crates
@ -55,8 +54,7 @@ fn main() {
if let Some(version) = env::var_os("RUSTDOC_CRATE_VERSION") { if let Some(version) = env::var_os("RUSTDOC_CRATE_VERSION") {
// This "unstable-options" can be removed when `--crate-version` is stabilized // This "unstable-options" can be removed when `--crate-version` is stabilized
if !has_unstable { if !has_unstable {
cmd.arg("-Z") cmd.arg("-Z").arg("unstable-options");
.arg("unstable-options");
} }
cmd.arg("--crate-version").arg(version); cmd.arg("--crate-version").arg(version);
has_unstable = true; has_unstable = true;
@ -66,8 +64,7 @@ fn main() {
if let Some(_) = env::var_os("RUSTDOC_GENERATE_REDIRECT_PAGES") { if let Some(_) = env::var_os("RUSTDOC_GENERATE_REDIRECT_PAGES") {
// This "unstable-options" can be removed when `--generate-redirect-pages` is stabilized // This "unstable-options" can be removed when `--generate-redirect-pages` is stabilized
if !has_unstable { if !has_unstable {
cmd.arg("-Z") cmd.arg("-Z").arg("unstable-options");
.arg("unstable-options");
} }
cmd.arg("--generate-redirect-pages"); cmd.arg("--generate-redirect-pages");
has_unstable = true; has_unstable = true;
@ -77,8 +74,7 @@ fn main() {
if let Some(ref x) = env::var_os("RUSTDOC_RESOURCE_SUFFIX") { if let Some(ref x) = env::var_os("RUSTDOC_RESOURCE_SUFFIX") {
// This "unstable-options" can be removed when `--resource-suffix` is stabilized // This "unstable-options" can be removed when `--resource-suffix` is stabilized
if !has_unstable { if !has_unstable {
cmd.arg("-Z") cmd.arg("-Z").arg("unstable-options");
.arg("unstable-options");
} }
cmd.arg("--resource-suffix").arg(x); cmd.arg("--resource-suffix").arg(x);
} }

View File

@ -8,12 +8,12 @@ fn main() {
env::set_var("CXX", env::var_os("SCCACHE_CXX").unwrap()); env::set_var("CXX", env::var_os("SCCACHE_CXX").unwrap());
let mut cfg = cc::Build::new(); let mut cfg = cc::Build::new();
cfg.cargo_metadata(false) cfg.cargo_metadata(false)
.out_dir("/") .out_dir("/")
.target(&target) .target(&target)
.host(&target) .host(&target)
.opt_level(0) .opt_level(0)
.warnings(false) .warnings(false)
.debug(false); .debug(false);
let compiler = cfg.get_compiler(); let compiler = cfg.get_compiler();
// Invoke sccache with said compiler // Invoke sccache with said compiler

View File

@ -322,6 +322,7 @@ class RustBuild(object):
self.date = '' self.date = ''
self._download_url = '' self._download_url = ''
self.rustc_channel = '' self.rustc_channel = ''
self.rustfmt_channel = ''
self.build = '' self.build = ''
self.build_dir = os.path.join(os.getcwd(), "build") self.build_dir = os.path.join(os.getcwd(), "build")
self.clean = False self.clean = False
@ -344,6 +345,7 @@ class RustBuild(object):
""" """
rustc_channel = self.rustc_channel rustc_channel = self.rustc_channel
cargo_channel = self.cargo_channel cargo_channel = self.cargo_channel
rustfmt_channel = self.rustfmt_channel
def support_xz(): def support_xz():
try: try:
@ -393,13 +395,29 @@ class RustBuild(object):
with output(self.cargo_stamp()) as cargo_stamp: with output(self.cargo_stamp()) as cargo_stamp:
cargo_stamp.write(self.date) cargo_stamp.write(self.date)
def _download_stage0_helper(self, filename, pattern, tarball_suffix): if self.rustfmt() and self.rustfmt().startswith(self.bin_root()) and (
not os.path.exists(self.rustfmt())
or self.program_out_of_date(self.rustfmt_stamp())
):
if rustfmt_channel:
tarball_suffix = '.tar.xz' if support_xz() else '.tar.gz'
[channel, date] = rustfmt_channel.split('-', 1)
filename = "rustfmt-{}-{}{}".format(channel, self.build, tarball_suffix)
self._download_stage0_helper(filename, "rustfmt-preview", tarball_suffix, date)
self.fix_executable("{}/bin/rustfmt".format(self.bin_root()))
self.fix_executable("{}/bin/cargo-fmt".format(self.bin_root()))
with output(self.rustfmt_stamp()) as rustfmt_stamp:
rustfmt_stamp.write(self.date)
def _download_stage0_helper(self, filename, pattern, tarball_suffix, date=None):
if date is None:
date = self.date
cache_dst = os.path.join(self.build_dir, "cache") cache_dst = os.path.join(self.build_dir, "cache")
rustc_cache = os.path.join(cache_dst, self.date) rustc_cache = os.path.join(cache_dst, date)
if not os.path.exists(rustc_cache): if not os.path.exists(rustc_cache):
os.makedirs(rustc_cache) os.makedirs(rustc_cache)
url = "{}/dist/{}".format(self._download_url, self.date) url = "{}/dist/{}".format(self._download_url, date)
tarball = os.path.join(rustc_cache, filename) tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball): if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose) get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
@ -493,6 +511,16 @@ class RustBuild(object):
""" """
return os.path.join(self.bin_root(), '.cargo-stamp') return os.path.join(self.bin_root(), '.cargo-stamp')
def rustfmt_stamp(self):
"""Return the path for .rustfmt-stamp
>>> rb = RustBuild()
>>> rb.build_dir = "build"
>>> rb.rustfmt_stamp() == os.path.join("build", "stage0", ".rustfmt-stamp")
True
"""
return os.path.join(self.bin_root(), '.rustfmt-stamp')
def program_out_of_date(self, stamp_path): def program_out_of_date(self, stamp_path):
"""Check if the given program stamp is out of date""" """Check if the given program stamp is out of date"""
if not os.path.exists(stamp_path) or self.clean: if not os.path.exists(stamp_path) or self.clean:
@ -565,6 +593,12 @@ class RustBuild(object):
"""Return config path for rustc""" """Return config path for rustc"""
return self.program_config('rustc') return self.program_config('rustc')
def rustfmt(self):
"""Return config path for rustfmt"""
if not self.rustfmt_channel:
return None
return self.program_config('rustfmt')
def program_config(self, program): def program_config(self, program):
"""Return config path for the given program """Return config path for the given program
@ -868,6 +902,9 @@ def bootstrap(help_triggered):
build.rustc_channel = data['rustc'] build.rustc_channel = data['rustc']
build.cargo_channel = data['cargo'] build.cargo_channel = data['cargo']
if "rustfmt" in data:
build.rustfmt_channel = data['rustfmt']
if 'dev' in data: if 'dev' in data:
build.set_dev_environment() build.set_dev_environment()
else: else:
@ -895,6 +932,8 @@ def bootstrap(help_triggered):
env["RUSTC_BOOTSTRAP"] = '1' env["RUSTC_BOOTSTRAP"] = '1'
env["CARGO"] = build.cargo() env["CARGO"] = build.cargo()
env["RUSTC"] = build.rustc() env["RUSTC"] = build.rustc()
if build.rustfmt():
env["RUSTFMT"] = build.rustfmt()
run(args, env=env, verbose=build.verbose) run(args, env=env, verbose=build.verbose)

View File

@ -20,14 +20,14 @@ class Stage0DataTestCase(unittest.TestCase):
os.mkdir(os.path.join(self.rust_root, "src")) os.mkdir(os.path.join(self.rust_root, "src"))
with open(os.path.join(self.rust_root, "src", with open(os.path.join(self.rust_root, "src",
"stage0.txt"), "w") as stage0: "stage0.txt"), "w") as stage0:
stage0.write("#ignore\n\ndate: 2017-06-15\nrustc: beta\ncargo: beta") stage0.write("#ignore\n\ndate: 2017-06-15\nrustc: beta\ncargo: beta\nrustfmt: beta")
def tearDown(self): def tearDown(self):
rmtree(self.rust_root) rmtree(self.rust_root)
def test_stage0_data(self): def test_stage0_data(self):
"""Extract data from stage0.txt""" """Extract data from stage0.txt"""
expected = {"date": "2017-06-15", "rustc": "beta", "cargo": "beta"} expected = {"date": "2017-06-15", "rustc": "beta", "cargo": "beta", "rustfmt": "beta"}
data = bootstrap.stage0_data(self.rust_root) data = bootstrap.stage0_data(self.rust_root)
self.assertDictEqual(data, expected) self.assertDictEqual(data, expected)

View File

@ -1,7 +1,6 @@
use std::any::Any; use std::any::Any;
use std::cell::{Cell, RefCell}; use std::cell::{Cell, RefCell};
use std::collections::BTreeSet; use std::collections::BTreeSet;
use std::collections::HashMap;
use std::env; use std::env;
use std::ffi::OsStr; use std::ffi::OsStr;
use std::fmt::Debug; use std::fmt::Debug;
@ -25,13 +24,10 @@ use crate::native;
use crate::test; use crate::test;
use crate::tool; use crate::tool;
use crate::util::{self, add_lib_path, exe, libdir}; use crate::util::{self, add_lib_path, exe, libdir};
use crate::{Build, DocTests, Mode, GitRepo}; use crate::{Build, DocTests, GitRepo, Mode};
pub use crate::Compiler; pub use crate::Compiler;
use petgraph::graph::NodeIndex;
use petgraph::Graph;
pub struct Builder<'a> { pub struct Builder<'a> {
pub build: &'a Build, pub build: &'a Build,
pub top_stage: u32, pub top_stage: u32,
@ -40,9 +36,6 @@ pub struct Builder<'a> {
stack: RefCell<Vec<Box<dyn Any>>>, stack: RefCell<Vec<Box<dyn Any>>>,
time_spent_on_dependencies: Cell<Duration>, time_spent_on_dependencies: Cell<Duration>,
pub paths: Vec<PathBuf>, pub paths: Vec<PathBuf>,
graph_nodes: RefCell<HashMap<String, NodeIndex>>,
graph: RefCell<Graph<String, bool>>,
parent: Cell<Option<NodeIndex>>,
} }
impl<'a> Deref for Builder<'a> { impl<'a> Deref for Builder<'a> {
@ -129,11 +122,7 @@ impl PathSet {
fn path(&self, builder: &Builder<'_>) -> PathBuf { fn path(&self, builder: &Builder<'_>) -> PathBuf {
match self { match self {
PathSet::Set(set) => set PathSet::Set(set) => set.iter().next().unwrap_or(&builder.build.src).to_path_buf(),
.iter()
.next()
.unwrap_or(&builder.build.src)
.to_path_buf(),
PathSet::Suite(path) => PathBuf::from(path), PathSet::Suite(path) => PathBuf::from(path),
} }
} }
@ -187,10 +176,8 @@ impl StepDescription {
} }
fn run(v: &[StepDescription], builder: &Builder<'_>, paths: &[PathBuf]) { fn run(v: &[StepDescription], builder: &Builder<'_>, paths: &[PathBuf]) {
let should_runs = v let should_runs =
.iter() v.iter().map(|desc| (desc.should_run)(ShouldRun::new(builder))).collect::<Vec<_>>();
.map(|desc| (desc.should_run)(ShouldRun::new(builder)))
.collect::<Vec<_>>();
// sanity checks on rules // sanity checks on rules
for (desc, should_run) in v.iter().zip(&should_runs) { for (desc, should_run) in v.iter().zip(&should_runs) {
@ -287,8 +274,7 @@ impl<'a> ShouldRun<'a> {
// multiple aliases for the same job // multiple aliases for the same job
pub fn paths(mut self, paths: &[&str]) -> Self { pub fn paths(mut self, paths: &[&str]) -> Self {
self.paths self.paths.insert(PathSet::Set(paths.iter().map(PathBuf::from).collect()));
.insert(PathSet::Set(paths.iter().map(PathBuf::from).collect()));
self self
} }
@ -321,6 +307,7 @@ pub enum Kind {
Check, Check,
Clippy, Clippy,
Fix, Fix,
Format,
Test, Test,
Bench, Bench,
Dist, Dist,
@ -356,15 +343,14 @@ impl<'a> Builder<'a> {
tool::Rustdoc, tool::Rustdoc,
tool::Clippy, tool::Clippy,
native::Llvm, native::Llvm,
native::Sanitizers,
tool::Rustfmt, tool::Rustfmt,
tool::Miri, tool::Miri,
native::Lld native::Lld
), ),
Kind::Check | Kind::Clippy | Kind::Fix => describe!( Kind::Check | Kind::Clippy | Kind::Fix | Kind::Format => {
check::Std, describe!(check::Std, check::Rustc, check::Rustdoc)
check::Rustc, }
check::Rustdoc
),
Kind::Test => describe!( Kind::Test => describe!(
crate::toolstate::ToolStateCheck, crate::toolstate::ToolStateCheck,
test::Tidy, test::Tidy,
@ -490,9 +476,6 @@ impl<'a> Builder<'a> {
stack: RefCell::new(Vec::new()), stack: RefCell::new(Vec::new()),
time_spent_on_dependencies: Cell::new(Duration::new(0, 0)), time_spent_on_dependencies: Cell::new(Duration::new(0, 0)),
paths: vec![], paths: vec![],
graph_nodes: RefCell::new(HashMap::new()),
graph: RefCell::new(Graph::new()),
parent: Cell::new(None),
}; };
let builder = &builder; let builder = &builder;
@ -524,7 +507,7 @@ impl<'a> Builder<'a> {
Subcommand::Bench { ref paths, .. } => (Kind::Bench, &paths[..]), Subcommand::Bench { ref paths, .. } => (Kind::Bench, &paths[..]),
Subcommand::Dist { ref paths } => (Kind::Dist, &paths[..]), Subcommand::Dist { ref paths } => (Kind::Dist, &paths[..]),
Subcommand::Install { ref paths } => (Kind::Install, &paths[..]), Subcommand::Install { ref paths } => (Kind::Install, &paths[..]),
Subcommand::Clean { .. } => panic!(), Subcommand::Format { .. } | Subcommand::Clean { .. } => panic!(),
}; };
let builder = Builder { let builder = Builder {
@ -535,17 +518,13 @@ impl<'a> Builder<'a> {
stack: RefCell::new(Vec::new()), stack: RefCell::new(Vec::new()),
time_spent_on_dependencies: Cell::new(Duration::new(0, 0)), time_spent_on_dependencies: Cell::new(Duration::new(0, 0)),
paths: paths.to_owned(), paths: paths.to_owned(),
graph_nodes: RefCell::new(HashMap::new()),
graph: RefCell::new(Graph::new()),
parent: Cell::new(None),
}; };
builder builder
} }
pub fn execute_cli(&self) -> Graph<String, bool> { pub fn execute_cli(&self) {
self.run_step_descriptions(&Builder::get_step_descriptions(self.kind), &self.paths); self.run_step_descriptions(&Builder::get_step_descriptions(self.kind), &self.paths);
self.graph.borrow().clone()
} }
pub fn default_doc(&self, paths: Option<&[PathBuf]>) { pub fn default_doc(&self, paths: Option<&[PathBuf]>) {
@ -562,9 +541,7 @@ impl<'a> Builder<'a> {
/// obtained through this function, since it ensures that they are valid /// obtained through this function, since it ensures that they are valid
/// (i.e., built and assembled). /// (i.e., built and assembled).
pub fn compiler(&self, stage: u32, host: Interned<String>) -> Compiler { pub fn compiler(&self, stage: u32, host: Interned<String>) -> Compiler {
self.ensure(compile::Assemble { self.ensure(compile::Assemble { target_compiler: Compiler { stage, host } })
target_compiler: Compiler { stage, host },
})
} }
/// Similar to `compiler`, except handles the full-bootstrap option to /// Similar to `compiler`, except handles the full-bootstrap option to
@ -640,9 +617,10 @@ impl<'a> Builder<'a> {
self.rustc_snapshot_libdir() self.rustc_snapshot_libdir()
} else { } else {
match self.config.libdir_relative() { match self.config.libdir_relative() {
Some(relative_libdir) if compiler.stage >= 1 Some(relative_libdir) if compiler.stage >= 1 => {
=> self.sysroot(compiler).join(relative_libdir), self.sysroot(compiler).join(relative_libdir)
_ => self.sysroot(compiler).join(libdir(&compiler.host)) }
_ => self.sysroot(compiler).join(libdir(&compiler.host)),
} }
} }
} }
@ -657,9 +635,8 @@ impl<'a> Builder<'a> {
libdir(&self.config.build).as_ref() libdir(&self.config.build).as_ref()
} else { } else {
match self.config.libdir_relative() { match self.config.libdir_relative() {
Some(relative_libdir) if compiler.stage >= 1 Some(relative_libdir) if compiler.stage >= 1 => relative_libdir,
=> relative_libdir, _ => libdir(&compiler.host).as_ref(),
_ => libdir(&compiler.host).as_ref()
} }
} }
} }
@ -670,9 +647,8 @@ impl<'a> Builder<'a> {
/// For example this returns `lib` on Unix and Windows. /// For example this returns `lib` on Unix and Windows.
pub fn sysroot_libdir_relative(&self, compiler: Compiler) -> &Path { pub fn sysroot_libdir_relative(&self, compiler: Compiler) -> &Path {
match self.config.libdir_relative() { match self.config.libdir_relative() {
Some(relative_libdir) if compiler.stage >= 1 Some(relative_libdir) if compiler.stage >= 1 => relative_libdir,
=> relative_libdir, _ => Path::new("lib"),
_ => Path::new("lib")
} }
} }
@ -694,9 +670,7 @@ impl<'a> Builder<'a> {
if compiler.is_snapshot(self) { if compiler.is_snapshot(self) {
self.initial_rustc.clone() self.initial_rustc.clone()
} else { } else {
self.sysroot(compiler) self.sysroot(compiler).join("bin").join(exe("rustc", &compiler.host))
.join("bin")
.join(exe("rustc", &compiler.host))
} }
} }
@ -753,17 +727,10 @@ impl<'a> Builder<'a> {
self.clear_if_dirty(&my_out, &rustdoc); self.clear_if_dirty(&my_out, &rustdoc);
} }
cargo cargo.env("CARGO_TARGET_DIR", &out_dir).arg(cmd).arg("-Zconfig-profile");
.env("CARGO_TARGET_DIR", out_dir)
.arg(cmd)
.arg("-Zconfig-profile");
let profile_var = |name: &str| { let profile_var = |name: &str| {
let profile = if self.config.rust_optimize { let profile = if self.config.rust_optimize { "RELEASE" } else { "DEV" };
"RELEASE"
} else {
"DEV"
};
format!("CARGO_PROFILE_{}_{}", profile, name) format!("CARGO_PROFILE_{}_{}", profile, name)
}; };
@ -775,8 +742,7 @@ impl<'a> Builder<'a> {
} }
if cmd != "install" { if cmd != "install" {
cargo.arg("--target") cargo.arg("--target").arg(target);
.arg(target);
} else { } else {
assert_eq!(target, compiler.host); assert_eq!(target, compiler.host);
} }
@ -814,14 +780,14 @@ impl<'a> Builder<'a> {
} }
match mode { match mode {
Mode::Std | Mode::ToolBootstrap | Mode::ToolStd => {}, Mode::Std | Mode::ToolBootstrap | Mode::ToolStd => {}
Mode::Rustc | Mode::Codegen | Mode::ToolRustc => { Mode::Rustc | Mode::Codegen | Mode::ToolRustc => {
// Build proc macros both for the host and the target // Build proc macros both for the host and the target
if target != compiler.host && cmd != "check" { if target != compiler.host && cmd != "check" {
cargo.arg("-Zdual-proc-macros"); cargo.arg("-Zdual-proc-macros");
rustflags.arg("-Zdual-proc-macros"); rustflags.arg("-Zdual-proc-macros");
} }
}, }
} }
// This tells Cargo (and in turn, rustc) to output more complete // This tells Cargo (and in turn, rustc) to output more complete
@ -897,13 +863,21 @@ impl<'a> Builder<'a> {
assert!(!use_snapshot || stage == 0 || self.local_rebuild); assert!(!use_snapshot || stage == 0 || self.local_rebuild);
let maybe_sysroot = self.sysroot(compiler); let maybe_sysroot = self.sysroot(compiler);
let sysroot = if use_snapshot { let sysroot = if use_snapshot { self.rustc_snapshot_sysroot() } else { &maybe_sysroot };
self.rustc_snapshot_sysroot()
} else {
&maybe_sysroot
};
let libdir = self.rustc_libdir(compiler); let libdir = self.rustc_libdir(compiler);
// Clear the output directory if the real rustc we're using has changed;
// Cargo cannot detect this as it thinks rustc is bootstrap/debug/rustc.
//
// Avoid doing this during dry run as that usually means the relevant
// compiler is not yet linked/copied properly.
//
// Only clear out the directory if we're compiling std; otherwise, we
// should let Cargo take care of things for us (via depdep info)
if !self.config.dry_run && mode == Mode::Std && cmd == "build" {
self.clear_if_dirty(&out_dir, &self.rustc(compiler));
}
// Customize the compiler we're running. Specify the compiler to cargo // Customize the compiler we're running. Specify the compiler to cargo
// as our shim and then pass it some various options used to configure // as our shim and then pass it some various options used to configure
// how the actual compiler itself is called. // how the actual compiler itself is called.
@ -915,10 +889,7 @@ impl<'a> Builder<'a> {
.env("RUSTC", self.out.join("bootstrap/debug/rustc")) .env("RUSTC", self.out.join("bootstrap/debug/rustc"))
.env("RUSTC_REAL", self.rustc(compiler)) .env("RUSTC_REAL", self.rustc(compiler))
.env("RUSTC_STAGE", stage.to_string()) .env("RUSTC_STAGE", stage.to_string())
.env( .env("RUSTC_DEBUG_ASSERTIONS", self.config.rust_debug_assertions.to_string())
"RUSTC_DEBUG_ASSERTIONS",
self.config.rust_debug_assertions.to_string(),
)
.env("RUSTC_SYSROOT", &sysroot) .env("RUSTC_SYSROOT", &sysroot)
.env("RUSTC_LIBDIR", &libdir) .env("RUSTC_LIBDIR", &libdir)
.env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc")) .env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc"))
@ -961,7 +932,6 @@ impl<'a> Builder<'a> {
// to change a flag in a binary? // to change a flag in a binary?
if self.config.rust_rpath && util::use_host_linker(&target) { if self.config.rust_rpath && util::use_host_linker(&target) {
let rpath = if target.contains("apple") { let rpath = if target.contains("apple") {
// Note that we need to take one extra step on macOS to also pass // Note that we need to take one extra step on macOS to also pass
// `-Wl,-instal_name,@rpath/...` to get things to work right. To // `-Wl,-instal_name,@rpath/...` to get things to work right. To
// do that we pass a weird flag to the compiler to get it to do // do that we pass a weird flag to the compiler to get it to do
@ -993,8 +963,9 @@ impl<'a> Builder<'a> {
let debuginfo_level = match mode { let debuginfo_level = match mode {
Mode::Rustc | Mode::Codegen => self.config.rust_debuginfo_level_rustc, Mode::Rustc | Mode::Codegen => self.config.rust_debuginfo_level_rustc,
Mode::Std => self.config.rust_debuginfo_level_std, Mode::Std => self.config.rust_debuginfo_level_std,
Mode::ToolBootstrap | Mode::ToolStd | Mode::ToolBootstrap | Mode::ToolStd | Mode::ToolRustc => {
Mode::ToolRustc => self.config.rust_debuginfo_level_tools, self.config.rust_debuginfo_level_tools
}
}; };
cargo.env(profile_var("DEBUG"), debuginfo_level.to_string()); cargo.env(profile_var("DEBUG"), debuginfo_level.to_string());
@ -1115,14 +1086,11 @@ impl<'a> Builder<'a> {
cargo.env(format!("CC_{}", target), &cc); cargo.env(format!("CC_{}", target), &cc);
let cflags = self.cflags(target, GitRepo::Rustc).join(" "); let cflags = self.cflags(target, GitRepo::Rustc).join(" ");
cargo cargo.env(format!("CFLAGS_{}", target), cflags.clone());
.env(format!("CFLAGS_{}", target), cflags.clone());
if let Some(ar) = self.ar(target) { if let Some(ar) = self.ar(target) {
let ranlib = format!("{} s", ar.display()); let ranlib = format!("{} s", ar.display());
cargo cargo.env(format!("AR_{}", target), ar).env(format!("RANLIB_{}", target), ranlib);
.env(format!("AR_{}", target), ar)
.env(format!("RANLIB_{}", target), ranlib);
} }
if let Ok(cxx) = self.cxx(target) { if let Ok(cxx) = self.cxx(target) {
@ -1133,15 +1101,14 @@ impl<'a> Builder<'a> {
} }
} }
if mode == Mode::Std if mode == Mode::Std && self.config.extended && compiler.is_final_stage(self) {
&& self.config.extended
&& compiler.is_final_stage(self)
{
rustflags.arg("-Zsave-analysis"); rustflags.arg("-Zsave-analysis");
cargo.env("RUST_SAVE_ANALYSIS_CONFIG", cargo.env(
"{\"output_file\": null,\"full_docs\": false,\ "RUST_SAVE_ANALYSIS_CONFIG",
"{\"output_file\": null,\"full_docs\": false,\
\"pub_only\": true,\"reachable_only\": false,\ \"pub_only\": true,\"reachable_only\": false,\
\"distro_crate\": true,\"signatures\": false,\"borrow_data\": false}"); \"distro_crate\": true,\"signatures\": false,\"borrow_data\": false}",
);
} }
// For `cargo doc` invocations, make rustdoc print the Rust version into the docs // For `cargo doc` invocations, make rustdoc print the Rust version into the docs
@ -1195,8 +1162,7 @@ impl<'a> Builder<'a> {
} }
match (mode, self.config.rust_codegen_units_std, self.config.rust_codegen_units) { match (mode, self.config.rust_codegen_units_std, self.config.rust_codegen_units) {
(Mode::Std, Some(n), _) | (Mode::Std, Some(n), _) | (_, _, Some(n)) => {
(_, _, Some(n)) => {
cargo.env(profile_var("CODEGEN_UNITS"), n.to_string()); cargo.env(profile_var("CODEGEN_UNITS"), n.to_string());
} }
_ => { _ => {
@ -1230,10 +1196,22 @@ impl<'a> Builder<'a> {
rustflags.arg("-Cprefer-dynamic"); rustflags.arg("-Cprefer-dynamic");
} }
Cargo { // When building incrementally we default to a lower ThinLTO import limit
command: cargo, // (unless explicitly specified otherwise). This will produce a somewhat
rustflags, // slower code but give way better compile times.
{
let limit = match self.config.rust_thin_lto_import_instr_limit {
Some(limit) => Some(limit),
None if self.config.incremental => Some(10),
_ => None,
};
if let Some(limit) = limit {
rustflags.arg(&format!("-Cllvm-args=-import-instr-limit={}", limit));
}
} }
Cargo { command: cargo, rustflags }
} }
/// Ensure that a given step is built, returning its output. This will /// Ensure that a given step is built, returning its output. This will
@ -1244,10 +1222,7 @@ impl<'a> Builder<'a> {
let mut stack = self.stack.borrow_mut(); let mut stack = self.stack.borrow_mut();
for stack_step in stack.iter() { for stack_step in stack.iter() {
// should skip // should skip
if stack_step if stack_step.downcast_ref::<S>().map_or(true, |stack_step| *stack_step != step) {
.downcast_ref::<S>()
.map_or(true, |stack_step| *stack_step != step)
{
continue; continue;
} }
let mut out = String::new(); let mut out = String::new();
@ -1260,41 +1235,12 @@ impl<'a> Builder<'a> {
if let Some(out) = self.cache.get(&step) { if let Some(out) = self.cache.get(&step) {
self.verbose(&format!("{}c {:?}", " ".repeat(stack.len()), step)); self.verbose(&format!("{}c {:?}", " ".repeat(stack.len()), step));
{
let mut graph = self.graph.borrow_mut();
let parent = self.parent.get();
let us = *self
.graph_nodes
.borrow_mut()
.entry(format!("{:?}", step))
.or_insert_with(|| graph.add_node(format!("{:?}", step)));
if let Some(parent) = parent {
graph.add_edge(parent, us, false);
}
}
return out; return out;
} }
self.verbose(&format!("{}> {:?}", " ".repeat(stack.len()), step)); self.verbose(&format!("{}> {:?}", " ".repeat(stack.len()), step));
stack.push(Box::new(step.clone())); stack.push(Box::new(step.clone()));
} }
let prev_parent = self.parent.get();
{
let mut graph = self.graph.borrow_mut();
let parent = self.parent.get();
let us = *self
.graph_nodes
.borrow_mut()
.entry(format!("{:?}", step))
.or_insert_with(|| graph.add_node(format!("{:?}", step)));
self.parent.set(Some(us));
if let Some(parent) = parent {
graph.add_edge(parent, us, true);
}
}
let (out, dur) = { let (out, dur) = {
let start = Instant::now(); let start = Instant::now();
let zero = Duration::new(0, 0); let zero = Duration::new(0, 0);
@ -1305,8 +1251,6 @@ impl<'a> Builder<'a> {
(out, dur - deps) (out, dur - deps)
}; };
self.parent.set(prev_parent);
if self.config.print_step_timings && dur > Duration::from_millis(100) { if self.config.print_step_timings && dur > Duration::from_millis(100) {
println!( println!(
"[TIMING] {:?} -- {}.{:03}", "[TIMING] {:?} -- {}.{:03}",
@ -1321,11 +1265,7 @@ impl<'a> Builder<'a> {
let cur_step = stack.pop().expect("step stack empty"); let cur_step = stack.pop().expect("step stack empty");
assert_eq!(cur_step.downcast_ref(), Some(&step)); assert_eq!(cur_step.downcast_ref(), Some(&step));
} }
self.verbose(&format!( self.verbose(&format!("{}< {:?}", " ".repeat(self.stack.borrow().len()), step));
"{}< {:?}",
" ".repeat(self.stack.borrow().len()),
step
));
self.cache.put(step, out.clone()); self.cache.put(step, out.clone());
out out
} }
@ -1388,7 +1328,9 @@ impl Cargo {
} }
pub fn args<I, S>(&mut self, args: I) -> &mut Cargo pub fn args<I, S>(&mut self, args: I) -> &mut Cargo
where I: IntoIterator<Item=S>, S: AsRef<OsStr> where
I: IntoIterator<Item = S>,
S: AsRef<OsStr>,
{ {
for arg in args { for arg in args {
self.arg(arg.as_ref()); self.arg(arg.as_ref());

View File

@ -11,12 +11,10 @@ fn configure(host: &[&str], target: &[&str]) -> Config {
config.skip_only_host_steps = false; config.skip_only_host_steps = false;
config.dry_run = true; config.dry_run = true;
// try to avoid spurious failures in dist where we create/delete each others file // try to avoid spurious failures in dist where we create/delete each others file
let dir = config.out.join("tmp-rustbuild-tests").join( let dir = config
&thread::current() .out
.name() .join("tmp-rustbuild-tests")
.unwrap_or("unknown") .join(&thread::current().name().unwrap_or("unknown").replace(":", "-"));
.replace(":", "-"),
);
t!(fs::create_dir_all(&dir)); t!(fs::create_dir_all(&dir));
config.out = dir; config.out = dir;
config.build = INTERNER.intern_str("A"); config.build = INTERNER.intern_str("A");
@ -46,26 +44,15 @@ fn dist_baseline() {
let a = INTERNER.intern_str("A"); let a = INTERNER.intern_str("A");
assert_eq!( assert_eq!(first(builder.cache.all::<dist::Docs>()), &[dist::Docs { host: a },]);
first(builder.cache.all::<dist::Docs>()), assert_eq!(first(builder.cache.all::<dist::Mingw>()), &[dist::Mingw { host: a },]);
&[dist::Docs { host: a },]
);
assert_eq!(
first(builder.cache.all::<dist::Mingw>()),
&[dist::Mingw { host: a },]
);
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[dist::Rustc { &[dist::Rustc { compiler: Compiler { host: a, stage: 2 } },]
compiler: Compiler { host: a, stage: 2 }
},]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[dist::Std { &[dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },]
compiler: Compiler { host: a, stage: 1 },
target: a,
},]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
} }
@ -81,10 +68,7 @@ fn dist_with_targets() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Docs>()), first(builder.cache.all::<dist::Docs>()),
&[ &[dist::Docs { host: a }, dist::Docs { host: b },]
dist::Docs { host: a },
dist::Docs { host: b },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Mingw>()), first(builder.cache.all::<dist::Mingw>()),
@ -92,21 +76,13 @@ fn dist_with_targets() {
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[dist::Rustc { &[dist::Rustc { compiler: Compiler { host: a, stage: 2 } },]
compiler: Compiler { host: a, stage: 2 }
},]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[ &[
dist::Std { dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
compiler: Compiler { host: a, stage: 1 }, dist::Std { compiler: Compiler { host: a, stage: 2 }, target: b },
target: a,
},
dist::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
] ]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
@ -123,10 +99,7 @@ fn dist_with_hosts() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Docs>()), first(builder.cache.all::<dist::Docs>()),
&[ &[dist::Docs { host: a }, dist::Docs { host: b },]
dist::Docs { host: a },
dist::Docs { host: b },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Mingw>()), first(builder.cache.all::<dist::Mingw>()),
@ -135,25 +108,15 @@ fn dist_with_hosts() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[ &[
dist::Rustc { dist::Rustc { compiler: Compiler { host: a, stage: 2 } },
compiler: Compiler { host: a, stage: 2 } dist::Rustc { compiler: Compiler { host: b, stage: 2 } },
},
dist::Rustc {
compiler: Compiler { host: b, stage: 2 }
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[ &[
dist::Std { dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
compiler: Compiler { host: a, stage: 1 }, dist::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
target: a,
},
dist::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
] ]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
@ -172,23 +135,13 @@ fn dist_only_cross_host() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[ &[dist::Rustc { compiler: Compiler { host: b, stage: 2 } },]
dist::Rustc {
compiler: Compiler { host: b, stage: 2 }
},
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Rustc>()), first(builder.cache.all::<compile::Rustc>()),
&[ &[
compile::Rustc { compile::Rustc { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Rustc { compiler: Compiler { host: a, stage: 1 }, target: b },
target: a,
},
compile::Rustc {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
] ]
); );
} }
@ -205,46 +158,25 @@ fn dist_with_targets_and_hosts() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Docs>()), first(builder.cache.all::<dist::Docs>()),
&[ &[dist::Docs { host: a }, dist::Docs { host: b }, dist::Docs { host: c },]
dist::Docs { host: a },
dist::Docs { host: b },
dist::Docs { host: c },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Mingw>()), first(builder.cache.all::<dist::Mingw>()),
&[ &[dist::Mingw { host: a }, dist::Mingw { host: b }, dist::Mingw { host: c },]
dist::Mingw { host: a },
dist::Mingw { host: b },
dist::Mingw { host: c },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[ &[
dist::Rustc { dist::Rustc { compiler: Compiler { host: a, stage: 2 } },
compiler: Compiler { host: a, stage: 2 } dist::Rustc { compiler: Compiler { host: b, stage: 2 } },
},
dist::Rustc {
compiler: Compiler { host: b, stage: 2 }
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[ &[
dist::Std { dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
compiler: Compiler { host: a, stage: 1 }, dist::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
target: a, dist::Std { compiler: Compiler { host: a, stage: 2 }, target: c },
},
dist::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
dist::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
] ]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
@ -264,36 +196,19 @@ fn dist_with_target_flag() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Docs>()), first(builder.cache.all::<dist::Docs>()),
&[ &[dist::Docs { host: a }, dist::Docs { host: b }, dist::Docs { host: c },]
dist::Docs { host: a },
dist::Docs { host: b },
dist::Docs { host: c },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Mingw>()), first(builder.cache.all::<dist::Mingw>()),
&[ &[dist::Mingw { host: a }, dist::Mingw { host: b }, dist::Mingw { host: c },]
dist::Mingw { host: a },
dist::Mingw { host: b },
dist::Mingw { host: c },
]
); );
assert_eq!(first(builder.cache.all::<dist::Rustc>()), &[]); assert_eq!(first(builder.cache.all::<dist::Rustc>()), &[]);
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[ &[
dist::Std { dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
compiler: Compiler { host: a, stage: 1 }, dist::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
target: a, dist::Std { compiler: Compiler { host: a, stage: 2 }, target: c },
},
dist::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
dist::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
] ]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[]);
@ -310,10 +225,7 @@ fn dist_with_same_targets_and_hosts() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Docs>()), first(builder.cache.all::<dist::Docs>()),
&[ &[dist::Docs { host: a }, dist::Docs { host: b },]
dist::Docs { host: a },
dist::Docs { host: b },
]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Mingw>()), first(builder.cache.all::<dist::Mingw>()),
@ -322,68 +234,35 @@ fn dist_with_same_targets_and_hosts() {
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Rustc>()), first(builder.cache.all::<dist::Rustc>()),
&[ &[
dist::Rustc { dist::Rustc { compiler: Compiler { host: a, stage: 2 } },
compiler: Compiler { host: a, stage: 2 } dist::Rustc { compiler: Compiler { host: b, stage: 2 } },
},
dist::Rustc {
compiler: Compiler { host: b, stage: 2 }
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<dist::Std>()), first(builder.cache.all::<dist::Std>()),
&[ &[
dist::Std { dist::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
compiler: Compiler { host: a, stage: 1 }, dist::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
target: a,
},
dist::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
] ]
); );
assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]); assert_eq!(first(builder.cache.all::<dist::Src>()), &[dist::Src]);
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Std>()), first(builder.cache.all::<compile::Std>()),
&[ &[
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
target: a, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: a },
}, compile::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 2 }, target: b },
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Assemble>()), first(builder.cache.all::<compile::Assemble>()),
&[ &[
compile::Assemble { compile::Assemble { target_compiler: Compiler { host: a, stage: 0 } },
target_compiler: Compiler { host: a, stage: 0 }, compile::Assemble { target_compiler: Compiler { host: a, stage: 1 } },
}, compile::Assemble { target_compiler: Compiler { host: a, stage: 2 } },
compile::Assemble { compile::Assemble { target_compiler: Compiler { host: b, stage: 2 } },
target_compiler: Compiler { host: a, stage: 1 },
},
compile::Assemble {
target_compiler: Compiler { host: a, stage: 2 },
},
compile::Assemble {
target_compiler: Compiler { host: b, stage: 2 },
},
] ]
); );
} }
@ -401,76 +280,28 @@ fn build_default() {
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Std>()), first(builder.cache.all::<compile::Std>()),
&[ &[
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
target: a, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: a },
}, compile::Std { compiler: Compiler { host: b, stage: 2 }, target: a },
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
compiler: Compiler { host: a, stage: 1 }, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: b },
target: a, compile::Std { compiler: Compiler { host: b, stage: 2 }, target: b },
}, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: c },
compile::Std { compile::Std { compiler: Compiler { host: b, stage: 2 }, target: c },
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
] ]
); );
assert!(!builder.cache.all::<compile::Assemble>().is_empty()); assert!(!builder.cache.all::<compile::Assemble>().is_empty());
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Rustc>()), first(builder.cache.all::<compile::Rustc>()),
&[ &[
compile::Rustc { compile::Rustc { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Rustc { compiler: Compiler { host: a, stage: 1 }, target: a },
target: a, compile::Rustc { compiler: Compiler { host: a, stage: 2 }, target: a },
}, compile::Rustc { compiler: Compiler { host: b, stage: 2 }, target: a },
compile::Rustc { compile::Rustc { compiler: Compiler { host: a, stage: 1 }, target: b },
compiler: Compiler { host: a, stage: 1 }, compile::Rustc { compiler: Compiler { host: a, stage: 2 }, target: b },
target: a, compile::Rustc { compiler: Compiler { host: b, stage: 2 }, target: b },
},
compile::Rustc {
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Rustc {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Rustc {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Rustc {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Rustc {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
] ]
); );
} }
@ -490,76 +321,32 @@ fn build_with_target_flag() {
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Std>()), first(builder.cache.all::<compile::Std>()),
&[ &[
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Std { compiler: Compiler { host: a, stage: 1 }, target: a },
target: a, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: a },
}, compile::Std { compiler: Compiler { host: b, stage: 2 }, target: a },
compile::Std { compile::Std { compiler: Compiler { host: a, stage: 1 }, target: b },
compiler: Compiler { host: a, stage: 1 }, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: b },
target: a, compile::Std { compiler: Compiler { host: b, stage: 2 }, target: b },
}, compile::Std { compiler: Compiler { host: a, stage: 2 }, target: c },
compile::Std { compile::Std { compiler: Compiler { host: b, stage: 2 }, target: c },
compiler: Compiler { host: a, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: a,
},
compile::Std {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: b,
},
compile::Std {
compiler: Compiler { host: a, stage: 2 },
target: c,
},
compile::Std {
compiler: Compiler { host: b, stage: 2 },
target: c,
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Assemble>()), first(builder.cache.all::<compile::Assemble>()),
&[ &[
compile::Assemble { compile::Assemble { target_compiler: Compiler { host: a, stage: 0 } },
target_compiler: Compiler { host: a, stage: 0 }, compile::Assemble { target_compiler: Compiler { host: a, stage: 1 } },
}, compile::Assemble { target_compiler: Compiler { host: a, stage: 2 } },
compile::Assemble { compile::Assemble { target_compiler: Compiler { host: b, stage: 2 } },
target_compiler: Compiler { host: a, stage: 1 },
},
compile::Assemble {
target_compiler: Compiler { host: a, stage: 2 },
},
compile::Assemble {
target_compiler: Compiler { host: b, stage: 2 },
},
] ]
); );
assert_eq!( assert_eq!(
first(builder.cache.all::<compile::Rustc>()), first(builder.cache.all::<compile::Rustc>()),
&[ &[
compile::Rustc { compile::Rustc { compiler: Compiler { host: a, stage: 0 }, target: a },
compiler: Compiler { host: a, stage: 0 }, compile::Rustc { compiler: Compiler { host: a, stage: 1 }, target: a },
target: a, compile::Rustc { compiler: Compiler { host: a, stage: 1 }, target: b },
},
compile::Rustc {
compiler: Compiler { host: a, stage: 1 },
target: a,
},
compile::Rustc {
compiler: Compiler { host: a, stage: 1 },
target: b,
},
] ]
); );
} }
@ -585,10 +372,8 @@ fn test_with_no_doc_stage0() {
let host = INTERNER.intern_str("A"); let host = INTERNER.intern_str("A");
builder.run_step_descriptions( builder
&[StepDescription::from::<test::Crate>()], .run_step_descriptions(&[StepDescription::from::<test::Crate>()], &["src/libstd".into()]);
&["src/libstd".into()],
);
// Ensure we don't build any compiler artifacts. // Ensure we don't build any compiler artifacts.
assert!(!builder.cache.contains::<compile::Rustc>()); assert!(!builder.cache.contains::<compile::Rustc>());
@ -607,9 +392,7 @@ fn test_with_no_doc_stage0() {
#[test] #[test]
fn test_exclude() { fn test_exclude() {
let mut config = configure(&[], &[]); let mut config = configure(&[], &[]);
config.exclude = vec![ config.exclude = vec!["src/tools/tidy".into()];
"src/tools/tidy".into(),
];
config.cmd = Subcommand::Test { config.cmd = Subcommand::Test {
paths: Vec::new(), paths: Vec::new(),
test_args: Vec::new(), test_args: Vec::new(),

View File

@ -1,6 +1,7 @@
use std::any::{Any, TypeId}; use std::any::{Any, TypeId};
use std::borrow::Borrow; use std::borrow::Borrow;
use std::cell::RefCell; use std::cell::RefCell;
use std::cmp::{Ord, Ordering, PartialOrd};
use std::collections::HashMap; use std::collections::HashMap;
use std::convert::AsRef; use std::convert::AsRef;
use std::ffi::OsStr; use std::ffi::OsStr;
@ -11,7 +12,6 @@ use std::mem;
use std::ops::Deref; use std::ops::Deref;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::sync::Mutex; use std::sync::Mutex;
use std::cmp::{PartialOrd, Ord, Ordering};
use lazy_static::lazy_static; use lazy_static::lazy_static;
@ -47,7 +47,7 @@ impl<T> Eq for Interned<T> {}
impl PartialEq<str> for Interned<String> { impl PartialEq<str> for Interned<String> {
fn eq(&self, other: &str) -> bool { fn eq(&self, other: &str) -> bool {
*self == other *self == other
} }
} }
impl<'a> PartialEq<&'a str> for Interned<String> { impl<'a> PartialEq<&'a str> for Interned<String> {
@ -168,24 +168,21 @@ struct TyIntern<T: Clone + Eq> {
impl<T: Hash + Clone + Eq> Default for TyIntern<T> { impl<T: Hash + Clone + Eq> Default for TyIntern<T> {
fn default() -> Self { fn default() -> Self {
TyIntern { TyIntern { items: Vec::new(), set: Default::default() }
items: Vec::new(),
set: Default::default(),
}
} }
} }
impl<T: Hash + Clone + Eq> TyIntern<T> { impl<T: Hash + Clone + Eq> TyIntern<T> {
fn intern_borrow<B>(&mut self, item: &B) -> Interned<T> fn intern_borrow<B>(&mut self, item: &B) -> Interned<T>
where where
B: Eq + Hash + ToOwned<Owned=T> + ?Sized, B: Eq + Hash + ToOwned<Owned = T> + ?Sized,
T: Borrow<B>, T: Borrow<B>,
{ {
if let Some(i) = self.set.get(&item) { if let Some(i) = self.set.get(&item) {
return *i; return *i;
} }
let item = item.to_owned(); let item = item.to_owned();
let interned = Interned(self.items.len(), PhantomData::<*const T>); let interned = Interned(self.items.len(), PhantomData::<*const T>);
self.set.insert(item.clone(), interned); self.set.insert(item.clone(), interned);
self.items.push(item); self.items.push(item);
interned interned
@ -195,7 +192,7 @@ impl<T: Hash + Clone + Eq> TyIntern<T> {
if let Some(i) = self.set.get(&item) { if let Some(i) = self.set.get(&item) {
return *i; return *i;
} }
let interned = Interned(self.items.len(), PhantomData::<*const T>); let interned = Interned(self.items.len(), PhantomData::<*const T>);
self.set.insert(item.clone(), interned); self.set.insert(item.clone(), interned);
self.items.push(item); self.items.push(item);
interned interned
@ -235,10 +232,12 @@ lazy_static! {
/// `get()` method. /// `get()` method.
#[derive(Debug)] #[derive(Debug)]
pub struct Cache( pub struct Cache(
RefCell<HashMap< RefCell<
TypeId, HashMap<
Box<dyn Any>, // actually a HashMap<Step, Interned<Step::Output>> TypeId,
>> Box<dyn Any>, // actually a HashMap<Step, Interned<Step::Output>>
>,
>,
); );
impl Cache { impl Cache {
@ -249,10 +248,11 @@ impl Cache {
pub fn put<S: Step>(&self, step: S, value: S::Output) { pub fn put<S: Step>(&self, step: S, value: S::Output) {
let mut cache = self.0.borrow_mut(); let mut cache = self.0.borrow_mut();
let type_id = TypeId::of::<S>(); let type_id = TypeId::of::<S>();
let stepcache = cache.entry(type_id) let stepcache = cache
.or_insert_with(|| Box::new(HashMap::<S, S::Output>::new())) .entry(type_id)
.downcast_mut::<HashMap<S, S::Output>>() .or_insert_with(|| Box::new(HashMap::<S, S::Output>::new()))
.expect("invalid type mapped"); .downcast_mut::<HashMap<S, S::Output>>()
.expect("invalid type mapped");
assert!(!stepcache.contains_key(&step), "processing {:?} a second time", step); assert!(!stepcache.contains_key(&step), "processing {:?} a second time", step);
stepcache.insert(step, value); stepcache.insert(step, value);
} }
@ -260,10 +260,11 @@ impl Cache {
pub fn get<S: Step>(&self, step: &S) -> Option<S::Output> { pub fn get<S: Step>(&self, step: &S) -> Option<S::Output> {
let mut cache = self.0.borrow_mut(); let mut cache = self.0.borrow_mut();
let type_id = TypeId::of::<S>(); let type_id = TypeId::of::<S>();
let stepcache = cache.entry(type_id) let stepcache = cache
.or_insert_with(|| Box::new(HashMap::<S, S::Output>::new())) .entry(type_id)
.downcast_mut::<HashMap<S, S::Output>>() .or_insert_with(|| Box::new(HashMap::<S, S::Output>::new()))
.expect("invalid type mapped"); .downcast_mut::<HashMap<S, S::Output>>()
.expect("invalid type mapped");
stepcache.get(step).cloned() stepcache.get(step).cloned()
} }
} }
@ -273,7 +274,8 @@ impl Cache {
pub fn all<S: Ord + Copy + Step>(&mut self) -> Vec<(S, S::Output)> { pub fn all<S: Ord + Copy + Step>(&mut self) -> Vec<(S, S::Output)> {
let cache = self.0.get_mut(); let cache = self.0.get_mut();
let type_id = TypeId::of::<S>(); let type_id = TypeId::of::<S>();
let mut v = cache.remove(&type_id) let mut v = cache
.remove(&type_id)
.map(|b| b.downcast::<HashMap<S, S::Output>>().expect("correct type")) .map(|b| b.downcast::<HashMap<S, S::Output>>().expect("correct type"))
.map(|m| m.into_iter().collect::<Vec<_>>()) .map(|m| m.into_iter().collect::<Vec<_>>())
.unwrap_or_default(); .unwrap_or_default();

View File

@ -22,15 +22,15 @@
//! everything. //! everything.
use std::collections::HashSet; use std::collections::HashSet;
use std::{env, iter};
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::Command; use std::process::Command;
use std::{env, iter};
use build_helper::output; use build_helper::output;
use crate::{Build, GitRepo};
use crate::config::Target;
use crate::cache::Interned; use crate::cache::Interned;
use crate::config::Target;
use crate::{Build, GitRepo};
// The `cc` crate doesn't provide a way to obtain a path to the detected archiver, // The `cc` crate doesn't provide a way to obtain a path to the detected archiver,
// so use some simplified logic here. First we respect the environment variable `AR`, then // so use some simplified logic here. First we respect the environment variable `AR`, then
@ -64,14 +64,25 @@ fn cc2ar(cc: &Path, target: &str) -> Option<PathBuf> {
pub fn find(build: &mut Build) { pub fn find(build: &mut Build) {
// For all targets we're going to need a C compiler for building some shims // For all targets we're going to need a C compiler for building some shims
// and such as well as for being a linker for Rust code. // and such as well as for being a linker for Rust code.
let targets = build.targets.iter().chain(&build.hosts).cloned().chain(iter::once(build.build)) let targets = build
.collect::<HashSet<_>>(); .targets
.iter()
.chain(&build.hosts)
.cloned()
.chain(iter::once(build.build))
.collect::<HashSet<_>>();
for target in targets.into_iter() { for target in targets.into_iter() {
let mut cfg = cc::Build::new(); let mut cfg = cc::Build::new();
cfg.cargo_metadata(false).opt_level(2).warnings(false).debug(false) cfg.cargo_metadata(false)
.target(&target).host(&build.build); .opt_level(2)
.warnings(false)
.debug(false)
.target(&target)
.host(&build.build);
match build.crt_static(target) { match build.crt_static(target) {
Some(a) => { cfg.static_crt(a); } Some(a) => {
cfg.static_crt(a);
}
None => { None => {
if target.contains("msvc") { if target.contains("msvc") {
cfg.static_crt(true); cfg.static_crt(true);
@ -102,8 +113,13 @@ pub fn find(build: &mut Build) {
// If we use llvm-libunwind, we will need a C++ compiler as well for all targets // If we use llvm-libunwind, we will need a C++ compiler as well for all targets
// We'll need one anyways if the target triple is also a host triple // We'll need one anyways if the target triple is also a host triple
let mut cfg = cc::Build::new(); let mut cfg = cc::Build::new();
cfg.cargo_metadata(false).opt_level(2).warnings(false).debug(false).cpp(true) cfg.cargo_metadata(false)
.target(&target).host(&build.build); .opt_level(2)
.warnings(false)
.debug(false)
.cpp(true)
.target(&target)
.host(&build.build);
let cxx_configured = if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) { let cxx_configured = if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) {
cfg.compiler(cxx); cfg.compiler(cxx);
@ -133,21 +149,24 @@ pub fn find(build: &mut Build) {
} }
} }
fn set_compiler(cfg: &mut cc::Build, fn set_compiler(
compiler: Language, cfg: &mut cc::Build,
target: Interned<String>, compiler: Language,
config: Option<&Target>, target: Interned<String>,
build: &Build) { config: Option<&Target>,
build: &Build,
) {
match &*target { match &*target {
// When compiling for android we may have the NDK configured in the // When compiling for android we may have the NDK configured in the
// config.toml in which case we look there. Otherwise the default // config.toml in which case we look there. Otherwise the default
// compiler already takes into account the triple in question. // compiler already takes into account the triple in question.
t if t.contains("android") => { t if t.contains("android") => {
if let Some(ndk) = config.and_then(|c| c.ndk.as_ref()) { if let Some(ndk) = config.and_then(|c| c.ndk.as_ref()) {
let target = target.replace("armv7neon", "arm") let target = target
.replace("armv7", "arm") .replace("armv7neon", "arm")
.replace("thumbv7neon", "arm") .replace("armv7", "arm")
.replace("thumbv7", "arm"); .replace("thumbv7neon", "arm")
.replace("thumbv7", "arm");
let compiler = format!("{}-{}", target, compiler.clang()); let compiler = format!("{}-{}", target, compiler.clang());
cfg.compiler(ndk.join("bin").join(compiler)); cfg.compiler(ndk.join("bin").join(compiler));
} }
@ -159,7 +178,7 @@ fn set_compiler(cfg: &mut cc::Build,
let c = cfg.get_compiler(); let c = cfg.get_compiler();
let gnu_compiler = compiler.gcc(); let gnu_compiler = compiler.gcc();
if !c.path().ends_with(gnu_compiler) { if !c.path().ends_with(gnu_compiler) {
return return;
} }
let output = output(c.to_command().arg("--version")); let output = output(c.to_command().arg("--version"));
@ -168,7 +187,7 @@ fn set_compiler(cfg: &mut cc::Build,
None => return, None => return,
}; };
match output[i + 3..].chars().next().unwrap() { match output[i + 3..].chars().next().unwrap() {
'0' ..= '6' => {} '0'..='6' => {}
_ => return, _ => return,
} }
let alternative = format!("e{}", gnu_compiler); let alternative = format!("e{}", gnu_compiler);

View File

@ -13,7 +13,7 @@ use build_helper::output;
use crate::Build; use crate::Build;
// The version number // The version number
pub const CFG_RELEASE_NUM: &str = "1.41.1"; pub const CFG_RELEASE_NUM: &str = "1.42.0";
pub struct GitInfo { pub struct GitInfo {
inner: Option<Info>, inner: Option<Info>,

View File

@ -1,10 +1,10 @@
//! Implementation of compiling the compiler and standard library, in "check"-based modes. //! Implementation of compiling the compiler and standard library, in "check"-based modes.
use crate::compile::{run_cargo, std_cargo, rustc_cargo, add_to_sysroot}; use crate::builder::{Builder, Kind, RunConfig, ShouldRun, Step};
use crate::builder::{RunConfig, Builder, Kind, ShouldRun, Step}; use crate::cache::Interned;
use crate::compile::{add_to_sysroot, run_cargo, rustc_cargo, std_cargo};
use crate::tool::{prepare_tool_cargo, SourceType}; use crate::tool::{prepare_tool_cargo, SourceType};
use crate::{Compiler, Mode}; use crate::{Compiler, Mode};
use crate::cache::Interned;
use std::path::PathBuf; use std::path::PathBuf;
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)] #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
@ -15,7 +15,7 @@ pub struct Std {
fn args(kind: Kind) -> Vec<String> { fn args(kind: Kind) -> Vec<String> {
match kind { match kind {
Kind::Clippy => vec!["--".to_owned(), "--cap-lints".to_owned(), "warn".to_owned()], Kind::Clippy => vec!["--".to_owned(), "--cap-lints".to_owned(), "warn".to_owned()],
_ => Vec::new() _ => Vec::new(),
} }
} }
@ -24,7 +24,7 @@ fn cargo_subcommand(kind: Kind) -> &'static str {
Kind::Check => "check", Kind::Check => "check",
Kind::Clippy => "clippy", Kind::Clippy => "clippy",
Kind::Fix => "fix", Kind::Fix => "fix",
_ => unreachable!() _ => unreachable!(),
} }
} }
@ -37,9 +37,7 @@ impl Step for Std {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std { run.builder.ensure(Std { target: run.target });
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -47,15 +45,17 @@ impl Step for Std {
let compiler = builder.compiler(0, builder.config.build); let compiler = builder.compiler(0, builder.config.build);
let mut cargo = builder.cargo(compiler, Mode::Std, target, cargo_subcommand(builder.kind)); let mut cargo = builder.cargo(compiler, Mode::Std, target, cargo_subcommand(builder.kind));
std_cargo(builder, &compiler, target, &mut cargo); std_cargo(builder, target, &mut cargo);
builder.info(&format!("Checking std artifacts ({} -> {})", &compiler.host, target)); builder.info(&format!("Checking std artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder, run_cargo(
cargo, builder,
args(builder.kind), cargo,
&libstd_stamp(builder, compiler, target), args(builder.kind),
vec![], &libstd_stamp(builder, compiler, target),
true); vec![],
true,
);
let libdir = builder.sysroot_libdir(compiler, target); let libdir = builder.sysroot_libdir(compiler, target);
let hostdir = builder.sysroot_libdir(compiler, compiler.host); let hostdir = builder.sysroot_libdir(compiler, compiler.host);
@ -78,9 +78,7 @@ impl Step for Rustc {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc { run.builder.ensure(Rustc { target: run.target });
target: run.target,
});
} }
/// Builds the compiler. /// Builds the compiler.
@ -94,17 +92,19 @@ impl Step for Rustc {
builder.ensure(Std { target }); builder.ensure(Std { target });
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, let mut cargo =
cargo_subcommand(builder.kind)); builder.cargo(compiler, Mode::Rustc, target, cargo_subcommand(builder.kind));
rustc_cargo(builder, &mut cargo, target); rustc_cargo(builder, &mut cargo, target);
builder.info(&format!("Checking compiler artifacts ({} -> {})", &compiler.host, target)); builder.info(&format!("Checking compiler artifacts ({} -> {})", &compiler.host, target));
run_cargo(builder, run_cargo(
cargo, builder,
args(builder.kind), cargo,
&librustc_stamp(builder, compiler, target), args(builder.kind),
vec![], &librustc_stamp(builder, compiler, target),
true); vec![],
true,
);
let libdir = builder.sysroot_libdir(compiler, target); let libdir = builder.sysroot_libdir(compiler, target);
let hostdir = builder.sysroot_libdir(compiler, compiler.host); let hostdir = builder.sysroot_libdir(compiler, compiler.host);
@ -127,9 +127,7 @@ impl Step for Rustdoc {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc { run.builder.ensure(Rustdoc { target: run.target });
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -138,22 +136,26 @@ impl Step for Rustdoc {
builder.ensure(Rustc { target }); builder.ensure(Rustc { target });
let cargo = prepare_tool_cargo(builder, let cargo = prepare_tool_cargo(
compiler, builder,
Mode::ToolRustc, compiler,
target, Mode::ToolRustc,
cargo_subcommand(builder.kind), target,
"src/tools/rustdoc", cargo_subcommand(builder.kind),
SourceType::InTree, "src/tools/rustdoc",
&[]); SourceType::InTree,
&[],
);
println!("Checking rustdoc artifacts ({} -> {})", &compiler.host, target); println!("Checking rustdoc artifacts ({} -> {})", &compiler.host, target);
run_cargo(builder, run_cargo(
cargo, builder,
args(builder.kind), cargo,
&rustdoc_stamp(builder, compiler, target), args(builder.kind),
vec![], &rustdoc_stamp(builder, compiler, target),
true); vec![],
true,
);
let libdir = builder.sysroot_libdir(compiler, target); let libdir = builder.sysroot_libdir(compiler, target);
let hostdir = builder.sysroot_libdir(compiler, compiler.host); let hostdir = builder.sysroot_libdir(compiler, compiler.host);
@ -188,6 +190,5 @@ pub fn rustdoc_stamp(
compiler: Compiler, compiler: Compiler,
target: Interned<String>, target: Interned<String>,
) -> PathBuf { ) -> PathBuf {
builder.cargo_out(compiler, Mode::ToolRustc, target) builder.cargo_out(compiler, Mode::ToolRustc, target).join(".rustdoc-check.stamp")
.join(".rustdoc-check.stamp")
} }

View File

@ -31,7 +31,7 @@ pub fn clean(build: &Build, all: bool) {
for entry in entries { for entry in entries {
let entry = t!(entry); let entry = t!(entry);
if entry.file_name().to_str() == Some("llvm") { if entry.file_name().to_str() == Some("llvm") {
continue continue;
} }
let path = t!(entry.path().canonicalize()); let path = t!(entry.path().canonicalize());
rm_rf(&path); rm_rf(&path);
@ -47,7 +47,7 @@ fn rm_rf(path: &Path) {
return; return;
} }
panic!("failed to get metadata for file {}: {}", path.display(), e); panic!("failed to get metadata for file {}: {}", path.display(), e);
}, }
Ok(metadata) => { Ok(metadata) => {
if metadata.file_type().is_file() || metadata.file_type().is_symlink() { if metadata.file_type().is_file() || metadata.file_type().is_symlink() {
do_op(path, "remove file", |p| fs::remove_file(p)); do_op(path, "remove file", |p| fs::remove_file(p));
@ -58,20 +58,20 @@ fn rm_rf(path: &Path) {
rm_rf(&t!(file).path()); rm_rf(&t!(file).path());
} }
do_op(path, "remove dir", |p| fs::remove_dir(p)); do_op(path, "remove dir", |p| fs::remove_dir(p));
}, }
}; };
} }
fn do_op<F>(path: &Path, desc: &str, mut f: F) fn do_op<F>(path: &Path, desc: &str, mut f: F)
where F: FnMut(&Path) -> io::Result<()> where
F: FnMut(&Path) -> io::Result<()>,
{ {
match f(path) { match f(path) {
Ok(()) => {} Ok(()) => {}
// On windows we can't remove a readonly file, and git will often clone files as readonly. // On windows we can't remove a readonly file, and git will often clone files as readonly.
// As a result, we have some special logic to remove readonly files on windows. // As a result, we have some special logic to remove readonly files on windows.
// This is also the reason that we can't use things like fs::remove_dir_all(). // This is also the reason that we can't use things like fs::remove_dir_all().
Err(ref e) if cfg!(windows) && Err(ref e) if cfg!(windows) && e.kind() == ErrorKind::PermissionDenied => {
e.kind() == ErrorKind::PermissionDenied => {
let mut p = t!(path.symlink_metadata()).permissions(); let mut p = t!(path.symlink_metadata()).permissions();
p.set_readonly(false); p.set_readonly(false);
t!(fs::set_permissions(path, p)); t!(fs::set_permissions(path, p));

View File

@ -9,10 +9,10 @@
use std::borrow::Cow; use std::borrow::Cow;
use std::env; use std::env;
use std::fs; use std::fs;
use std::io::BufReader;
use std::io::prelude::*; use std::io::prelude::*;
use std::io::BufReader;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::{Command, Stdio, exit}; use std::process::{exit, Command, Stdio};
use std::str; use std::str;
use build_helper::{output, t, up_to_date}; use build_helper::{output, t, up_to_date};
@ -20,14 +20,14 @@ use filetime::FileTime;
use serde::Deserialize; use serde::Deserialize;
use serde_json; use serde_json;
use crate::dist;
use crate::builder::Cargo; use crate::builder::Cargo;
use crate::util::{exe, is_dylib}; use crate::dist;
use crate::{Compiler, Mode, GitRepo};
use crate::native; use crate::native;
use crate::util::{exe, is_dylib};
use crate::{Compiler, GitRepo, Mode};
use crate::cache::{INTERNER, Interned}; use crate::builder::{Builder, Kind, RunConfig, ShouldRun, Step};
use crate::builder::{Step, RunConfig, ShouldRun, Builder, Kind}; use crate::cache::{Interned, INTERNER};
#[derive(Debug, PartialOrd, Ord, Copy, Clone, PartialEq, Eq, Hash)] #[derive(Debug, PartialOrd, Ord, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Std { pub struct Std {
@ -61,11 +61,7 @@ impl Step for Std {
if builder.config.keep_stage.contains(&compiler.stage) { if builder.config.keep_stage.contains(&compiler.stage) {
builder.info("Warning: Using a potentially old libstd. This may not behave well."); builder.info("Warning: Using a potentially old libstd. This may not behave well.");
builder.ensure(StdLink { builder.ensure(StdLink { compiler, target_compiler: compiler, target });
compiler,
target_compiler: compiler,
target,
});
return; return;
} }
@ -73,10 +69,7 @@ impl Step for Std {
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target); let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
if compiler_to_use != compiler { if compiler_to_use != compiler {
builder.ensure(Std { builder.ensure(Std { compiler: compiler_to_use, target });
compiler: compiler_to_use,
target,
});
builder.info(&format!("Uplifting stage1 std ({} -> {})", compiler_to_use.host, target)); builder.info(&format!("Uplifting stage1 std ({} -> {})", compiler_to_use.host, target));
// Even if we're not building std this stage, the new sysroot must // Even if we're not building std this stage, the new sysroot must
@ -94,16 +87,20 @@ impl Step for Std {
target_deps.extend(copy_third_party_objects(builder, &compiler, target).into_iter()); target_deps.extend(copy_third_party_objects(builder, &compiler, target).into_iter());
let mut cargo = builder.cargo(compiler, Mode::Std, target, "build"); let mut cargo = builder.cargo(compiler, Mode::Std, target, "build");
std_cargo(builder, &compiler, target, &mut cargo); std_cargo(builder, target, &mut cargo);
builder.info(&format!("Building stage{} std artifacts ({} -> {})", compiler.stage, builder.info(&format!(
&compiler.host, target)); "Building stage{} std artifacts ({} -> {})",
run_cargo(builder, compiler.stage, &compiler.host, target
cargo, ));
vec![], run_cargo(
&libstd_stamp(builder, compiler, target), builder,
target_deps, cargo,
false); vec![],
&libstd_stamp(builder, compiler, target),
target_deps,
false,
);
builder.ensure(StdLink { builder.ensure(StdLink {
compiler: builder.compiler(compiler.stage, builder.config.build), compiler: builder.compiler(compiler.stage, builder.config.build),
@ -114,19 +111,18 @@ impl Step for Std {
} }
/// Copies third party objects needed by various targets. /// Copies third party objects needed by various targets.
fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target: Interned<String>) fn copy_third_party_objects(
-> Vec<PathBuf> builder: &Builder<'_>,
{ compiler: &Compiler,
target: Interned<String>,
) -> Vec<PathBuf> {
let libdir = builder.sysroot_libdir(*compiler, target); let libdir = builder.sysroot_libdir(*compiler, target);
let mut target_deps = vec![]; let mut target_deps = vec![];
let mut copy_and_stamp = |sourcedir: &Path, name: &str| { let mut copy_and_stamp = |sourcedir: &Path, name: &str| {
let target = libdir.join(name); let target = libdir.join(name);
builder.copy( builder.copy(&sourcedir.join(name), &target);
&sourcedir.join(name),
&target,
);
target_deps.push(target); target_deps.push(target);
}; };
@ -157,15 +153,18 @@ fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target:
copy_and_stamp(Path::new(&src), "libunwind.a"); copy_and_stamp(Path::new(&src), "libunwind.a");
} }
if builder.config.sanitizers && compiler.stage != 0 {
// The sanitizers are only copied in stage1 or above,
// to avoid creating dependency on LLVM.
target_deps.extend(copy_sanitizers(builder, &compiler, target));
}
target_deps target_deps
} }
/// Configure cargo to compile the standard library, adding appropriate env vars /// Configure cargo to compile the standard library, adding appropriate env vars
/// and such. /// and such.
pub fn std_cargo(builder: &Builder<'_>, pub fn std_cargo(builder: &Builder<'_>, target: Interned<String>, cargo: &mut Cargo) {
compiler: &Compiler,
target: Interned<String>,
cargo: &mut Cargo) {
if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") { if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") {
cargo.env("MACOSX_DEPLOYMENT_TARGET", target); cargo.env("MACOSX_DEPLOYMENT_TARGET", target);
} }
@ -208,22 +207,9 @@ pub fn std_cargo(builder: &Builder<'_>,
let mut features = builder.std_features(); let mut features = builder.std_features();
features.push_str(&compiler_builtins_c_feature); features.push_str(&compiler_builtins_c_feature);
if compiler.stage != 0 && builder.config.sanitizers { cargo
// This variable is used by the sanitizer runtime crates, e.g. .arg("--features")
// rustc_lsan, to build the sanitizer runtime from C code .arg(features)
// When this variable is missing, those crates won't compile the C code,
// so we don't set this variable during stage0 where llvm-config is
// missing
// We also only build the runtimes when --enable-sanitizers (or its
// config.toml equivalent) is used
let llvm_config = builder.ensure(native::Llvm {
target: builder.config.build,
});
cargo.env("LLVM_CONFIG", llvm_config);
cargo.env("RUSTC_BUILD_SANITIZERS", "1");
}
cargo.arg("--features").arg(features)
.arg("--manifest-path") .arg("--manifest-path")
.arg(builder.src.join("src/libtest/Cargo.toml")); .arg(builder.src.join("src/libtest/Cargo.toml"));
@ -271,40 +257,50 @@ impl Step for StdLink {
let compiler = self.compiler; let compiler = self.compiler;
let target_compiler = self.target_compiler; let target_compiler = self.target_compiler;
let target = self.target; let target = self.target;
builder.info(&format!("Copying stage{} std from stage{} ({} -> {} / {})", builder.info(&format!(
target_compiler.stage, "Copying stage{} std from stage{} ({} -> {} / {})",
compiler.stage, target_compiler.stage, compiler.stage, &compiler.host, target_compiler.host, target
&compiler.host, ));
target_compiler.host,
target));
let libdir = builder.sysroot_libdir(target_compiler, target); let libdir = builder.sysroot_libdir(target_compiler, target);
let hostdir = builder.sysroot_libdir(target_compiler, compiler.host); let hostdir = builder.sysroot_libdir(target_compiler, compiler.host);
add_to_sysroot(builder, &libdir, &hostdir, &libstd_stamp(builder, compiler, target)); add_to_sysroot(builder, &libdir, &hostdir, &libstd_stamp(builder, compiler, target));
if builder.config.sanitizers && compiler.stage != 0 && target == "x86_64-apple-darwin" {
// The sanitizers are only built in stage1 or above, so the dylibs will
// be missing in stage0 and causes panic. See the `std()` function above
// for reason why the sanitizers are not built in stage0.
copy_apple_sanitizer_dylibs(builder, &builder.native_dir(target), "osx", &libdir);
}
} }
} }
fn copy_apple_sanitizer_dylibs( /// Copies sanitizer runtime libraries into target libdir.
fn copy_sanitizers(
builder: &Builder<'_>, builder: &Builder<'_>,
native_dir: &Path, compiler: &Compiler,
platform: &str, target: Interned<String>,
into: &Path, ) -> Vec<PathBuf> {
) { let runtimes: Vec<native::SanitizerRuntime> = builder.ensure(native::Sanitizers { target });
for &sanitizer in &["asan", "tsan"] {
let filename = format!("lib__rustc__clang_rt.{}_{}_dynamic.dylib", sanitizer, platform); if builder.config.dry_run {
let mut src_path = native_dir.join(sanitizer); return Vec::new();
src_path.push("build");
src_path.push("lib");
src_path.push("darwin");
src_path.push(&filename);
builder.copy(&src_path, &into.join(filename));
} }
let mut target_deps = Vec::new();
let libdir = builder.sysroot_libdir(*compiler, target);
for runtime in &runtimes {
let dst = libdir.join(&runtime.name);
builder.copy(&runtime.path, &dst);
if target == "x86_64-apple-darwin" {
// Update the library install name reflect the fact it has been renamed.
let status = Command::new("install_name_tool")
.arg("-id")
.arg(format!("@rpath/{}", runtime.name))
.arg(&dst)
.status()
.expect("failed to execute `install_name_tool`");
assert!(status.success());
}
target_deps.push(dst);
}
target_deps
} }
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)] #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
@ -337,7 +333,7 @@ impl Step for StartupObjects {
let for_compiler = self.compiler; let for_compiler = self.compiler;
let target = self.target; let target = self.target;
if !target.contains("windows-gnu") { if !target.contains("windows-gnu") {
return vec![] return vec![];
} }
let mut target_deps = vec![]; let mut target_deps = vec![];
@ -352,12 +348,17 @@ impl Step for StartupObjects {
let dst_file = &dst_dir.join(file.to_string() + ".o"); let dst_file = &dst_dir.join(file.to_string() + ".o");
if !up_to_date(src_file, dst_file) { if !up_to_date(src_file, dst_file) {
let mut cmd = Command::new(&builder.initial_rustc); let mut cmd = Command::new(&builder.initial_rustc);
builder.run(cmd.env("RUSTC_BOOTSTRAP", "1") builder.run(
.arg("--cfg").arg("bootstrap") cmd.env("RUSTC_BOOTSTRAP", "1")
.arg("--target").arg(target) .arg("--cfg")
.arg("--emit=obj") .arg("bootstrap")
.arg("-o").arg(dst_file) .arg("--target")
.arg(src_file)); .arg(target)
.arg("--emit=obj")
.arg("-o")
.arg(dst_file)
.arg(src_file),
);
} }
let target = sysroot_dir.join(file.to_string() + ".o"); let target = sysroot_dir.join(file.to_string() + ".o");
@ -366,10 +367,7 @@ impl Step for StartupObjects {
} }
for obj in ["crt2.o", "dllcrt2.o"].iter() { for obj in ["crt2.o", "dllcrt2.o"].iter() {
let src = compiler_file(builder, let src = compiler_file(builder, builder.cc(target), target, obj);
builder.cc(target),
target,
obj);
let target = sysroot_dir.join(obj); let target = sysroot_dir.join(obj);
builder.copy(&src, &target); builder.copy(&src, &target);
target_deps.push(target); target_deps.push(target);
@ -414,22 +412,15 @@ impl Step for Rustc {
if builder.config.keep_stage.contains(&compiler.stage) { if builder.config.keep_stage.contains(&compiler.stage) {
builder.info("Warning: Using a potentially old librustc. This may not behave well."); builder.info("Warning: Using a potentially old librustc. This may not behave well.");
builder.ensure(RustcLink { builder.ensure(RustcLink { compiler, target_compiler: compiler, target });
compiler,
target_compiler: compiler,
target,
});
return; return;
} }
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target); let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
if compiler_to_use != compiler { if compiler_to_use != compiler {
builder.ensure(Rustc { builder.ensure(Rustc { compiler: compiler_to_use, target });
compiler: compiler_to_use, builder
target, .info(&format!("Uplifting stage1 rustc ({} -> {})", builder.config.build, target));
});
builder.info(&format!("Uplifting stage1 rustc ({} -> {})",
builder.config.build, target));
builder.ensure(RustcLink { builder.ensure(RustcLink {
compiler: compiler_to_use, compiler: compiler_to_use,
target_compiler: compiler, target_compiler: compiler,
@ -447,14 +438,18 @@ impl Step for Rustc {
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "build"); let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "build");
rustc_cargo(builder, &mut cargo, target); rustc_cargo(builder, &mut cargo, target);
builder.info(&format!("Building stage{} compiler artifacts ({} -> {})", builder.info(&format!(
compiler.stage, &compiler.host, target)); "Building stage{} compiler artifacts ({} -> {})",
run_cargo(builder, compiler.stage, &compiler.host, target
cargo, ));
vec![], run_cargo(
&librustc_stamp(builder, compiler, target), builder,
vec![], cargo,
false); vec![],
&librustc_stamp(builder, compiler, target),
vec![],
false,
);
// We used to build librustc_codegen_llvm as a separate step, // We used to build librustc_codegen_llvm as a separate step,
// which produced a dylib that the compiler would dlopen() at runtime. // which produced a dylib that the compiler would dlopen() at runtime.
@ -503,19 +498,22 @@ impl Step for Rustc {
} }
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) { pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) {
cargo.arg("--features").arg(builder.rustc_features()) cargo
.arg("--manifest-path") .arg("--features")
.arg(builder.src.join("src/rustc/Cargo.toml")); .arg(builder.rustc_features())
.arg("--manifest-path")
.arg(builder.src.join("src/rustc/Cargo.toml"));
rustc_cargo_env(builder, cargo, target); rustc_cargo_env(builder, cargo, target);
} }
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) { pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo, target: Interned<String>) {
// Set some configuration variables picked up by build scripts and // Set some configuration variables picked up by build scripts and
// the compiler alike // the compiler alike
cargo.env("CFG_RELEASE", builder.rust_release()) cargo
.env("CFG_RELEASE_CHANNEL", &builder.config.channel) .env("CFG_RELEASE", builder.rust_release())
.env("CFG_VERSION", builder.rust_version()) .env("CFG_RELEASE_CHANNEL", &builder.config.channel)
.env("CFG_PREFIX", builder.config.prefix.clone().unwrap_or_default()); .env("CFG_VERSION", builder.rust_version())
.env("CFG_PREFIX", builder.config.prefix.clone().unwrap_or_default());
let libdir_relative = builder.config.libdir_relative().unwrap_or(Path::new("lib")); let libdir_relative = builder.config.libdir_relative().unwrap_or(Path::new("lib"));
cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative); cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative);
@ -561,14 +559,12 @@ pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Cargo, target: Interne
} }
// Building with a static libstdc++ is only supported on linux right now, // Building with a static libstdc++ is only supported on linux right now,
// not for MSVC or macOS // not for MSVC or macOS
if builder.config.llvm_static_stdcpp && if builder.config.llvm_static_stdcpp
!target.contains("freebsd") && && !target.contains("freebsd")
!target.contains("msvc") && && !target.contains("msvc")
!target.contains("apple") { && !target.contains("apple")
let file = compiler_file(builder, {
builder.cxx(target).unwrap(), let file = compiler_file(builder, builder.cxx(target).unwrap(), target, "libstdc++.a");
target,
"libstdc++.a");
cargo.env("LLVM_STATIC_STDCPP", file); cargo.env("LLVM_STATIC_STDCPP", file);
} }
if builder.config.llvm_link_shared || builder.config.llvm_thin_lto { if builder.config.llvm_link_shared || builder.config.llvm_thin_lto {
@ -602,17 +598,15 @@ impl Step for RustcLink {
let compiler = self.compiler; let compiler = self.compiler;
let target_compiler = self.target_compiler; let target_compiler = self.target_compiler;
let target = self.target; let target = self.target;
builder.info(&format!("Copying stage{} rustc from stage{} ({} -> {} / {})", builder.info(&format!(
target_compiler.stage, "Copying stage{} rustc from stage{} ({} -> {} / {})",
compiler.stage, target_compiler.stage, compiler.stage, &compiler.host, target_compiler.host, target
&compiler.host, ));
target_compiler.host,
target));
add_to_sysroot( add_to_sysroot(
builder, builder,
&builder.sysroot_libdir(target_compiler, target), &builder.sysroot_libdir(target_compiler, target),
&builder.sysroot_libdir(target_compiler, compiler.host), &builder.sysroot_libdir(target_compiler, compiler.host),
&librustc_stamp(builder, compiler, target) &librustc_stamp(builder, compiler, target),
); );
} }
} }
@ -706,8 +700,10 @@ impl Step for Assemble {
let target_compiler = self.target_compiler; let target_compiler = self.target_compiler;
if target_compiler.stage == 0 { if target_compiler.stage == 0 {
assert_eq!(builder.config.build, target_compiler.host, assert_eq!(
"Cannot obtain compiler for non-native build triple at stage 0"); builder.config.build, target_compiler.host,
"Cannot obtain compiler for non-native build triple at stage 0"
);
// The stage 0 compiler for the build triple is always pre-built. // The stage 0 compiler for the build triple is always pre-built.
return target_compiler; return target_compiler;
} }
@ -728,23 +724,17 @@ impl Step for Assemble {
// //
// FIXME: It may be faster if we build just a stage 1 compiler and then // FIXME: It may be faster if we build just a stage 1 compiler and then
// use that to bootstrap this compiler forward. // use that to bootstrap this compiler forward.
let build_compiler = let build_compiler = builder.compiler(target_compiler.stage - 1, builder.config.build);
builder.compiler(target_compiler.stage - 1, builder.config.build);
// Build the libraries for this compiler to link to (i.e., the libraries // Build the libraries for this compiler to link to (i.e., the libraries
// it uses at runtime). NOTE: Crates the target compiler compiles don't // it uses at runtime). NOTE: Crates the target compiler compiles don't
// link to these. (FIXME: Is that correct? It seems to be correct most // link to these. (FIXME: Is that correct? It seems to be correct most
// of the time but I think we do link to these for stage2/bin compilers // of the time but I think we do link to these for stage2/bin compilers
// when not performing a full bootstrap). // when not performing a full bootstrap).
builder.ensure(Rustc { builder.ensure(Rustc { compiler: build_compiler, target: target_compiler.host });
compiler: build_compiler,
target: target_compiler.host,
});
let lld_install = if builder.config.lld_enabled { let lld_install = if builder.config.lld_enabled {
Some(builder.ensure(native::Lld { Some(builder.ensure(native::Lld { target: target_compiler.host }))
target: target_compiler.host,
}))
} else { } else {
None None
}; };
@ -786,7 +776,6 @@ impl Step for Assemble {
let bindir = sysroot.join("bin"); let bindir = sysroot.join("bin");
t!(fs::create_dir_all(&bindir)); t!(fs::create_dir_all(&bindir));
let compiler = builder.rustc(target_compiler); let compiler = builder.rustc(target_compiler);
let _ = fs::remove_file(&compiler);
builder.copy(&rustc, &compiler); builder.copy(&rustc, &compiler);
target_compiler target_compiler
@ -801,7 +790,7 @@ pub fn add_to_sysroot(
builder: &Builder<'_>, builder: &Builder<'_>,
sysroot_dst: &Path, sysroot_dst: &Path,
sysroot_host_dst: &Path, sysroot_host_dst: &Path,
stamp: &Path stamp: &Path,
) { ) {
t!(fs::create_dir_all(&sysroot_dst)); t!(fs::create_dir_all(&sysroot_dst));
t!(fs::create_dir_all(&sysroot_host_dst)); t!(fs::create_dir_all(&sysroot_host_dst));
@ -814,14 +803,14 @@ pub fn add_to_sysroot(
} }
} }
pub fn run_cargo(builder: &Builder<'_>, pub fn run_cargo(
cargo: Cargo, builder: &Builder<'_>,
tail_args: Vec<String>, cargo: Cargo,
stamp: &Path, tail_args: Vec<String>,
additional_target_deps: Vec<PathBuf>, stamp: &Path,
is_check: bool) additional_target_deps: Vec<PathBuf>,
-> Vec<PathBuf> is_check: bool,
{ ) -> Vec<PathBuf> {
if builder.config.dry_run { if builder.config.dry_run {
return Vec::new(); return Vec::new();
} }
@ -831,9 +820,12 @@ pub fn run_cargo(builder: &Builder<'_>,
// `target_deps_dir` looks like $dir/$target/release/deps // `target_deps_dir` looks like $dir/$target/release/deps
let target_deps_dir = target_root_dir.join("deps"); let target_deps_dir = target_root_dir.join("deps");
// `host_root_dir` looks like $dir/release // `host_root_dir` looks like $dir/release
let host_root_dir = target_root_dir.parent().unwrap() // chop off `release` let host_root_dir = target_root_dir
.parent().unwrap() // chop off `$target` .parent()
.join(target_root_dir.file_name().unwrap()); .unwrap() // chop off `release`
.parent()
.unwrap() // chop off `$target`
.join(target_root_dir.file_name().unwrap());
// Spawn Cargo slurping up its JSON output. We'll start building up the // Spawn Cargo slurping up its JSON output. We'll start building up the
// `deps` array of all files it generated along with a `toplevel` array of // `deps` array of all files it generated along with a `toplevel` array of
@ -844,20 +836,19 @@ pub fn run_cargo(builder: &Builder<'_>,
let (filenames, crate_types) = match msg { let (filenames, crate_types) = match msg {
CargoMessage::CompilerArtifact { CargoMessage::CompilerArtifact {
filenames, filenames,
target: CargoTarget { target: CargoTarget { crate_types },
crate_types,
},
.. ..
} => (filenames, crate_types), } => (filenames, crate_types),
_ => return, _ => return,
}; };
for filename in filenames { for filename in filenames {
// Skip files like executables // Skip files like executables
if !filename.ends_with(".rlib") && if !filename.ends_with(".rlib")
!filename.ends_with(".lib") && && !filename.ends_with(".lib")
!filename.ends_with(".a") && && !filename.ends_with(".a")
!is_dylib(&filename) && && !is_dylib(&filename)
!(is_check && filename.ends_with(".rmeta")) { && !(is_check && filename.ends_with(".rmeta"))
{
continue; continue;
} }
@ -913,14 +904,13 @@ pub fn run_cargo(builder: &Builder<'_>,
.collect::<Vec<_>>(); .collect::<Vec<_>>();
for (prefix, extension, expected_len) in toplevel { for (prefix, extension, expected_len) in toplevel {
let candidates = contents.iter().filter(|&&(_, ref filename, ref meta)| { let candidates = contents.iter().filter(|&&(_, ref filename, ref meta)| {
filename.starts_with(&prefix[..]) && filename.starts_with(&prefix[..])
filename[prefix.len()..].starts_with("-") && && filename[prefix.len()..].starts_with("-")
filename.ends_with(&extension[..]) && && filename.ends_with(&extension[..])
meta.len() == expected_len && meta.len() == expected_len
});
let max = candidates.max_by_key(|&&(_, _, ref metadata)| {
FileTime::from_last_modification_time(metadata)
}); });
let max = candidates
.max_by_key(|&&(_, _, ref metadata)| FileTime::from_last_modification_time(metadata));
let path_to_add = match max { let path_to_add = match max {
Some(triple) => triple.0.to_str().unwrap(), Some(triple) => triple.0.to_str().unwrap(),
None => panic!("no output generated for {:?} {:?}", prefix, extension), None => panic!("no output generated for {:?} {:?}", prefix, extension),
@ -960,7 +950,7 @@ pub fn stream_cargo(
// Instruct Cargo to give us json messages on stdout, critically leaving // Instruct Cargo to give us json messages on stdout, critically leaving
// stderr as piped so we can get those pretty colors. // stderr as piped so we can get those pretty colors.
let mut message_format = String::from("json-render-diagnostics"); let mut message_format = String::from("json-render-diagnostics");
if let Some(s) = &builder.config.rustc_error_format { if let Some(s) = &builder.config.rustc_error_format {
message_format.push_str(",json-diagnostic-"); message_format.push_str(",json-diagnostic-");
message_format.push_str(s); message_format.push_str(s);
} }
@ -985,17 +975,18 @@ pub fn stream_cargo(
match serde_json::from_str::<CargoMessage<'_>>(&line) { match serde_json::from_str::<CargoMessage<'_>>(&line) {
Ok(msg) => cb(msg), Ok(msg) => cb(msg),
// If this was informational, just print it out and continue // If this was informational, just print it out and continue
Err(_) => println!("{}", line) Err(_) => println!("{}", line),
} }
} }
// Make sure Cargo actually succeeded after we read all of its stdout. // Make sure Cargo actually succeeded after we read all of its stdout.
let status = t!(child.wait()); let status = t!(child.wait());
if !status.success() { if !status.success() {
eprintln!("command did not execute successfully: {:?}\n\ eprintln!(
"command did not execute successfully: {:?}\n\
expected success, got: {}", expected success, got: {}",
cargo, cargo, status
status); );
} }
status.success() status.success()
} }

View File

@ -3,19 +3,20 @@
//! This module implements parsing `config.toml` configuration files to tweak //! This module implements parsing `config.toml` configuration files to tweak
//! how the build runs. //! how the build runs.
use std::cmp;
use std::collections::{HashMap, HashSet}; use std::collections::{HashMap, HashSet};
use std::env; use std::env;
use std::ffi::OsString;
use std::fs; use std::fs;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process; use std::process;
use std::cmp;
use build_helper::t; use crate::cache::{Interned, INTERNER};
use toml;
use serde::Deserialize;
use crate::cache::{INTERNER, Interned};
use crate::flags::Flags; use crate::flags::Flags;
pub use crate::flags::Subcommand; pub use crate::flags::Subcommand;
use build_helper::t;
use serde::Deserialize;
use toml;
/// Global configuration for the entire build and/or bootstrap. /// Global configuration for the entire build and/or bootstrap.
/// ///
@ -66,6 +67,7 @@ pub struct Config {
pub backtrace_on_ice: bool, pub backtrace_on_ice: bool,
// llvm codegen options // llvm codegen options
pub llvm_skip_rebuild: bool,
pub llvm_assertions: bool, pub llvm_assertions: bool,
pub llvm_optimize: bool, pub llvm_optimize: bool,
pub llvm_thin_lto: bool, pub llvm_thin_lto: bool,
@ -106,6 +108,7 @@ pub struct Config {
pub rust_dist_src: bool, pub rust_dist_src: bool,
pub rust_codegen_backends: Vec<Interned<String>>, pub rust_codegen_backends: Vec<Interned<String>>,
pub rust_verify_llvm_ir: bool, pub rust_verify_llvm_ir: bool,
pub rust_thin_lto_import_instr_limit: Option<u32>,
pub rust_remap_debuginfo: bool, pub rust_remap_debuginfo: bool,
pub build: Interned<String>, pub build: Interned<String>,
@ -149,6 +152,7 @@ pub struct Config {
// These are either the stage0 downloaded binaries or the locally installed ones. // These are either the stage0 downloaded binaries or the locally installed ones.
pub initial_cargo: PathBuf, pub initial_cargo: PathBuf,
pub initial_rustc: PathBuf, pub initial_rustc: PathBuf,
pub initial_rustfmt: Option<PathBuf>,
pub out: PathBuf, pub out: PathBuf,
} }
@ -199,6 +203,7 @@ struct Build {
target: Vec<String>, target: Vec<String>,
cargo: Option<String>, cargo: Option<String>,
rustc: Option<String>, rustc: Option<String>,
rustfmt: Option<String>, /* allow bootstrap.py to use rustfmt key */
docs: Option<bool>, docs: Option<bool>,
compiler_docs: Option<bool>, compiler_docs: Option<bool>,
submodules: Option<bool>, submodules: Option<bool>,
@ -242,6 +247,7 @@ struct Install {
#[derive(Deserialize, Default)] #[derive(Deserialize, Default)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")] #[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Llvm { struct Llvm {
skip_rebuild: Option<bool>,
optimize: Option<bool>, optimize: Option<bool>,
thin_lto: Option<bool>, thin_lto: Option<bool>,
release_debuginfo: Option<bool>, release_debuginfo: Option<bool>,
@ -321,6 +327,7 @@ struct Rust {
deny_warnings: Option<bool>, deny_warnings: Option<bool>,
backtrace_on_ice: Option<bool>, backtrace_on_ice: Option<bool>,
verify_llvm_ir: Option<bool>, verify_llvm_ir: Option<bool>,
thin_lto_import_instr_limit: Option<u32>,
remap_debuginfo: Option<bool>, remap_debuginfo: Option<bool>,
jemalloc: Option<bool>, jemalloc: Option<bool>,
test_compare_mode: Option<bool>, test_compare_mode: Option<bool>,
@ -348,12 +355,16 @@ struct TomlTarget {
impl Config { impl Config {
fn path_from_python(var_key: &str) -> PathBuf { fn path_from_python(var_key: &str) -> PathBuf {
match env::var_os(var_key) { match env::var_os(var_key) {
// Do not trust paths from Python and normalize them slightly (#49785). Some(var_val) => Self::normalize_python_path(var_val),
Some(var_val) => Path::new(&var_val).components().collect(),
_ => panic!("expected '{}' to be set", var_key), _ => panic!("expected '{}' to be set", var_key),
} }
} }
/// Normalizes paths from Python slightly. We don't trust paths from Python (#49785).
fn normalize_python_path(path: OsString) -> PathBuf {
Path::new(&path).components().collect()
}
pub fn default_opts() -> Config { pub fn default_opts() -> Config {
let mut config = Config::default(); let mut config = Config::default();
config.llvm_optimize = true; config.llvm_optimize = true;
@ -380,6 +391,7 @@ impl Config {
config.initial_rustc = Config::path_from_python("RUSTC"); config.initial_rustc = Config::path_from_python("RUSTC");
config.initial_cargo = Config::path_from_python("CARGO"); config.initial_cargo = Config::path_from_python("CARGO");
config.initial_rustfmt = env::var_os("RUSTFMT").map(Config::normalize_python_path);
config config
} }
@ -413,17 +425,22 @@ impl Config {
let has_targets = !flags.target.is_empty(); let has_targets = !flags.target.is_empty();
config.skip_only_host_steps = !has_hosts && has_targets; config.skip_only_host_steps = !has_hosts && has_targets;
let toml = file.map(|file| { let toml = file
let contents = t!(fs::read_to_string(&file)); .map(|file| {
match toml::from_str(&contents) { let contents = t!(fs::read_to_string(&file));
Ok(table) => table, match toml::from_str(&contents) {
Err(err) => { Ok(table) => table,
println!("failed to parse TOML configuration '{}': {}", Err(err) => {
file.display(), err); println!(
process::exit(2); "failed to parse TOML configuration '{}': {}",
file.display(),
err
);
process::exit(2);
}
} }
} })
}).unwrap_or_else(|| TomlConfig::default()); .unwrap_or_else(|| TomlConfig::default());
let build = toml.build.clone().unwrap_or_default(); let build = toml.build.clone().unwrap_or_default();
// set by bootstrap.py // set by bootstrap.py
@ -434,24 +451,15 @@ impl Config {
config.hosts.push(host); config.hosts.push(host);
} }
} }
for target in config.hosts.iter().cloned() for target in
.chain(build.target.iter().map(|s| INTERNER.intern_str(s))) config.hosts.iter().cloned().chain(build.target.iter().map(|s| INTERNER.intern_str(s)))
{ {
if !config.targets.contains(&target) { if !config.targets.contains(&target) {
config.targets.push(target); config.targets.push(target);
} }
} }
config.hosts = if !flags.host.is_empty() { config.hosts = if !flags.host.is_empty() { flags.host } else { config.hosts };
flags.host config.targets = if !flags.target.is_empty() { flags.target } else { config.targets };
} else {
config.hosts
};
config.targets = if !flags.target.is_empty() {
flags.target
} else {
config.targets
};
config.nodejs = build.nodejs.map(PathBuf::from); config.nodejs = build.nodejs.map(PathBuf::from);
config.gdb = build.gdb.map(PathBuf::from); config.gdb = build.gdb.map(PathBuf::from);
@ -485,6 +493,11 @@ impl Config {
config.mandir = install.mandir.clone().map(PathBuf::from); config.mandir = install.mandir.clone().map(PathBuf::from);
} }
// We want the llvm-skip-rebuild flag to take precedence over the
// skip-rebuild config.toml option so we store it separately
// so that we can infer the right value
let mut llvm_skip_rebuild = flags.llvm_skip_rebuild;
// Store off these values as options because if they're not provided // Store off these values as options because if they're not provided
// we'll infer default values for them later // we'll infer default values for them later
let mut llvm_assertions = None; let mut llvm_assertions = None;
@ -500,9 +513,7 @@ impl Config {
if let Some(ref llvm) = toml.llvm { if let Some(ref llvm) = toml.llvm {
match llvm.ccache { match llvm.ccache {
Some(StringOrBool::String(ref s)) => { Some(StringOrBool::String(ref s)) => config.ccache = Some(s.to_string()),
config.ccache = Some(s.to_string())
}
Some(StringOrBool::Bool(true)) => { Some(StringOrBool::Bool(true)) => {
config.ccache = Some("ccache".to_string()); config.ccache = Some("ccache".to_string());
} }
@ -510,6 +521,7 @@ impl Config {
} }
set(&mut config.ninja, llvm.ninja); set(&mut config.ninja, llvm.ninja);
llvm_assertions = llvm.assertions; llvm_assertions = llvm.assertions;
llvm_skip_rebuild = llvm_skip_rebuild.or(llvm.skip_rebuild);
set(&mut config.llvm_optimize, llvm.optimize); set(&mut config.llvm_optimize, llvm.optimize);
set(&mut config.llvm_thin_lto, llvm.thin_lto); set(&mut config.llvm_thin_lto, llvm.thin_lto);
set(&mut config.llvm_release_debuginfo, llvm.release_debuginfo); set(&mut config.llvm_release_debuginfo, llvm.release_debuginfo);
@ -564,12 +576,12 @@ impl Config {
set(&mut config.deny_warnings, flags.deny_warnings.or(rust.deny_warnings)); set(&mut config.deny_warnings, flags.deny_warnings.or(rust.deny_warnings));
set(&mut config.backtrace_on_ice, rust.backtrace_on_ice); set(&mut config.backtrace_on_ice, rust.backtrace_on_ice);
set(&mut config.rust_verify_llvm_ir, rust.verify_llvm_ir); set(&mut config.rust_verify_llvm_ir, rust.verify_llvm_ir);
config.rust_thin_lto_import_instr_limit = rust.thin_lto_import_instr_limit;
set(&mut config.rust_remap_debuginfo, rust.remap_debuginfo); set(&mut config.rust_remap_debuginfo, rust.remap_debuginfo);
if let Some(ref backends) = rust.codegen_backends { if let Some(ref backends) = rust.codegen_backends {
config.rust_codegen_backends = backends.iter() config.rust_codegen_backends =
.map(|s| INTERNER.intern_str(s)) backends.iter().map(|s| INTERNER.intern_str(s)).collect();
.collect();
} }
config.rust_codegen_units = rust.codegen_units.map(threads_from_config); config.rust_codegen_units = rust.codegen_units.map(threads_from_config);
@ -617,6 +629,8 @@ impl Config {
set(&mut config.initial_rustc, build.rustc.map(PathBuf::from)); set(&mut config.initial_rustc, build.rustc.map(PathBuf::from));
set(&mut config.initial_cargo, build.cargo.map(PathBuf::from)); set(&mut config.initial_cargo, build.cargo.map(PathBuf::from));
config.llvm_skip_rebuild = llvm_skip_rebuild.unwrap_or(false);
let default = false; let default = false;
config.llvm_assertions = llvm_assertions.unwrap_or(default); config.llvm_assertions = llvm_assertions.unwrap_or(default);
@ -627,9 +641,11 @@ impl Config {
config.rust_debug_assertions = debug_assertions.unwrap_or(default); config.rust_debug_assertions = debug_assertions.unwrap_or(default);
let with_defaults = |debuginfo_level_specific: Option<u32>| { let with_defaults = |debuginfo_level_specific: Option<u32>| {
debuginfo_level_specific debuginfo_level_specific.or(debuginfo_level).unwrap_or(if debug == Some(true) {
.or(debuginfo_level) 2
.unwrap_or(if debug == Some(true) { 2 } else { 0 }) } else {
0
})
}; };
config.rust_debuginfo_level_rustc = with_defaults(debuginfo_level_rustc); config.rust_debuginfo_level_rustc = with_defaults(debuginfo_level_rustc);
config.rust_debuginfo_level_std = with_defaults(debuginfo_level_std); config.rust_debuginfo_level_std = with_defaults(debuginfo_level_std);

View File

@ -59,13 +59,13 @@ o("full-tools", None, "enable all tools")
o("lld", "rust.lld", "build lld") o("lld", "rust.lld", "build lld")
o("lldb", "rust.lldb", "build lldb") o("lldb", "rust.lldb", "build lldb")
o("missing-tools", "dist.missing-tools", "allow failures when building tools") o("missing-tools", "dist.missing-tools", "allow failures when building tools")
o("use-libcxx", "llvm.use_libcxx", "build LLVM with libc++") o("use-libcxx", "llvm.use-libcxx", "build LLVM with libc++")
o("cflags", "llvm.cflags", "build LLVM with these extra compiler flags") o("cflags", "llvm.cflags", "build LLVM with these extra compiler flags")
o("cxxflags", "llvm.cxxflags", "build LLVM with these extra compiler flags") o("cxxflags", "llvm.cxxflags", "build LLVM with these extra compiler flags")
o("ldflags", "llvm.ldflags", "build LLVM with these extra linker flags") o("ldflags", "llvm.ldflags", "build LLVM with these extra linker flags")
o("llvm-libunwind", "rust.llvm_libunwind", "use LLVM libunwind") o("llvm-libunwind", "rust.llvm-libunwind", "use LLVM libunwind")
# Optimization and debugging options. These may be overridden by the release # Optimization and debugging options. These may be overridden by the release
# channel, etc. # channel, etc.

File diff suppressed because it is too large Load Diff

View File

@ -10,17 +10,17 @@
use std::collections::HashSet; use std::collections::HashSet;
use std::fs; use std::fs;
use std::io; use std::io;
use std::path::{PathBuf, Path}; use std::path::{Path, PathBuf};
use crate::Mode; use crate::Mode;
use build_helper::{t, up_to_date}; use build_helper::{t, up_to_date};
use crate::util::symlink_dir;
use crate::builder::{Builder, Compiler, RunConfig, ShouldRun, Step}; use crate::builder::{Builder, Compiler, RunConfig, ShouldRun, Step};
use crate::tool::{self, prepare_tool_cargo, Tool, SourceType}; use crate::cache::{Interned, INTERNER};
use crate::compile; use crate::compile;
use crate::cache::{INTERNER, Interned};
use crate::config::Config; use crate::config::Config;
use crate::tool::{self, prepare_tool_cargo, SourceType, Tool};
use crate::util::symlink_dir;
macro_rules! book { macro_rules! book {
($($name:ident, $path:expr, $book_name:expr;)+) => { ($($name:ident, $path:expr, $book_name:expr;)+) => {
@ -49,7 +49,7 @@ macro_rules! book {
builder.ensure(RustbookSrc { builder.ensure(RustbookSrc {
target: self.target, target: self.target,
name: INTERNER.intern_str($book_name), name: INTERNER.intern_str($book_name),
src: doc_src(builder), src: INTERNER.intern_path(builder.src.join($path)),
}) })
} }
} }
@ -60,6 +60,7 @@ macro_rules! book {
// NOTE: When adding a book here, make sure to ALSO build the book by // NOTE: When adding a book here, make sure to ALSO build the book by
// adding a build step in `src/bootstrap/builder.rs`! // adding a build step in `src/bootstrap/builder.rs`!
book!( book!(
CargoBook, "src/tools/cargo/src/doc", "cargo";
EditionGuide, "src/doc/edition-guide", "edition-guide"; EditionGuide, "src/doc/edition-guide", "edition-guide";
EmbeddedBook, "src/doc/embedded-book", "embedded-book"; EmbeddedBook, "src/doc/embedded-book", "embedded-book";
Nomicon, "src/doc/nomicon", "nomicon"; Nomicon, "src/doc/nomicon", "nomicon";
@ -69,10 +70,6 @@ book!(
RustdocBook, "src/doc/rustdoc", "rustdoc"; RustdocBook, "src/doc/rustdoc", "rustdoc";
); );
fn doc_src(builder: &Builder<'_>) -> Interned<PathBuf> {
INTERNER.intern_path(builder.src.join("src/doc"))
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)] #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct UnstableBook { pub struct UnstableBook {
target: Interned<String>, target: Interned<String>,
@ -88,67 +85,19 @@ impl Step for UnstableBook {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(UnstableBook { run.builder.ensure(UnstableBook { target: run.target });
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
builder.ensure(UnstableBookGen { builder.ensure(UnstableBookGen { target: self.target });
target: self.target,
});
builder.ensure(RustbookSrc { builder.ensure(RustbookSrc {
target: self.target, target: self.target,
name: INTERNER.intern_str("unstable-book"), name: INTERNER.intern_str("unstable-book"),
src: builder.md_doc_out(self.target), src: INTERNER.intern_path(builder.md_doc_out(self.target).join("unstable-book")),
}) })
} }
} }
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct CargoBook {
target: Interned<String>,
name: Interned<String>,
}
impl Step for CargoBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/cargo/src/doc/book").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(CargoBook {
target: run.target,
name: INTERNER.intern_str("cargo"),
});
}
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let name = self.name;
let src = builder.src.join("src/tools/cargo/src/doc");
let out = builder.doc_out(target);
t!(fs::create_dir_all(&out));
let out = out.join(name);
builder.info(&format!("Cargo Book ({}) - {}", target, name));
let _ = fs::remove_dir_all(&out);
builder.run(builder.tool_cmd(Tool::Rustbook)
.arg("build")
.arg(&src)
.arg("-d")
.arg(out));
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)] #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct RustbookSrc { struct RustbookSrc {
target: Interned<String>, target: Interned<String>,
@ -175,21 +124,16 @@ impl Step for RustbookSrc {
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let out = out.join(name); let out = out.join(name);
let src = src.join(name);
let index = out.join("index.html"); let index = out.join("index.html");
let rustbook = builder.tool_exe(Tool::Rustbook); let rustbook = builder.tool_exe(Tool::Rustbook);
let mut rustbook_cmd = builder.tool_cmd(Tool::Rustbook); let mut rustbook_cmd = builder.tool_cmd(Tool::Rustbook);
if up_to_date(&src, &index) && up_to_date(&rustbook, &index) { if up_to_date(&src, &index) && up_to_date(&rustbook, &index) {
return return;
} }
builder.info(&format!("Rustbook ({}) - {}", target, name)); builder.info(&format!("Rustbook ({}) - {}", target, name));
let _ = fs::remove_dir_all(&out); let _ = fs::remove_dir_all(&out);
builder.run(rustbook_cmd builder.run(rustbook_cmd.arg("build").arg(&src).arg("-d").arg(out));
.arg("build")
.arg(&src)
.arg("-d")
.arg(out));
} }
} }
@ -197,7 +141,6 @@ impl Step for RustbookSrc {
pub struct TheBook { pub struct TheBook {
compiler: Compiler, compiler: Compiler,
target: Interned<String>, target: Interned<String>,
name: &'static str,
} }
impl Step for TheBook { impl Step for TheBook {
@ -213,7 +156,6 @@ impl Step for TheBook {
run.builder.ensure(TheBook { run.builder.ensure(TheBook {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build), compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target, target: run.target,
name: "book",
}); });
} }
@ -221,51 +163,33 @@ impl Step for TheBook {
/// ///
/// We need to build: /// We need to build:
/// ///
/// * Book (first edition) /// * Book
/// * Book (second edition) /// * Older edition redirects
/// * Version info and CSS /// * Version info and CSS
/// * Index page /// * Index page
/// * Redirect pages /// * Redirect pages
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler; let compiler = self.compiler;
let target = self.target; let target = self.target;
let name = self.name;
// build book // build book
builder.ensure(RustbookSrc { builder.ensure(RustbookSrc {
target, target,
name: INTERNER.intern_string(name.to_string()), name: INTERNER.intern_str("book"),
src: doc_src(builder), src: INTERNER.intern_path(builder.src.join("src/doc/book")),
}); });
// building older edition redirects // building older edition redirects
for edition in &["first-edition", "second-edition", "2018-edition"] {
let source_name = format!("{}/first-edition", name); builder.ensure(RustbookSrc {
builder.ensure(RustbookSrc { target,
target, name: INTERNER.intern_string(format!("book/{}", edition)),
name: INTERNER.intern_string(source_name), src: INTERNER.intern_path(builder.src.join("src/doc/book").join(edition)),
src: doc_src(builder), });
}); }
let source_name = format!("{}/second-edition", name);
builder.ensure(RustbookSrc {
target,
name: INTERNER.intern_string(source_name),
src: doc_src(builder),
});
let source_name = format!("{}/2018-edition", name);
builder.ensure(RustbookSrc {
target,
name: INTERNER.intern_string(source_name),
src: doc_src(builder),
});
// build the version info page and CSS // build the version info page and CSS
builder.ensure(Standalone { builder.ensure(Standalone { compiler, target });
compiler,
target,
});
// build the redirect pages // build the redirect pages
builder.info(&format!("Documenting book redirect pages ({})", target)); builder.info(&format!("Documenting book redirect pages ({})", target));
@ -297,13 +221,20 @@ fn invoke_rustdoc(
let out = out.join("book"); let out = out.join("book");
cmd.arg("--html-after-content").arg(&footer) cmd.arg("--html-after-content")
.arg("--html-before-content").arg(&version_info) .arg(&footer)
.arg("--html-in-header").arg(&header) .arg("--html-before-content")
.arg(&version_info)
.arg("--html-in-header")
.arg(&header)
.arg("--markdown-no-toc") .arg("--markdown-no-toc")
.arg("--markdown-playground-url").arg("https://play.rust-lang.org/") .arg("--markdown-playground-url")
.arg("-o").arg(&out).arg(&path) .arg("https://play.rust-lang.org/")
.arg("--markdown-css").arg("../rust.css"); .arg("-o")
.arg(&out)
.arg(&path)
.arg("--markdown-css")
.arg("../rust.css");
builder.run(&mut cmd); builder.run(&mut cmd);
} }
@ -366,33 +297,39 @@ impl Step for Standalone {
let path = file.path(); let path = file.path();
let filename = path.file_name().unwrap().to_str().unwrap(); let filename = path.file_name().unwrap().to_str().unwrap();
if !filename.ends_with(".md") || filename == "README.md" { if !filename.ends_with(".md") || filename == "README.md" {
continue continue;
} }
let html = out.join(filename).with_extension("html"); let html = out.join(filename).with_extension("html");
let rustdoc = builder.rustdoc(compiler); let rustdoc = builder.rustdoc(compiler);
if up_to_date(&path, &html) && if up_to_date(&path, &html)
up_to_date(&footer, &html) && && up_to_date(&footer, &html)
up_to_date(&favicon, &html) && && up_to_date(&favicon, &html)
up_to_date(&full_toc, &html) && && up_to_date(&full_toc, &html)
(builder.config.dry_run || up_to_date(&version_info, &html)) && && (builder.config.dry_run || up_to_date(&version_info, &html))
(builder.config.dry_run || up_to_date(&rustdoc, &html)) { && (builder.config.dry_run || up_to_date(&rustdoc, &html))
continue {
continue;
} }
let mut cmd = builder.rustdoc_cmd(compiler); let mut cmd = builder.rustdoc_cmd(compiler);
cmd.arg("--html-after-content").arg(&footer) cmd.arg("--html-after-content")
.arg("--html-before-content").arg(&version_info) .arg(&footer)
.arg("--html-in-header").arg(&favicon) .arg("--html-before-content")
.arg("--markdown-no-toc") .arg(&version_info)
.arg("--index-page").arg(&builder.src.join("src/doc/index.md")) .arg("--html-in-header")
.arg("--markdown-playground-url").arg("https://play.rust-lang.org/") .arg(&favicon)
.arg("-o").arg(&out) .arg("--markdown-no-toc")
.arg(&path); .arg("--index-page")
.arg(&builder.src.join("src/doc/index.md"))
.arg("--markdown-playground-url")
.arg("https://play.rust-lang.org/")
.arg("-o")
.arg(&out)
.arg(&path);
if filename == "not_found.md" { if filename == "not_found.md" {
cmd.arg("--markdown-css") cmd.arg("--markdown-css").arg("https://doc.rust-lang.org/rust.css");
.arg("https://doc.rust-lang.org/rust.css");
} else { } else {
cmd.arg("--markdown-css").arg("rust.css"); cmd.arg("--markdown-css").arg("rust.css");
} }
@ -417,10 +354,7 @@ impl Step for Std {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std { run.builder.ensure(Std { stage: run.builder.top_stage, target: run.target });
stage: run.builder.top_stage,
target: run.target
});
} }
/// Compile all standard library documentation. /// Compile all standard library documentation.
@ -436,8 +370,7 @@ impl Step for Std {
let compiler = builder.compiler(stage, builder.config.build); let compiler = builder.compiler(stage, builder.config.build);
builder.ensure(compile::Std { compiler, target }); builder.ensure(compile::Std { compiler, target });
let out_dir = builder.stage_out(compiler, Mode::Std) let out_dir = builder.stage_out(compiler, Mode::Std).join(target).join("doc");
.join(target).join("doc");
// Here what we're doing is creating a *symlink* (directory junction on // Here what we're doing is creating a *symlink* (directory junction on
// Windows) to the final output location. This is not done as an // Windows) to the final output location. This is not done as an
@ -458,22 +391,25 @@ impl Step for Std {
let run_cargo_rustdoc_for = |package: &str| { let run_cargo_rustdoc_for = |package: &str| {
let mut cargo = builder.cargo(compiler, Mode::Std, target, "rustdoc"); let mut cargo = builder.cargo(compiler, Mode::Std, target, "rustdoc");
compile::std_cargo(builder, &compiler, target, &mut cargo); compile::std_cargo(builder, target, &mut cargo);
// Keep a whitelist so we do not build internal stdlib crates, these will be // Keep a whitelist so we do not build internal stdlib crates, these will be
// build by the rustc step later if enabled. // build by the rustc step later if enabled.
cargo.arg("-Z").arg("unstable-options") cargo.arg("-Z").arg("unstable-options").arg("-p").arg(package);
.arg("-p").arg(package);
// Create all crate output directories first to make sure rustdoc uses // Create all crate output directories first to make sure rustdoc uses
// relative links. // relative links.
// FIXME: Cargo should probably do this itself. // FIXME: Cargo should probably do this itself.
t!(fs::create_dir_all(out_dir.join(package))); t!(fs::create_dir_all(out_dir.join(package)));
cargo.arg("--") cargo
.arg("--markdown-css").arg("rust.css") .arg("--")
.arg("--markdown-no-toc") .arg("--markdown-css")
.arg("--generate-redirect-pages") .arg("rust.css")
.arg("--resource-suffix").arg(crate::channel::CFG_RELEASE_NUM) .arg("--markdown-no-toc")
.arg("--index-page").arg(&builder.src.join("src/doc/index.md")); .arg("--generate-redirect-pages")
.arg("--resource-suffix")
.arg(crate::channel::CFG_RELEASE_NUM)
.arg("--index-page")
.arg(&builder.src.join("src/doc/index.md"));
builder.run(&mut cargo.into()); builder.run(&mut cargo.into());
}; };
@ -501,10 +437,7 @@ impl Step for Rustc {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc { run.builder.ensure(Rustc { stage: run.builder.top_stage, target: run.target });
stage: run.builder.top_stage,
target: run.target,
});
} }
/// Generates compiler documentation. /// Generates compiler documentation.
@ -540,7 +473,7 @@ impl Step for Rustc {
// Build cargo command. // Build cargo command.
let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "doc"); let mut cargo = builder.cargo(compiler, Mode::Rustc, target, "doc");
cargo.env("RUSTDOCFLAGS", "--document-private-items --passes strip-hidden"); cargo.env("RUSTDOCFLAGS", "--document-private-items");
compile::rustc_cargo(builder, &mut cargo, target); compile::rustc_cargo(builder, &mut cargo, target);
// Only include compiler crates, no dependencies of those, such as `libc`. // Only include compiler crates, no dependencies of those, such as `libc`.
@ -568,7 +501,7 @@ impl Step for Rustc {
fn find_compiler_crates( fn find_compiler_crates(
builder: &Builder<'_>, builder: &Builder<'_>,
name: &Interned<String>, name: &Interned<String>,
crates: &mut HashSet<Interned<String>> crates: &mut HashSet<Interned<String>>,
) { ) {
// Add current crate. // Add current crate.
crates.insert(*name); crates.insert(*name);
@ -597,10 +530,7 @@ impl Step for Rustdoc {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc { run.builder.ensure(Rustdoc { stage: run.builder.top_stage, target: run.target });
stage: run.builder.top_stage,
target: run.target,
});
} }
/// Generates compiler documentation. /// Generates compiler documentation.
@ -633,9 +563,7 @@ impl Step for Rustdoc {
builder.ensure(tool::Rustdoc { compiler: compiler }); builder.ensure(tool::Rustdoc { compiler: compiler });
// Symlink compiler docs to the output directory of rustdoc documentation. // Symlink compiler docs to the output directory of rustdoc documentation.
let out_dir = builder.stage_out(compiler, Mode::ToolRustc) let out_dir = builder.stage_out(compiler, Mode::ToolRustc).join(target).join("doc");
.join(target)
.join("doc");
t!(fs::create_dir_all(&out_dir)); t!(fs::create_dir_all(&out_dir));
t!(symlink_dir_force(&builder.config, &out, &out_dir)); t!(symlink_dir_force(&builder.config, &out, &out_dir));
@ -648,7 +576,7 @@ impl Step for Rustdoc {
"doc", "doc",
"src/tools/rustdoc", "src/tools/rustdoc",
SourceType::InTree, SourceType::InTree,
&[] &[],
); );
// Only include compiler crates, no dependencies of those, such as `libc`. // Only include compiler crates, no dependencies of those, such as `libc`.
@ -676,9 +604,7 @@ impl Step for ErrorIndex {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(ErrorIndex { run.builder.ensure(ErrorIndex { target: run.target });
target: run.target,
});
} }
/// Generates the HTML rendered error-index by running the /// Generates the HTML rendered error-index by running the
@ -690,10 +616,7 @@ impl Step for ErrorIndex {
let out = builder.doc_out(target); let out = builder.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = builder.compiler(2, builder.config.build); let compiler = builder.compiler(2, builder.config.build);
let mut index = tool::ErrorIndex::command( let mut index = tool::ErrorIndex::command(builder, compiler);
builder,
compiler,
);
index.arg("html"); index.arg("html");
index.arg(out.join("error-index.html")); index.arg(out.join("error-index.html"));
index.arg(crate::channel::CFG_RELEASE_NUM); index.arg(crate::channel::CFG_RELEASE_NUM);
@ -721,9 +644,7 @@ impl Step for UnstableBookGen {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(UnstableBookGen { run.builder.ensure(UnstableBookGen { target: run.target });
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -751,9 +672,7 @@ fn symlink_dir_force(config: &Config, src: &Path, dst: &Path) -> io::Result<()>
} else { } else {
// handle directory junctions on windows by falling back to // handle directory junctions on windows by falling back to
// `remove_dir`. // `remove_dir`.
fs::remove_file(dst).or_else(|_| { fs::remove_file(dst).or_else(|_| fs::remove_dir(dst))?;
fs::remove_dir(dst)
})?;
} }
} }

View File

@ -38,6 +38,8 @@ pub struct Flags {
// //
// true => deny, false => warn // true => deny, false => warn
pub deny_warnings: Option<bool>, pub deny_warnings: Option<bool>,
pub llvm_skip_rebuild: Option<bool>,
} }
pub enum Subcommand { pub enum Subcommand {
@ -53,6 +55,9 @@ pub enum Subcommand {
Fix { Fix {
paths: Vec<PathBuf>, paths: Vec<PathBuf>,
}, },
Format {
check: bool,
},
Doc { Doc {
paths: Vec<PathBuf>, paths: Vec<PathBuf>,
}, },
@ -85,23 +90,23 @@ pub enum Subcommand {
impl Default for Subcommand { impl Default for Subcommand {
fn default() -> Subcommand { fn default() -> Subcommand {
Subcommand::Build { Subcommand::Build { paths: vec![PathBuf::from("nowhere")] }
paths: vec![PathBuf::from("nowhere")],
}
} }
} }
impl Flags { impl Flags {
pub fn parse(args: &[String]) -> Flags { pub fn parse(args: &[String]) -> Flags {
let mut extra_help = String::new(); let mut extra_help = String::new();
let mut subcommand_help = String::from("\ let mut subcommand_help = String::from(
"\
Usage: x.py <subcommand> [options] [<paths>...] Usage: x.py <subcommand> [options] [<paths>...]
Subcommands: Subcommands:
build Compile either the compiler or libraries build Compile either the compiler or libraries
check Compile either the compiler or libraries, using cargo check check Compile either the compiler or libraries, using cargo check
clippy Run clippy clippy Run clippy (uses rustup/cargo-installed clippy binary)
fix Run cargo fix fix Run cargo fix
fmt Run rustfmt
test Build and run some test suites test Build and run some test suites
bench Build and run some benchmarks bench Build and run some benchmarks
doc Build documentation doc Build documentation
@ -109,7 +114,7 @@ Subcommands:
dist Build distribution artifacts dist Build distribution artifacts
install Install distribution artifacts install Install distribution artifacts
To learn more about a subcommand, run `./x.py <subcommand> -h`" To learn more about a subcommand, run `./x.py <subcommand> -h`",
); );
let mut opts = Options::new(); let mut opts = Options::new();
@ -123,12 +128,20 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
opts.optmulti("", "exclude", "build paths to exclude", "PATH"); opts.optmulti("", "exclude", "build paths to exclude", "PATH");
opts.optopt("", "on-fail", "command to run on failure", "CMD"); opts.optopt("", "on-fail", "command to run on failure", "CMD");
opts.optflag("", "dry-run", "dry run; don't build anything"); opts.optflag("", "dry-run", "dry run; don't build anything");
opts.optopt("", "stage", opts.optopt(
"",
"stage",
"stage to build (indicates compiler to use/test, e.g., stage 0 uses the \ "stage to build (indicates compiler to use/test, e.g., stage 0 uses the \
bootstrap compiler, stage 1 the stage 0 rustc artifacts, etc.)", bootstrap compiler, stage 1 the stage 0 rustc artifacts, etc.)",
"N"); "N",
opts.optmulti("", "keep-stage", "stage(s) to keep without recompiling \ );
(pass multiple times to keep e.g., both stages 0 and 1)", "N"); opts.optmulti(
"",
"keep-stage",
"stage(s) to keep without recompiling \
(pass multiple times to keep e.g., both stages 0 and 1)",
"N",
);
opts.optopt("", "src", "path to the root of the rust checkout", "DIR"); opts.optopt("", "src", "path to the root of the rust checkout", "DIR");
opts.optopt("j", "jobs", "number of jobs to run in parallel", "JOBS"); opts.optopt("j", "jobs", "number of jobs to run in parallel", "JOBS");
opts.optflag("h", "help", "print this help message"); opts.optflag("h", "help", "print this help message");
@ -139,6 +152,14 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
"VALUE", "VALUE",
); );
opts.optopt("", "error-format", "rustc error format", "FORMAT"); opts.optopt("", "error-format", "rustc error format", "FORMAT");
opts.optopt(
"",
"llvm-skip-rebuild",
"whether rebuilding llvm should be skipped \
a VALUE of TRUE indicates that llvm will not be rebuilt \
VALUE overrides the skip-rebuild option in config.toml.",
"VALUE",
);
// fn usage() // fn usage()
let usage = let usage =
@ -160,6 +181,7 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
|| (s == "check") || (s == "check")
|| (s == "clippy") || (s == "clippy")
|| (s == "fix") || (s == "fix")
|| (s == "fmt")
|| (s == "test") || (s == "test")
|| (s == "bench") || (s == "bench")
|| (s == "doc") || (s == "doc")
@ -192,11 +214,7 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
); );
opts.optflag("", "no-doc", "do not run doc tests"); opts.optflag("", "no-doc", "do not run doc tests");
opts.optflag("", "doc", "only run doc tests"); opts.optflag("", "doc", "only run doc tests");
opts.optflag( opts.optflag("", "bless", "update all stderr/stdout files of failing ui tests");
"",
"bless",
"update all stderr/stdout files of failing ui tests",
);
opts.optopt( opts.optopt(
"", "",
"compare-mode", "compare-mode",
@ -207,7 +225,7 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
"", "",
"pass", "pass",
"force {check,build,run}-pass tests to this mode.", "force {check,build,run}-pass tests to this mode.",
"check | build | run" "check | build | run",
); );
opts.optflag( opts.optflag(
"", "",
@ -222,6 +240,9 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`"
"clean" => { "clean" => {
opts.optflag("", "all", "clean all build artifacts"); opts.optflag("", "all", "clean all build artifacts");
} }
"fmt" => {
opts.optflag("", "check", "check formatting instead of applying.");
}
_ => {} _ => {}
}; };
@ -323,6 +344,17 @@ Arguments:
./x.py fix src/libcore src/libproc_macro", ./x.py fix src/libcore src/libproc_macro",
); );
} }
"fmt" => {
subcommand_help.push_str(
"\n
Arguments:
This subcommand optionally accepts a `--check` flag which succeeds if formatting is correct and
fails if it is not. For example:
./x.py fmt
./x.py fmt --check",
);
}
"test" => { "test" => {
subcommand_help.push_str( subcommand_help.push_str(
"\n "\n
@ -367,10 +399,7 @@ Arguments:
_ => {} _ => {}
}; };
// Get any optional paths which occur after the subcommand // Get any optional paths which occur after the subcommand
let paths = matches.free[1..] let paths = matches.free[1..].iter().map(|p| p.into()).collect::<Vec<PathBuf>>();
.iter()
.map(|p| p.into())
.collect::<Vec<PathBuf>>();
let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| { let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| {
if fs::metadata("config.toml").is_ok() { if fs::metadata("config.toml").is_ok() {
@ -388,12 +417,10 @@ Arguments:
let maybe_rules_help = Builder::get_help(&build, subcommand.as_str()); let maybe_rules_help = Builder::get_help(&build, subcommand.as_str());
extra_help.push_str(maybe_rules_help.unwrap_or_default().as_str()); extra_help.push_str(maybe_rules_help.unwrap_or_default().as_str());
} else if subcommand.as_str() != "clean" { } else if !(subcommand.as_str() == "clean" || subcommand.as_str() == "fmt") {
extra_help.push_str( extra_help.push_str(
format!( format!("Run `./x.py {} -h -v` to see a list of available paths.", subcommand)
"Run `./x.py {} -h -v` to see a list of available paths.", .as_str(),
subcommand
).as_str(),
); );
} }
@ -424,10 +451,7 @@ Arguments:
DocTests::Yes DocTests::Yes
}, },
}, },
"bench" => Subcommand::Bench { "bench" => Subcommand::Bench { paths, test_args: matches.opt_strs("test-args") },
paths,
test_args: matches.opt_strs("test-args"),
},
"doc" => Subcommand::Doc { paths }, "doc" => Subcommand::Doc { paths },
"clean" => { "clean" => {
if !paths.is_empty() { if !paths.is_empty() {
@ -435,10 +459,9 @@ Arguments:
usage(1, &opts, &subcommand_help, &extra_help); usage(1, &opts, &subcommand_help, &extra_help);
} }
Subcommand::Clean { Subcommand::Clean { all: matches.opt_present("all") }
all: matches.opt_present("all"),
}
} }
"fmt" => Subcommand::Format { check: matches.opt_present("check") },
"dist" => Subcommand::Dist { paths }, "dist" => Subcommand::Dist { paths },
"install" => Subcommand::Install { paths }, "install" => Subcommand::Install { paths },
_ => { _ => {
@ -452,8 +475,10 @@ Arguments:
dry_run: matches.opt_present("dry-run"), dry_run: matches.opt_present("dry-run"),
on_fail: matches.opt_str("on-fail"), on_fail: matches.opt_str("on-fail"),
rustc_error_format: matches.opt_str("error-format"), rustc_error_format: matches.opt_str("error-format"),
keep_stage: matches.opt_strs("keep-stage") keep_stage: matches
.into_iter().map(|j| j.parse().expect("`keep-stage` should be a number")) .opt_strs("keep-stage")
.into_iter()
.map(|j| j.parse().expect("`keep-stage` should be a number"))
.collect(), .collect(),
host: split(&matches.opt_strs("host")) host: split(&matches.opt_strs("host"))
.into_iter() .into_iter()
@ -472,6 +497,9 @@ Arguments:
.map(|p| p.into()) .map(|p| p.into())
.collect::<Vec<_>>(), .collect::<Vec<_>>(),
deny_warnings: parse_deny_warnings(&matches), deny_warnings: parse_deny_warnings(&matches),
llvm_skip_rebuild: matches.opt_str("llvm-skip-rebuild").map(|s| s.to_lowercase()).map(
|s| s.parse::<bool>().expect("`llvm-skip-rebuild` should be either true or false"),
),
} }
} }
} }
@ -480,10 +508,7 @@ impl Subcommand {
pub fn test_args(&self) -> Vec<&str> { pub fn test_args(&self) -> Vec<&str> {
match *self { match *self {
Subcommand::Test { ref test_args, .. } | Subcommand::Bench { ref test_args, .. } => { Subcommand::Test { ref test_args, .. } | Subcommand::Bench { ref test_args, .. } => {
test_args test_args.iter().flat_map(|s| s.split_whitespace()).collect()
.iter()
.flat_map(|s| s.split_whitespace())
.collect()
} }
_ => Vec::new(), _ => Vec::new(),
} }
@ -491,10 +516,9 @@ impl Subcommand {
pub fn rustc_args(&self) -> Vec<&str> { pub fn rustc_args(&self) -> Vec<&str> {
match *self { match *self {
Subcommand::Test { ref rustc_args, .. } => rustc_args Subcommand::Test { ref rustc_args, .. } => {
.iter() rustc_args.iter().flat_map(|s| s.split_whitespace()).collect()
.flat_map(|s| s.split_whitespace()) }
.collect(),
_ => Vec::new(), _ => Vec::new(),
} }
} }
@ -529,28 +553,21 @@ impl Subcommand {
pub fn compare_mode(&self) -> Option<&str> { pub fn compare_mode(&self) -> Option<&str> {
match *self { match *self {
Subcommand::Test { Subcommand::Test { ref compare_mode, .. } => compare_mode.as_ref().map(|s| &s[..]),
ref compare_mode, ..
} => compare_mode.as_ref().map(|s| &s[..]),
_ => None, _ => None,
} }
} }
pub fn pass(&self) -> Option<&str> { pub fn pass(&self) -> Option<&str> {
match *self { match *self {
Subcommand::Test { Subcommand::Test { ref pass, .. } => pass.as_ref().map(|s| &s[..]),
ref pass, ..
} => pass.as_ref().map(|s| &s[..]),
_ => None, _ => None,
} }
} }
} }
fn split(s: &[String]) -> Vec<String> { fn split(s: &[String]) -> Vec<String> {
s.iter() s.iter().flat_map(|s| s.split(',')).map(|s| s.to_string()).collect()
.flat_map(|s| s.split(','))
.map(|s| s.to_string())
.collect()
} }
fn parse_deny_warnings(matches: &getopts::Matches) -> Option<bool> { fn parse_deny_warnings(matches: &getopts::Matches) -> Option<bool> {
@ -558,12 +575,9 @@ fn parse_deny_warnings(matches: &getopts::Matches) -> Option<bool> {
Some("deny") => Some(true), Some("deny") => Some(true),
Some("warn") => Some(false), Some("warn") => Some(false),
Some(value) => { Some(value) => {
eprintln!( eprintln!(r#"invalid value for --warnings: {:?}, expected "warn" or "deny""#, value,);
r#"invalid value for --warnings: {:?}, expected "warn" or "deny""#,
value,
);
process::exit(1); process::exit(1);
}, }
None => None, None => None,
} }
} }

75
src/bootstrap/format.rs Normal file
View File

@ -0,0 +1,75 @@
//! Runs rustfmt on the repository.
use crate::Build;
use build_helper::t;
use ignore::WalkBuilder;
use std::path::Path;
use std::process::Command;
fn rustfmt(src: &Path, rustfmt: &Path, path: &Path, check: bool) {
let mut cmd = Command::new(&rustfmt);
// avoid the submodule config paths from coming into play,
// we only allow a single global config for the workspace for now
cmd.arg("--config-path").arg(&src.canonicalize().unwrap());
cmd.arg("--edition").arg("2018");
cmd.arg("--unstable-features");
cmd.arg("--skip-children");
if check {
cmd.arg("--check");
}
cmd.arg(&path);
let cmd_debug = format!("{:?}", cmd);
let status = cmd.status().expect("executing rustfmt");
if !status.success() {
eprintln!(
"Running `{}` failed.\nIf you're running `tidy`, \
try again with `--bless` flag. Or, you just want to format \
code, run `./x.py fmt` instead.",
cmd_debug,
);
std::process::exit(1);
}
}
#[derive(serde::Deserialize)]
struct RustfmtConfig {
ignore: Vec<String>,
}
pub fn format(build: &Build, check: bool) {
let mut builder = ignore::types::TypesBuilder::new();
builder.add_defaults();
builder.select("rust");
let matcher = builder.build().unwrap();
let rustfmt_config = build.src.join("rustfmt.toml");
if !rustfmt_config.exists() {
eprintln!("Not running formatting checks; rustfmt.toml does not exist.");
eprintln!("This may happen in distributed tarballs.");
return;
}
let rustfmt_config = t!(std::fs::read_to_string(&rustfmt_config));
let rustfmt_config: RustfmtConfig = t!(toml::from_str(&rustfmt_config));
let mut ignore_fmt = ignore::overrides::OverrideBuilder::new(&build.src);
for ignore in rustfmt_config.ignore {
ignore_fmt.add(&format!("!{}", ignore)).expect(&ignore);
}
let ignore_fmt = ignore_fmt.build().unwrap();
let rustfmt_path = build.config.initial_rustfmt.as_ref().unwrap_or_else(|| {
eprintln!("./x.py fmt is not supported on this channel");
std::process::exit(1);
});
let src = build.src.clone();
let walker = WalkBuilder::new(&build.src).types(matcher).overrides(ignore_fmt).build_parallel();
walker.run(|| {
let src = src.clone();
let rustfmt_path = rustfmt_path.clone();
Box::new(move |entry| {
let entry = t!(entry);
if entry.file_type().map_or(false, |t| t.is_file()) {
rustfmt(&src, &rustfmt_path, &entry.path(), check);
}
ignore::WalkState::Continue
})
});
}

View File

@ -29,88 +29,22 @@
#![allow(nonstandard_style, dead_code)] #![allow(nonstandard_style, dead_code)]
use crate::Build;
use std::env; use std::env;
use std::io; use std::io;
use std::mem; use std::mem;
use std::ptr; use std::ptr;
use crate::Build;
type HANDLE = *mut u8; use winapi::shared::minwindef::{DWORD, FALSE, LPVOID};
type BOOL = i32; use winapi::um::errhandlingapi::SetErrorMode;
type DWORD = u32; use winapi::um::handleapi::{CloseHandle, DuplicateHandle};
type LPHANDLE = *mut HANDLE; use winapi::um::jobapi2::{AssignProcessToJobObject, CreateJobObjectW, SetInformationJobObject};
type LPVOID = *mut u8; use winapi::um::processthreadsapi::{GetCurrentProcess, OpenProcess};
type JOBOBJECTINFOCLASS = i32; use winapi::um::winbase::{BELOW_NORMAL_PRIORITY_CLASS, SEM_NOGPFAULTERRORBOX};
type SIZE_T = usize; use winapi::um::winnt::{
type LARGE_INTEGER = i64; JobObjectExtendedLimitInformation, DUPLICATE_SAME_ACCESS, JOBOBJECT_EXTENDED_LIMIT_INFORMATION,
type UINT = u32; JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE, JOB_OBJECT_LIMIT_PRIORITY_CLASS, PROCESS_DUP_HANDLE,
type ULONG_PTR = usize; };
type ULONGLONG = u64;
const FALSE: BOOL = 0;
const DUPLICATE_SAME_ACCESS: DWORD = 0x2;
const PROCESS_DUP_HANDLE: DWORD = 0x40;
const JobObjectExtendedLimitInformation: JOBOBJECTINFOCLASS = 9;
const JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE: DWORD = 0x2000;
const JOB_OBJECT_LIMIT_PRIORITY_CLASS: DWORD = 0x00000020;
const SEM_FAILCRITICALERRORS: UINT = 0x0001;
const SEM_NOGPFAULTERRORBOX: UINT = 0x0002;
const BELOW_NORMAL_PRIORITY_CLASS: DWORD = 0x00004000;
extern "system" {
fn CreateJobObjectW(lpJobAttributes: *mut u8, lpName: *const u8) -> HANDLE;
fn CloseHandle(hObject: HANDLE) -> BOOL;
fn GetCurrentProcess() -> HANDLE;
fn OpenProcess(dwDesiredAccess: DWORD,
bInheritHandle: BOOL,
dwProcessId: DWORD) -> HANDLE;
fn DuplicateHandle(hSourceProcessHandle: HANDLE,
hSourceHandle: HANDLE,
hTargetProcessHandle: HANDLE,
lpTargetHandle: LPHANDLE,
dwDesiredAccess: DWORD,
bInheritHandle: BOOL,
dwOptions: DWORD) -> BOOL;
fn AssignProcessToJobObject(hJob: HANDLE, hProcess: HANDLE) -> BOOL;
fn SetInformationJobObject(hJob: HANDLE,
JobObjectInformationClass: JOBOBJECTINFOCLASS,
lpJobObjectInformation: LPVOID,
cbJobObjectInformationLength: DWORD) -> BOOL;
fn SetErrorMode(mode: UINT) -> UINT;
}
#[repr(C)]
struct JOBOBJECT_EXTENDED_LIMIT_INFORMATION {
BasicLimitInformation: JOBOBJECT_BASIC_LIMIT_INFORMATION,
IoInfo: IO_COUNTERS,
ProcessMemoryLimit: SIZE_T,
JobMemoryLimit: SIZE_T,
PeakProcessMemoryUsed: SIZE_T,
PeakJobMemoryUsed: SIZE_T,
}
#[repr(C)]
struct IO_COUNTERS {
ReadOperationCount: ULONGLONG,
WriteOperationCount: ULONGLONG,
OtherOperationCount: ULONGLONG,
ReadTransferCount: ULONGLONG,
WriteTransferCount: ULONGLONG,
OtherTransferCount: ULONGLONG,
}
#[repr(C)]
struct JOBOBJECT_BASIC_LIMIT_INFORMATION {
PerProcessUserTimeLimit: LARGE_INTEGER,
PerJobUserTimeLimit: LARGE_INTEGER,
LimitFlags: DWORD,
MinimumWorkingsetSize: SIZE_T,
MaximumWorkingsetSize: SIZE_T,
ActiveProcessLimit: DWORD,
Affinity: ULONG_PTR,
PriorityClass: DWORD,
SchedulingClass: DWORD,
}
pub unsafe fn setup(build: &mut Build) { pub unsafe fn setup(build: &mut Build) {
// Enable the Windows Error Reporting dialog which msys disables, // Enable the Windows Error Reporting dialog which msys disables,
@ -132,10 +66,12 @@ pub unsafe fn setup(build: &mut Build) {
info.BasicLimitInformation.LimitFlags |= JOB_OBJECT_LIMIT_PRIORITY_CLASS; info.BasicLimitInformation.LimitFlags |= JOB_OBJECT_LIMIT_PRIORITY_CLASS;
info.BasicLimitInformation.PriorityClass = BELOW_NORMAL_PRIORITY_CLASS; info.BasicLimitInformation.PriorityClass = BELOW_NORMAL_PRIORITY_CLASS;
} }
let r = SetInformationJobObject(job, let r = SetInformationJobObject(
JobObjectExtendedLimitInformation, job,
&mut info as *mut _ as LPVOID, JobObjectExtendedLimitInformation,
mem::size_of_val(&info) as DWORD); &mut info as *mut _ as LPVOID,
mem::size_of_val(&info) as DWORD,
);
assert!(r != 0, "{}", io::Error::last_os_error()); assert!(r != 0, "{}", io::Error::last_os_error());
// Assign our process to this job object. Note that if this fails, one very // Assign our process to this job object. Note that if this fails, one very
@ -150,7 +86,7 @@ pub unsafe fn setup(build: &mut Build) {
let r = AssignProcessToJobObject(job, GetCurrentProcess()); let r = AssignProcessToJobObject(job, GetCurrentProcess());
if r == 0 { if r == 0 {
CloseHandle(job); CloseHandle(job);
return return;
} }
// If we've got a parent process (e.g., the python script that called us) // If we've got a parent process (e.g., the python script that called us)
@ -169,9 +105,15 @@ pub unsafe fn setup(build: &mut Build) {
let parent = OpenProcess(PROCESS_DUP_HANDLE, FALSE, pid.parse().unwrap()); let parent = OpenProcess(PROCESS_DUP_HANDLE, FALSE, pid.parse().unwrap());
assert!(!parent.is_null(), "{}", io::Error::last_os_error()); assert!(!parent.is_null(), "{}", io::Error::last_os_error());
let mut parent_handle = ptr::null_mut(); let mut parent_handle = ptr::null_mut();
let r = DuplicateHandle(GetCurrentProcess(), job, let r = DuplicateHandle(
parent, &mut parent_handle, GetCurrentProcess(),
0, FALSE, DUPLICATE_SAME_ACCESS); job,
parent,
&mut parent_handle,
0,
FALSE,
DUPLICATE_SAME_ACCESS,
);
// If this failed, well at least we tried! An example of DuplicateHandle // If this failed, well at least we tried! An example of DuplicateHandle
// failing in the past has been when the wrong python2 package spawned this // failing in the past has been when the wrong python2 package spawned this

View File

@ -106,12 +106,12 @@
#![feature(core_intrinsics)] #![feature(core_intrinsics)]
#![feature(drain_filter)] #![feature(drain_filter)]
use std::cell::{RefCell, Cell}; use std::cell::{Cell, RefCell};
use std::collections::{HashSet, HashMap}; use std::collections::{HashMap, HashSet};
use std::env; use std::env;
use std::fs::{self, OpenOptions, File}; use std::fs::{self, File, OpenOptions};
use std::io::{Seek, SeekFrom, Write, Read}; use std::io::{Read, Seek, SeekFrom, Write};
use std::path::{PathBuf, Path}; use std::path::{Path, PathBuf};
use std::process::{self, Command}; use std::process::{self, Command};
use std::slice; use std::slice;
use std::str; use std::str;
@ -121,32 +121,31 @@ use std::os::unix::fs::symlink as symlink_file;
#[cfg(windows)] #[cfg(windows)]
use std::os::windows::fs::symlink_file; use std::os::windows::fs::symlink_file;
use build_helper::{ use build_helper::{mtime, output, run, run_suppressed, t, try_run, try_run_suppressed};
mtime, output, run, run_suppressed, t, try_run, try_run_suppressed,
};
use filetime::FileTime; use filetime::FileTime;
use crate::util::{exe, libdir, CiEnv}; use crate::util::{exe, libdir, CiEnv};
mod builder;
mod cache;
mod cc_detect; mod cc_detect;
mod channel; mod channel;
mod check; mod check;
mod test;
mod clean; mod clean;
mod compile; mod compile;
mod metadata;
mod config; mod config;
mod dist; mod dist;
mod doc; mod doc;
mod flags; mod flags;
mod format;
mod install; mod install;
mod metadata;
mod native; mod native;
mod sanity; mod sanity;
pub mod util; mod test;
mod builder;
mod cache;
mod tool; mod tool;
mod toolstate; mod toolstate;
pub mod util;
#[cfg(windows)] #[cfg(windows)]
mod job; mod job;
@ -162,13 +161,12 @@ mod job {
#[cfg(any(target_os = "haiku", target_os = "hermit", not(any(unix, windows))))] #[cfg(any(target_os = "haiku", target_os = "hermit", not(any(unix, windows))))]
mod job { mod job {
pub unsafe fn setup(_build: &mut crate::Build) { pub unsafe fn setup(_build: &mut crate::Build) {}
}
} }
use crate::cache::{Interned, INTERNER};
pub use crate::config::Config; pub use crate::config::Config;
use crate::flags::Subcommand; use crate::flags::Subcommand;
use crate::cache::{Interned, INTERNER};
const LLVM_TOOLS: &[&str] = &[ const LLVM_TOOLS: &[&str] = &[
"llvm-nm", // used to inspect binaries; it shows symbol names, their sizes and visibility "llvm-nm", // used to inspect binaries; it shows symbol names, their sizes and visibility
@ -178,7 +176,7 @@ const LLVM_TOOLS: &[&str] = &[
"llvm-readobj", // used to get information from ELFs/objects that the other tools don't provide "llvm-readobj", // used to get information from ELFs/objects that the other tools don't provide
"llvm-size", // used to prints the size of the linker sections of a program "llvm-size", // used to prints the size of the linker sections of a program
"llvm-strip", // used to discard symbols from binary files to reduce their size "llvm-strip", // used to discard symbols from binary files to reduce their size
"llvm-ar" // used for creating and modifying archive files "llvm-ar", // used for creating and modifying archive files
]; ];
/// A structure representing a Rust compiler. /// A structure representing a Rust compiler.
@ -257,10 +255,8 @@ pub struct Build {
ci_env: CiEnv, ci_env: CiEnv,
delayed_failures: RefCell<Vec<String>>, delayed_failures: RefCell<Vec<String>>,
prerelease_version: Cell<Option<u32>>, prerelease_version: Cell<Option<u32>>,
tool_artifacts: RefCell<HashMap< tool_artifacts:
Interned<String>, RefCell<HashMap<Interned<String>, HashMap<String, (&'static str, PathBuf, Vec<String>)>>>,
HashMap<String, (&'static str, PathBuf, Vec<String>)>
>>,
} }
#[derive(Debug)] #[derive(Debug)]
@ -273,8 +269,7 @@ struct Crate {
impl Crate { impl Crate {
fn is_local(&self, build: &Build) -> bool { fn is_local(&self, build: &Build) -> bool {
self.path.starts_with(&build.config.src) && self.path.starts_with(&build.config.src) && !self.path.to_string_lossy().ends_with("_shim")
!self.path.to_string_lossy().ends_with("_shim")
} }
fn local_path(&self, build: &Build) -> PathBuf { fn local_path(&self, build: &Build) -> PathBuf {
@ -315,7 +310,7 @@ impl Mode {
pub fn is_tool(&self) -> bool { pub fn is_tool(&self) -> bool {
match self { match self {
Mode::ToolBootstrap | Mode::ToolRustc | Mode::ToolStd => true, Mode::ToolBootstrap | Mode::ToolRustc | Mode::ToolStd => true,
_ => false _ => false,
} }
} }
} }
@ -330,12 +325,10 @@ impl Build {
let out = config.out.clone(); let out = config.out.clone();
let is_sudo = match env::var_os("SUDO_USER") { let is_sudo = match env::var_os("SUDO_USER") {
Some(sudo_user) => { Some(sudo_user) => match env::var_os("USER") {
match env::var_os("USER") { Some(user) => user != sudo_user,
Some(user) => user != sudo_user, None => false,
None => false, },
}
}
None => false, None => false,
}; };
@ -392,11 +385,15 @@ impl Build {
// If local-rust is the same major.minor as the current version, then force a // If local-rust is the same major.minor as the current version, then force a
// local-rebuild // local-rebuild
let local_version_verbose = output( let local_version_verbose =
Command::new(&build.initial_rustc).arg("--version").arg("--verbose")); output(Command::new(&build.initial_rustc).arg("--version").arg("--verbose"));
let local_release = local_version_verbose let local_release = local_version_verbose
.lines().filter(|x| x.starts_with("release:")) .lines()
.next().unwrap().trim_start_matches("release:").trim(); .filter(|x| x.starts_with("release:"))
.next()
.unwrap()
.trim_start_matches("release:")
.trim();
let my_version = channel::CFG_RELEASE_NUM; let my_version = channel::CFG_RELEASE_NUM;
if local_release.split('.').take(2).eq(my_version.split('.').take(2)) { if local_release.split('.').take(2).eq(my_version.split('.').take(2)) {
build.verbose(&format!("auto-detected local-rebuild {}", local_release)); build.verbose(&format!("auto-detected local-rebuild {}", local_release));
@ -410,9 +407,7 @@ impl Build {
} }
pub fn build_triple(&self) -> &[Interned<String>] { pub fn build_triple(&self) -> &[Interned<String>] {
unsafe { unsafe { slice::from_raw_parts(&self.build, 1) }
slice::from_raw_parts(&self.build, 1)
}
} }
/// Executes the entire build, as configured by the flags and configuration. /// Executes the entire build, as configured by the flags and configuration.
@ -421,6 +416,10 @@ impl Build {
job::setup(self); job::setup(self);
} }
if let Subcommand::Format { check } = self.config.cmd {
return format::format(self, check);
}
if let Subcommand::Clean { all } = self.config.cmd { if let Subcommand::Clean { all } = self.config.cmd {
return clean::clean(self, all); return clean::clean(self, all);
} }
@ -509,7 +508,7 @@ impl Build {
/// Component directory that Cargo will produce output into (e.g. /// Component directory that Cargo will produce output into (e.g.
/// release/debug) /// release/debug)
fn cargo_dir(&self) -> &'static str { fn cargo_dir(&self) -> &'static str {
if self.config.rust_optimize {"release"} else {"debug"} if self.config.rust_optimize { "release" } else { "debug" }
} }
fn tools_dir(&self, compiler: Compiler) -> PathBuf { fn tools_dir(&self, compiler: Compiler) -> PathBuf {
@ -530,17 +529,13 @@ impl Build {
Mode::ToolBootstrap => "-bootstrap-tools", Mode::ToolBootstrap => "-bootstrap-tools",
Mode::ToolStd | Mode::ToolRustc => "-tools", Mode::ToolStd | Mode::ToolRustc => "-tools",
}; };
self.out.join(&*compiler.host) self.out.join(&*compiler.host).join(format!("stage{}{}", compiler.stage, suffix))
.join(format!("stage{}{}", compiler.stage, suffix))
} }
/// Returns the root output directory for all Cargo output in a given stage, /// Returns the root output directory for all Cargo output in a given stage,
/// running a particular compiler, whether or not we're building the /// running a particular compiler, whether or not we're building the
/// standard library, and targeting the specified architecture. /// standard library, and targeting the specified architecture.
fn cargo_out(&self, fn cargo_out(&self, compiler: Compiler, mode: Mode, target: Interned<String>) -> PathBuf {
compiler: Compiler,
mode: Mode,
target: Interned<String>) -> PathBuf {
self.stage_out(compiler, mode).join(&*target).join(self.cargo_dir()) self.stage_out(compiler, mode).join(&*target).join(self.cargo_dir())
} }
@ -584,7 +579,7 @@ impl Build {
fn is_rust_llvm(&self, target: Interned<String>) -> bool { fn is_rust_llvm(&self, target: Interned<String>) -> bool {
match self.config.target_config.get(&target) { match self.config.target_config.get(&target) {
Some(ref c) => c.llvm_config.is_none(), Some(ref c) => c.llvm_config.is_none(),
None => true None => true,
} }
} }
@ -602,8 +597,8 @@ impl Build {
// On Fedora the system LLVM installs FileCheck in the // On Fedora the system LLVM installs FileCheck in the
// llvm subdirectory of the libdir. // llvm subdirectory of the libdir.
let llvm_libdir = output(Command::new(s).arg("--libdir")); let llvm_libdir = output(Command::new(s).arg("--libdir"));
let lib_filecheck = Path::new(llvm_libdir.trim()) let lib_filecheck =
.join("llvm").join(exe("FileCheck", &*target)); Path::new(llvm_libdir.trim()).join("llvm").join(exe("FileCheck", &*target));
if lib_filecheck.exists() { if lib_filecheck.exists() {
lib_filecheck lib_filecheck
} else { } else {
@ -662,14 +657,18 @@ impl Build {
/// Runs a command, printing out nice contextual information if it fails. /// Runs a command, printing out nice contextual information if it fails.
fn run(&self, cmd: &mut Command) { fn run(&self, cmd: &mut Command) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
self.verbose(&format!("running: {:?}", cmd)); self.verbose(&format!("running: {:?}", cmd));
run(cmd) run(cmd)
} }
/// Runs a command, printing out nice contextual information if it fails. /// Runs a command, printing out nice contextual information if it fails.
fn run_quiet(&self, cmd: &mut Command) { fn run_quiet(&self, cmd: &mut Command) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
self.verbose(&format!("running: {:?}", cmd)); self.verbose(&format!("running: {:?}", cmd));
run_suppressed(cmd) run_suppressed(cmd)
} }
@ -678,7 +677,9 @@ impl Build {
/// Exits if the command failed to execute at all, otherwise returns its /// Exits if the command failed to execute at all, otherwise returns its
/// `status.success()`. /// `status.success()`.
fn try_run(&self, cmd: &mut Command) -> bool { fn try_run(&self, cmd: &mut Command) -> bool {
if self.config.dry_run { return true; } if self.config.dry_run {
return true;
}
self.verbose(&format!("running: {:?}", cmd)); self.verbose(&format!("running: {:?}", cmd));
try_run(cmd) try_run(cmd)
} }
@ -687,7 +688,9 @@ impl Build {
/// Exits if the command failed to execute at all, otherwise returns its /// Exits if the command failed to execute at all, otherwise returns its
/// `status.success()`. /// `status.success()`.
fn try_run_quiet(&self, cmd: &mut Command) -> bool { fn try_run_quiet(&self, cmd: &mut Command) -> bool {
if self.config.dry_run { return true; } if self.config.dry_run {
return true;
}
self.verbose(&format!("running: {:?}", cmd)); self.verbose(&format!("running: {:?}", cmd));
try_run_suppressed(cmd) try_run_suppressed(cmd)
} }
@ -715,7 +718,9 @@ impl Build {
} }
fn info(&self, msg: &str) { fn info(&self, msg: &str) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
println!("{}", msg); println!("{}", msg);
} }
@ -727,7 +732,7 @@ impl Build {
fn debuginfo_map(&self, which: GitRepo) -> Option<String> { fn debuginfo_map(&self, which: GitRepo) -> Option<String> {
if !self.config.rust_remap_debuginfo { if !self.config.rust_remap_debuginfo {
return None return None;
} }
let path = match which { let path = match which {
@ -750,10 +755,12 @@ impl Build {
fn cflags(&self, target: Interned<String>, which: GitRepo) -> Vec<String> { fn cflags(&self, target: Interned<String>, which: GitRepo) -> Vec<String> {
// Filter out -O and /O (the optimization flags) that we picked up from // Filter out -O and /O (the optimization flags) that we picked up from
// cc-rs because the build scripts will determine that for themselves. // cc-rs because the build scripts will determine that for themselves.
let mut base = self.cc[&target].args().iter() let mut base = self.cc[&target]
.map(|s| s.to_string_lossy().into_owned()) .args()
.filter(|s| !s.starts_with("-O") && !s.starts_with("/O")) .iter()
.collect::<Vec<String>>(); .map(|s| s.to_string_lossy().into_owned())
.filter(|s| !s.starts_with("-O") && !s.starts_with("/O"))
.collect::<Vec<String>>();
// If we're compiling on macOS then we add a few unconditional flags // If we're compiling on macOS then we add a few unconditional flags
// indicating that we want libc++ (more filled out than libstdc++) and // indicating that we want libc++ (more filled out than libstdc++) and
@ -771,7 +778,7 @@ impl Build {
} }
if let Some(map) = self.debuginfo_map(which) { if let Some(map) = self.debuginfo_map(which) {
let cc = self.cc(target); let cc = self.cc(target);
if cc.ends_with("clang") || cc.ends_with("gcc") { if cc.ends_with("clang") || cc.ends_with("gcc") {
base.push(format!("-fdebug-prefix-map={}", map)); base.push(format!("-fdebug-prefix-map={}", map));
} else if cc.ends_with("clang-cl.exe") { } else if cc.ends_with("clang-cl.exe") {
@ -796,20 +803,21 @@ impl Build {
fn cxx(&self, target: Interned<String>) -> Result<&Path, String> { fn cxx(&self, target: Interned<String>) -> Result<&Path, String> {
match self.cxx.get(&target) { match self.cxx.get(&target) {
Some(p) => Ok(p.path()), Some(p) => Ok(p.path()),
None => Err(format!( None => {
"target `{}` is not configured as a host, only as a target", Err(format!("target `{}` is not configured as a host, only as a target", target))
target)) }
} }
} }
/// Returns the path to the linker for the given target if it needs to be overridden. /// Returns the path to the linker for the given target if it needs to be overridden.
fn linker(&self, target: Interned<String>) -> Option<&Path> { fn linker(&self, target: Interned<String>) -> Option<&Path> {
if let Some(linker) = self.config.target_config.get(&target) if let Some(linker) = self.config.target_config.get(&target).and_then(|c| c.linker.as_ref())
.and_then(|c| c.linker.as_ref()) { {
Some(linker) Some(linker)
} else if target != self.config.build && } else if target != self.config.build
util::use_host_linker(&target) && && util::use_host_linker(&target)
!target.contains("msvc") { && !target.contains("msvc")
{
Some(self.cc(target)) Some(self.cc(target))
} else { } else {
None None
@ -821,14 +829,15 @@ impl Build {
if target.contains("pc-windows-msvc") { if target.contains("pc-windows-msvc") {
Some(true) Some(true)
} else { } else {
self.config.target_config.get(&target) self.config.target_config.get(&target).and_then(|t| t.crt_static)
.and_then(|t| t.crt_static)
} }
} }
/// Returns the "musl root" for this `target`, if defined /// Returns the "musl root" for this `target`, if defined
fn musl_root(&self, target: Interned<String>) -> Option<&Path> { fn musl_root(&self, target: Interned<String>) -> Option<&Path> {
self.config.target_config.get(&target) self.config
.target_config
.get(&target)
.and_then(|t| t.musl_root.as_ref()) .and_then(|t| t.musl_root.as_ref())
.or(self.config.musl_root.as_ref()) .or(self.config.musl_root.as_ref())
.map(|p| &**p) .map(|p| &**p)
@ -836,22 +845,20 @@ impl Build {
/// Returns the sysroot for the wasi target, if defined /// Returns the sysroot for the wasi target, if defined
fn wasi_root(&self, target: Interned<String>) -> Option<&Path> { fn wasi_root(&self, target: Interned<String>) -> Option<&Path> {
self.config.target_config.get(&target) self.config.target_config.get(&target).and_then(|t| t.wasi_root.as_ref()).map(|p| &**p)
.and_then(|t| t.wasi_root.as_ref())
.map(|p| &**p)
} }
/// Returns `true` if this is a no-std `target`, if defined /// Returns `true` if this is a no-std `target`, if defined
fn no_std(&self, target: Interned<String>) -> Option<bool> { fn no_std(&self, target: Interned<String>) -> Option<bool> {
self.config.target_config.get(&target) self.config.target_config.get(&target).map(|t| t.no_std)
.map(|t| t.no_std)
} }
/// Returns `true` if the target will be tested using the `remote-test-client` /// Returns `true` if the target will be tested using the `remote-test-client`
/// and `remote-test-server` binaries. /// and `remote-test-server` binaries.
fn remote_tested(&self, target: Interned<String>) -> bool { fn remote_tested(&self, target: Interned<String>) -> bool {
self.qemu_rootfs(target).is_some() || target.contains("android") || self.qemu_rootfs(target).is_some()
env::var_os("TEST_DEVICE_ADDR").is_some() || target.contains("android")
|| env::var_os("TEST_DEVICE_ADDR").is_some()
} }
/// Returns the root of the "rootfs" image that this target will be using, /// Returns the root of the "rootfs" image that this target will be using,
@ -860,9 +867,7 @@ impl Build {
/// If `Some` is returned then that means that tests for this target are /// If `Some` is returned then that means that tests for this target are
/// emulated with QEMU and binaries will need to be shipped to the emulator. /// emulated with QEMU and binaries will need to be shipped to the emulator.
fn qemu_rootfs(&self, target: Interned<String>) -> Option<&Path> { fn qemu_rootfs(&self, target: Interned<String>) -> Option<&Path> {
self.config.target_config.get(&target) self.config.target_config.get(&target).and_then(|t| t.qemu_rootfs.as_ref()).map(|p| &**p)
.and_then(|t| t.qemu_rootfs.as_ref())
.map(|p| &**p)
} }
/// Path to the python interpreter to use /// Path to the python interpreter to use
@ -894,9 +899,9 @@ impl Build {
/// When all of these conditions are met the build will lift artifacts from /// When all of these conditions are met the build will lift artifacts from
/// the previous stage forward. /// the previous stage forward.
fn force_use_stage1(&self, compiler: Compiler, target: Interned<String>) -> bool { fn force_use_stage1(&self, compiler: Compiler, target: Interned<String>) -> bool {
!self.config.full_bootstrap && !self.config.full_bootstrap
compiler.stage >= 2 && && compiler.stage >= 2
(self.hosts.iter().any(|h| *h == target) || target == self.build) && (self.hosts.iter().any(|h| *h == target) || target == self.build)
} }
/// Given `num` in the form "a.b.c" return a "release string" which /// Given `num` in the form "a.b.c" return a "release string" which
@ -907,11 +912,13 @@ impl Build {
fn release(&self, num: &str) -> String { fn release(&self, num: &str) -> String {
match &self.config.channel[..] { match &self.config.channel[..] {
"stable" => num.to_string(), "stable" => num.to_string(),
"beta" => if self.rust_info.is_git() { "beta" => {
format!("{}-beta.{}", num, self.beta_prerelease_version()) if self.rust_info.is_git() {
} else { format!("{}-beta.{}", num, self.beta_prerelease_version())
format!("{}-beta", num) } else {
}, format!("{}-beta", num)
}
}
"nightly" => format!("{}-nightly", num), "nightly" => format!("{}-nightly", num),
_ => format!("{}-dev", num), _ => format!("{}-dev", num),
} }
@ -919,33 +926,21 @@ impl Build {
fn beta_prerelease_version(&self) -> u32 { fn beta_prerelease_version(&self) -> u32 {
if let Some(s) = self.prerelease_version.get() { if let Some(s) = self.prerelease_version.get() {
return s return s;
} }
let beta = output( let beta = output(
Command::new("git") Command::new("git").arg("ls-remote").arg("origin").arg("beta").current_dir(&self.src),
.arg("ls-remote")
.arg("origin")
.arg("beta")
.current_dir(&self.src)
); );
let beta = beta.trim().split_whitespace().next().unwrap(); let beta = beta.trim().split_whitespace().next().unwrap();
let master = output( let master = output(
Command::new("git") Command::new("git").arg("ls-remote").arg("origin").arg("master").current_dir(&self.src),
.arg("ls-remote")
.arg("origin")
.arg("master")
.current_dir(&self.src)
); );
let master = master.trim().split_whitespace().next().unwrap(); let master = master.trim().split_whitespace().next().unwrap();
// Figure out where the current beta branch started. // Figure out where the current beta branch started.
let base = output( let base = output(
Command::new("git") Command::new("git").arg("merge-base").arg(beta).arg(master).current_dir(&self.src),
.arg("merge-base")
.arg(beta)
.arg(master)
.current_dir(&self.src),
); );
let base = base.trim(); let base = base.trim();
@ -1056,7 +1051,7 @@ impl Build {
let prefix = "version = \""; let prefix = "version = \"";
let suffix = "\""; let suffix = "\"";
if line.starts_with(prefix) && line.ends_with(suffix) { if line.starts_with(prefix) && line.ends_with(suffix) {
return line[prefix.len()..line.len() - suffix.len()].to_string() return line[prefix.len()..line.len() - suffix.len()].to_string();
} }
} }
@ -1101,7 +1096,7 @@ impl Build {
// run_cargo for more information (in compile.rs). // run_cargo for more information (in compile.rs).
for part in contents.split(|b| *b == 0) { for part in contents.split(|b| *b == 0) {
if part.is_empty() { if part.is_empty() {
continue continue;
} }
let host = part[0] as char == 'h'; let host = part[0] as char == 'h';
let path = PathBuf::from(t!(str::from_utf8(&part[1..]))); let path = PathBuf::from(t!(str::from_utf8(&part[1..])));
@ -1112,9 +1107,13 @@ impl Build {
/// Copies a file from `src` to `dst` /// Copies a file from `src` to `dst`
pub fn copy(&self, src: &Path, dst: &Path) { pub fn copy(&self, src: &Path, dst: &Path) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
self.verbose_than(1, &format!("Copy {:?} to {:?}", src, dst)); self.verbose_than(1, &format!("Copy {:?} to {:?}", src, dst));
if src == dst { return; } if src == dst {
return;
}
let _ = fs::remove_file(&dst); let _ = fs::remove_file(&dst);
let metadata = t!(src.symlink_metadata()); let metadata = t!(src.symlink_metadata());
if metadata.file_type().is_symlink() { if metadata.file_type().is_symlink() {
@ -1126,8 +1125,7 @@ impl Build {
// just fall back to a slow `copy` operation. // just fall back to a slow `copy` operation.
} else { } else {
if let Err(e) = fs::copy(src, dst) { if let Err(e) = fs::copy(src, dst) {
panic!("failed to copy `{}` to `{}`: {}", src.display(), panic!("failed to copy `{}` to `{}`: {}", src.display(), dst.display(), e)
dst.display(), e)
} }
t!(fs::set_permissions(dst, metadata.permissions())); t!(fs::set_permissions(dst, metadata.permissions()));
let atime = FileTime::from_last_access_time(&metadata); let atime = FileTime::from_last_access_time(&metadata);
@ -1139,7 +1137,9 @@ impl Build {
/// Search-and-replaces within a file. (Not maximally efficiently: allocates a /// Search-and-replaces within a file. (Not maximally efficiently: allocates a
/// new string for each replacement.) /// new string for each replacement.)
pub fn replace_in_file(&self, path: &Path, replacements: &[(&str, &str)]) { pub fn replace_in_file(&self, path: &Path, replacements: &[(&str, &str)]) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
let mut contents = String::new(); let mut contents = String::new();
let mut file = t!(OpenOptions::new().read(true).write(true).open(path)); let mut file = t!(OpenOptions::new().read(true).write(true).open(path));
t!(file.read_to_string(&mut contents)); t!(file.read_to_string(&mut contents));
@ -1154,7 +1154,9 @@ impl Build {
/// Copies the `src` directory recursively to `dst`. Both are assumed to exist /// Copies the `src` directory recursively to `dst`. Both are assumed to exist
/// when this function is called. /// when this function is called.
pub fn cp_r(&self, src: &Path, dst: &Path) { pub fn cp_r(&self, src: &Path, dst: &Path) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
for f in self.read_dir(src) { for f in self.read_dir(src) {
let path = f.path(); let path = f.path();
let name = path.file_name().unwrap(); let name = path.file_name().unwrap();
@ -1205,7 +1207,9 @@ impl Build {
} }
fn install(&self, src: &Path, dstdir: &Path, perms: u32) { fn install(&self, src: &Path, dstdir: &Path, perms: u32) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
let dst = dstdir.join(src.file_name().unwrap()); let dst = dstdir.join(src.file_name().unwrap());
self.verbose_than(1, &format!("Install {:?} to {:?}", src, dst)); self.verbose_than(1, &format!("Install {:?} to {:?}", src, dst));
t!(fs::create_dir_all(dstdir)); t!(fs::create_dir_all(dstdir));
@ -1216,8 +1220,7 @@ impl Build {
} }
let metadata = t!(src.symlink_metadata()); let metadata = t!(src.symlink_metadata());
if let Err(e) = fs::copy(&src, &dst) { if let Err(e) = fs::copy(&src, &dst) {
panic!("failed to copy `{}` to `{}`: {}", src.display(), panic!("failed to copy `{}` to `{}`: {}", src.display(), dst.display(), e)
dst.display(), e)
} }
t!(fs::set_permissions(&dst, metadata.permissions())); t!(fs::set_permissions(&dst, metadata.permissions()));
let atime = FileTime::from_last_access_time(&metadata); let atime = FileTime::from_last_access_time(&metadata);
@ -1228,26 +1231,34 @@ impl Build {
} }
fn create(&self, path: &Path, s: &str) { fn create(&self, path: &Path, s: &str) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
t!(fs::write(path, s)); t!(fs::write(path, s));
} }
fn read(&self, path: &Path) -> String { fn read(&self, path: &Path) -> String {
if self.config.dry_run { return String::new(); } if self.config.dry_run {
return String::new();
}
t!(fs::read_to_string(path)) t!(fs::read_to_string(path))
} }
fn create_dir(&self, dir: &Path) { fn create_dir(&self, dir: &Path) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
t!(fs::create_dir_all(dir)) t!(fs::create_dir_all(dir))
} }
fn remove_dir(&self, dir: &Path) { fn remove_dir(&self, dir: &Path) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
t!(fs::remove_dir_all(dir)) t!(fs::remove_dir_all(dir))
} }
fn read_dir(&self, dir: &Path) -> impl Iterator<Item=fs::DirEntry> { fn read_dir(&self, dir: &Path) -> impl Iterator<Item = fs::DirEntry> {
let iter = match fs::read_dir(dir) { let iter = match fs::read_dir(dir) {
Ok(v) => v, Ok(v) => v,
Err(_) if self.config.dry_run => return vec![].into_iter(), Err(_) if self.config.dry_run => return vec![].into_iter(),
@ -1257,7 +1268,9 @@ impl Build {
} }
fn remove(&self, f: &Path) { fn remove(&self, f: &Path) {
if self.config.dry_run { return; } if self.config.dry_run {
return;
}
fs::remove_file(f).unwrap_or_else(|_| panic!("failed to remove {:?}", f)); fs::remove_file(f).unwrap_or_else(|_| panic!("failed to remove {:?}", f));
} }
} }
@ -1270,7 +1283,6 @@ fn chmod(path: &Path, perms: u32) {
#[cfg(windows)] #[cfg(windows)]
fn chmod(_path: &Path, _perms: u32) {} fn chmod(_path: &Path, _perms: u32) {}
impl Compiler { impl Compiler {
pub fn with_stage(mut self, stage: u32) -> Compiler { pub fn with_stage(mut self, stage: u32) -> Compiler {
self.stage = stage; self.stage = stage;

View File

@ -1,14 +1,14 @@
use std::collections::HashMap; use std::collections::HashMap;
use std::process::Command;
use std::path::PathBuf;
use std::collections::HashSet; use std::collections::HashSet;
use std::path::PathBuf;
use std::process::Command;
use build_helper::output; use build_helper::output;
use serde::Deserialize; use serde::Deserialize;
use serde_json; use serde_json;
use crate::{Build, Crate};
use crate::cache::INTERNER; use crate::cache::INTERNER;
use crate::{Build, Crate};
#[derive(Deserialize)] #[derive(Deserialize)]
struct Output { struct Output {
@ -71,10 +71,14 @@ fn build_krate(features: &str, build: &mut Build, resolves: &mut Vec<ResolveNode
// to know what crates to test. Here we run `cargo metadata` to learn about // to know what crates to test. Here we run `cargo metadata` to learn about
// the dependency graph and what `-p` arguments there are. // the dependency graph and what `-p` arguments there are.
let mut cargo = Command::new(&build.initial_cargo); let mut cargo = Command::new(&build.initial_cargo);
cargo.arg("metadata") cargo
.arg("--format-version").arg("1") .arg("metadata")
.arg("--features").arg(features) .arg("--format-version")
.arg("--manifest-path").arg(build.src.join(krate).join("Cargo.toml")); .arg("1")
.arg("--features")
.arg(features)
.arg("--manifest-path")
.arg(build.src.join(krate).join("Cargo.toml"));
let output = output(&mut cargo); let output = output(&mut cargo);
let output: Output = serde_json::from_str(&output).unwrap(); let output: Output = serde_json::from_str(&output).unwrap();
for package in output.packages { for package in output.packages {
@ -82,12 +86,7 @@ fn build_krate(features: &str, build: &mut Build, resolves: &mut Vec<ResolveNode
let name = INTERNER.intern_string(package.name); let name = INTERNER.intern_string(package.name);
let mut path = PathBuf::from(package.manifest_path); let mut path = PathBuf::from(package.manifest_path);
path.pop(); path.pop();
build.crates.insert(name, Crate { build.crates.insert(name, Crate { name, id: package.id, deps: HashSet::new(), path });
name,
id: package.id,
deps: HashSet::new(),
path,
});
} }
} }
resolves.extend(output.resolve.nodes); resolves.extend(output.resolve.nodes);

View File

@ -15,15 +15,15 @@ use std::path::{Path, PathBuf};
use std::process::Command; use std::process::Command;
use build_helper::{output, t}; use build_helper::{output, t};
use cmake;
use cc; use cc;
use cmake;
use crate::channel;
use crate::util::{self, exe};
use build_helper::up_to_date;
use crate::builder::{Builder, RunConfig, ShouldRun, Step}; use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use crate::cache::Interned; use crate::cache::Interned;
use crate::channel;
use crate::util::{self, exe};
use crate::GitRepo; use crate::GitRepo;
use build_helper::up_to_date;
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)] #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Llvm { pub struct Llvm {
@ -36,15 +36,11 @@ impl Step for Llvm {
const ONLY_HOSTS: bool = true; const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> { fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/llvm-project") run.path("src/llvm-project").path("src/llvm-project/llvm").path("src/llvm")
.path("src/llvm-project/llvm")
.path("src/llvm")
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Llvm { run.builder.ensure(Llvm { target: run.target });
target: run.target,
});
} }
/// Compile LLVM for `target`. /// Compile LLVM for `target`.
@ -56,7 +52,7 @@ impl Step for Llvm {
if let Some(config) = builder.config.target_config.get(&target) { if let Some(config) = builder.config.target_config.get(&target) {
if let Some(ref s) = config.llvm_config { if let Some(ref s) = config.llvm_config {
check_llvm_version(builder, s); check_llvm_version(builder, s);
return s.to_path_buf() return s.to_path_buf();
} }
} }
@ -69,11 +65,20 @@ impl Step for Llvm {
} }
llvm_config_ret_dir.push("bin"); llvm_config_ret_dir.push("bin");
let build_llvm_config = llvm_config_ret_dir let build_llvm_config =
.join(exe("llvm-config", &*builder.config.build)); llvm_config_ret_dir.join(exe("llvm-config", &*builder.config.build));
let done_stamp = out_dir.join("llvm-finished-building"); let done_stamp = out_dir.join("llvm-finished-building");
if done_stamp.exists() { if done_stamp.exists() {
if builder.config.llvm_skip_rebuild {
builder.info(
"Warning: \
Using a potentially stale build of LLVM; \
This may not behave well.",
);
return build_llvm_config;
}
if let Some(llvm_commit) = llvm_info.sha() { if let Some(llvm_commit) = llvm_info.sha() {
let done_contents = t!(fs::read(&done_stamp)); let done_contents = t!(fs::read(&done_stamp));
@ -112,8 +117,10 @@ impl Step for Llvm {
// defaults! // defaults!
let llvm_targets = match &builder.config.llvm_targets { let llvm_targets = match &builder.config.llvm_targets {
Some(s) => s, Some(s) => s,
None => "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;\ None => {
Sparc;SystemZ;WebAssembly;X86", "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;\
Sparc;SystemZ;WebAssembly;X86"
}
}; };
let llvm_exp_targets = match builder.config.llvm_experimental_targets { let llvm_exp_targets = match builder.config.llvm_experimental_targets {
@ -121,31 +128,31 @@ impl Step for Llvm {
None => "", None => "",
}; };
let assertions = if builder.config.llvm_assertions {"ON"} else {"OFF"}; let assertions = if builder.config.llvm_assertions { "ON" } else { "OFF" };
cfg.out_dir(&out_dir) cfg.out_dir(&out_dir)
.profile(profile) .profile(profile)
.define("LLVM_ENABLE_ASSERTIONS", assertions) .define("LLVM_ENABLE_ASSERTIONS", assertions)
.define("LLVM_TARGETS_TO_BUILD", llvm_targets) .define("LLVM_TARGETS_TO_BUILD", llvm_targets)
.define("LLVM_EXPERIMENTAL_TARGETS_TO_BUILD", llvm_exp_targets) .define("LLVM_EXPERIMENTAL_TARGETS_TO_BUILD", llvm_exp_targets)
.define("LLVM_INCLUDE_EXAMPLES", "OFF") .define("LLVM_INCLUDE_EXAMPLES", "OFF")
.define("LLVM_INCLUDE_TESTS", "OFF") .define("LLVM_INCLUDE_TESTS", "OFF")
.define("LLVM_INCLUDE_DOCS", "OFF") .define("LLVM_INCLUDE_DOCS", "OFF")
.define("LLVM_INCLUDE_BENCHMARKS", "OFF") .define("LLVM_INCLUDE_BENCHMARKS", "OFF")
.define("LLVM_ENABLE_ZLIB", "OFF") .define("LLVM_ENABLE_ZLIB", "OFF")
.define("WITH_POLLY", "OFF") .define("WITH_POLLY", "OFF")
.define("LLVM_ENABLE_TERMINFO", "OFF") .define("LLVM_ENABLE_TERMINFO", "OFF")
.define("LLVM_ENABLE_LIBEDIT", "OFF") .define("LLVM_ENABLE_LIBEDIT", "OFF")
.define("LLVM_ENABLE_BINDINGS", "OFF") .define("LLVM_ENABLE_BINDINGS", "OFF")
.define("LLVM_ENABLE_Z3_SOLVER", "OFF") .define("LLVM_ENABLE_Z3_SOLVER", "OFF")
.define("LLVM_PARALLEL_COMPILE_JOBS", builder.jobs().to_string()) .define("LLVM_PARALLEL_COMPILE_JOBS", builder.jobs().to_string())
.define("LLVM_TARGET_ARCH", target.split('-').next().unwrap()) .define("LLVM_TARGET_ARCH", target.split('-').next().unwrap())
.define("LLVM_DEFAULT_TARGET_TRIPLE", target); .define("LLVM_DEFAULT_TARGET_TRIPLE", target);
if builder.config.llvm_thin_lto { if builder.config.llvm_thin_lto {
cfg.define("LLVM_ENABLE_LTO", "Thin"); cfg.define("LLVM_ENABLE_LTO", "Thin");
if !target.contains("apple") { if !target.contains("apple") {
cfg.define("LLVM_ENABLE_LLD", "ON"); cfg.define("LLVM_ENABLE_LLD", "ON");
} }
} }
@ -212,20 +219,19 @@ impl Step for Llvm {
// http://llvm.org/docs/HowToCrossCompileLLVM.html // http://llvm.org/docs/HowToCrossCompileLLVM.html
if target != builder.config.build { if target != builder.config.build {
builder.ensure(Llvm { builder.ensure(Llvm { target: builder.config.build });
target: builder.config.build,
});
// FIXME: if the llvm root for the build triple is overridden then we // FIXME: if the llvm root for the build triple is overridden then we
// should use llvm-tblgen from there, also should verify that it // should use llvm-tblgen from there, also should verify that it
// actually exists most of the time in normal installs of LLVM. // actually exists most of the time in normal installs of LLVM.
let host = builder.llvm_out(builder.config.build).join("bin/llvm-tblgen"); let host = builder.llvm_out(builder.config.build).join("bin/llvm-tblgen");
cfg.define("CMAKE_CROSSCOMPILING", "True") cfg.define("CMAKE_CROSSCOMPILING", "True").define("LLVM_TABLEGEN", &host);
.define("LLVM_TABLEGEN", &host);
if target.contains("netbsd") { if target.contains("netbsd") {
cfg.define("CMAKE_SYSTEM_NAME", "NetBSD"); cfg.define("CMAKE_SYSTEM_NAME", "NetBSD");
} else if target.contains("freebsd") { } else if target.contains("freebsd") {
cfg.define("CMAKE_SYSTEM_NAME", "FreeBSD"); cfg.define("CMAKE_SYSTEM_NAME", "FreeBSD");
} else if target.contains("windows") {
cfg.define("CMAKE_SYSTEM_NAME", "Windows");
} }
cfg.define("LLVM_NATIVE_BUILD", builder.llvm_out(builder.config.build).join("build")); cfg.define("LLVM_NATIVE_BUILD", builder.llvm_out(builder.config.build).join("build"));
@ -237,11 +243,8 @@ impl Step for Llvm {
cfg.define("LLVM_VERSION_SUFFIX", suffix); cfg.define("LLVM_VERSION_SUFFIX", suffix);
} }
} else { } else {
let mut default_suffix = format!( let mut default_suffix =
"-rust-{}-{}", format!("-rust-{}-{}", channel::CFG_RELEASE_NUM, builder.config.channel,);
channel::CFG_RELEASE_NUM,
builder.config.channel,
);
if let Some(sha) = llvm_info.sha_short() { if let Some(sha) = llvm_info.sha_short() {
default_suffix.push_str("-"); default_suffix.push_str("-");
default_suffix.push_str(sha); default_suffix.push_str(sha);
@ -261,7 +264,7 @@ impl Step for Llvm {
cfg.define("PYTHON_EXECUTABLE", python); cfg.define("PYTHON_EXECUTABLE", python);
} }
configure_cmake(builder, target, &mut cfg); configure_cmake(builder, target, &mut cfg, true);
// FIXME: we don't actually need to build all LLVM tools and all LLVM // FIXME: we don't actually need to build all LLVM tools and all LLVM
// libraries here, e.g., we just want a few components and a few // libraries here, e.g., we just want a few components and a few
@ -282,7 +285,7 @@ impl Step for Llvm {
fn check_llvm_version(builder: &Builder<'_>, llvm_config: &Path) { fn check_llvm_version(builder: &Builder<'_>, llvm_config: &Path) {
if !builder.config.llvm_version_check { if !builder.config.llvm_version_check {
return return;
} }
if builder.config.dry_run { if builder.config.dry_run {
@ -291,19 +294,21 @@ fn check_llvm_version(builder: &Builder<'_>, llvm_config: &Path) {
let mut cmd = Command::new(llvm_config); let mut cmd = Command::new(llvm_config);
let version = output(cmd.arg("--version")); let version = output(cmd.arg("--version"));
let mut parts = version.split('.').take(2) let mut parts = version.split('.').take(2).filter_map(|s| s.parse::<u32>().ok());
.filter_map(|s| s.parse::<u32>().ok());
if let (Some(major), Some(_minor)) = (parts.next(), parts.next()) { if let (Some(major), Some(_minor)) = (parts.next(), parts.next()) {
if major >= 7 { if major >= 7 {
return return;
} }
} }
panic!("\n\nbad LLVM version: {}, need >=7.0\n\n", version) panic!("\n\nbad LLVM version: {}, need >=7.0\n\n", version)
} }
fn configure_cmake(builder: &Builder<'_>, fn configure_cmake(
target: Interned<String>, builder: &Builder<'_>,
cfg: &mut cmake::Config) { target: Interned<String>,
cfg: &mut cmake::Config,
use_compiler_launcher: bool,
) {
// Do not print installation messages for up-to-date files. // Do not print installation messages for up-to-date files.
// LLVM and LLD builds can produce a lot of those and hit CI limits on log size. // LLVM and LLD builds can produce a lot of those and hit CI limits on log size.
cfg.define("CMAKE_INSTALL_MESSAGE", "LAZY"); cfg.define("CMAKE_INSTALL_MESSAGE", "LAZY");
@ -311,8 +316,7 @@ fn configure_cmake(builder: &Builder<'_>,
if builder.config.ninja { if builder.config.ninja {
cfg.generator("Ninja"); cfg.generator("Ninja");
} }
cfg.target(&target) cfg.target(&target).host(&builder.config.build);
.host(&builder.config.build);
let sanitize_cc = |cc: &Path| { let sanitize_cc = |cc: &Path| {
if target.contains("msvc") { if target.contains("msvc") {
@ -326,7 +330,7 @@ fn configure_cmake(builder: &Builder<'_>,
// vars that we'd otherwise configure. In that case we just skip this // vars that we'd otherwise configure. In that case we just skip this
// entirely. // entirely.
if target.contains("msvc") && !builder.config.ninja { if target.contains("msvc") && !builder.config.ninja {
return return;
} }
let (cc, cxx) = match builder.config.llvm_clang_cl { let (cc, cxx) = match builder.config.llvm_clang_cl {
@ -335,56 +339,54 @@ fn configure_cmake(builder: &Builder<'_>,
}; };
// Handle msvc + ninja + ccache specially (this is what the bots use) // Handle msvc + ninja + ccache specially (this is what the bots use)
if target.contains("msvc") && if target.contains("msvc") && builder.config.ninja && builder.config.ccache.is_some() {
builder.config.ninja && let mut wrap_cc = env::current_exe().expect("failed to get cwd");
builder.config.ccache.is_some() wrap_cc.set_file_name("sccache-plus-cl.exe");
{
let mut wrap_cc = env::current_exe().expect("failed to get cwd");
wrap_cc.set_file_name("sccache-plus-cl.exe");
cfg.define("CMAKE_C_COMPILER", sanitize_cc(&wrap_cc)) cfg.define("CMAKE_C_COMPILER", sanitize_cc(&wrap_cc))
.define("CMAKE_CXX_COMPILER", sanitize_cc(&wrap_cc)); .define("CMAKE_CXX_COMPILER", sanitize_cc(&wrap_cc));
cfg.env("SCCACHE_PATH", cfg.env("SCCACHE_PATH", builder.config.ccache.as_ref().unwrap())
builder.config.ccache.as_ref().unwrap()) .env("SCCACHE_TARGET", target)
.env("SCCACHE_TARGET", target) .env("SCCACHE_CC", &cc)
.env("SCCACHE_CC", &cc) .env("SCCACHE_CXX", &cxx);
.env("SCCACHE_CXX", &cxx);
// Building LLVM on MSVC can be a little ludicrous at times. We're so far // Building LLVM on MSVC can be a little ludicrous at times. We're so far
// off the beaten path here that I'm not really sure this is even half // off the beaten path here that I'm not really sure this is even half
// supported any more. Here we're trying to: // supported any more. Here we're trying to:
// //
// * Build LLVM on MSVC // * Build LLVM on MSVC
// * Build LLVM with `clang-cl` instead of `cl.exe` // * Build LLVM with `clang-cl` instead of `cl.exe`
// * Build a project with `sccache` // * Build a project with `sccache`
// * Build for 32-bit as well // * Build for 32-bit as well
// * Build with Ninja // * Build with Ninja
// //
// For `cl.exe` there are different binaries to compile 32/64 bit which // For `cl.exe` there are different binaries to compile 32/64 bit which
// we use but for `clang-cl` there's only one which internally // we use but for `clang-cl` there's only one which internally
// multiplexes via flags. As a result it appears that CMake's detection // multiplexes via flags. As a result it appears that CMake's detection
// of a compiler's architecture and such on MSVC **doesn't** pass any // of a compiler's architecture and such on MSVC **doesn't** pass any
// custom flags we pass in CMAKE_CXX_FLAGS below. This means that if we // custom flags we pass in CMAKE_CXX_FLAGS below. This means that if we
// use `clang-cl.exe` it's always diagnosed as a 64-bit compiler which // use `clang-cl.exe` it's always diagnosed as a 64-bit compiler which
// definitely causes problems since all the env vars are pointing to // definitely causes problems since all the env vars are pointing to
// 32-bit libraries. // 32-bit libraries.
// //
// To hack around this... again... we pass an argument that's // To hack around this... again... we pass an argument that's
// unconditionally passed in the sccache shim. This'll get CMake to // unconditionally passed in the sccache shim. This'll get CMake to
// correctly diagnose it's doing a 32-bit compilation and LLVM will // correctly diagnose it's doing a 32-bit compilation and LLVM will
// internally configure itself appropriately. // internally configure itself appropriately.
if builder.config.llvm_clang_cl.is_some() && target.contains("i686") { if builder.config.llvm_clang_cl.is_some() && target.contains("i686") {
cfg.env("SCCACHE_EXTRA_ARGS", "-m32"); cfg.env("SCCACHE_EXTRA_ARGS", "-m32");
} }
} else { } else {
// If ccache is configured we inform the build a little differently how // If ccache is configured we inform the build a little differently how
// to invoke ccache while also invoking our compilers. // to invoke ccache while also invoking our compilers.
if let Some(ref ccache) = builder.config.ccache { if use_compiler_launcher {
cfg.define("CMAKE_C_COMPILER_LAUNCHER", ccache) if let Some(ref ccache) = builder.config.ccache {
.define("CMAKE_CXX_COMPILER_LAUNCHER", ccache); cfg.define("CMAKE_C_COMPILER_LAUNCHER", ccache)
} .define("CMAKE_CXX_COMPILER_LAUNCHER", ccache);
cfg.define("CMAKE_C_COMPILER", sanitize_cc(cc)) }
.define("CMAKE_CXX_COMPILER", sanitize_cc(cxx)); }
cfg.define("CMAKE_C_COMPILER", sanitize_cc(cc))
.define("CMAKE_CXX_COMPILER", sanitize_cc(cxx));
} }
cfg.build_arg("-j").build_arg(builder.jobs().to_string()); cfg.build_arg("-j").build_arg(builder.jobs().to_string());
@ -394,10 +396,7 @@ fn configure_cmake(builder: &Builder<'_>,
} }
cfg.define("CMAKE_C_FLAGS", cflags); cfg.define("CMAKE_C_FLAGS", cflags);
let mut cxxflags = builder.cflags(target, GitRepo::Llvm).join(" "); let mut cxxflags = builder.cflags(target, GitRepo::Llvm).join(" ");
if builder.config.llvm_static_stdcpp && if builder.config.llvm_static_stdcpp && !target.contains("msvc") && !target.contains("netbsd") {
!target.contains("msvc") &&
!target.contains("netbsd")
{
cxxflags.push_str(" -static-libstdc++"); cxxflags.push_str(" -static-libstdc++");
} }
if let Some(ref s) = builder.config.llvm_cxxflags { if let Some(ref s) = builder.config.llvm_cxxflags {
@ -455,14 +454,12 @@ impl Step for Lld {
} }
let target = self.target; let target = self.target;
let llvm_config = builder.ensure(Llvm { let llvm_config = builder.ensure(Llvm { target: self.target });
target: self.target,
});
let out_dir = builder.lld_out(target); let out_dir = builder.lld_out(target);
let done_stamp = out_dir.join("lld-finished-building"); let done_stamp = out_dir.join("lld-finished-building");
if done_stamp.exists() { if done_stamp.exists() {
return out_dir return out_dir;
} }
builder.info(&format!("Building LLD for {}", target)); builder.info(&format!("Building LLD for {}", target));
@ -470,7 +467,7 @@ impl Step for Lld {
t!(fs::create_dir_all(&out_dir)); t!(fs::create_dir_all(&out_dir));
let mut cfg = cmake::Config::new(builder.src.join("src/llvm-project/lld")); let mut cfg = cmake::Config::new(builder.src.join("src/llvm-project/lld"));
configure_cmake(builder, target, &mut cfg); configure_cmake(builder, target, &mut cfg, true);
// This is an awful, awful hack. Discovered when we migrated to using // This is an awful, awful hack. Discovered when we migrated to using
// clang-cl to compile LLVM/LLD it turns out that LLD, when built out of // clang-cl to compile LLVM/LLD it turns out that LLD, when built out of
@ -486,14 +483,12 @@ impl Step for Lld {
// ensure we don't hit the same bugs with escaping. It means that you // ensure we don't hit the same bugs with escaping. It means that you
// can't build on a system where your paths require `\` on Windows, but // can't build on a system where your paths require `\` on Windows, but
// there's probably a lot of reasons you can't do that other than this. // there's probably a lot of reasons you can't do that other than this.
let llvm_config_shim = env::current_exe() let llvm_config_shim = env::current_exe().unwrap().with_file_name("llvm-config-wrapper");
.unwrap()
.with_file_name("llvm-config-wrapper");
cfg.out_dir(&out_dir) cfg.out_dir(&out_dir)
.profile("Release") .profile("Release")
.env("LLVM_CONFIG_REAL", llvm_config) .env("LLVM_CONFIG_REAL", llvm_config)
.define("LLVM_CONFIG_PATH", llvm_config_shim) .define("LLVM_CONFIG_PATH", llvm_config_shim)
.define("LLVM_INCLUDE_TESTS", "OFF"); .define("LLVM_INCLUDE_TESTS", "OFF");
cfg.build(); cfg.build();
@ -528,7 +523,7 @@ impl Step for TestHelpers {
let dst = builder.test_helpers_out(target); let dst = builder.test_helpers_out(target);
let src = builder.src.join("src/test/auxiliary/rust_test_helpers.c"); let src = builder.src.join("src/test/auxiliary/rust_test_helpers.c");
if up_to_date(&src, &dst.join("librust_test_helpers.a")) { if up_to_date(&src, &dst.join("librust_test_helpers.a")) {
return return;
} }
builder.info("Building test helpers"); builder.info("Building test helpers");
@ -550,13 +545,149 @@ impl Step for TestHelpers {
} }
cfg.cargo_metadata(false) cfg.cargo_metadata(false)
.out_dir(&dst) .out_dir(&dst)
.target(&target) .target(&target)
.host(&builder.config.build) .host(&builder.config.build)
.opt_level(0) .opt_level(0)
.warnings(false) .warnings(false)
.debug(false) .debug(false)
.file(builder.src.join("src/test/auxiliary/rust_test_helpers.c")) .file(builder.src.join("src/test/auxiliary/rust_test_helpers.c"))
.compile("rust_test_helpers"); .compile("rust_test_helpers");
} }
} }
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub struct Sanitizers {
pub target: Interned<String>,
}
impl Step for Sanitizers {
type Output = Vec<SanitizerRuntime>;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/llvm-project/compiler-rt").path("src/sanitizers")
}
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Sanitizers { target: run.target });
}
/// Builds sanitizer runtime libraries.
fn run(self, builder: &Builder<'_>) -> Self::Output {
let compiler_rt_dir = builder.src.join("src/llvm-project/compiler-rt");
if !compiler_rt_dir.exists() {
return Vec::new();
}
let out_dir = builder.native_dir(self.target).join("sanitizers");
let runtimes = supported_sanitizers(&out_dir, self.target);
if runtimes.is_empty() {
return runtimes;
}
let llvm_config = builder.ensure(Llvm { target: builder.config.build });
if builder.config.dry_run {
return runtimes;
}
let done_stamp = out_dir.join("sanitizers-finished-building");
if done_stamp.exists() {
builder.info(&format!(
"Assuming that sanitizers rebuild is not necessary. \
To force a rebuild, remove the file `{}`",
done_stamp.display()
));
return runtimes;
}
builder.info(&format!("Building sanitizers for {}", self.target));
let _time = util::timeit(&builder);
let mut cfg = cmake::Config::new(&compiler_rt_dir);
cfg.profile("Release");
cfg.define("CMAKE_C_COMPILER_TARGET", self.target);
cfg.define("COMPILER_RT_BUILD_BUILTINS", "OFF");
cfg.define("COMPILER_RT_BUILD_CRT", "OFF");
cfg.define("COMPILER_RT_BUILD_LIBFUZZER", "OFF");
cfg.define("COMPILER_RT_BUILD_PROFILE", "OFF");
cfg.define("COMPILER_RT_BUILD_SANITIZERS", "ON");
cfg.define("COMPILER_RT_BUILD_XRAY", "OFF");
cfg.define("COMPILER_RT_DEFAULT_TARGET_ONLY", "ON");
cfg.define("COMPILER_RT_USE_LIBCXX", "OFF");
cfg.define("LLVM_CONFIG_PATH", &llvm_config);
// On Darwin targets the sanitizer runtimes are build as universal binaries.
// Unfortunately sccache currently lacks support to build them successfully.
// Disable compiler launcher on Darwin targets to avoid potential issues.
let use_compiler_launcher = !self.target.contains("apple-darwin");
configure_cmake(builder, self.target, &mut cfg, use_compiler_launcher);
t!(fs::create_dir_all(&out_dir));
cfg.out_dir(out_dir);
for runtime in &runtimes {
cfg.build_target(&runtime.cmake_target);
cfg.build();
}
t!(fs::write(&done_stamp, b""));
runtimes
}
}
#[derive(Clone, Debug)]
pub struct SanitizerRuntime {
/// CMake target used to build the runtime.
pub cmake_target: String,
/// Path to the built runtime library.
pub path: PathBuf,
/// Library filename that will be used rustc.
pub name: String,
}
/// Returns sanitizers available on a given target.
fn supported_sanitizers(out_dir: &Path, target: Interned<String>) -> Vec<SanitizerRuntime> {
let mut result = Vec::new();
match &*target {
"x86_64-apple-darwin" => {
for s in &["asan", "lsan", "tsan"] {
result.push(SanitizerRuntime {
cmake_target: format!("clang_rt.{}_osx_dynamic", s),
path: out_dir
.join(&format!("build/lib/darwin/libclang_rt.{}_osx_dynamic.dylib", s)),
name: format!("librustc_rt.{}.dylib", s),
});
}
}
"x86_64-unknown-linux-gnu" => {
for s in &["asan", "lsan", "msan", "tsan"] {
result.push(SanitizerRuntime {
cmake_target: format!("clang_rt.{}-x86_64", s),
path: out_dir.join(&format!("build/lib/linux/libclang_rt.{}-x86_64.a", s)),
name: format!("librustc_rt.{}.a", s),
});
}
}
"x86_64-fuchsia" => {
for s in &["asan"] {
result.push(SanitizerRuntime {
cmake_target: format!("clang_rt.{}-x86_64", s),
path: out_dir.join(&format!("build/lib/fuchsia/libclang_rt.{}-x86_64.a", s)),
name: format!("librustc_rt.{}.a", s),
});
}
}
"aarch64-fuchsia" => {
for s in &["asan"] {
result.push(SanitizerRuntime {
cmake_target: format!("clang_rt.{}-aarch64", s),
path: out_dir.join(&format!("build/lib/fuchsia/libclang_rt.{}-aarch64.a", s)),
name: format!("librustc_rt.{}.a", s),
});
}
}
_ => {}
}
result
}

View File

@ -10,7 +10,7 @@
use std::collections::HashMap; use std::collections::HashMap;
use std::env; use std::env;
use std::ffi::{OsString, OsStr}; use std::ffi::{OsStr, OsString};
use std::fs; use std::fs;
use std::path::PathBuf; use std::path::PathBuf;
use std::process::Command; use std::process::Command;
@ -26,30 +26,31 @@ struct Finder {
impl Finder { impl Finder {
fn new() -> Self { fn new() -> Self {
Self { Self { cache: HashMap::new(), path: env::var_os("PATH").unwrap_or_default() }
cache: HashMap::new(),
path: env::var_os("PATH").unwrap_or_default()
}
} }
fn maybe_have<S: AsRef<OsStr>>(&mut self, cmd: S) -> Option<PathBuf> { fn maybe_have<S: AsRef<OsStr>>(&mut self, cmd: S) -> Option<PathBuf> {
let cmd: OsString = cmd.as_ref().into(); let cmd: OsString = cmd.as_ref().into();
let path = &self.path; let path = &self.path;
self.cache.entry(cmd.clone()).or_insert_with(|| { self.cache
for path in env::split_paths(path) { .entry(cmd.clone())
let target = path.join(&cmd); .or_insert_with(|| {
let mut cmd_exe = cmd.clone(); for path in env::split_paths(path) {
cmd_exe.push(".exe"); let target = path.join(&cmd);
let mut cmd_exe = cmd.clone();
cmd_exe.push(".exe");
if target.is_file() // some/path/git if target.is_file() // some/path/git
|| path.join(&cmd_exe).exists() // some/path/git.exe || path.join(&cmd_exe).exists() // some/path/git.exe
|| target.join(&cmd_exe).exists() // some/path/git/git.exe || target.join(&cmd_exe).exists()
{ // some/path/git/git.exe
return Some(target); {
return Some(target);
}
} }
} None
None })
}).clone() .clone()
} }
fn must_have<S: AsRef<OsStr>>(&mut self, cmd: S) -> PathBuf { fn must_have<S: AsRef<OsStr>>(&mut self, cmd: S) -> PathBuf {
@ -77,11 +78,17 @@ pub fn check(build: &mut Build) {
} }
// We need cmake, but only if we're actually building LLVM or sanitizers. // We need cmake, but only if we're actually building LLVM or sanitizers.
let building_llvm = build.hosts.iter() let building_llvm = build
.map(|host| build.config.target_config .hosts
.get(host) .iter()
.map(|config| config.llvm_config.is_none()) .map(|host| {
.unwrap_or(true)) build
.config
.target_config
.get(host)
.map(|config| config.llvm_config.is_none())
.unwrap_or(true)
})
.any(|build_llvm_ourselves| build_llvm_ourselves); .any(|build_llvm_ourselves| build_llvm_ourselves);
if building_llvm || build.config.sanitizers { if building_llvm || build.config.sanitizers {
cmd_finder.must_have("cmake"); cmd_finder.must_have("cmake");
@ -119,17 +126,29 @@ pub fn check(build: &mut Build) {
} }
} }
build.config.python = build.config.python.take().map(|p| cmd_finder.must_have(p)) build.config.python = build
.config
.python
.take()
.map(|p| cmd_finder.must_have(p))
.or_else(|| cmd_finder.maybe_have("python2.7")) .or_else(|| cmd_finder.maybe_have("python2.7"))
.or_else(|| cmd_finder.maybe_have("python2")) .or_else(|| cmd_finder.maybe_have("python2"))
.or_else(|| env::var_os("BOOTSTRAP_PYTHON").map(PathBuf::from)) // set by bootstrap.py .or_else(|| env::var_os("BOOTSTRAP_PYTHON").map(PathBuf::from)) // set by bootstrap.py
.or_else(|| Some(cmd_finder.must_have("python"))); .or_else(|| Some(cmd_finder.must_have("python")));
build.config.nodejs = build.config.nodejs.take().map(|p| cmd_finder.must_have(p)) build.config.nodejs = build
.config
.nodejs
.take()
.map(|p| cmd_finder.must_have(p))
.or_else(|| cmd_finder.maybe_have("node")) .or_else(|| cmd_finder.maybe_have("node"))
.or_else(|| cmd_finder.maybe_have("nodejs")); .or_else(|| cmd_finder.maybe_have("nodejs"));
build.config.gdb = build.config.gdb.take().map(|p| cmd_finder.must_have(p)) build.config.gdb = build
.config
.gdb
.take()
.map(|p| cmd_finder.must_have(p))
.or_else(|| cmd_finder.maybe_have("gdb")); .or_else(|| cmd_finder.maybe_have("gdb"));
// We're gonna build some custom C code here and there, host triples // We're gonna build some custom C code here and there, host triples
@ -169,15 +188,13 @@ pub fn check(build: &mut Build) {
for target in &build.targets { for target in &build.targets {
// Can't compile for iOS unless we're on macOS // Can't compile for iOS unless we're on macOS
if target.contains("apple-ios") && if target.contains("apple-ios") && !build.build.contains("apple-darwin") {
!build.build.contains("apple-darwin") {
panic!("the iOS target is only supported on macOS"); panic!("the iOS target is only supported on macOS");
} }
if target.contains("-none-") || target.contains("nvptx") { if target.contains("-none-") || target.contains("nvptx") {
if build.no_std(*target).is_none() { if build.no_std(*target).is_none() {
let target = build.config.target_config.entry(target.clone()) let target = build.config.target_config.entry(target.clone()).or_default();
.or_default();
target.no_std = true; target.no_std = true;
} }
@ -192,22 +209,20 @@ pub fn check(build: &mut Build) {
// If this is a native target (host is also musl) and no musl-root is given, // If this is a native target (host is also musl) and no musl-root is given,
// fall back to the system toolchain in /usr before giving up // fall back to the system toolchain in /usr before giving up
if build.musl_root(*target).is_none() && build.config.build == *target { if build.musl_root(*target).is_none() && build.config.build == *target {
let target = build.config.target_config.entry(target.clone()) let target = build.config.target_config.entry(target.clone()).or_default();
.or_default();
target.musl_root = Some("/usr".into()); target.musl_root = Some("/usr".into());
} }
match build.musl_root(*target) { match build.musl_root(*target) {
Some(root) => { Some(root) => {
if fs::metadata(root.join("lib/libc.a")).is_err() { if fs::metadata(root.join("lib/libc.a")).is_err() {
panic!("couldn't find libc.a in musl dir: {}", panic!("couldn't find libc.a in musl dir: {}", root.join("lib").display());
root.join("lib").display());
} }
} }
None => { None => panic!(
panic!("when targeting MUSL either the rust.musl-root \ "when targeting MUSL either the rust.musl-root \
option or the target.$TARGET.musl-root option must \ option or the target.$TARGET.musl-root option must \
be specified in config.toml") be specified in config.toml"
} ),
} }
} }
@ -217,7 +232,8 @@ pub fn check(build: &mut Build) {
// Studio, so detect that here and error. // Studio, so detect that here and error.
let out = output(Command::new("cmake").arg("--help")); let out = output(Command::new("cmake").arg("--help"));
if !out.contains("Visual Studio") { if !out.contains("Visual Studio") {
panic!(" panic!(
"
cmake does not support Visual Studio generators. cmake does not support Visual Studio generators.
This is likely due to it being an msys/cygwin build of cmake, This is likely due to it being an msys/cygwin build of cmake,
@ -228,7 +244,8 @@ If you are building under msys2 try installing the mingw-w64-x86_64-cmake
package instead of cmake: package instead of cmake:
$ pacman -R cmake && pacman -S mingw-w64-x86_64-cmake $ pacman -R cmake && pacman -S mingw-w64-x86_64-cmake
"); "
);
} }
} }
} }
@ -240,8 +257,10 @@ $ pacman -R cmake && pacman -S mingw-w64-x86_64-cmake
if build.config.channel == "stable" { if build.config.channel == "stable" {
let stage0 = t!(fs::read_to_string(build.src.join("src/stage0.txt"))); let stage0 = t!(fs::read_to_string(build.src.join("src/stage0.txt")));
if stage0.contains("\ndev:") { if stage0.contains("\ndev:") {
panic!("bootstrapping from a dev compiler in a stable release, but \ panic!(
should only be bootstrapping from a released compiler!"); "bootstrapping from a dev compiler in a stable release, but \
should only be bootstrapping from a released compiler!"
);
} }
} }
} }

View File

@ -19,11 +19,11 @@ use crate::compile;
use crate::dist; use crate::dist;
use crate::flags::Subcommand; use crate::flags::Subcommand;
use crate::native; use crate::native;
use crate::tool::{self, Tool, SourceType}; use crate::tool::{self, SourceType, Tool};
use crate::toolstate::ToolState; use crate::toolstate::ToolState;
use crate::util::{self, dylib_path, dylib_path_var}; use crate::util::{self, dylib_path, dylib_path_var};
use crate::Crate as CargoCrate; use crate::Crate as CargoCrate;
use crate::{DocTests, Mode, GitRepo, envify}; use crate::{envify, DocTests, GitRepo, Mode};
const ADB_TEST_DIR: &str = "/data/tmp/work"; const ADB_TEST_DIR: &str = "/data/tmp/work";
@ -115,16 +115,13 @@ impl Step for Linkcheck {
let _time = util::timeit(&builder); let _time = util::timeit(&builder);
try_run( try_run(
builder, builder,
builder builder.tool_cmd(Tool::Linkchecker).arg(builder.out.join(host).join("doc")),
.tool_cmd(Tool::Linkchecker)
.arg(builder.out.join(host).join("doc")),
); );
} }
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> { fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder; let builder = run.builder;
run.path("src/tools/linkchecker") run.path("src/tools/linkchecker").default_condition(builder.config.docs)
.default_condition(builder.config.docs)
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
@ -147,10 +144,7 @@ impl Step for Cargotest {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargotest { run.builder.ensure(Cargotest { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs the `cargotest` tool as compiled in `stage` by the `host` compiler. /// Runs the `cargotest` tool as compiled in `stage` by the `host` compiler.
@ -159,10 +153,7 @@ impl Step for Cargotest {
/// test` to ensure that we don't regress the test suites there. /// test` to ensure that we don't regress the test suites there.
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(self.stage, self.host); let compiler = builder.compiler(self.stage, self.host);
builder.ensure(compile::Rustc { builder.ensure(compile::Rustc { compiler, target: compiler.host });
compiler,
target: compiler.host,
});
// Note that this is a short, cryptic, and not scoped directory name. This // Note that this is a short, cryptic, and not scoped directory name. This
// is currently to minimize the length of path on Windows where we otherwise // is currently to minimize the length of path on Windows where we otherwise
@ -197,28 +188,24 @@ impl Step for Cargo {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargo { run.builder.ensure(Cargo { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs `cargo test` for `cargo` packaged with Rust. /// Runs `cargo test` for `cargo` packaged with Rust.
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(self.stage, self.host); let compiler = builder.compiler(self.stage, self.host);
builder.ensure(tool::Cargo { builder.ensure(tool::Cargo { compiler, target: self.host });
let mut cargo = tool::prepare_tool_cargo(
builder,
compiler, compiler,
target: self.host, Mode::ToolRustc,
}); self.host,
let mut cargo = tool::prepare_tool_cargo(builder, "test",
compiler, "src/tools/cargo",
Mode::ToolRustc, SourceType::Submodule,
self.host, &[],
"test", );
"src/tools/cargo",
SourceType::Submodule,
&[]);
if !builder.fail_fast { if !builder.fail_fast {
cargo.arg("--no-fail-fast"); cargo.arg("--no-fail-fast");
@ -254,10 +241,7 @@ impl Step for Rls {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rls { run.builder.ensure(Rls { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs `cargo test` for the rls. /// Runs `cargo test` for the rls.
@ -266,28 +250,26 @@ impl Step for Rls {
let host = self.host; let host = self.host;
let compiler = builder.compiler(stage, host); let compiler = builder.compiler(stage, host);
let build_result = builder.ensure(tool::Rls { let build_result =
compiler, builder.ensure(tool::Rls { compiler, target: self.host, extra_features: Vec::new() });
target: self.host,
extra_features: Vec::new(),
});
if build_result.is_none() { if build_result.is_none() {
eprintln!("failed to test rls: could not build"); eprintln!("failed to test rls: could not build");
return; return;
} }
let mut cargo = tool::prepare_tool_cargo(builder, let mut cargo = tool::prepare_tool_cargo(
compiler, builder,
Mode::ToolRustc, compiler,
host, Mode::ToolRustc,
"test", host,
"src/tools/rls", "test",
SourceType::Submodule, "src/tools/rls",
&[]); SourceType::Submodule,
&[],
);
builder.add_rustc_lib_path(compiler, &mut cargo); builder.add_rustc_lib_path(compiler, &mut cargo);
cargo.arg("--") cargo.arg("--").args(builder.config.cmd.test_args());
.args(builder.config.cmd.test_args());
if try_run(builder, &mut cargo.into()) { if try_run(builder, &mut cargo.into()) {
builder.save_toolstate("rls", ToolState::TestPass); builder.save_toolstate("rls", ToolState::TestPass);
@ -310,10 +292,7 @@ impl Step for Rustfmt {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustfmt { run.builder.ensure(Rustfmt { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs `cargo test` for rustfmt. /// Runs `cargo test` for rustfmt.
@ -332,14 +311,16 @@ impl Step for Rustfmt {
return; return;
} }
let mut cargo = tool::prepare_tool_cargo(builder, let mut cargo = tool::prepare_tool_cargo(
compiler, builder,
Mode::ToolRustc, compiler,
host, Mode::ToolRustc,
"test", host,
"src/tools/rustfmt", "test",
SourceType::Submodule, "src/tools/rustfmt",
&[]); SourceType::Submodule,
&[],
);
let dir = testdir(builder, compiler.host); let dir = testdir(builder, compiler.host);
t!(fs::create_dir_all(&dir)); t!(fs::create_dir_all(&dir));
@ -368,10 +349,7 @@ impl Step for Miri {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Miri { run.builder.ensure(Miri { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs `cargo test` for miri. /// Runs `cargo test` for miri.
@ -380,11 +358,8 @@ impl Step for Miri {
let host = self.host; let host = self.host;
let compiler = builder.compiler(stage, host); let compiler = builder.compiler(stage, host);
let miri = builder.ensure(tool::Miri { let miri =
compiler, builder.ensure(tool::Miri { compiler, target: self.host, extra_features: Vec::new() });
target: self.host,
extra_features: Vec::new(),
});
if let Some(miri) = miri { if let Some(miri) = miri {
let mut cargo = builder.cargo(compiler, Mode::ToolRustc, host, "install"); let mut cargo = builder.cargo(compiler, Mode::ToolRustc, host, "install");
cargo.arg("xargo"); cargo.arg("xargo");
@ -407,16 +382,8 @@ impl Step for Miri {
SourceType::Submodule, SourceType::Submodule,
&[], &[],
); );
cargo cargo.arg("--bin").arg("cargo-miri").arg("--").arg("miri").arg("setup");
.arg("--bin")
.arg("cargo-miri")
.arg("--")
.arg("miri")
.arg("setup");
// Tell `cargo miri` not to worry about the sysroot mismatch (we built with
// stage1 but run with stage2).
cargo.env("MIRI_SKIP_SYSROOT_CHECK", "1");
// Tell `cargo miri setup` where to find the sources. // Tell `cargo miri setup` where to find the sources.
cargo.env("XARGO_RUST_SRC", builder.src.join("src")); cargo.env("XARGO_RUST_SRC", builder.src.join("src"));
// Debug things. // Debug things.
@ -441,7 +408,8 @@ impl Step for Miri {
String::new() String::new()
} else { } else {
builder.verbose(&format!("running: {:?}", cargo)); builder.verbose(&format!("running: {:?}", cargo));
let out = cargo.output() let out = cargo
.output()
.expect("We already ran `cargo miri setup` before and that worked"); .expect("We already ran `cargo miri setup` before and that worked");
assert!(out.status.success(), "`cargo miri setup` returned with non-0 exit code"); assert!(out.status.success(), "`cargo miri setup` returned with non-0 exit code");
// Output is "<sysroot>\n". // Output is "<sysroot>\n".
@ -497,9 +465,7 @@ impl Step for CompiletestTest {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(CompiletestTest { run.builder.ensure(CompiletestTest { host: run.target });
host: run.target,
});
} }
/// Runs `cargo test` for compiletest. /// Runs `cargo test` for compiletest.
@ -507,14 +473,16 @@ impl Step for CompiletestTest {
let host = self.host; let host = self.host;
let compiler = builder.compiler(0, host); let compiler = builder.compiler(0, host);
let cargo = tool::prepare_tool_cargo(builder, let cargo = tool::prepare_tool_cargo(
compiler, builder,
Mode::ToolBootstrap, compiler,
host, Mode::ToolBootstrap,
"test", host,
"src/tools/compiletest", "test",
SourceType::InTree, "src/tools/compiletest",
&[]); SourceType::InTree,
&[],
);
try_run(builder, &mut cargo.into()); try_run(builder, &mut cargo.into());
} }
@ -536,10 +504,7 @@ impl Step for Clippy {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Clippy { run.builder.ensure(Clippy { stage: run.builder.top_stage, host: run.target });
stage: run.builder.top_stage,
host: run.target,
});
} }
/// Runs `cargo test` for clippy. /// Runs `cargo test` for clippy.
@ -554,22 +519,22 @@ impl Step for Clippy {
extra_features: Vec::new(), extra_features: Vec::new(),
}); });
if let Some(clippy) = clippy { if let Some(clippy) = clippy {
let mut cargo = tool::prepare_tool_cargo(builder, let mut cargo = tool::prepare_tool_cargo(
compiler, builder,
Mode::ToolRustc, compiler,
host, Mode::ToolRustc,
"test", host,
"src/tools/clippy", "test",
SourceType::Submodule, "src/tools/clippy",
&[]); SourceType::Submodule,
&[],
);
// clippy tests need to know about the stage sysroot // clippy tests need to know about the stage sysroot
cargo.env("SYSROOT", builder.sysroot(compiler)); cargo.env("SYSROOT", builder.sysroot(compiler));
cargo.env("RUSTC_TEST_SUITE", builder.rustc(compiler)); cargo.env("RUSTC_TEST_SUITE", builder.rustc(compiler));
cargo.env("RUSTC_LIB_PATH", builder.rustc_libdir(compiler)); cargo.env("RUSTC_LIB_PATH", builder.rustc_libdir(compiler));
let host_libs = builder let host_libs = builder.stage_out(compiler, Mode::ToolRustc).join(builder.cargo_dir());
.stage_out(compiler, Mode::ToolRustc)
.join(builder.cargo_dir());
let target_libs = builder let target_libs = builder
.stage_out(compiler, Mode::ToolRustc) .stage_out(compiler, Mode::ToolRustc)
.join(&self.host) .join(&self.host)
@ -623,19 +588,10 @@ impl Step for RustdocTheme {
let rustdoc = builder.out.join("bootstrap/debug/rustdoc"); let rustdoc = builder.out.join("bootstrap/debug/rustdoc");
let mut cmd = builder.tool_cmd(Tool::RustdocTheme); let mut cmd = builder.tool_cmd(Tool::RustdocTheme);
cmd.arg(rustdoc.to_str().unwrap()) cmd.arg(rustdoc.to_str().unwrap())
.arg( .arg(builder.src.join("src/librustdoc/html/static/themes").to_str().unwrap())
builder
.src
.join("src/librustdoc/html/static/themes")
.to_str()
.unwrap(),
)
.env("RUSTC_STAGE", self.compiler.stage.to_string()) .env("RUSTC_STAGE", self.compiler.stage.to_string())
.env("RUSTC_SYSROOT", builder.sysroot(self.compiler)) .env("RUSTC_SYSROOT", builder.sysroot(self.compiler))
.env( .env("RUSTDOC_LIBDIR", builder.sysroot_libdir(self.compiler, self.compiler.host))
"RUSTDOC_LIBDIR",
builder.sysroot_libdir(self.compiler, self.compiler.host),
)
.env("CFG_RELEASE_CHANNEL", &builder.config.channel) .env("CFG_RELEASE_CHANNEL", &builder.config.channel)
.env("RUSTDOC_REAL", builder.rustdoc(self.compiler)) .env("RUSTDOC_REAL", builder.rustdoc(self.compiler))
.env("RUSTDOC_CRATE_VERSION", builder.rust_version()) .env("RUSTDOC_CRATE_VERSION", builder.rust_version())
@ -663,25 +619,17 @@ impl Step for RustdocJSStd {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(RustdocJSStd { run.builder.ensure(RustdocJSStd { host: run.host, target: run.target });
host: run.host,
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
if let Some(ref nodejs) = builder.config.nodejs { if let Some(ref nodejs) = builder.config.nodejs {
let mut command = Command::new(nodejs); let mut command = Command::new(nodejs);
command.args(&["src/tools/rustdoc-js-std/tester.js", &*self.host]); command.args(&["src/tools/rustdoc-js-std/tester.js", &*self.host]);
builder.ensure(crate::doc::Std { builder.ensure(crate::doc::Std { target: self.target, stage: builder.top_stage });
target: self.target,
stage: builder.top_stage,
});
builder.run(&mut command); builder.run(&mut command);
} else { } else {
builder.info( builder.info("No nodejs found, skipping \"src/test/rustdoc-js-std\" tests");
"No nodejs found, skipping \"src/test/rustdoc-js-std\" tests"
);
} }
} }
} }
@ -704,11 +652,7 @@ impl Step for RustdocJSNotStd {
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host); let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure(RustdocJSNotStd { run.builder.ensure(RustdocJSNotStd { host: run.host, target: run.target, compiler });
host: run.host,
target: run.target,
compiler,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -722,9 +666,7 @@ impl Step for RustdocJSNotStd {
compare_mode: None, compare_mode: None,
}); });
} else { } else {
builder.info( builder.info("No nodejs found, skipping \"src/test/rustdoc-js\" tests");
"No nodejs found, skipping \"src/test/rustdoc-js\" tests"
);
} }
} }
} }
@ -747,11 +689,7 @@ impl Step for RustdocUi {
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host); let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure(RustdocUi { run.builder.ensure(RustdocUi { host: run.host, target: run.target, compiler });
host: run.host,
target: run.target,
compiler,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -779,6 +717,9 @@ impl Step for Tidy {
/// This tool in `src/tools` checks up on various bits and pieces of style and /// This tool in `src/tools` checks up on various bits and pieces of style and
/// otherwise just implements a few lint-like checks that are specific to the /// otherwise just implements a few lint-like checks that are specific to the
/// compiler itself. /// compiler itself.
///
/// Once tidy passes, this step also runs `fmt --check` if tests are being run
/// for the `dev` or `nightly` channels.
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let mut cmd = builder.tool_cmd(Tool::Tidy); let mut cmd = builder.tool_cmd(Tool::Tidy);
cmd.arg(builder.src.join("src")); cmd.arg(builder.src.join("src"));
@ -792,6 +733,11 @@ impl Step for Tidy {
builder.info("tidy check"); builder.info("tidy check");
try_run(builder, &mut cmd); try_run(builder, &mut cmd);
if builder.config.channel == "dev" || builder.config.channel == "nightly" {
builder.info("fmt check");
crate::format::format(&builder.build, !builder.config.cmd.bless());
}
} }
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> { fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
@ -810,37 +756,55 @@ fn testdir(builder: &Builder<'_>, host: Interned<String>) -> PathBuf {
macro_rules! default_test { macro_rules! default_test {
($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr }) => { ($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr }) => {
test!($name { path: $path, mode: $mode, suite: $suite, default: true, host: false }); test!($name { path: $path, mode: $mode, suite: $suite, default: true, host: false });
} };
} }
macro_rules! default_test_with_compare_mode { macro_rules! default_test_with_compare_mode {
($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr, ($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr,
compare_mode: $compare_mode:expr }) => { compare_mode: $compare_mode:expr }) => {
test_with_compare_mode!($name { path: $path, mode: $mode, suite: $suite, default: true, test_with_compare_mode!($name {
host: false, compare_mode: $compare_mode }); path: $path,
} mode: $mode,
suite: $suite,
default: true,
host: false,
compare_mode: $compare_mode
});
};
} }
macro_rules! host_test { macro_rules! host_test {
($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr }) => { ($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr }) => {
test!($name { path: $path, mode: $mode, suite: $suite, default: true, host: true }); test!($name { path: $path, mode: $mode, suite: $suite, default: true, host: true });
} };
} }
macro_rules! test { macro_rules! test {
($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr, default: $default:expr, ($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr, default: $default:expr,
host: $host:expr }) => { host: $host:expr }) => {
test_definitions!($name { path: $path, mode: $mode, suite: $suite, default: $default, test_definitions!($name {
host: $host, compare_mode: None }); path: $path,
} mode: $mode,
suite: $suite,
default: $default,
host: $host,
compare_mode: None
});
};
} }
macro_rules! test_with_compare_mode { macro_rules! test_with_compare_mode {
($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr, default: $default:expr, ($name:ident { path: $path:expr, mode: $mode:expr, suite: $suite:expr, default: $default:expr,
host: $host:expr, compare_mode: $compare_mode:expr }) => { host: $host:expr, compare_mode: $compare_mode:expr }) => {
test_definitions!($name { path: $path, mode: $mode, suite: $suite, default: $default, test_definitions!($name {
host: $host, compare_mode: Some($compare_mode) }); path: $path,
} mode: $mode,
suite: $suite,
default: $default,
host: $host,
compare_mode: Some($compare_mode)
});
};
} }
macro_rules! test_definitions { macro_rules! test_definitions {
@ -870,10 +834,7 @@ macro_rules! test_definitions {
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host); let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure($name { run.builder.ensure($name { compiler, target: run.target });
compiler,
target: run.target,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -887,7 +848,7 @@ macro_rules! test_definitions {
}) })
} }
} }
} };
} }
default_test_with_compare_mode!(Ui { default_test_with_compare_mode!(Ui {
@ -903,11 +864,7 @@ default_test!(CompileFail {
suite: "compile-fail" suite: "compile-fail"
}); });
default_test!(RunFail { default_test!(RunFail { path: "src/test/run-fail", mode: "run-fail", suite: "run-fail" });
path: "src/test/run-fail",
mode: "run-fail",
suite: "run-fail"
});
default_test!(RunPassValgrind { default_test!(RunPassValgrind {
path: "src/test/run-pass-valgrind", path: "src/test/run-pass-valgrind",
@ -915,17 +872,9 @@ default_test!(RunPassValgrind {
suite: "run-pass-valgrind" suite: "run-pass-valgrind"
}); });
default_test!(MirOpt { default_test!(MirOpt { path: "src/test/mir-opt", mode: "mir-opt", suite: "mir-opt" });
path: "src/test/mir-opt",
mode: "mir-opt",
suite: "mir-opt"
});
default_test!(Codegen { default_test!(Codegen { path: "src/test/codegen", mode: "codegen", suite: "codegen" });
path: "src/test/codegen",
mode: "codegen",
suite: "codegen"
});
default_test!(CodegenUnits { default_test!(CodegenUnits {
path: "src/test/codegen-units", path: "src/test/codegen-units",
@ -939,29 +888,13 @@ default_test!(Incremental {
suite: "incremental" suite: "incremental"
}); });
default_test!(Debuginfo { default_test!(Debuginfo { path: "src/test/debuginfo", mode: "debuginfo", suite: "debuginfo" });
path: "src/test/debuginfo",
mode: "debuginfo",
suite: "debuginfo"
});
host_test!(UiFullDeps { host_test!(UiFullDeps { path: "src/test/ui-fulldeps", mode: "ui", suite: "ui-fulldeps" });
path: "src/test/ui-fulldeps",
mode: "ui",
suite: "ui-fulldeps"
});
host_test!(Rustdoc { host_test!(Rustdoc { path: "src/test/rustdoc", mode: "rustdoc", suite: "rustdoc" });
path: "src/test/rustdoc",
mode: "rustdoc",
suite: "rustdoc"
});
host_test!(Pretty { host_test!(Pretty { path: "src/test/pretty", mode: "pretty", suite: "pretty" });
path: "src/test/pretty",
mode: "pretty",
suite: "pretty"
});
test!(RunFailPretty { test!(RunFailPretty {
path: "src/test/run-fail/pretty", path: "src/test/run-fail/pretty",
mode: "pretty", mode: "pretty",
@ -977,11 +910,7 @@ test!(RunPassValgrindPretty {
host: true host: true
}); });
default_test!(RunMake { default_test!(RunMake { path: "src/test/run-make", mode: "run-make", suite: "run-make" });
path: "src/test/run-make",
mode: "run-make",
suite: "run-make"
});
host_test!(RunMakeFullDeps { host_test!(RunMakeFullDeps {
path: "src/test/run-make-fulldeps", path: "src/test/run-make-fulldeps",
@ -989,11 +918,7 @@ host_test!(RunMakeFullDeps {
suite: "run-make-fulldeps" suite: "run-make-fulldeps"
}); });
default_test!(Assembly { default_test!(Assembly { path: "src/test/assembly", mode: "assembly", suite: "assembly" });
path: "src/test/assembly",
mode: "assembly",
suite: "assembly"
});
#[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)] #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
struct Compiletest { struct Compiletest {
@ -1032,18 +957,8 @@ impl Step for Compiletest {
} }
if suite == "debuginfo" { if suite == "debuginfo" {
let msvc = builder.config.build.contains("msvc"); builder
if mode == "debuginfo" { .ensure(dist::DebuggerScripts { sysroot: builder.sysroot(compiler), host: target });
return builder.ensure(Compiletest {
mode: if msvc { "debuginfo-cdb" } else { "debuginfo-gdb+lldb" },
..self
});
}
builder.ensure(dist::DebuggerScripts {
sysroot: builder.sysroot(compiler),
host: target,
});
} }
if suite.ends_with("fulldeps") { if suite.ends_with("fulldeps") {
@ -1069,10 +984,8 @@ impl Step for Compiletest {
// compiletest currently has... a lot of arguments, so let's just pass all // compiletest currently has... a lot of arguments, so let's just pass all
// of them! // of them!
cmd.arg("--compile-lib-path") cmd.arg("--compile-lib-path").arg(builder.rustc_libdir(compiler));
.arg(builder.rustc_libdir(compiler)); cmd.arg("--run-lib-path").arg(builder.sysroot_libdir(compiler, target));
cmd.arg("--run-lib-path")
.arg(builder.sysroot_libdir(compiler, target));
cmd.arg("--rustc-path").arg(builder.rustc(compiler)); cmd.arg("--rustc-path").arg(builder.rustc(compiler));
let is_rustdoc = suite.ends_with("rustdoc-ui") || suite.ends_with("rustdoc-js"); let is_rustdoc = suite.ends_with("rustdoc-ui") || suite.ends_with("rustdoc-js");
@ -1083,33 +996,25 @@ impl Step for Compiletest {
|| (mode == "ui" && is_rustdoc) || (mode == "ui" && is_rustdoc)
|| mode == "js-doc-test" || mode == "js-doc-test"
{ {
cmd.arg("--rustdoc-path") cmd.arg("--rustdoc-path").arg(builder.rustdoc(compiler));
.arg(builder.rustdoc(compiler));
} }
cmd.arg("--src-base") cmd.arg("--src-base").arg(builder.src.join("src/test").join(suite));
.arg(builder.src.join("src/test").join(suite)); cmd.arg("--build-base").arg(testdir(builder, compiler.host).join(suite));
cmd.arg("--build-base") cmd.arg("--stage-id").arg(format!("stage{}-{}", compiler.stage, target));
.arg(testdir(builder, compiler.host).join(suite));
cmd.arg("--stage-id")
.arg(format!("stage{}-{}", compiler.stage, target));
cmd.arg("--mode").arg(mode); cmd.arg("--mode").arg(mode);
cmd.arg("--target").arg(target); cmd.arg("--target").arg(target);
cmd.arg("--host").arg(&*compiler.host); cmd.arg("--host").arg(&*compiler.host);
cmd.arg("--llvm-filecheck") cmd.arg("--llvm-filecheck").arg(builder.llvm_filecheck(builder.config.build));
.arg(builder.llvm_filecheck(builder.config.build));
if builder.config.cmd.bless() { if builder.config.cmd.bless() {
cmd.arg("--bless"); cmd.arg("--bless");
} }
let compare_mode = builder.config.cmd.compare_mode().or_else(|| { let compare_mode =
if builder.config.test_compare_mode { builder.config.cmd.compare_mode().or_else(|| {
self.compare_mode if builder.config.test_compare_mode { self.compare_mode } else { None }
} else { });
None
}
});
if let Some(ref pass) = builder.config.cmd.pass() { if let Some(ref pass) = builder.config.cmd.pass() {
cmd.arg("--pass"); cmd.arg("--pass");
@ -1120,11 +1025,7 @@ impl Step for Compiletest {
cmd.arg("--nodejs").arg(nodejs); cmd.arg("--nodejs").arg(nodejs);
} }
let mut flags = if is_rustdoc { let mut flags = if is_rustdoc { Vec::new() } else { vec!["-Crpath".to_string()] };
Vec::new()
} else {
vec!["-Crpath".to_string()]
};
if !is_rustdoc { if !is_rustdoc {
if builder.config.rust_optimize_tests { if builder.config.rust_optimize_tests {
flags.push("-O".to_string()); flags.push("-O".to_string());
@ -1139,26 +1040,20 @@ impl Step for Compiletest {
} }
let mut hostflags = flags.clone(); let mut hostflags = flags.clone();
hostflags.push(format!( hostflags.push(format!("-Lnative={}", builder.test_helpers_out(compiler.host).display()));
"-Lnative={}",
builder.test_helpers_out(compiler.host).display()
));
cmd.arg("--host-rustcflags").arg(hostflags.join(" ")); cmd.arg("--host-rustcflags").arg(hostflags.join(" "));
let mut targetflags = flags; let mut targetflags = flags;
targetflags.push(format!( targetflags.push(format!("-Lnative={}", builder.test_helpers_out(target).display()));
"-Lnative={}",
builder.test_helpers_out(target).display()
));
cmd.arg("--target-rustcflags").arg(targetflags.join(" ")); cmd.arg("--target-rustcflags").arg(targetflags.join(" "));
cmd.arg("--docck-python").arg(builder.python()); cmd.arg("--docck-python").arg(builder.python());
if builder.config.build.ends_with("apple-darwin") { if builder.config.build.ends_with("apple-darwin") {
// Force /usr/bin/python on macOS for LLDB tests because we're loading the // Force /usr/bin/python3 on macOS for LLDB tests because we're loading the
// LLDB plugin's compiled module which only works with the system python // LLDB plugin's compiled module which only works with the system python
// (namely not Homebrew-installed python) // (namely not Homebrew-installed python)
cmd.arg("--lldb-python").arg("/usr/bin/python"); cmd.arg("--lldb-python").arg("/usr/bin/python3");
} else { } else {
cmd.arg("--lldb-python").arg(builder.python()); cmd.arg("--lldb-python").arg(builder.python());
} }
@ -1170,9 +1065,10 @@ impl Step for Compiletest {
let run = |cmd: &mut Command| { let run = |cmd: &mut Command| {
cmd.output().map(|output| { cmd.output().map(|output| {
String::from_utf8_lossy(&output.stdout) String::from_utf8_lossy(&output.stdout)
.lines().next().unwrap_or_else(|| { .lines()
panic!("{:?} failed {:?}", cmd, output) .next()
}).to_string() .unwrap_or_else(|| panic!("{:?} failed {:?}", cmd, output))
.to_string()
}) })
}; };
let lldb_exe = if builder.config.lldb_enabled { let lldb_exe = if builder.config.lldb_enabled {
@ -1184,7 +1080,7 @@ impl Step for Compiletest {
let lldb_version = Command::new(&lldb_exe) let lldb_version = Command::new(&lldb_exe)
.arg("--version") .arg("--version")
.output() .output()
.map(|output| { String::from_utf8_lossy(&output.stdout).to_string() }) .map(|output| String::from_utf8_lossy(&output.stdout).to_string())
.ok(); .ok();
if let Some(ref vers) = lldb_version { if let Some(ref vers) = lldb_version {
cmd.arg("--lldb-version").arg(vers); cmd.arg("--lldb-version").arg(vers);
@ -1208,11 +1104,9 @@ impl Step for Compiletest {
// Get test-args by striping suite path // Get test-args by striping suite path
let mut test_args: Vec<&str> = paths let mut test_args: Vec<&str> = paths
.iter() .iter()
.map(|p| { .map(|p| match p.strip_prefix(".") {
match p.strip_prefix(".") { Ok(path) => path,
Ok(path) => path, Err(_) => p,
Err(_) => p,
}
}) })
.filter(|p| p.starts_with(suite_path) && (p.is_dir() || p.is_file())) .filter(|p| p.starts_with(suite_path) && (p.is_dir() || p.is_file()))
.filter_map(|p| { .filter_map(|p| {
@ -1242,9 +1136,7 @@ impl Step for Compiletest {
} }
if builder.config.llvm_enabled() { if builder.config.llvm_enabled() {
let llvm_config = builder.ensure(native::Llvm { let llvm_config = builder.ensure(native::Llvm { target: builder.config.build });
target: builder.config.build,
});
if !builder.config.dry_run { if !builder.config.dry_run {
let llvm_version = output(Command::new(&llvm_config).arg("--version")); let llvm_version = output(Command::new(&llvm_config).arg("--version"));
cmd.arg("--llvm-version").arg(llvm_version); cmd.arg("--llvm-version").arg(llvm_version);
@ -1274,23 +1166,24 @@ impl Step for Compiletest {
// The llvm/bin directory contains many useful cross-platform // The llvm/bin directory contains many useful cross-platform
// tools. Pass the path to run-make tests so they can use them. // tools. Pass the path to run-make tests so they can use them.
let llvm_bin_path = llvm_config.parent() let llvm_bin_path = llvm_config
.parent()
.expect("Expected llvm-config to be contained in directory"); .expect("Expected llvm-config to be contained in directory");
assert!(llvm_bin_path.is_dir()); assert!(llvm_bin_path.is_dir());
cmd.arg("--llvm-bin-dir").arg(llvm_bin_path); cmd.arg("--llvm-bin-dir").arg(llvm_bin_path);
// If LLD is available, add it to the PATH // If LLD is available, add it to the PATH
if builder.config.lld_enabled { if builder.config.lld_enabled {
let lld_install_root = builder.ensure(native::Lld { let lld_install_root =
target: builder.config.build, builder.ensure(native::Lld { target: builder.config.build });
});
let lld_bin_path = lld_install_root.join("bin"); let lld_bin_path = lld_install_root.join("bin");
let old_path = env::var_os("PATH").unwrap_or_default(); let old_path = env::var_os("PATH").unwrap_or_default();
let new_path = env::join_paths(std::iter::once(lld_bin_path) let new_path = env::join_paths(
.chain(env::split_paths(&old_path))) std::iter::once(lld_bin_path).chain(env::split_paths(&old_path)),
.expect("Could not add LLD bin path to PATH"); )
.expect("Could not add LLD bin path to PATH");
cmd.env("PATH", new_path); cmd.env("PATH", new_path);
} }
} }
@ -1310,8 +1203,7 @@ impl Step for Compiletest {
} }
if builder.remote_tested(target) { if builder.remote_tested(target) {
cmd.arg("--remote-test-client") cmd.arg("--remote-test-client").arg(builder.tool_exe(Tool::RemoteTestClient));
.arg(builder.tool_exe(Tool::RemoteTestClient));
} }
// Running a C compiler on MSVC requires a few env vars to be set, to be // Running a C compiler on MSVC requires a few env vars to be set, to be
@ -1341,7 +1233,6 @@ impl Step for Compiletest {
std::fs::create_dir_all(&tmp).unwrap(); std::fs::create_dir_all(&tmp).unwrap();
cmd.env("RUST_TEST_TMPDIR", tmp); cmd.env("RUST_TEST_TMPDIR", tmp);
cmd.arg("--adb-path").arg("adb"); cmd.arg("--adb-path").arg("adb");
cmd.arg("--adb-test-dir").arg(ADB_TEST_DIR); cmd.arg("--adb-test-dir").arg(ADB_TEST_DIR);
if target.contains("android") { if target.contains("android") {
@ -1401,10 +1292,7 @@ impl Step for DocTest {
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler; let compiler = self.compiler;
builder.ensure(compile::Std { builder.ensure(compile::Std { compiler, target: compiler.host });
compiler,
target: compiler.host,
});
// Do a breadth-first traversal of the `src/doc` directory and just run // Do a breadth-first traversal of the `src/doc` directory and just run
// tests for all files that end in `*.md` // tests for all files that end in `*.md`
@ -1508,9 +1396,8 @@ impl Step for ErrorIndex {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(ErrorIndex { run.builder
compiler: run.builder.compiler(run.builder.top_stage, run.host), .ensure(ErrorIndex { compiler: run.builder.compiler(run.builder.top_stage, run.host) });
});
} }
/// Runs the error index generator tool to execute the tests located in the error /// Runs the error index generator tool to execute the tests located in the error
@ -1522,10 +1409,7 @@ impl Step for ErrorIndex {
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler; let compiler = self.compiler;
builder.ensure(compile::Std { builder.ensure(compile::Std { compiler, target: compiler.host });
compiler,
target: compiler.host,
});
let dir = testdir(builder, compiler.host); let dir = testdir(builder, compiler.host);
t!(fs::create_dir_all(&dir)); t!(fs::create_dir_all(&dir));
@ -1535,9 +1419,7 @@ impl Step for ErrorIndex {
builder, builder,
builder.compiler(compiler.stage, builder.config.build), builder.compiler(compiler.stage, builder.config.build),
); );
tool.arg("markdown") tool.arg("markdown").arg(&output).env("CFG_BUILD", &builder.config.build);
.arg(&output)
.env("CFG_BUILD", &builder.config.build);
builder.info(&format!("Testing error-index stage{}", compiler.stage)); builder.info(&format!("Testing error-index stage{}", compiler.stage));
let _time = util::timeit(&builder); let _time = util::timeit(&builder);
@ -1769,7 +1651,7 @@ impl Step for Crate {
let mut cargo = builder.cargo(compiler, mode, target, test_kind.subcommand()); let mut cargo = builder.cargo(compiler, mode, target, test_kind.subcommand());
match mode { match mode {
Mode::Std => { Mode::Std => {
compile::std_cargo(builder, &compiler, target, &mut cargo); compile::std_cargo(builder, target, &mut cargo);
} }
Mode::Rustc => { Mode::Rustc => {
builder.ensure(compile::Rustc { compiler, target }); builder.ensure(compile::Rustc { compiler, target });
@ -1817,23 +1699,12 @@ impl Step for Crate {
if target.contains("emscripten") { if target.contains("emscripten") {
cargo.env( cargo.env(
format!("CARGO_TARGET_{}_RUNNER", envify(&target)), format!("CARGO_TARGET_{}_RUNNER", envify(&target)),
builder builder.config.nodejs.as_ref().expect("nodejs not configured"),
.config
.nodejs
.as_ref()
.expect("nodejs not configured"),
); );
} else if target.starts_with("wasm32") { } else if target.starts_with("wasm32") {
let node = builder let node = builder.config.nodejs.as_ref().expect("nodejs not configured");
.config let runner =
.nodejs format!("{} {}/src/etc/wasm32-shim.js", node.display(), builder.src.display());
.as_ref()
.expect("nodejs not configured");
let runner = format!(
"{} {}/src/etc/wasm32-shim.js",
node.display(),
builder.src.display()
);
cargo.env(format!("CARGO_TARGET_{}_RUNNER", envify(&target)), &runner); cargo.env(format!("CARGO_TARGET_{}_RUNNER", envify(&target)), &runner);
} else if builder.remote_tested(target) { } else if builder.remote_tested(target) {
cargo.env( cargo.env(
@ -1871,10 +1742,7 @@ impl Step for CrateRustdoc {
let test_kind = builder.kind.into(); let test_kind = builder.kind.into();
builder.ensure(CrateRustdoc { builder.ensure(CrateRustdoc { host: run.host, test_kind });
host: run.host,
test_kind,
});
} }
fn run(self, builder: &Builder<'_>) { fn run(self, builder: &Builder<'_>) {
@ -1884,14 +1752,16 @@ impl Step for CrateRustdoc {
let target = compiler.host; let target = compiler.host;
builder.ensure(compile::Rustc { compiler, target }); builder.ensure(compile::Rustc { compiler, target });
let mut cargo = tool::prepare_tool_cargo(builder, let mut cargo = tool::prepare_tool_cargo(
compiler, builder,
Mode::ToolRustc, compiler,
target, Mode::ToolRustc,
test_kind.subcommand(), target,
"src/tools/rustdoc", test_kind.subcommand(),
SourceType::InTree, "src/tools/rustdoc",
&[]); SourceType::InTree,
&[],
);
if test_kind.subcommand() == "test" && !builder.fail_fast { if test_kind.subcommand() == "test" && !builder.fail_fast {
cargo.arg("--no-fail-fast"); cargo.arg("--no-fail-fast");
} }
@ -1953,18 +1823,13 @@ impl Step for RemoteCopyLibs {
builder.info(&format!("REMOTE copy libs to emulator ({})", target)); builder.info(&format!("REMOTE copy libs to emulator ({})", target));
t!(fs::create_dir_all(builder.out.join("tmp"))); t!(fs::create_dir_all(builder.out.join("tmp")));
let server = builder.ensure(tool::RemoteTestServer { let server =
compiler: compiler.with_stage(0), builder.ensure(tool::RemoteTestServer { compiler: compiler.with_stage(0), target });
target,
});
// Spawn the emulator and wait for it to come online // Spawn the emulator and wait for it to come online
let tool = builder.tool_exe(Tool::RemoteTestClient); let tool = builder.tool_exe(Tool::RemoteTestClient);
let mut cmd = Command::new(&tool); let mut cmd = Command::new(&tool);
cmd.arg("spawn-emulator") cmd.arg("spawn-emulator").arg(target).arg(&server).arg(builder.out.join("tmp"));
.arg(target)
.arg(&server)
.arg(builder.out.join("tmp"));
if let Some(rootfs) = builder.qemu_rootfs(target) { if let Some(rootfs) = builder.qemu_rootfs(target) {
cmd.arg(rootfs); cmd.arg(rootfs);
} }
@ -2019,9 +1884,7 @@ impl Step for Distcheck {
.current_dir(&dir), .current_dir(&dir),
); );
builder.run( builder.run(
Command::new(build_helper::make(&builder.config.build)) Command::new(build_helper::make(&builder.config.build)).arg("check").current_dir(&dir),
.arg("check")
.current_dir(&dir),
); );
// Now make sure that rust-src has all of libstd's dependencies // Now make sure that rust-src has all of libstd's dependencies

View File

@ -1,20 +1,20 @@
use std::fs;
use std::env;
use std::path::PathBuf;
use std::process::{Command, exit};
use std::collections::HashSet; use std::collections::HashSet;
use std::env;
use std::fs;
use std::path::PathBuf;
use std::process::{exit, Command};
use build_helper::t; use build_helper::t;
use crate::Mode; use crate::builder::{Builder, Cargo as CargoCommand, RunConfig, ShouldRun, Step};
use crate::Compiler;
use crate::builder::{Step, RunConfig, ShouldRun, Builder, Cargo as CargoCommand};
use crate::util::{exe, add_lib_path, CiEnv};
use crate::compile;
use crate::channel::GitInfo;
use crate::channel;
use crate::cache::Interned; use crate::cache::Interned;
use crate::channel;
use crate::channel::GitInfo;
use crate::compile;
use crate::toolstate::ToolState; use crate::toolstate::ToolState;
use crate::util::{add_lib_path, exe, CiEnv};
use crate::Compiler;
use crate::Mode;
#[derive(Debug, Clone, Hash, PartialEq, Eq)] #[derive(Debug, Clone, Hash, PartialEq, Eq)]
pub enum SourceType { pub enum SourceType {
@ -53,14 +53,10 @@ impl Step for ToolBuild {
let is_optional_tool = self.is_optional_tool; let is_optional_tool = self.is_optional_tool;
match self.mode { match self.mode {
Mode::ToolRustc => { Mode::ToolRustc => builder.ensure(compile::Rustc { compiler, target }),
builder.ensure(compile::Rustc { compiler, target }) Mode::ToolStd => builder.ensure(compile::Std { compiler, target }),
}
Mode::ToolStd => {
builder.ensure(compile::Std { compiler, target })
}
Mode::ToolBootstrap => {} // uses downloaded stage0 compiler libs Mode::ToolBootstrap => {} // uses downloaded stage0 compiler libs
_ => panic!("unexpected Mode for tool build") _ => panic!("unexpected Mode for tool build"),
} }
let cargo = prepare_tool_cargo( let cargo = prepare_tool_cargo(
@ -79,12 +75,7 @@ impl Step for ToolBuild {
let is_expected = compile::stream_cargo(builder, cargo, vec![], &mut |msg| { let is_expected = compile::stream_cargo(builder, cargo, vec![], &mut |msg| {
// Only care about big things like the RLS/Cargo for now // Only care about big things like the RLS/Cargo for now
match tool { match tool {
| "rls" "rls" | "cargo" | "clippy-driver" | "miri" | "rustfmt" => {}
| "cargo"
| "clippy-driver"
| "miri"
| "rustfmt"
=> {}
_ => return, _ => return,
} }
@ -94,9 +85,7 @@ impl Step for ToolBuild {
features, features,
filenames, filenames,
target: _, target: _,
} => { } => (package_id, features, filenames),
(package_id, features, filenames)
}
_ => return, _ => return,
}; };
let features = features.iter().map(|s| s.to_string()).collect::<Vec<_>>(); let features = features.iter().map(|s| s.to_string()).collect::<Vec<_>>();
@ -105,7 +94,7 @@ impl Step for ToolBuild {
let val = (tool, PathBuf::from(&*path), features.clone()); let val = (tool, PathBuf::from(&*path), features.clone());
// we're only interested in deduplicating rlibs for now // we're only interested in deduplicating rlibs for now
if val.1.extension().and_then(|s| s.to_str()) != Some("rlib") { if val.1.extension().and_then(|s| s.to_str()) != Some("rlib") {
continue continue;
} }
// Don't worry about compiles that turn out to be host // Don't worry about compiles that turn out to be host
@ -132,9 +121,7 @@ impl Step for ToolBuild {
// already listed then we need to see if we reused the same // already listed then we need to see if we reused the same
// artifact or produced a duplicate. // artifact or produced a duplicate.
let mut artifacts = builder.tool_artifacts.borrow_mut(); let mut artifacts = builder.tool_artifacts.borrow_mut();
let prev_artifacts = artifacts let prev_artifacts = artifacts.entry(target).or_default();
.entry(target)
.or_default();
let prev = match prev_artifacts.get(&*id) { let prev = match prev_artifacts.get(&*id) {
Some(prev) => prev, Some(prev) => prev,
None => { None => {
@ -160,21 +147,21 @@ impl Step for ToolBuild {
// ... and otherwise this looks like we duplicated some sort of // ... and otherwise this looks like we duplicated some sort of
// compilation, so record it to generate an error later. // compilation, so record it to generate an error later.
duplicates.push(( duplicates.push((id.to_string(), val, prev.clone()));
id.to_string(),
val,
prev.clone(),
));
} }
}); });
if is_expected && !duplicates.is_empty() { if is_expected && !duplicates.is_empty() {
println!("duplicate artifacts found when compiling a tool, this \ println!(
"duplicate artifacts found when compiling a tool, this \
typically means that something was recompiled because \ typically means that something was recompiled because \
a transitive dependency has different features activated \ a transitive dependency has different features activated \
than in a previous build:\n"); than in a previous build:\n"
println!("the following dependencies are duplicated although they \ );
have the same features enabled:"); println!(
"the following dependencies are duplicated although they \
have the same features enabled:"
);
for (id, cur, prev) in duplicates.drain_filter(|(_, cur, prev)| cur.2 == prev.2) { for (id, cur, prev) in duplicates.drain_filter(|(_, cur, prev)| cur.2 == prev.2) {
println!(" {}", id); println!(" {}", id);
// same features // same features
@ -185,24 +172,33 @@ impl Step for ToolBuild {
println!(" {}", id); println!(" {}", id);
let cur_features: HashSet<_> = cur.2.into_iter().collect(); let cur_features: HashSet<_> = cur.2.into_iter().collect();
let prev_features: HashSet<_> = prev.2.into_iter().collect(); let prev_features: HashSet<_> = prev.2.into_iter().collect();
println!(" `{}` additionally enabled features {:?} at {:?}", println!(
cur.0, &cur_features - &prev_features, cur.1); " `{}` additionally enabled features {:?} at {:?}",
println!(" `{}` additionally enabled features {:?} at {:?}", cur.0,
prev.0, &prev_features - &cur_features, prev.1); &cur_features - &prev_features,
cur.1
);
println!(
" `{}` additionally enabled features {:?} at {:?}",
prev.0,
&prev_features - &cur_features,
prev.1
);
} }
println!(); println!();
println!("to fix this you will probably want to edit the local \ println!(
"to fix this you will probably want to edit the local \
src/tools/rustc-workspace-hack/Cargo.toml crate, as \ src/tools/rustc-workspace-hack/Cargo.toml crate, as \
that will update the dependency graph to ensure that \ that will update the dependency graph to ensure that \
these crates all share the same feature set"); these crates all share the same feature set"
);
panic!("tools should not compile multiple copies of the same crate"); panic!("tools should not compile multiple copies of the same crate");
} }
builder.save_toolstate(tool, if is_expected { builder.save_toolstate(
ToolState::TestFail tool,
} else { if is_expected { ToolState::TestFail } else { ToolState::BuildFail },
ToolState::BuildFail );
});
if !is_expected { if !is_expected {
if !is_optional_tool { if !is_optional_tool {
@ -211,8 +207,8 @@ impl Step for ToolBuild {
None None
} }
} else { } else {
let cargo_out = builder.cargo_out(compiler, self.mode, target) let cargo_out =
.join(exe(tool, &compiler.host)); builder.cargo_out(compiler, self.mode, target).join(exe(tool, &compiler.host));
let bin = builder.tools_dir(compiler).join(exe(tool, &compiler.host)); let bin = builder.tools_dir(compiler).join(exe(tool, &compiler.host));
builder.copy(&cargo_out, &bin); builder.copy(&cargo_out, &bin);
Some(bin) Some(bin)
@ -240,12 +236,12 @@ pub fn prepare_tool_cargo(
let mut features = extra_features.iter().cloned().collect::<Vec<_>>(); let mut features = extra_features.iter().cloned().collect::<Vec<_>>();
if builder.build.config.cargo_native_static { if builder.build.config.cargo_native_static {
if path.ends_with("cargo") || if path.ends_with("cargo")
path.ends_with("rls") || || path.ends_with("rls")
path.ends_with("clippy") || || path.ends_with("clippy")
path.ends_with("miri") || || path.ends_with("miri")
path.ends_with("rustbook") || || path.ends_with("rustbook")
path.ends_with("rustfmt") || path.ends_with("rustfmt")
{ {
cargo.env("LIBZ_SYS_STATIC", "1"); cargo.env("LIBZ_SYS_STATIC", "1");
features.push("rustc-workspace-hack/all-static".to_string()); features.push("rustc-workspace-hack/all-static".to_string());
@ -293,8 +289,8 @@ fn rustbook_features() -> Vec<String> {
macro_rules! bootstrap_tool { macro_rules! bootstrap_tool {
($( ($(
$name:ident, $path:expr, $tool_name:expr $name:ident, $path:expr, $tool_name:expr
$(,llvm_tools = $llvm:expr)*
$(,is_external_tool = $external:expr)* $(,is_external_tool = $external:expr)*
$(,is_unstable_tool = $unstable:expr)*
$(,features = $features:expr)* $(,features = $features:expr)*
; ;
)+) => { )+) => {
@ -305,15 +301,6 @@ macro_rules! bootstrap_tool {
)+ )+
} }
impl Tool {
/// Whether this tool requires LLVM to run
pub fn uses_llvm_tools(&self) -> bool {
match self {
$(Tool::$name => false $(|| $llvm)*,)+
}
}
}
impl<'a> Builder<'a> { impl<'a> Builder<'a> {
pub fn tool_exe(&self, tool: Tool) -> PathBuf { pub fn tool_exe(&self, tool: Tool) -> PathBuf {
match tool { match tool {
@ -354,7 +341,12 @@ macro_rules! bootstrap_tool {
compiler: self.compiler, compiler: self.compiler,
target: self.target, target: self.target,
tool: $tool_name, tool: $tool_name,
mode: Mode::ToolBootstrap, mode: if false $(|| $unstable)* {
// use in-tree libraries for unstable features
Mode::ToolStd
} else {
Mode::ToolBootstrap
},
path: $path, path: $path,
is_optional_tool: false, is_optional_tool: false,
source_type: if false $(|| $external)* { source_type: if false $(|| $external)* {
@ -381,7 +373,7 @@ bootstrap_tool!(
Tidy, "src/tools/tidy", "tidy"; Tidy, "src/tools/tidy", "tidy";
Linkchecker, "src/tools/linkchecker", "linkchecker"; Linkchecker, "src/tools/linkchecker", "linkchecker";
CargoTest, "src/tools/cargotest", "cargotest"; CargoTest, "src/tools/cargotest", "cargotest";
Compiletest, "src/tools/compiletest", "compiletest", llvm_tools = true; Compiletest, "src/tools/compiletest", "compiletest", is_unstable_tool = true;
BuildManifest, "src/tools/build-manifest", "build-manifest"; BuildManifest, "src/tools/build-manifest", "build-manifest";
RemoteTestClient, "src/tools/remote-test-client", "remote-test-client"; RemoteTestClient, "src/tools/remote-test-client", "remote-test-client";
RustInstaller, "src/tools/rust-installer", "fabricate", is_external_tool = true; RustInstaller, "src/tools/rust-installer", "fabricate", is_external_tool = true;
@ -395,9 +387,7 @@ pub struct ErrorIndex {
impl ErrorIndex { impl ErrorIndex {
pub fn command(builder: &Builder<'_>, compiler: Compiler) -> Command { pub fn command(builder: &Builder<'_>, compiler: Compiler) -> Command {
let mut cmd = Command::new(builder.ensure(ErrorIndex { let mut cmd = Command::new(builder.ensure(ErrorIndex { compiler }));
compiler
}));
add_lib_path( add_lib_path(
vec![PathBuf::from(&builder.sysroot_libdir(compiler, compiler.host))], vec![PathBuf::from(&builder.sysroot_libdir(compiler, compiler.host))],
&mut cmd, &mut cmd,
@ -417,22 +407,23 @@ impl Step for ErrorIndex {
// Compile the error-index in the same stage as rustdoc to avoid // Compile the error-index in the same stage as rustdoc to avoid
// recompiling rustdoc twice if we can. // recompiling rustdoc twice if we can.
let stage = if run.builder.top_stage >= 2 { run.builder.top_stage } else { 0 }; let stage = if run.builder.top_stage >= 2 { run.builder.top_stage } else { 0 };
run.builder.ensure(ErrorIndex { run.builder
compiler: run.builder.compiler(stage, run.builder.config.build), .ensure(ErrorIndex { compiler: run.builder.compiler(stage, run.builder.config.build) });
});
} }
fn run(self, builder: &Builder<'_>) -> PathBuf { fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.ensure(ToolBuild { builder
compiler: self.compiler, .ensure(ToolBuild {
target: self.compiler.host, compiler: self.compiler,
tool: "error_index_generator", target: self.compiler.host,
mode: Mode::ToolRustc, tool: "error_index_generator",
path: "src/tools/error_index_generator", mode: Mode::ToolRustc,
is_optional_tool: false, path: "src/tools/error_index_generator",
source_type: SourceType::InTree, is_optional_tool: false,
extra_features: Vec::new(), source_type: SourceType::InTree,
}).expect("expected to build -- essential tool") extra_features: Vec::new(),
})
.expect("expected to build -- essential tool")
} }
} }
@ -457,16 +448,18 @@ impl Step for RemoteTestServer {
} }
fn run(self, builder: &Builder<'_>) -> PathBuf { fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.ensure(ToolBuild { builder
compiler: self.compiler, .ensure(ToolBuild {
target: self.target, compiler: self.compiler,
tool: "remote-test-server", target: self.target,
mode: Mode::ToolStd, tool: "remote-test-server",
path: "src/tools/remote-test-server", mode: Mode::ToolStd,
is_optional_tool: false, path: "src/tools/remote-test-server",
source_type: SourceType::InTree, is_optional_tool: false,
extra_features: Vec::new(), source_type: SourceType::InTree,
}).expect("expected to build -- essential tool") extra_features: Vec::new(),
})
.expect("expected to build -- essential tool")
} }
} }
@ -487,9 +480,8 @@ impl Step for Rustdoc {
} }
fn make_run(run: RunConfig<'_>) { fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc { run.builder
compiler: run.builder.compiler(run.builder.top_stage, run.host), .ensure(Rustdoc { compiler: run.builder.compiler(run.builder.top_stage, run.host) });
});
} }
fn run(self, builder: &Builder<'_>) -> PathBuf { fn run(self, builder: &Builder<'_>) -> PathBuf {
@ -525,14 +517,17 @@ impl Step for Rustdoc {
&[], &[],
); );
builder.info(&format!("Building rustdoc for stage{} ({})", builder.info(&format!(
target_compiler.stage, target_compiler.host)); "Building rustdoc for stage{} ({})",
target_compiler.stage, target_compiler.host
));
builder.run(&mut cargo.into()); builder.run(&mut cargo.into());
// Cargo adds a number of paths to the dylib search path on windows, which results in // Cargo adds a number of paths to the dylib search path on windows, which results in
// the wrong rustdoc being executed. To avoid the conflicting rustdocs, we name the "tool" // the wrong rustdoc being executed. To avoid the conflicting rustdocs, we name the "tool"
// rustdoc a different name. // rustdoc a different name.
let tool_rustdoc = builder.cargo_out(build_compiler, Mode::ToolRustc, target) let tool_rustdoc = builder
.cargo_out(build_compiler, Mode::ToolRustc, target)
.join(exe("rustdoc_tool_binary", &target_compiler.host)); .join(exe("rustdoc_tool_binary", &target_compiler.host));
// don't create a stage0-sysroot/bin directory. // don't create a stage0-sysroot/bin directory.
@ -574,16 +569,18 @@ impl Step for Cargo {
} }
fn run(self, builder: &Builder<'_>) -> PathBuf { fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.ensure(ToolBuild { builder
compiler: self.compiler, .ensure(ToolBuild {
target: self.target, compiler: self.compiler,
tool: "cargo", target: self.target,
mode: Mode::ToolRustc, tool: "cargo",
path: "src/tools/cargo", mode: Mode::ToolRustc,
is_optional_tool: false, path: "src/tools/cargo",
source_type: SourceType::Submodule, is_optional_tool: false,
extra_features: Vec::new(), source_type: SourceType::Submodule,
}).expect("expected to build -- essential tool") extra_features: Vec::new(),
})
.expect("expected to build -- essential tool")
} }
} }
@ -682,7 +679,7 @@ impl<'a> Builder<'a> {
let curpaths = env::split_paths(&curpaths).collect::<Vec<_>>(); let curpaths = env::split_paths(&curpaths).collect::<Vec<_>>();
for &(ref k, ref v) in self.cc[&compiler.host].env() { for &(ref k, ref v) in self.cc[&compiler.host].env() {
if k != "PATH" { if k != "PATH" {
continue continue;
} }
for path in env::split_paths(v) { for path in env::split_paths(v) {
if !curpaths.contains(&path) { if !curpaths.contains(&path) {

View File

@ -1,25 +1,25 @@
use serde::{Deserialize, Serialize}; use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use build_helper::t; use build_helper::t;
use std::time; use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::env;
use std::fmt;
use std::fs; use std::fs;
use std::io::{Seek, SeekFrom}; use std::io::{Seek, SeekFrom};
use std::collections::HashMap;
use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use std::fmt;
use std::process::Command;
use std::path::PathBuf; use std::path::PathBuf;
use std::env; use std::process::Command;
use std::time;
// Each cycle is 42 days long (6 weeks); the last week is 35..=42 then. // Each cycle is 42 days long (6 weeks); the last week is 35..=42 then.
const BETA_WEEK_START: u64 = 35; const BETA_WEEK_START: u64 = 35;
#[cfg(linux)] #[cfg(target_os = "linux")]
const OS: Option<&str> = Some("linux"); const OS: Option<&str> = Some("linux");
#[cfg(windows)] #[cfg(windows)]
const OS: Option<&str> = Some("windows"); const OS: Option<&str> = Some("windows");
#[cfg(all(not(linux), not(windows)))] #[cfg(all(not(target_os = "linux"), not(windows)))]
const OS: Option<&str> = None; const OS: Option<&str> = None;
type ToolstateData = HashMap<Box<str>, ToolState>; type ToolstateData = HashMap<Box<str>, ToolState>;
@ -38,11 +38,15 @@ pub enum ToolState {
impl fmt::Display for ToolState { impl fmt::Display for ToolState {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "{}", match self { write!(
ToolState::TestFail => "test-fail", f,
ToolState::TestPass => "test-pass", "{}",
ToolState::BuildFail => "build-fail", match self {
}) ToolState::TestFail => "test-fail",
ToolState::TestPass => "test-pass",
ToolState::BuildFail => "build-fail",
}
)
} }
} }
@ -120,9 +124,7 @@ fn check_changed_files(toolstates: &HashMap<Box<str>, ToolState>) {
let output = t!(String::from_utf8(output.stdout)); let output = t!(String::from_utf8(output.stdout));
for (tool, submodule) in STABLE_TOOLS.iter().chain(NIGHTLY_TOOLS.iter()) { for (tool, submodule) in STABLE_TOOLS.iter().chain(NIGHTLY_TOOLS.iter()) {
let changed = output.lines().any(|l| { let changed = output.lines().any(|l| l.starts_with("M") && l.ends_with(submodule));
l.starts_with("M") && l.ends_with(submodule)
});
eprintln!("Verifying status of {}...", tool); eprintln!("Verifying status of {}...", tool);
if !changed { if !changed {
continue; continue;
@ -179,8 +181,10 @@ impl Step for ToolStateCheck {
eprintln!("error: Tool `{}` should be test-pass but is {}", tool, state); eprintln!("error: Tool `{}` should be test-pass but is {}", tool, state);
} else if in_beta_week { } else if in_beta_week {
did_error = true; did_error = true;
eprintln!("error: Tool `{}` should be test-pass but is {} during beta week.", eprintln!(
tool, state); "error: Tool `{}` should be test-pass but is {} during beta week.",
tool, state
);
} }
} }
} }
@ -210,11 +214,8 @@ impl Builder<'_> {
// Ensure the parent directory always exists // Ensure the parent directory always exists
t!(std::fs::create_dir_all(parent)); t!(std::fs::create_dir_all(parent));
} }
let mut file = t!(fs::OpenOptions::new() let mut file =
.create(true) t!(fs::OpenOptions::new().create(true).write(true).read(true).open(path));
.write(true)
.read(true)
.open(path));
serde_json::from_reader(&mut file).unwrap_or_default() serde_json::from_reader(&mut file).unwrap_or_default()
} else { } else {
@ -233,11 +234,8 @@ impl Builder<'_> {
// Ensure the parent directory always exists // Ensure the parent directory always exists
t!(std::fs::create_dir_all(parent)); t!(std::fs::create_dir_all(parent));
} }
let mut file = t!(fs::OpenOptions::new() let mut file =
.create(true) t!(fs::OpenOptions::new().create(true).read(true).write(true).open(path));
.read(true)
.write(true)
.open(path));
let mut current_toolstates: HashMap<Box<str>, ToolState> = let mut current_toolstates: HashMap<Box<str>, ToolState> =
serde_json::from_reader(&mut file).unwrap_or_default(); serde_json::from_reader(&mut file).unwrap_or_default();
@ -275,10 +273,7 @@ impl Builder<'_> {
/// ///
/// * See <https://help.github.com/articles/about-commit-email-addresses/> /// * See <https://help.github.com/articles/about-commit-email-addresses/>
/// if a private email by GitHub is wanted. /// if a private email by GitHub is wanted.
fn commit_toolstate_change( fn commit_toolstate_change(current_toolstate: &ToolstateData, in_beta_week: bool) {
current_toolstate: &ToolstateData,
in_beta_week: bool,
) {
fn git_config(key: &str, value: &str) { fn git_config(key: &str, value: &str) {
let status = Command::new("git").arg("config").arg("--global").arg(key).arg(value).status(); let status = Command::new("git").arg("config").arg("--global").arg(key).arg(value).status();
let success = match status { let success = match status {
@ -303,7 +298,8 @@ fn commit_toolstate_change(
let git_credential_path = PathBuf::from(t!(env::var("HOME"))).join(".git-credentials"); let git_credential_path = PathBuf::from(t!(env::var("HOME"))).join(".git-credentials");
t!(fs::write(&git_credential_path, credential)); t!(fs::write(&git_credential_path, credential));
let status = Command::new("git").arg("clone") let status = Command::new("git")
.arg("clone")
.arg("--depth=1") .arg("--depth=1")
.arg(t!(env::var("TOOLSTATE_REPO"))) .arg(t!(env::var("TOOLSTATE_REPO")))
.status(); .status();
@ -379,7 +375,7 @@ fn change_toolstate(
let mut regressed = false; let mut regressed = false;
for repo_state in old_toolstate { for repo_state in old_toolstate {
let tool = &repo_state.tool; let tool = &repo_state.tool;
let state = if cfg!(linux) { let state = if cfg!(target_os = "linux") {
&repo_state.linux &repo_state.linux
} else if cfg!(windows) { } else if cfg!(windows) {
&repo_state.windows &repo_state.windows
@ -402,10 +398,7 @@ fn change_toolstate(
std::process::exit(1); std::process::exit(1);
} }
let commit = t!(std::process::Command::new("git") let commit = t!(std::process::Command::new("git").arg("rev-parse").arg("HEAD").output());
.arg("rev-parse")
.arg("HEAD")
.output());
let commit = t!(String::from_utf8(commit.stdout)); let commit = t!(String::from_utf8(commit.stdout));
let toolstate_serialized = t!(serde_json::to_string(&current_toolstate)); let toolstate_serialized = t!(serde_json::to_string(&current_toolstate));
@ -413,7 +406,7 @@ fn change_toolstate(
let history_path = format!("rust-toolstate/history/{}.tsv", OS.expect("linux/windows only")); let history_path = format!("rust-toolstate/history/{}.tsv", OS.expect("linux/windows only"));
let mut file = t!(fs::read_to_string(&history_path)); let mut file = t!(fs::read_to_string(&history_path));
let end_of_first_line = file.find('\n').unwrap(); let end_of_first_line = file.find('\n').unwrap();
file.insert_str(end_of_first_line, &format!("{}\t{}\n", commit, toolstate_serialized)); file.insert_str(end_of_first_line, &format!("\n{}\t{}", commit.trim(), toolstate_serialized));
t!(fs::write(&history_path, file)); t!(fs::write(&history_path, file));
} }

View File

@ -4,36 +4,28 @@
//! not a lot of interesting happenings here unfortunately. //! not a lot of interesting happenings here unfortunately.
use std::env; use std::env;
use std::str;
use std::fs; use std::fs;
use std::io; use std::io;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::Command; use std::process::Command;
use std::str;
use std::time::Instant; use std::time::Instant;
use build_helper::t; use build_helper::t;
use crate::config::Config;
use crate::builder::Builder; use crate::builder::Builder;
use crate::cache::Interned; use crate::cache::Interned;
use crate::config::Config;
/// Returns the `name` as the filename of a static library for `target`. /// Returns the `name` as the filename of a static library for `target`.
pub fn staticlib(name: &str, target: &str) -> String { pub fn staticlib(name: &str, target: &str) -> String {
if target.contains("windows") { if target.contains("windows") { format!("{}.lib", name) } else { format!("lib{}.a", name) }
format!("{}.lib", name)
} else {
format!("lib{}.a", name)
}
} }
/// Given an executable called `name`, return the filename for the /// Given an executable called `name`, return the filename for the
/// executable for a particular target. /// executable for a particular target.
pub fn exe(name: &str, target: &str) -> String { pub fn exe(name: &str, target: &str) -> String {
if target.contains("windows") { if target.contains("windows") { format!("{}.exe", name) } else { name.to_string() }
format!("{}.exe", name)
} else {
name.to_string()
}
} }
/// Returns `true` if the file name given looks like a dynamic library. /// Returns `true` if the file name given looks like a dynamic library.
@ -44,7 +36,7 @@ pub fn is_dylib(name: &str) -> bool {
/// Returns the corresponding relative library directory that the compiler's /// Returns the corresponding relative library directory that the compiler's
/// dylibs will be found in. /// dylibs will be found in.
pub fn libdir(target: &str) -> &'static str { pub fn libdir(target: &str) -> &'static str {
if target.contains("windows") {"bin"} else {"lib"} if target.contains("windows") { "bin" } else { "lib" }
} }
/// Adds a list of lookup paths to `cmd`'s dynamic library lookup path. /// Adds a list of lookup paths to `cmd`'s dynamic library lookup path.
@ -106,9 +98,7 @@ impl Drop for TimeIt {
fn drop(&mut self) { fn drop(&mut self) {
let time = self.1.elapsed(); let time = self.1.elapsed();
if !self.0 { if !self.0 {
println!("\tfinished in {}.{:03}", println!("\tfinished in {}.{:03}", time.as_secs(), time.subsec_nanos() / 1_000_000);
time.as_secs(),
time.subsec_nanos() / 1_000_000);
} }
} }
} }
@ -116,7 +106,9 @@ impl Drop for TimeIt {
/// Symlinks two directories, using junctions on Windows and normal symlinks on /// Symlinks two directories, using junctions on Windows and normal symlinks on
/// Unix. /// Unix.
pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> { pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
if config.dry_run { return Ok(()); } if config.dry_run {
return Ok(());
}
let _ = fs::remove_dir(dest); let _ = fs::remove_dir(dest);
return symlink_dir_inner(src, dest); return symlink_dir_inner(src, dest);
@ -131,37 +123,24 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
// what can be found here: // what can be found here:
// //
// http://www.flexhex.com/docs/articles/hard-links.phtml // http://www.flexhex.com/docs/articles/hard-links.phtml
//
// Copied from std
#[cfg(windows)] #[cfg(windows)]
#[allow(nonstandard_style)]
fn symlink_dir_inner(target: &Path, junction: &Path) -> io::Result<()> { fn symlink_dir_inner(target: &Path, junction: &Path) -> io::Result<()> {
use std::ptr;
use std::ffi::OsStr; use std::ffi::OsStr;
use std::os::windows::ffi::OsStrExt; use std::os::windows::ffi::OsStrExt;
use std::ptr;
const MAXIMUM_REPARSE_DATA_BUFFER_SIZE: usize = 16 * 1024; use winapi::shared::minwindef::{DWORD, WORD};
const GENERIC_WRITE: DWORD = 0x40000000; use winapi::um::fileapi::{CreateFileW, OPEN_EXISTING};
const OPEN_EXISTING: DWORD = 3; use winapi::um::handleapi::CloseHandle;
const FILE_FLAG_OPEN_REPARSE_POINT: DWORD = 0x00200000; use winapi::um::ioapiset::DeviceIoControl;
const FILE_FLAG_BACKUP_SEMANTICS: DWORD = 0x02000000; use winapi::um::winbase::{FILE_FLAG_BACKUP_SEMANTICS, FILE_FLAG_OPEN_REPARSE_POINT};
const FSCTL_SET_REPARSE_POINT: DWORD = 0x900a4; use winapi::um::winioctl::FSCTL_SET_REPARSE_POINT;
const IO_REPARSE_TAG_MOUNT_POINT: DWORD = 0xa0000003; use winapi::um::winnt::{
const FILE_SHARE_DELETE: DWORD = 0x4; FILE_SHARE_DELETE, FILE_SHARE_READ, FILE_SHARE_WRITE, GENERIC_WRITE,
const FILE_SHARE_READ: DWORD = 0x1; IO_REPARSE_TAG_MOUNT_POINT, MAXIMUM_REPARSE_DATA_BUFFER_SIZE, WCHAR,
const FILE_SHARE_WRITE: DWORD = 0x2; };
type BOOL = i32;
type DWORD = u32;
type HANDLE = *mut u8;
type LPCWSTR = *const u16;
type LPDWORD = *mut DWORD;
type LPOVERLAPPED = *mut u8;
type LPSECURITY_ATTRIBUTES = *mut u8;
type LPVOID = *mut u8;
type WCHAR = u16;
type WORD = u16;
#[allow(non_snake_case)]
#[repr(C)] #[repr(C)]
struct REPARSE_MOUNTPOINT_DATA_BUFFER { struct REPARSE_MOUNTPOINT_DATA_BUFFER {
ReparseTag: DWORD, ReparseTag: DWORD,
@ -173,26 +152,6 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
ReparseTarget: WCHAR, ReparseTarget: WCHAR,
} }
extern "system" {
fn CreateFileW(lpFileName: LPCWSTR,
dwDesiredAccess: DWORD,
dwShareMode: DWORD,
lpSecurityAttributes: LPSECURITY_ATTRIBUTES,
dwCreationDisposition: DWORD,
dwFlagsAndAttributes: DWORD,
hTemplateFile: HANDLE)
-> HANDLE;
fn DeviceIoControl(hDevice: HANDLE,
dwIoControlCode: DWORD,
lpInBuffer: LPVOID,
nInBufferSize: DWORD,
lpOutBuffer: LPVOID,
nOutBufferSize: DWORD,
lpBytesReturned: LPDWORD,
lpOverlapped: LPOVERLAPPED) -> BOOL;
fn CloseHandle(hObject: HANDLE) -> BOOL;
}
fn to_u16s<S: AsRef<OsStr>>(s: S) -> io::Result<Vec<u16>> { fn to_u16s<S: AsRef<OsStr>>(s: S) -> io::Result<Vec<u16>> {
Ok(s.as_ref().encode_wide().chain(Some(0)).collect()) Ok(s.as_ref().encode_wide().chain(Some(0)).collect())
} }
@ -207,17 +166,18 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
let path = to_u16s(junction)?; let path = to_u16s(junction)?;
unsafe { unsafe {
let h = CreateFileW(path.as_ptr(), let h = CreateFileW(
GENERIC_WRITE, path.as_ptr(),
FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE, GENERIC_WRITE,
ptr::null_mut(), FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
OPEN_EXISTING, ptr::null_mut(),
FILE_FLAG_OPEN_REPARSE_POINT | FILE_FLAG_BACKUP_SEMANTICS, OPEN_EXISTING,
ptr::null_mut()); FILE_FLAG_OPEN_REPARSE_POINT | FILE_FLAG_BACKUP_SEMANTICS,
ptr::null_mut(),
);
let mut data = [0u8; MAXIMUM_REPARSE_DATA_BUFFER_SIZE]; let mut data = [0u8; MAXIMUM_REPARSE_DATA_BUFFER_SIZE as usize];
let db = data.as_mut_ptr() let db = data.as_mut_ptr() as *mut REPARSE_MOUNTPOINT_DATA_BUFFER;
as *mut REPARSE_MOUNTPOINT_DATA_BUFFER;
let buf = &mut (*db).ReparseTarget as *mut u16; let buf = &mut (*db).ReparseTarget as *mut u16;
let mut i = 0; let mut i = 0;
// FIXME: this conversion is very hacky // FIXME: this conversion is very hacky
@ -232,23 +192,21 @@ pub fn symlink_dir(config: &Config, src: &Path, dest: &Path) -> io::Result<()> {
(*db).ReparseTag = IO_REPARSE_TAG_MOUNT_POINT; (*db).ReparseTag = IO_REPARSE_TAG_MOUNT_POINT;
(*db).ReparseTargetMaximumLength = (i * 2) as WORD; (*db).ReparseTargetMaximumLength = (i * 2) as WORD;
(*db).ReparseTargetLength = ((i - 1) * 2) as WORD; (*db).ReparseTargetLength = ((i - 1) * 2) as WORD;
(*db).ReparseDataLength = (*db).ReparseDataLength = (*db).ReparseTargetLength as DWORD + 12;
(*db).ReparseTargetLength as DWORD + 12;
let mut ret = 0; let mut ret = 0;
let res = DeviceIoControl(h as *mut _, let res = DeviceIoControl(
FSCTL_SET_REPARSE_POINT, h as *mut _,
data.as_ptr() as *mut _, FSCTL_SET_REPARSE_POINT,
(*db).ReparseDataLength + 8, data.as_ptr() as *mut _,
ptr::null_mut(), 0, (*db).ReparseDataLength + 8,
&mut ret, ptr::null_mut(),
ptr::null_mut()); 0,
&mut ret,
ptr::null_mut(),
);
let out = if res == 0 { let out = if res == 0 { Err(io::Error::last_os_error()) } else { Ok(()) };
Err(io::Error::last_os_error())
} else {
Ok(())
};
CloseHandle(h); CloseHandle(h);
out out
} }
@ -270,9 +228,9 @@ pub enum CiEnv {
impl CiEnv { impl CiEnv {
/// Obtains the current CI environment. /// Obtains the current CI environment.
pub fn current() -> CiEnv { pub fn current() -> CiEnv {
if env::var("TF_BUILD").ok().map_or(false, |e| &*e == "True") { if env::var("TF_BUILD").map_or(false, |e| e == "True") {
CiEnv::AzurePipelines CiEnv::AzurePipelines
} else if env::var("GITHUB_ACTIONS").ok().map_or(false, |e| &*e == "true") { } else if env::var("GITHUB_ACTIONS").map_or(false, |e| e == "true") {
CiEnv::GitHubActions CiEnv::GitHubActions
} else { } else {
CiEnv::None CiEnv::None
@ -299,8 +257,11 @@ pub fn forcing_clang_based_tests() -> bool {
"0" | "no" | "off" => false, "0" | "no" | "off" => false,
other => { other => {
// Let's make sure typos don't go unnoticed // Let's make sure typos don't go unnoticed
panic!("Unrecognized option '{}' set in \ panic!(
RUSTBUILD_FORCE_CLANG_BASED_TESTS", other) "Unrecognized option '{}' set in \
RUSTBUILD_FORCE_CLANG_BASED_TESTS",
other
)
} }
} }
} else { } else {
@ -311,11 +272,9 @@ pub fn forcing_clang_based_tests() -> bool {
pub fn use_host_linker(target: &Interned<String>) -> bool { pub fn use_host_linker(target: &Interned<String>) -> bool {
// FIXME: this information should be gotten by checking the linker flavor // FIXME: this information should be gotten by checking the linker flavor
// of the rustc target // of the rustc target
!( !(target.contains("emscripten")
target.contains("emscripten") || || target.contains("wasm32")
target.contains("wasm32") || || target.contains("nvptx")
target.contains("nvptx") || || target.contains("fortanix")
target.contains("fortanix") || || target.contains("fuchsia"))
target.contains("fuchsia")
)
} }

View File

@ -1,9 +1,7 @@
use std::fs::File;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::time::{SystemTime, UNIX_EPOCH}; use std::time::{SystemTime, UNIX_EPOCH};
use std::{env, fs}; use std::{env, fs};
use std::thread;
/// A helper macro to `unwrap` a result except also print out details like: /// A helper macro to `unwrap` a result except also print out details like:
/// ///
@ -64,10 +62,7 @@ pub fn run(cmd: &mut Command) {
pub fn try_run(cmd: &mut Command) -> bool { pub fn try_run(cmd: &mut Command) -> bool {
let status = match cmd.status() { let status = match cmd.status() {
Ok(status) => status, Ok(status) => status,
Err(e) => fail(&format!( Err(e) => fail(&format!("failed to execute command: {:?}\nerror: {}", cmd, e)),
"failed to execute command: {:?}\nerror: {}",
cmd, e
)),
}; };
if !status.success() { if !status.success() {
println!( println!(
@ -88,10 +83,7 @@ pub fn run_suppressed(cmd: &mut Command) {
pub fn try_run_suppressed(cmd: &mut Command) -> bool { pub fn try_run_suppressed(cmd: &mut Command) -> bool {
let output = match cmd.output() { let output = match cmd.output() {
Ok(status) => status, Ok(status) => status,
Err(e) => fail(&format!( Err(e) => fail(&format!("failed to execute command: {:?}\nerror: {}", cmd, e)),
"failed to execute command: {:?}\nerror: {}",
cmd, e
)),
}; };
if !output.status.success() { if !output.status.success() {
println!( println!(
@ -119,8 +111,10 @@ pub fn gnu_target(target: &str) -> &str {
} }
pub fn make(host: &str) -> PathBuf { pub fn make(host: &str) -> PathBuf {
if host.contains("dragonfly") || host.contains("freebsd") if host.contains("dragonfly")
|| host.contains("netbsd") || host.contains("openbsd") || host.contains("freebsd")
|| host.contains("netbsd")
|| host.contains("openbsd")
{ {
PathBuf::from("gmake") PathBuf::from("gmake")
} else { } else {
@ -131,10 +125,7 @@ pub fn make(host: &str) -> PathBuf {
pub fn output(cmd: &mut Command) -> String { pub fn output(cmd: &mut Command) -> String {
let output = match cmd.stderr(Stdio::inherit()).output() { let output = match cmd.stderr(Stdio::inherit()).output() {
Ok(status) => status, Ok(status) => status,
Err(e) => fail(&format!( Err(e) => fail(&format!("failed to execute command: {:?}\nerror: {}", cmd, e)),
"failed to execute command: {:?}\nerror: {}",
cmd, e
)),
}; };
if !output.status.success() { if !output.status.success() {
panic!( panic!(
@ -147,7 +138,8 @@ pub fn output(cmd: &mut Command) -> String {
} }
pub fn rerun_if_changed_anything_in_dir(dir: &Path) { pub fn rerun_if_changed_anything_in_dir(dir: &Path) {
let mut stack = dir.read_dir() let mut stack = dir
.read_dir()
.unwrap() .unwrap()
.map(|e| e.unwrap()) .map(|e| e.unwrap())
.filter(|e| &*e.file_name() != ".git") .filter(|e| &*e.file_name() != ".git")
@ -164,9 +156,7 @@ pub fn rerun_if_changed_anything_in_dir(dir: &Path) {
/// Returns the last-modified time for `path`, or zero if it doesn't exist. /// Returns the last-modified time for `path`, or zero if it doesn't exist.
pub fn mtime(path: &Path) -> SystemTime { pub fn mtime(path: &Path) -> SystemTime {
fs::metadata(path) fs::metadata(path).and_then(|f| f.modified()).unwrap_or(UNIX_EPOCH)
.and_then(|f| f.modified())
.unwrap_or(UNIX_EPOCH)
} }
/// Returns `true` if `dst` is up to date given that the file or files in `src` /// Returns `true` if `dst` is up to date given that the file or files in `src`
@ -189,123 +179,6 @@ pub fn up_to_date(src: &Path, dst: &Path) -> bool {
} }
} }
#[must_use]
pub struct NativeLibBoilerplate {
pub src_dir: PathBuf,
pub out_dir: PathBuf,
}
impl NativeLibBoilerplate {
/// On macOS we don't want to ship the exact filename that compiler-rt builds.
/// This conflicts with the system and ours is likely a wildly different
/// version, so they can't be substituted.
///
/// As a result, we rename it here but we need to also use
/// `install_name_tool` on macOS to rename the commands listed inside of it to
/// ensure it's linked against correctly.
pub fn fixup_sanitizer_lib_name(&self, sanitizer_name: &str) {
if env::var("TARGET").unwrap() != "x86_64-apple-darwin" {
return
}
let dir = self.out_dir.join("build/lib/darwin");
let name = format!("clang_rt.{}_osx_dynamic", sanitizer_name);
let src = dir.join(&format!("lib{}.dylib", name));
let new_name = format!("lib__rustc__{}.dylib", name);
let dst = dir.join(&new_name);
println!("{} => {}", src.display(), dst.display());
fs::rename(&src, &dst).unwrap();
let status = Command::new("install_name_tool")
.arg("-id")
.arg(format!("@rpath/{}", new_name))
.arg(&dst)
.status()
.expect("failed to execute `install_name_tool`");
assert!(status.success());
}
}
impl Drop for NativeLibBoilerplate {
fn drop(&mut self) {
if !thread::panicking() {
t!(File::create(self.out_dir.join("rustbuild.timestamp")));
}
}
}
// Perform standard preparations for native libraries that are build only once for all stages.
// Emit rerun-if-changed and linking attributes for Cargo, check if any source files are
// updated, calculate paths used later in actual build with CMake/make or C/C++ compiler.
// If Err is returned, then everything is up-to-date and further build actions can be skipped.
// Timestamps are created automatically when the result of `native_lib_boilerplate` goes out
// of scope, so all the build actions should be completed until then.
pub fn native_lib_boilerplate(
src_dir: &Path,
out_name: &str,
link_name: &str,
search_subdir: &str,
) -> Result<NativeLibBoilerplate, ()> {
rerun_if_changed_anything_in_dir(src_dir);
let out_dir = env::var_os("RUSTBUILD_NATIVE_DIR").unwrap_or_else(||
env::var_os("OUT_DIR").unwrap());
let out_dir = PathBuf::from(out_dir).join(out_name);
t!(fs::create_dir_all(&out_dir));
if link_name.contains('=') {
println!("cargo:rustc-link-lib={}", link_name);
} else {
println!("cargo:rustc-link-lib=static={}", link_name);
}
println!(
"cargo:rustc-link-search=native={}",
out_dir.join(search_subdir).display()
);
let timestamp = out_dir.join("rustbuild.timestamp");
if !up_to_date(Path::new("build.rs"), &timestamp) || !up_to_date(src_dir, &timestamp) {
Ok(NativeLibBoilerplate {
src_dir: src_dir.to_path_buf(),
out_dir,
})
} else {
Err(())
}
}
pub fn sanitizer_lib_boilerplate(sanitizer_name: &str)
-> Result<(NativeLibBoilerplate, String), ()>
{
let (link_name, search_path, apple) = match &*env::var("TARGET").unwrap() {
"x86_64-unknown-linux-gnu" => (
format!("clang_rt.{}-x86_64", sanitizer_name),
"build/lib/linux",
false,
),
"x86_64-apple-darwin" => (
format!("clang_rt.{}_osx_dynamic", sanitizer_name),
"build/lib/darwin",
true,
),
_ => return Err(()),
};
let to_link = if apple {
format!("dylib=__rustc__{}", link_name)
} else {
format!("static={}", link_name)
};
// This env var is provided by rustbuild to tell us where `compiler-rt`
// lives.
let dir = env::var_os("RUST_COMPILER_RT_ROOT").unwrap();
let lib = native_lib_boilerplate(
dir.as_ref(),
sanitizer_name,
&to_link,
search_path,
)?;
Ok((lib, link_name))
}
fn dir_up_to_date(src: &Path, threshold: SystemTime) -> bool { fn dir_up_to_date(src: &Path, threshold: SystemTime) -> bool {
t!(fs::read_dir(src)).map(|e| t!(e)).all(|e| { t!(fs::read_dir(src)).map(|e| t!(e)).all(|e| {
let meta = t!(e.metadata()); let meta = t!(e.metadata());

View File

@ -63,7 +63,7 @@ jobs:
- job: macOS - job: macOS
timeoutInMinutes: 600 timeoutInMinutes: 600
pool: pool:
vmImage: macos-10.13 vmImage: macos-10.15
steps: steps:
- template: steps/run.yml - template: steps/run.yml
strategy: strategy:
@ -85,7 +85,7 @@ jobs:
dist-x86_64-apple: dist-x86_64-apple:
SCRIPT: ./x.py dist SCRIPT: ./x.py dist
RUST_CONFIGURE_ARGS: --target=aarch64-apple-ios,armv7-apple-ios,armv7s-apple-ios,i386-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc RUST_CONFIGURE_ARGS: --target=aarch64-apple-ios,x86_64-apple-ios --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1 RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.7 MACOSX_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1 NO_LLVM_ASSERTIONS: 1
@ -100,25 +100,6 @@ jobs:
NO_LLVM_ASSERTIONS: 1 NO_LLVM_ASSERTIONS: 1
NO_DEBUG_ASSERTIONS: 1 NO_DEBUG_ASSERTIONS: 1
i686-apple:
SCRIPT: ./x.py test
RUST_CONFIGURE_ARGS: --build=i686-apple-darwin --set rust.jemalloc
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.8
MACOSX_STD_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1
NO_DEBUG_ASSERTIONS: 1
dist-i686-apple:
SCRIPT: ./x.py dist
RUST_CONFIGURE_ARGS: --build=i686-apple-darwin --enable-full-tools --enable-profiler --set rust.jemalloc
RUSTC_RETRY_LINKER_ON_SEGFAULT: 1
MACOSX_DEPLOYMENT_TARGET: 10.7
NO_LLVM_ASSERTIONS: 1
NO_DEBUG_ASSERTIONS: 1
DIST_REQUIRE_ALL_TOOLS: 1
- job: Windows - job: Windows
timeoutInMinutes: 600 timeoutInMinutes: 600

View File

@ -51,10 +51,6 @@ steps:
displayName: Install clang displayName: Install clang
condition: and(succeeded(), not(variables.SKIP_JOB)) condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/switch-xcode.sh
displayName: Switch to Xcode 9.3
condition: and(succeeded(), not(variables.SKIP_JOB))
- bash: src/ci/scripts/install-wix.sh - bash: src/ci/scripts/install-wix.sh
displayName: Install wix displayName: Install wix
condition: and(succeeded(), not(variables.SKIP_JOB)) condition: and(succeeded(), not(variables.SKIP_JOB))

View File

@ -25,7 +25,7 @@ jobs:
# - job: macOS # - job: macOS
# timeoutInMinutes: 600 # timeoutInMinutes: 600
# pool: # pool:
# vmImage: macos-10.13 # vmImage: macos-10.15
# steps: # steps:
# - template: steps/run.yml # - template: steps/run.yml
# strategy: # strategy:

View File

@ -2,11 +2,24 @@ FROM ubuntu:16.04
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
g++ \ g++ \
automake \
bison \
bzip2 \
flex \
help2man \
libtool-bin \
texinfo \
unzip \
wget \
xz-utils \
libncurses-dev \
gawk \
make \ make \
file \ file \
curl \ curl \
ca-certificates \ ca-certificates \
python2.7 \ python2.7 \
python3 \
git \ git \
cmake \ cmake \
sudo \ sudo \
@ -35,6 +48,18 @@ RUN add-apt-repository ppa:team-gcc-arm-embedded/ppa && \
apt-get update && \ apt-get update && \
apt-get install -y --no-install-recommends gcc-arm-embedded apt-get install -y --no-install-recommends gcc-arm-embedded
COPY scripts/rustbuild-setup.sh dist-various-1/build-riscv-toolchain.sh dist-various-1/riscv64-unknown-linux-gnu.config dist-various-1/crosstool-ng.sh /build/
RUN ./crosstool-ng.sh
# Crosstool-ng will refuse to build as root
RUN sh ./rustbuild-setup.sh
USER rustbuild
RUN ./build-riscv-toolchain.sh
USER root
ENV PATH=/x-tools/riscv64-unknown-linux-gnu/bin:$PATH
COPY dist-various-1/build-rumprun.sh /build COPY dist-various-1/build-rumprun.sh /build
RUN ./build-rumprun.sh RUN ./build-rumprun.sh
@ -129,11 +154,13 @@ ENV TARGETS=$TARGETS,riscv32imc-unknown-none-elf
ENV TARGETS=$TARGETS,riscv32imac-unknown-none-elf ENV TARGETS=$TARGETS,riscv32imac-unknown-none-elf
ENV TARGETS=$TARGETS,riscv64imac-unknown-none-elf ENV TARGETS=$TARGETS,riscv64imac-unknown-none-elf
ENV TARGETS=$TARGETS,riscv64gc-unknown-none-elf ENV TARGETS=$TARGETS,riscv64gc-unknown-none-elf
ENV TARGETS=$TARGETS,riscv64gc-unknown-linux-gnu
ENV TARGETS=$TARGETS,armebv7r-none-eabi ENV TARGETS=$TARGETS,armebv7r-none-eabi
ENV TARGETS=$TARGETS,armebv7r-none-eabihf ENV TARGETS=$TARGETS,armebv7r-none-eabihf
ENV TARGETS=$TARGETS,armv7r-none-eabi ENV TARGETS=$TARGETS,armv7r-none-eabi
ENV TARGETS=$TARGETS,armv7r-none-eabihf ENV TARGETS=$TARGETS,armv7r-none-eabihf
ENV TARGETS=$TARGETS,thumbv7neon-unknown-linux-gnueabihf ENV TARGETS=$TARGETS,thumbv7neon-unknown-linux-gnueabihf
ENV TARGETS=$TARGETS,armv7a-none-eabi
# riscv targets currently do not need a C compiler, as compiler_builtins # riscv targets currently do not need a C compiler, as compiler_builtins
# doesn't currently have it enabled, and the riscv gcc compiler is not # doesn't currently have it enabled, and the riscv gcc compiler is not
@ -147,6 +174,13 @@ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \ CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \
AR_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-ar \ AR_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-ar \
CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++ \ CXX_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-g++ \
CC_armv7a_none_eabi=arm-none-eabi-gcc \
CC_armv7a_none_eabihf=arm-none-eabi-gcc \
CFLAGS_armv7a_none_eabi=-march=armv7-a \
CFLAGS_armv7a_none_eabihf=-march=armv7-a+vfpv3 \
CC_riscv64gc_unknown_linux_gnu=riscv64-unknown-linux-gnu-gcc \
AR_riscv64gc_unknown_linux_gnu=riscv64-unknown-linux-gnu-ar \
CXX_riscv64gc_unknown_linux_gnu=riscv64-unknown-linux-gnu-g++ \
CC_riscv32i_unknown_none_elf=false \ CC_riscv32i_unknown_none_elf=false \
CC_riscv32imc_unknown_none_elf=false \ CC_riscv32imc_unknown_none_elf=false \
CC_riscv32imac_unknown_none_elf=false \ CC_riscv32imac_unknown_none_elf=false \

View File

@ -0,0 +1,27 @@
#!/usr/bin/env bash
set -ex
hide_output() {
set +x
on_err="
echo ERROR: An error was encountered with the build.
cat /tmp/build.log
exit 1
"
trap "$on_err" ERR
bash -c "while true; do sleep 30; echo \$(date) - building ...; done" &
PING_LOOP_PID=$!
$@ &> /tmp/build.log
rm /tmp/build.log
trap - ERR
kill $PING_LOOP_PID
set -x
}
mkdir -p /tmp/build-riscv
cp riscv64-unknown-linux-gnu.config /tmp/build-riscv/.config
cd /tmp/build-riscv
hide_output ct-ng build
cd ..
rm -rf build-riscv

View File

@ -0,0 +1,13 @@
#!/bin/bash
set -ex
# Mirrored from https://github.com/crosstool-ng/crosstool-ng/archive/crosstool-ng-1.24.0.tar.gz
url="https://ci-mirrors.rust-lang.org/rustc/crosstool-ng-1.24.0.tar.gz"
curl -Lf $url | tar xzf -
cd crosstool-ng-crosstool-ng-1.24.0
./bootstrap
./configure --prefix=/usr/local
make -j$(nproc)
make install
cd ..
rm -rf crosstool-ng-crosstool-ng-1.24.0

View File

@ -0,0 +1,908 @@
#
# Automatically generated file; DO NOT EDIT.
# crosstool-NG Configuration
#
CT_CONFIGURE_has_static_link=y
CT_CONFIGURE_has_cxx11=y
CT_CONFIGURE_has_wget=y
CT_CONFIGURE_has_curl=y
CT_CONFIGURE_has_make_3_81_or_newer=y
CT_CONFIGURE_has_make_4_0_or_newer=y
CT_CONFIGURE_has_libtool_2_4_or_newer=y
CT_CONFIGURE_has_libtoolize_2_4_or_newer=y
CT_CONFIGURE_has_autoconf_2_65_or_newer=y
CT_CONFIGURE_has_autoreconf_2_65_or_newer=y
CT_CONFIGURE_has_automake_1_15_or_newer=y
CT_CONFIGURE_has_gnu_m4_1_4_12_or_newer=y
CT_CONFIGURE_has_python_3_4_or_newer=y
CT_CONFIGURE_has_bison_2_7_or_newer=y
CT_CONFIGURE_has_python=y
CT_CONFIGURE_has_dtc=y
CT_CONFIGURE_has_svn=y
CT_CONFIGURE_has_git=y
CT_CONFIGURE_has_md5sum=y
CT_CONFIGURE_has_sha1sum=y
CT_CONFIGURE_has_sha256sum=y
CT_CONFIGURE_has_sha512sum=y
CT_CONFIGURE_has_install_with_strip_program=y
CT_CONFIG_VERSION_CURRENT="3"
CT_CONFIG_VERSION="3"
CT_MODULES=y
#
# Paths and misc options
#
#
# crosstool-NG behavior
#
# CT_OBSOLETE is not set
CT_EXPERIMENTAL=y
# CT_ALLOW_BUILD_AS_ROOT is not set
# CT_DEBUG_CT is not set
#
# Paths
#
CT_LOCAL_TARBALLS_DIR="${HOME}/src"
CT_SAVE_TARBALLS=y
# CT_TARBALLS_BUILDROOT_LAYOUT is not set
CT_WORK_DIR="${CT_TOP_DIR}/.build"
CT_BUILD_TOP_DIR="${CT_WORK_DIR:-${CT_TOP_DIR}/.build}/${CT_HOST:+HOST-${CT_HOST}/}${CT_TARGET}"
CT_PREFIX_DIR="/x-tools/${CT_TARGET}"
CT_RM_RF_PREFIX_DIR=y
CT_REMOVE_DOCS=y
CT_INSTALL_LICENSES=y
CT_PREFIX_DIR_RO=y
CT_STRIP_HOST_TOOLCHAIN_EXECUTABLES=y
# CT_STRIP_TARGET_TOOLCHAIN_EXECUTABLES is not set
#
# Downloading
#
CT_DOWNLOAD_AGENT_WGET=y
# CT_DOWNLOAD_AGENT_CURL is not set
# CT_DOWNLOAD_AGENT_NONE is not set
# CT_FORBID_DOWNLOAD is not set
# CT_FORCE_DOWNLOAD is not set
CT_CONNECT_TIMEOUT=10
CT_DOWNLOAD_WGET_OPTIONS="--passive-ftp --tries=3 -nc --progress=dot:binary"
# CT_ONLY_DOWNLOAD is not set
# CT_USE_MIRROR is not set
CT_VERIFY_DOWNLOAD_DIGEST=y
CT_VERIFY_DOWNLOAD_DIGEST_SHA512=y
# CT_VERIFY_DOWNLOAD_DIGEST_SHA256 is not set
# CT_VERIFY_DOWNLOAD_DIGEST_SHA1 is not set
# CT_VERIFY_DOWNLOAD_DIGEST_MD5 is not set
CT_VERIFY_DOWNLOAD_DIGEST_ALG="sha512"
# CT_VERIFY_DOWNLOAD_SIGNATURE is not set
#
# Extracting
#
# CT_FORCE_EXTRACT is not set
CT_OVERRIDE_CONFIG_GUESS_SUB=y
# CT_ONLY_EXTRACT is not set
CT_PATCH_BUNDLED=y
# CT_PATCH_LOCAL is not set
# CT_PATCH_BUNDLED_LOCAL is not set
# CT_PATCH_LOCAL_BUNDLED is not set
# CT_PATCH_NONE is not set
CT_PATCH_ORDER="bundled"
#
# Build behavior
#
CT_PARALLEL_JOBS=0
CT_LOAD=""
CT_USE_PIPES=y
CT_EXTRA_CFLAGS_FOR_BUILD=""
CT_EXTRA_LDFLAGS_FOR_BUILD=""
CT_EXTRA_CFLAGS_FOR_HOST=""
CT_EXTRA_LDFLAGS_FOR_HOST=""
# CT_CONFIG_SHELL_SH is not set
# CT_CONFIG_SHELL_ASH is not set
CT_CONFIG_SHELL_BASH=y
# CT_CONFIG_SHELL_CUSTOM is not set
CT_CONFIG_SHELL="${bash}"
#
# Logging
#
# CT_LOG_ERROR is not set
# CT_LOG_WARN is not set
# CT_LOG_INFO is not set
# CT_LOG_EXTRA is not set
CT_LOG_ALL=y
# CT_LOG_DEBUG is not set
CT_LOG_LEVEL_MAX="ALL"
# CT_LOG_SEE_TOOLS_WARN is not set
CT_LOG_TO_FILE=y
CT_LOG_FILE_COMPRESS=y
#
# Target options
#
# CT_ARCH_ALPHA is not set
# CT_ARCH_ARC is not set
# CT_ARCH_ARM is not set
# CT_ARCH_AVR is not set
# CT_ARCH_M68K is not set
# CT_ARCH_MICROBLAZE is not set
# CT_ARCH_MIPS is not set
# CT_ARCH_MOXIE is not set
# CT_ARCH_MSP430 is not set
# CT_ARCH_NIOS2 is not set
# CT_ARCH_POWERPC is not set
CT_ARCH_RISCV=y
# CT_ARCH_S390 is not set
# CT_ARCH_SH is not set
# CT_ARCH_SPARC is not set
# CT_ARCH_X86 is not set
# CT_ARCH_XTENSA is not set
CT_ARCH="riscv"
CT_ARCH_CHOICE_KSYM="RISCV"
CT_ARCH_TUNE=""
CT_ARCH_RISCV_SHOW=y
#
# Options for riscv
#
CT_ARCH_RISCV_PKG_KSYM=""
CT_ALL_ARCH_CHOICES="ALPHA ARC ARM AVR M68K MICROBLAZE MIPS MOXIE MSP430 NIOS2 POWERPC RISCV S390 SH SPARC X86 XTENSA"
CT_ARCH_SUFFIX=""
# CT_OMIT_TARGET_VENDOR is not set
#
# Generic target options
#
# CT_MULTILIB is not set
# CT_DEMULTILIB is not set
CT_ARCH_SUPPORTS_BOTH_MMU=y
CT_ARCH_USE_MMU=y
CT_ARCH_SUPPORTS_32=y
CT_ARCH_SUPPORTS_64=y
CT_ARCH_DEFAULT_32=y
CT_ARCH_BITNESS=64
# CT_ARCH_32 is not set
CT_ARCH_64=y
#
# Target optimisations
#
CT_ARCH_SUPPORTS_WITH_ARCH=y
CT_ARCH_SUPPORTS_WITH_ABI=y
CT_ARCH_SUPPORTS_WITH_TUNE=y
CT_ARCH_ARCH="rv64gc"
CT_ARCH_ABI=""
CT_TARGET_CFLAGS=""
CT_TARGET_LDFLAGS=""
#
# Toolchain options
#
#
# General toolchain options
#
CT_FORCE_SYSROOT=y
CT_USE_SYSROOT=y
CT_SYSROOT_NAME="sysroot"
CT_SYSROOT_DIR_PREFIX=""
CT_WANTS_STATIC_LINK=y
CT_WANTS_STATIC_LINK_CXX=y
# CT_STATIC_TOOLCHAIN is not set
CT_SHOW_CT_VERSION=y
CT_TOOLCHAIN_PKGVERSION=""
CT_TOOLCHAIN_BUGURL=""
#
# Tuple completion and aliasing
#
CT_TARGET_VENDOR="unknown"
CT_TARGET_ALIAS_SED_EXPR=""
CT_TARGET_ALIAS=""
#
# Toolchain type
#
# CT_NATIVE is not set
CT_CROSS=y
# CT_CROSS_NATIVE is not set
# CT_CANADIAN is not set
CT_TOOLCHAIN_TYPE="cross"
#
# Build system
#
CT_BUILD=""
CT_BUILD_PREFIX=""
CT_BUILD_SUFFIX=""
#
# Misc options
#
# CT_TOOLCHAIN_ENABLE_NLS is not set
#
# Operating System
#
CT_KERNEL_SUPPORTS_SHARED_LIBS=y
# CT_KERNEL_BARE_METAL is not set
CT_KERNEL_LINUX=y
CT_KERNEL="linux"
CT_KERNEL_CHOICE_KSYM="LINUX"
CT_KERNEL_LINUX_SHOW=y
#
# Options for linux
#
CT_KERNEL_LINUX_PKG_KSYM="LINUX"
CT_LINUX_DIR_NAME="linux"
CT_LINUX_PKG_NAME="linux"
CT_LINUX_SRC_RELEASE=y
# CT_LINUX_SRC_DEVEL is not set
# CT_LINUX_SRC_CUSTOM is not set
CT_LINUX_PATCH_GLOBAL=y
# CT_LINUX_PATCH_BUNDLED is not set
# CT_LINUX_PATCH_LOCAL is not set
# CT_LINUX_PATCH_BUNDLED_LOCAL is not set
# CT_LINUX_PATCH_LOCAL_BUNDLED is not set
# CT_LINUX_PATCH_NONE is not set
CT_LINUX_PATCH_ORDER="global"
CT_LINUX_V_4_20=y
# CT_LINUX_V_4_19 is not set
# CT_LINUX_V_4_18 is not set
# CT_LINUX_V_4_17 is not set
# CT_LINUX_V_4_16 is not set
# CT_LINUX_V_4_15 is not set
# CT_LINUX_V_4_14 is not set
# CT_LINUX_V_4_13 is not set
# CT_LINUX_V_4_12 is not set
# CT_LINUX_V_4_11 is not set
# CT_LINUX_V_4_10 is not set
# CT_LINUX_V_4_9 is not set
# CT_LINUX_V_4_4 is not set
# CT_LINUX_V_4_1 is not set
# CT_LINUX_V_3_16 is not set
# CT_LINUX_V_3_13 is not set
# CT_LINUX_V_3_12 is not set
# CT_LINUX_V_3_10 is not set
# CT_LINUX_V_3_4 is not set
# CT_LINUX_V_3_2 is not set
# CT_LINUX_NO_VERSIONS is not set
CT_LINUX_VERSION="4.20.8"
CT_LINUX_MIRRORS="$(CT_Mirrors kernel.org linux ${CT_LINUX_VERSION})"
CT_LINUX_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_LINUX_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_LINUX_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_LINUX_SIGNATURE_FORMAT="unpacked/.sign"
CT_LINUX_later_than_4_8=y
CT_LINUX_4_8_or_later=y
CT_LINUX_later_than_3_7=y
CT_LINUX_3_7_or_later=y
CT_LINUX_later_than_3_2=y
CT_LINUX_3_2_or_later=y
CT_LINUX_REQUIRE_3_2_or_later=y
CT_KERNEL_LINUX_VERBOSITY_0=y
# CT_KERNEL_LINUX_VERBOSITY_1 is not set
# CT_KERNEL_LINUX_VERBOSITY_2 is not set
CT_KERNEL_LINUX_VERBOSE_LEVEL=0
CT_KERNEL_LINUX_INSTALL_CHECK=y
CT_ALL_KERNEL_CHOICES="BARE_METAL LINUX WINDOWS"
#
# Common kernel options
#
CT_SHARED_LIBS=y
#
# Binary utilities
#
CT_ARCH_BINFMT_ELF=y
CT_BINUTILS_BINUTILS=y
CT_BINUTILS="binutils"
CT_BINUTILS_CHOICE_KSYM="BINUTILS"
CT_BINUTILS_BINUTILS_SHOW=y
#
# Options for binutils
#
CT_BINUTILS_BINUTILS_PKG_KSYM="BINUTILS"
CT_BINUTILS_DIR_NAME="binutils"
CT_BINUTILS_USE_GNU=y
CT_BINUTILS_USE="BINUTILS"
CT_BINUTILS_PKG_NAME="binutils"
CT_BINUTILS_SRC_RELEASE=y
# CT_BINUTILS_SRC_DEVEL is not set
# CT_BINUTILS_SRC_CUSTOM is not set
CT_BINUTILS_PATCH_GLOBAL=y
# CT_BINUTILS_PATCH_BUNDLED is not set
# CT_BINUTILS_PATCH_LOCAL is not set
# CT_BINUTILS_PATCH_BUNDLED_LOCAL is not set
# CT_BINUTILS_PATCH_LOCAL_BUNDLED is not set
# CT_BINUTILS_PATCH_NONE is not set
CT_BINUTILS_PATCH_ORDER="global"
CT_BINUTILS_V_2_32=y
# CT_BINUTILS_V_2_31 is not set
# CT_BINUTILS_V_2_30 is not set
# CT_BINUTILS_V_2_29 is not set
# CT_BINUTILS_V_2_28 is not set
# CT_BINUTILS_V_2_27 is not set
# CT_BINUTILS_V_2_26 is not set
# CT_BINUTILS_NO_VERSIONS is not set
CT_BINUTILS_VERSION="2.32"
CT_BINUTILS_MIRRORS="$(CT_Mirrors GNU binutils) $(CT_Mirrors sourceware binutils/releases)"
CT_BINUTILS_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_BINUTILS_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_BINUTILS_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_BINUTILS_SIGNATURE_FORMAT="packed/.sig"
CT_BINUTILS_later_than_2_30=y
CT_BINUTILS_2_30_or_later=y
CT_BINUTILS_later_than_2_27=y
CT_BINUTILS_2_27_or_later=y
CT_BINUTILS_later_than_2_25=y
CT_BINUTILS_2_25_or_later=y
CT_BINUTILS_REQUIRE_2_25_or_later=y
CT_BINUTILS_later_than_2_23=y
CT_BINUTILS_2_23_or_later=y
#
# GNU binutils
#
CT_BINUTILS_HAS_HASH_STYLE=y
CT_BINUTILS_HAS_GOLD=y
CT_BINUTILS_HAS_PLUGINS=y
CT_BINUTILS_HAS_PKGVERSION_BUGURL=y
CT_BINUTILS_FORCE_LD_BFD_DEFAULT=y
CT_BINUTILS_LINKER_LD=y
CT_BINUTILS_LINKERS_LIST="ld"
CT_BINUTILS_LINKER_DEFAULT="bfd"
# CT_BINUTILS_PLUGINS is not set
CT_BINUTILS_RELRO=m
CT_BINUTILS_EXTRA_CONFIG_ARRAY=""
# CT_BINUTILS_FOR_TARGET is not set
CT_ALL_BINUTILS_CHOICES="BINUTILS"
#
# C-library
#
CT_LIBC_GLIBC=y
# CT_LIBC_MUSL is not set
# CT_LIBC_UCLIBC is not set
CT_LIBC="glibc"
CT_LIBC_CHOICE_KSYM="GLIBC"
CT_THREADS="nptl"
CT_LIBC_GLIBC_SHOW=y
#
# Options for glibc
#
CT_LIBC_GLIBC_PKG_KSYM="GLIBC"
CT_GLIBC_DIR_NAME="glibc"
CT_GLIBC_USE_GNU=y
CT_GLIBC_USE="GLIBC"
CT_GLIBC_PKG_NAME="glibc"
CT_GLIBC_SRC_RELEASE=y
# CT_GLIBC_SRC_DEVEL is not set
# CT_GLIBC_SRC_CUSTOM is not set
CT_GLIBC_PATCH_GLOBAL=y
# CT_GLIBC_PATCH_BUNDLED is not set
# CT_GLIBC_PATCH_LOCAL is not set
# CT_GLIBC_PATCH_BUNDLED_LOCAL is not set
# CT_GLIBC_PATCH_LOCAL_BUNDLED is not set
# CT_GLIBC_PATCH_NONE is not set
CT_GLIBC_PATCH_ORDER="global"
CT_GLIBC_V_2_29=y
# CT_GLIBC_NO_VERSIONS is not set
CT_GLIBC_VERSION="2.29"
CT_GLIBC_MIRRORS="$(CT_Mirrors GNU glibc)"
CT_GLIBC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GLIBC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GLIBC_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_GLIBC_SIGNATURE_FORMAT="packed/.sig"
CT_GLIBC_2_29_or_later=y
CT_GLIBC_2_29_or_older=y
CT_GLIBC_REQUIRE_2_29_or_later=y
CT_GLIBC_later_than_2_27=y
CT_GLIBC_2_27_or_later=y
CT_GLIBC_later_than_2_26=y
CT_GLIBC_2_26_or_later=y
CT_GLIBC_later_than_2_25=y
CT_GLIBC_2_25_or_later=y
CT_GLIBC_later_than_2_24=y
CT_GLIBC_2_24_or_later=y
CT_GLIBC_later_than_2_23=y
CT_GLIBC_2_23_or_later=y
CT_GLIBC_later_than_2_20=y
CT_GLIBC_2_20_or_later=y
CT_GLIBC_later_than_2_17=y
CT_GLIBC_2_17_or_later=y
CT_GLIBC_later_than_2_14=y
CT_GLIBC_2_14_or_later=y
CT_GLIBC_DEP_KERNEL_HEADERS_VERSION=y
CT_GLIBC_DEP_BINUTILS=y
CT_GLIBC_DEP_GCC=y
CT_GLIBC_DEP_PYTHON=y
CT_GLIBC_BUILD_SSP=y
CT_GLIBC_HAS_LIBIDN_ADDON=y
# CT_GLIBC_USE_LIBIDN_ADDON is not set
CT_GLIBC_NO_SPARC_V8=y
CT_GLIBC_HAS_OBSOLETE_RPC=y
CT_GLIBC_EXTRA_CONFIG_ARRAY=""
CT_GLIBC_CONFIGPARMS=""
CT_GLIBC_EXTRA_CFLAGS=""
CT_GLIBC_ENABLE_OBSOLETE_RPC=y
# CT_GLIBC_ENABLE_FORTIFIED_BUILD is not set
# CT_GLIBC_DISABLE_VERSIONING is not set
CT_GLIBC_OLDEST_ABI=""
CT_GLIBC_FORCE_UNWIND=y
# CT_GLIBC_LOCALES is not set
CT_GLIBC_KERNEL_VERSION_NONE=y
# CT_GLIBC_KERNEL_VERSION_AS_HEADERS is not set
# CT_GLIBC_KERNEL_VERSION_CHOSEN is not set
CT_GLIBC_MIN_KERNEL=""
CT_GLIBC_SSP_DEFAULT=y
# CT_GLIBC_SSP_NO is not set
# CT_GLIBC_SSP_YES is not set
# CT_GLIBC_SSP_ALL is not set
# CT_GLIBC_SSP_STRONG is not set
# CT_GLIBC_ENABLE_WERROR is not set
CT_ALL_LIBC_CHOICES="AVR_LIBC BIONIC GLIBC MINGW_W64 MOXIEBOX MUSL NEWLIB NONE UCLIBC"
CT_LIBC_SUPPORT_THREADS_ANY=y
CT_LIBC_SUPPORT_THREADS_NATIVE=y
#
# Common C library options
#
CT_THREADS_NATIVE=y
# CT_CREATE_LDSO_CONF is not set
CT_LIBC_XLDD=y
#
# C compiler
#
CT_CC_CORE_PASSES_NEEDED=y
CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_SUPPORT_CXX=y
CT_CC_SUPPORT_FORTRAN=y
CT_CC_SUPPORT_ADA=y
CT_CC_SUPPORT_OBJC=y
CT_CC_SUPPORT_OBJCXX=y
CT_CC_SUPPORT_GOLANG=y
CT_CC_GCC=y
CT_CC="gcc"
CT_CC_CHOICE_KSYM="GCC"
CT_CC_GCC_SHOW=y
#
# Options for gcc
#
CT_CC_GCC_PKG_KSYM="GCC"
CT_GCC_DIR_NAME="gcc"
CT_GCC_USE_GNU=y
# CT_GCC_USE_LINARO is not set
CT_GCC_USE="GCC"
CT_GCC_PKG_NAME="gcc"
CT_GCC_SRC_RELEASE=y
# CT_GCC_SRC_DEVEL is not set
# CT_GCC_SRC_CUSTOM is not set
CT_GCC_PATCH_GLOBAL=y
# CT_GCC_PATCH_BUNDLED is not set
# CT_GCC_PATCH_LOCAL is not set
# CT_GCC_PATCH_BUNDLED_LOCAL is not set
# CT_GCC_PATCH_LOCAL_BUNDLED is not set
# CT_GCC_PATCH_NONE is not set
CT_GCC_PATCH_ORDER="global"
CT_GCC_V_8=y
# CT_GCC_V_7 is not set
# CT_GCC_NO_VERSIONS is not set
CT_GCC_VERSION="8.3.0"
CT_GCC_MIRRORS="$(CT_Mirrors GNU gcc/gcc-${CT_GCC_VERSION}) $(CT_Mirrors sourceware gcc/releases/gcc-${CT_GCC_VERSION})"
CT_GCC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GCC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GCC_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_GCC_SIGNATURE_FORMAT=""
CT_GCC_later_than_7=y
CT_GCC_7_or_later=y
CT_GCC_REQUIRE_7_or_later=y
CT_GCC_later_than_6=y
CT_GCC_6_or_later=y
CT_GCC_later_than_5=y
CT_GCC_5_or_later=y
CT_GCC_REQUIRE_5_or_later=y
CT_GCC_later_than_4_9=y
CT_GCC_4_9_or_later=y
CT_GCC_REQUIRE_4_9_or_later=y
CT_GCC_later_than_4_8=y
CT_GCC_4_8_or_later=y
CT_CC_GCC_HAS_LIBMPX=y
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
CT_CC_GCC_EXTRA_CONFIG_ARRAY=""
CT_CC_GCC_STATIC_LIBSTDCXX=y
# CT_CC_GCC_SYSTEM_ZLIB is not set
CT_CC_GCC_CONFIG_TLS=m
#
# Optimisation features
#
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_USE_LTO=y
#
# Settings for libraries running on target
#
CT_CC_GCC_ENABLE_TARGET_OPTSPACE=y
# CT_CC_GCC_LIBMUDFLAP is not set
# CT_CC_GCC_LIBGOMP is not set
# CT_CC_GCC_LIBSSP is not set
# CT_CC_GCC_LIBQUADMATH is not set
# CT_CC_GCC_LIBSANITIZER is not set
#
# Misc. obscure options.
#
CT_CC_CXA_ATEXIT=y
# CT_CC_GCC_DISABLE_PCH is not set
CT_CC_GCC_SJLJ_EXCEPTIONS=m
CT_CC_GCC_LDBL_128=m
# CT_CC_GCC_BUILD_ID is not set
CT_CC_GCC_LNK_HASH_STYLE_DEFAULT=y
# CT_CC_GCC_LNK_HASH_STYLE_SYSV is not set
# CT_CC_GCC_LNK_HASH_STYLE_GNU is not set
# CT_CC_GCC_LNK_HASH_STYLE_BOTH is not set
CT_CC_GCC_LNK_HASH_STYLE=""
CT_CC_GCC_DEC_FLOAT_AUTO=y
# CT_CC_GCC_DEC_FLOAT_BID is not set
# CT_CC_GCC_DEC_FLOAT_DPD is not set
# CT_CC_GCC_DEC_FLOATS_NO is not set
CT_ALL_CC_CHOICES="GCC"
#
# Additional supported languages:
#
CT_CC_LANG_CXX=y
# CT_CC_LANG_FORTRAN is not set
# CT_CC_LANG_ADA is not set
# CT_CC_LANG_OBJC is not set
# CT_CC_LANG_OBJCXX is not set
# CT_CC_LANG_GOLANG is not set
CT_CC_LANG_OTHERS=""
#
# Debug facilities
#
# CT_DEBUG_DUMA is not set
CT_DEBUG_GDB=y
CT_DEBUG_GDB_PKG_KSYM="GDB"
CT_GDB_DIR_NAME="gdb"
CT_GDB_USE_GNU=y
CT_GDB_USE="GDB"
CT_GDB_PKG_NAME="gdb"
CT_GDB_SRC_RELEASE=y
# CT_GDB_SRC_DEVEL is not set
# CT_GDB_SRC_CUSTOM is not set
CT_GDB_PATCH_GLOBAL=y
# CT_GDB_PATCH_BUNDLED is not set
# CT_GDB_PATCH_LOCAL is not set
# CT_GDB_PATCH_BUNDLED_LOCAL is not set
# CT_GDB_PATCH_LOCAL_BUNDLED is not set
# CT_GDB_PATCH_NONE is not set
CT_GDB_PATCH_ORDER="global"
CT_GDB_V_8_2=y
# CT_GDB_V_8_1 is not set
# CT_GDB_V_8_0 is not set
# CT_GDB_NO_VERSIONS is not set
CT_GDB_VERSION="8.2.1"
CT_GDB_MIRRORS="$(CT_Mirrors GNU gdb) $(CT_Mirrors sourceware gdb/releases)"
CT_GDB_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GDB_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GDB_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_GDB_SIGNATURE_FORMAT=""
CT_GDB_later_than_8_0=y
CT_GDB_8_0_or_later=y
CT_GDB_REQUIRE_8_0_or_later=y
CT_GDB_later_than_7_12=y
CT_GDB_7_12_or_later=y
CT_GDB_later_than_7_2=y
CT_GDB_7_2_or_later=y
CT_GDB_later_than_7_0=y
CT_GDB_7_0_or_later=y
CT_GDB_CROSS=y
# CT_GDB_CROSS_STATIC is not set
# CT_GDB_CROSS_SIM is not set
# CT_GDB_CROSS_PYTHON is not set
CT_GDB_CROSS_EXTRA_CONFIG_ARRAY=""
# CT_GDB_NATIVE is not set
# CT_GDB_GDBSERVER is not set
CT_GDB_HAS_PKGVERSION_BUGURL=y
CT_GDB_HAS_PYTHON=y
CT_GDB_INSTALL_GDBINIT=y
CT_GDB_HAS_IPA_LIB=y
# CT_DEBUG_LTRACE is not set
# CT_DEBUG_STRACE is not set
CT_ALL_DEBUG_CHOICES="DUMA GDB LTRACE STRACE"
#
# Companion libraries
#
# CT_COMPLIBS_CHECK is not set
# CT_COMP_LIBS_CLOOG is not set
CT_COMP_LIBS_EXPAT=y
CT_COMP_LIBS_EXPAT_PKG_KSYM="EXPAT"
CT_EXPAT_DIR_NAME="expat"
CT_EXPAT_PKG_NAME="expat"
CT_EXPAT_SRC_RELEASE=y
# CT_EXPAT_SRC_DEVEL is not set
# CT_EXPAT_SRC_CUSTOM is not set
CT_EXPAT_PATCH_GLOBAL=y
# CT_EXPAT_PATCH_BUNDLED is not set
# CT_EXPAT_PATCH_LOCAL is not set
# CT_EXPAT_PATCH_BUNDLED_LOCAL is not set
# CT_EXPAT_PATCH_LOCAL_BUNDLED is not set
# CT_EXPAT_PATCH_NONE is not set
CT_EXPAT_PATCH_ORDER="global"
CT_EXPAT_V_2_2=y
# CT_EXPAT_NO_VERSIONS is not set
CT_EXPAT_VERSION="2.2.6"
CT_EXPAT_MIRRORS="http://downloads.sourceforge.net/project/expat/expat/${CT_EXPAT_VERSION}"
CT_EXPAT_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_EXPAT_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_EXPAT_ARCHIVE_FORMATS=".tar.bz2"
CT_EXPAT_SIGNATURE_FORMAT=""
CT_COMP_LIBS_GETTEXT=y
CT_COMP_LIBS_GETTEXT_PKG_KSYM="GETTEXT"
CT_GETTEXT_DIR_NAME="gettext"
CT_GETTEXT_PKG_NAME="gettext"
CT_GETTEXT_SRC_RELEASE=y
# CT_GETTEXT_SRC_DEVEL is not set
# CT_GETTEXT_SRC_CUSTOM is not set
CT_GETTEXT_PATCH_GLOBAL=y
# CT_GETTEXT_PATCH_BUNDLED is not set
# CT_GETTEXT_PATCH_LOCAL is not set
# CT_GETTEXT_PATCH_BUNDLED_LOCAL is not set
# CT_GETTEXT_PATCH_LOCAL_BUNDLED is not set
# CT_GETTEXT_PATCH_NONE is not set
CT_GETTEXT_PATCH_ORDER="global"
CT_GETTEXT_V_0_19_8_1=y
# CT_GETTEXT_NO_VERSIONS is not set
CT_GETTEXT_VERSION="0.19.8.1"
CT_GETTEXT_MIRRORS="$(CT_Mirrors GNU gettext)"
CT_GETTEXT_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GETTEXT_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GETTEXT_ARCHIVE_FORMATS=".tar.xz .tar.lz .tar.gz"
CT_GETTEXT_SIGNATURE_FORMAT="packed/.sig"
CT_COMP_LIBS_GMP=y
CT_COMP_LIBS_GMP_PKG_KSYM="GMP"
CT_GMP_DIR_NAME="gmp"
CT_GMP_PKG_NAME="gmp"
CT_GMP_SRC_RELEASE=y
# CT_GMP_SRC_DEVEL is not set
# CT_GMP_SRC_CUSTOM is not set
CT_GMP_PATCH_GLOBAL=y
# CT_GMP_PATCH_BUNDLED is not set
# CT_GMP_PATCH_LOCAL is not set
# CT_GMP_PATCH_BUNDLED_LOCAL is not set
# CT_GMP_PATCH_LOCAL_BUNDLED is not set
# CT_GMP_PATCH_NONE is not set
CT_GMP_PATCH_ORDER="global"
CT_GMP_V_6_1=y
# CT_GMP_NO_VERSIONS is not set
CT_GMP_VERSION="6.1.2"
CT_GMP_MIRRORS="https://gmplib.org/download/gmp https://gmplib.org/download/gmp/archive $(CT_Mirrors GNU gmp)"
CT_GMP_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_GMP_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_GMP_ARCHIVE_FORMATS=".tar.xz .tar.lz .tar.bz2"
CT_GMP_SIGNATURE_FORMAT="packed/.sig"
CT_GMP_later_than_5_1_0=y
CT_GMP_5_1_0_or_later=y
CT_GMP_later_than_5_0_0=y
CT_GMP_5_0_0_or_later=y
CT_GMP_REQUIRE_5_0_0_or_later=y
CT_COMP_LIBS_ISL=y
CT_COMP_LIBS_ISL_PKG_KSYM="ISL"
CT_ISL_DIR_NAME="isl"
CT_ISL_PKG_NAME="isl"
CT_ISL_SRC_RELEASE=y
# CT_ISL_SRC_DEVEL is not set
# CT_ISL_SRC_CUSTOM is not set
CT_ISL_PATCH_GLOBAL=y
# CT_ISL_PATCH_BUNDLED is not set
# CT_ISL_PATCH_LOCAL is not set
# CT_ISL_PATCH_BUNDLED_LOCAL is not set
# CT_ISL_PATCH_LOCAL_BUNDLED is not set
# CT_ISL_PATCH_NONE is not set
CT_ISL_PATCH_ORDER="global"
CT_ISL_V_0_20=y
# CT_ISL_V_0_19 is not set
# CT_ISL_V_0_18 is not set
# CT_ISL_V_0_17 is not set
# CT_ISL_V_0_16 is not set
# CT_ISL_V_0_15 is not set
# CT_ISL_NO_VERSIONS is not set
CT_ISL_VERSION="0.20"
CT_ISL_MIRRORS="http://isl.gforge.inria.fr"
CT_ISL_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_ISL_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_ISL_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz"
CT_ISL_SIGNATURE_FORMAT=""
CT_ISL_later_than_0_18=y
CT_ISL_0_18_or_later=y
CT_ISL_later_than_0_15=y
CT_ISL_0_15_or_later=y
CT_ISL_REQUIRE_0_15_or_later=y
CT_ISL_later_than_0_14=y
CT_ISL_0_14_or_later=y
CT_ISL_REQUIRE_0_14_or_later=y
CT_ISL_later_than_0_13=y
CT_ISL_0_13_or_later=y
CT_ISL_later_than_0_12=y
CT_ISL_0_12_or_later=y
CT_ISL_REQUIRE_0_12_or_later=y
# CT_COMP_LIBS_LIBELF is not set
CT_COMP_LIBS_LIBICONV=y
CT_COMP_LIBS_LIBICONV_PKG_KSYM="LIBICONV"
CT_LIBICONV_DIR_NAME="libiconv"
CT_LIBICONV_PKG_NAME="libiconv"
CT_LIBICONV_SRC_RELEASE=y
# CT_LIBICONV_SRC_DEVEL is not set
# CT_LIBICONV_SRC_CUSTOM is not set
CT_LIBICONV_PATCH_GLOBAL=y
# CT_LIBICONV_PATCH_BUNDLED is not set
# CT_LIBICONV_PATCH_LOCAL is not set
# CT_LIBICONV_PATCH_BUNDLED_LOCAL is not set
# CT_LIBICONV_PATCH_LOCAL_BUNDLED is not set
# CT_LIBICONV_PATCH_NONE is not set
CT_LIBICONV_PATCH_ORDER="global"
CT_LIBICONV_V_1_15=y
# CT_LIBICONV_NO_VERSIONS is not set
CT_LIBICONV_VERSION="1.15"
CT_LIBICONV_MIRRORS="$(CT_Mirrors GNU libiconv)"
CT_LIBICONV_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_LIBICONV_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_LIBICONV_ARCHIVE_FORMATS=".tar.gz"
CT_LIBICONV_SIGNATURE_FORMAT="packed/.sig"
CT_COMP_LIBS_MPC=y
CT_COMP_LIBS_MPC_PKG_KSYM="MPC"
CT_MPC_DIR_NAME="mpc"
CT_MPC_PKG_NAME="mpc"
CT_MPC_SRC_RELEASE=y
# CT_MPC_SRC_DEVEL is not set
# CT_MPC_SRC_CUSTOM is not set
CT_MPC_PATCH_GLOBAL=y
# CT_MPC_PATCH_BUNDLED is not set
# CT_MPC_PATCH_LOCAL is not set
# CT_MPC_PATCH_BUNDLED_LOCAL is not set
# CT_MPC_PATCH_LOCAL_BUNDLED is not set
# CT_MPC_PATCH_NONE is not set
CT_MPC_PATCH_ORDER="global"
CT_MPC_V_1_1=y
# CT_MPC_V_1_0 is not set
# CT_MPC_NO_VERSIONS is not set
CT_MPC_VERSION="1.1.0"
CT_MPC_MIRRORS="http://www.multiprecision.org/downloads $(CT_Mirrors GNU mpc)"
CT_MPC_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_MPC_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_MPC_ARCHIVE_FORMATS=".tar.gz"
CT_MPC_SIGNATURE_FORMAT="packed/.sig"
CT_MPC_1_1_0_or_later=y
CT_MPC_1_1_0_or_older=y
CT_COMP_LIBS_MPFR=y
CT_COMP_LIBS_MPFR_PKG_KSYM="MPFR"
CT_MPFR_DIR_NAME="mpfr"
CT_MPFR_PKG_NAME="mpfr"
CT_MPFR_SRC_RELEASE=y
# CT_MPFR_SRC_DEVEL is not set
# CT_MPFR_SRC_CUSTOM is not set
CT_MPFR_PATCH_GLOBAL=y
# CT_MPFR_PATCH_BUNDLED is not set
# CT_MPFR_PATCH_LOCAL is not set
# CT_MPFR_PATCH_BUNDLED_LOCAL is not set
# CT_MPFR_PATCH_LOCAL_BUNDLED is not set
# CT_MPFR_PATCH_NONE is not set
CT_MPFR_PATCH_ORDER="global"
CT_MPFR_V_4_0=y
# CT_MPFR_V_3_1 is not set
# CT_MPFR_NO_VERSIONS is not set
CT_MPFR_VERSION="4.0.2"
CT_MPFR_MIRRORS="http://www.mpfr.org/mpfr-${CT_MPFR_VERSION} $(CT_Mirrors GNU mpfr)"
CT_MPFR_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_MPFR_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_MPFR_ARCHIVE_FORMATS=".tar.xz .tar.bz2 .tar.gz .zip"
CT_MPFR_SIGNATURE_FORMAT="packed/.asc"
CT_MPFR_later_than_4_0_0=y
CT_MPFR_4_0_0_or_later=y
CT_MPFR_later_than_3_0_0=y
CT_MPFR_3_0_0_or_later=y
CT_MPFR_REQUIRE_3_0_0_or_later=y
CT_COMP_LIBS_NCURSES=y
CT_COMP_LIBS_NCURSES_PKG_KSYM="NCURSES"
CT_NCURSES_DIR_NAME="ncurses"
CT_NCURSES_PKG_NAME="ncurses"
CT_NCURSES_SRC_RELEASE=y
# CT_NCURSES_SRC_DEVEL is not set
# CT_NCURSES_SRC_CUSTOM is not set
CT_NCURSES_PATCH_GLOBAL=y
# CT_NCURSES_PATCH_BUNDLED is not set
# CT_NCURSES_PATCH_LOCAL is not set
# CT_NCURSES_PATCH_BUNDLED_LOCAL is not set
# CT_NCURSES_PATCH_LOCAL_BUNDLED is not set
# CT_NCURSES_PATCH_NONE is not set
CT_NCURSES_PATCH_ORDER="global"
CT_NCURSES_V_6_1=y
# CT_NCURSES_V_6_0 is not set
# CT_NCURSES_NO_VERSIONS is not set
CT_NCURSES_VERSION="6.1"
CT_NCURSES_MIRRORS="ftp://invisible-island.net/ncurses $(CT_Mirrors GNU ncurses)"
CT_NCURSES_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_NCURSES_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_NCURSES_ARCHIVE_FORMATS=".tar.gz"
CT_NCURSES_SIGNATURE_FORMAT="packed/.sig"
# CT_NCURSES_NEW_ABI is not set
CT_NCURSES_HOST_CONFIG_ARGS=""
CT_NCURSES_HOST_DISABLE_DB=y
CT_NCURSES_HOST_FALLBACKS="linux,xterm,xterm-color,xterm-256color,vt100"
CT_NCURSES_TARGET_CONFIG_ARGS=""
# CT_NCURSES_TARGET_DISABLE_DB is not set
CT_NCURSES_TARGET_FALLBACKS=""
CT_COMP_LIBS_ZLIB=y
CT_COMP_LIBS_ZLIB_PKG_KSYM="ZLIB"
CT_ZLIB_DIR_NAME="zlib"
CT_ZLIB_PKG_NAME="zlib"
CT_ZLIB_SRC_RELEASE=y
# CT_ZLIB_SRC_DEVEL is not set
# CT_ZLIB_SRC_CUSTOM is not set
CT_ZLIB_PATCH_GLOBAL=y
# CT_ZLIB_PATCH_BUNDLED is not set
# CT_ZLIB_PATCH_LOCAL is not set
# CT_ZLIB_PATCH_BUNDLED_LOCAL is not set
# CT_ZLIB_PATCH_LOCAL_BUNDLED is not set
# CT_ZLIB_PATCH_NONE is not set
CT_ZLIB_PATCH_ORDER="global"
CT_ZLIB_V_1_2_11=y
# CT_ZLIB_NO_VERSIONS is not set
CT_ZLIB_VERSION="1.2.11"
CT_ZLIB_MIRRORS="http://downloads.sourceforge.net/project/libpng/zlib/${CT_ZLIB_VERSION}"
CT_ZLIB_ARCHIVE_FILENAME="@{pkg_name}-@{version}"
CT_ZLIB_ARCHIVE_DIRNAME="@{pkg_name}-@{version}"
CT_ZLIB_ARCHIVE_FORMATS=".tar.xz .tar.gz"
CT_ZLIB_SIGNATURE_FORMAT="packed/.asc"
CT_ALL_COMP_LIBS_CHOICES="CLOOG EXPAT GETTEXT GMP ISL LIBELF LIBICONV MPC MPFR NCURSES ZLIB"
CT_LIBICONV_NEEDED=y
CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_MPC_NEEDED=y
CT_EXPAT_NEEDED=y
CT_NCURSES_NEEDED=y
CT_ZLIB_NEEDED=y
CT_LIBICONV=y
CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_MPC=y
CT_EXPAT=y
CT_NCURSES=y
CT_ZLIB=y
#
# Companion tools
#
# CT_COMP_TOOLS_FOR_HOST is not set
# CT_COMP_TOOLS_AUTOCONF is not set
# CT_COMP_TOOLS_AUTOMAKE is not set
# CT_COMP_TOOLS_BISON is not set
# CT_COMP_TOOLS_DTC is not set
# CT_COMP_TOOLS_LIBTOOL is not set
# CT_COMP_TOOLS_M4 is not set
# CT_COMP_TOOLS_MAKE is not set
CT_ALL_COMP_TOOLS_CHOICES="AUTOCONF AUTOMAKE BISON DTC LIBTOOL M4 MAKE"
#
# Test suite
#
# CT_TEST_SUITE_GCC is not set

View File

@ -12,7 +12,7 @@ export PATH=`pwd`/clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-14.04/bin:$PATH
git clone https://github.com/CraneStation/wasi-libc git clone https://github.com/CraneStation/wasi-libc
cd wasi-libc cd wasi-libc
git reset --hard f645f498dfbbbc00a7a97874d33082d3605c3f21 git reset --hard 1fad33890a5e299027ce0eab7b6ad5260585e347
make -j$(nproc) INSTALL_DIR=/wasm32-wasi install make -j$(nproc) INSTALL_DIR=/wasm32-wasi install
cd .. cd ..

View File

@ -1,4 +1,4 @@
FROM ubuntu:19.04 FROM ubuntu:19.10
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
g++ \ g++ \

View File

@ -25,7 +25,9 @@ RUN sh /scripts/sccache.sh
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \ --build=x86_64-unknown-linux-gnu \
--llvm-root=/usr/lib/llvm-7 \ --llvm-root=/usr/lib/llvm-7 \
--enable-llvm-link-shared --enable-llvm-link-shared \
--set rust.thin-lto-import-instr-limit=10
ENV SCRIPT python2.7 ../x.py test src/tools/tidy && python2.7 ../x.py test ENV SCRIPT python2.7 ../x.py test src/tools/tidy && python2.7 ../x.py test
# The purpose of this container isn't to test with debug assertions and # The purpose of this container isn't to test with debug assertions and

View File

@ -1,4 +1,4 @@
FROM ubuntu:19.04 FROM ubuntu:19.10
RUN apt-get update && apt-get install -y --no-install-recommends \ RUN apt-get update && apt-get install -y --no-install-recommends \
g++ \ g++ \

View File

@ -1,6 +1,9 @@
#!/bin/sh #!/bin/bash
set -eu set -euo pipefail
IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/shared.sh"
# The following lines are also found in src/bootstrap/toolstate.rs, # The following lines are also found in src/bootstrap/toolstate.rs,
# so if updating here, please also update that file. # so if updating here, please also update that file.
@ -14,12 +17,15 @@ printf 'https://%s:x-oauth-basic@github.com\n' "$TOOLSTATE_REPO_ACCESS_TOKEN" \
> "$HOME/.git-credentials" > "$HOME/.git-credentials"
git clone --depth=1 $TOOLSTATE_REPO git clone --depth=1 $TOOLSTATE_REPO
GIT_COMMIT="$(git rev-parse HEAD)"
GIT_COMMIT_MSG="$(git log --format=%s -n1 HEAD)"
cd rust-toolstate cd rust-toolstate
FAILURE=1 FAILURE=1
for RETRY_COUNT in 1 2 3 4 5; do for RETRY_COUNT in 1 2 3 4 5; do
# The purpose is to publish the new "current" toolstate in the toolstate repo. # The purpose is to publish the new "current" toolstate in the toolstate repo.
"$BUILD_SOURCESDIRECTORY/src/tools/publish_toolstate.py" "$(git rev-parse HEAD)" \ "$(ciCheckoutPath)/src/tools/publish_toolstate.py" "$GIT_COMMIT" \
"$(git log --format=%s -n1 HEAD)" \ "$GIT_COMMIT_MSG" \
"$MESSAGE_FILE" \ "$MESSAGE_FILE" \
"$TOOLSTATE_REPO_ACCESS_TOKEN" "$TOOLSTATE_REPO_ACCESS_TOKEN"
# `git commit` failing means nothing to commit. # `git commit` failing means nothing to commit.

View File

@ -44,8 +44,13 @@ fi
# FIXME: need a scheme for changing this `nightly` value to `beta` and `stable` # FIXME: need a scheme for changing this `nightly` value to `beta` and `stable`
# either automatically or manually. # either automatically or manually.
export RUST_RELEASE_CHANNEL=stable export RUST_RELEASE_CHANNEL=stable
# Always set the release channel for bootstrap; this is normally not important (i.e., only dist
# builds would seem to matter) but in practice bootstrap wants to know whether we're targeting
# master, beta, or stable with a build to determine whether to run some checks (notably toolstate).
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --release-channel=$RUST_RELEASE_CHANNEL"
if [ "$DEPLOY$DEPLOY_ALT" = "1" ]; then if [ "$DEPLOY$DEPLOY_ALT" = "1" ]; then
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --release-channel=$RUST_RELEASE_CHANNEL"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-llvm-static-stdcpp" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-llvm-static-stdcpp"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set rust.remap-debuginfo" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set rust.remap-debuginfo"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --debuginfo-level-std=1" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --debuginfo-level-std=1"

View File

@ -14,6 +14,13 @@ if isMacOS; then
ciCommandSetEnv CC "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang" ciCommandSetEnv CC "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang"
ciCommandSetEnv CXX "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang++" ciCommandSetEnv CXX "$(pwd)/clang+llvm-9.0.0-x86_64-darwin-apple/bin/clang++"
# macOS 10.15 onwards doesn't have libraries in /usr/include anymore: those
# are now located deep into the filesystem, under Xcode's own files. The
# native clang is configured to use the correct path, but our custom one
# doesn't. This sets the SDKROOT environment variable to the SDK so that
# our own clang can figure out the correct include path on its own.
ciCommandSetEnv SDKROOT "$(xcrun --sdk macosx --show-sdk-path)"
# Configure `AR` specifically so rustbuild doesn't try to infer it as # Configure `AR` specifically so rustbuild doesn't try to infer it as
# `clang-ar` by accident. # `clang-ar` by accident.
ciCommandSetEnv AR "ar" ciCommandSetEnv AR "ar"

View File

@ -6,7 +6,8 @@ IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh" source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
if isWindows; then if isWindows; then
pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar \
binutils
# Make sure we use the native python interpreter instead of some msys equivalent # Make sure we use the native python interpreter instead of some msys equivalent
# one way or another. The msys interpreters seem to have weird path conversions # one way or another. The msys interpreters seem to have weird path conversions

View File

@ -12,8 +12,14 @@ IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh" source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
if isWindows; then if isWindows; then
choco install msys2 --params="/InstallDir:$(ciCheckoutPath)/msys2 /NoPath" -y --no-progress # Pre-followed the api/v2 URL to the CDN since the API can be a bit flakey
curl -sSL https://packages.chocolatey.org/msys2.20190524.0.0.20191030.nupkg > \
msys2.nupkg
curl -sSL https://packages.chocolatey.org/chocolatey-core.extension.1.3.5.1.nupkg > \
chocolatey-core.extension.nupkg
choco install -s . msys2 \
--params="/InstallDir:$(ciCheckoutPath)/msys2 /NoPath" -y --no-progress
rm msys2.nupkg chocolatey-core.extension.nupkg
mkdir -p "$(ciCheckoutPath)/msys2/home/${USERNAME}" mkdir -p "$(ciCheckoutPath)/msys2/home/${USERNAME}"
ciCommandAddPath "$(ciCheckoutPath)/msys2/usr/bin" ciCommandAddPath "$(ciCheckoutPath)/msys2/usr/bin"
fi fi

View File

@ -1,13 +0,0 @@
#!/bin/bash
# Switch to XCode 9.3 on OSX since it seems to be the last version that supports
# i686-apple-darwin. We'll eventually want to upgrade this and it will probably
# force us to drop i686-apple-darwin, but let's keep the wheels turning for now.
set -euo pipefail
IFS=$'\n\t'
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
if isMacOS; then
sudo xcode-select --switch /Applications/Xcode_9.3.app
fi

View File

@ -80,7 +80,7 @@ function ciCommit {
function ciCheckoutPath { function ciCheckoutPath {
if isAzurePipelines; then if isAzurePipelines; then
echo "${SYSTEM_WORKFOLDER}" echo "${BUILD_SOURCESDIRECTORY}"
elif isGitHubActions; then elif isGitHubActions; then
echo "${GITHUB_WORKSPACE}" echo "${GITHUB_WORKSPACE}"
else else
@ -99,7 +99,7 @@ function ciCommandAddPath {
if isAzurePipelines; then if isAzurePipelines; then
echo "##vso[task.prependpath]${path}" echo "##vso[task.prependpath]${path}"
elif isGitHubActions; then elif isGitHubActions; then
echo "::add-path::${value}" echo "::add-path::${path}"
else else
echo "ciCommandAddPath only works inside CI!" echo "ciCommandAddPath only works inside CI!"
exit 1 exit 1

View File

@ -13,6 +13,6 @@ addons:
- aspell - aspell
- aspell-en - aspell-en
before_script: before_script:
- (cargo install mdbook --vers 0.2.3 --force || true) - (cargo install mdbook --vers 0.3.5 --force || true)
script: script:
- bash ci/build.sh - bash ci/build.sh

View File

@ -1,142 +0,0 @@
[[package]]
name = "aho-corasick"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "docopt"
version = "0.6.86"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.5.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "kernel32-sys"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "lazy_static"
version = "0.2.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "libc"
version = "0.2.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "memchr"
version = "0.1.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex"
version = "0.1.80"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"aho-corasick 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)",
"regex-syntax 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
"thread_local 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)",
"utf8-ranges 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex-syntax"
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "rust-book"
version = "0.0.1"
dependencies = [
"docopt 0.6.86 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
"walkdir 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rustc-serialize"
version = "0.3.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "strsim"
version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "thread-id"
version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "thread_local"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"thread-id 2.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "utf8-ranges"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "walkdir"
version = "0.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi-build"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[metadata]
"checksum aho-corasick 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)" = "ca972c2ea5f742bfce5687b9aef75506a764f61d37f8f649047846a9686ddb66"
"checksum docopt 0.6.86 (registry+https://github.com/rust-lang/crates.io-index)" = "4a7ef30445607f6fc8720f0a0a2c7442284b629cf0d049286860fae23e71c4d9"
"checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d"
"checksum lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)" = "236eb37a62591d4a41a89b7763d7de3e06ca02d5ab2815446a8bae5d2f8c2d57"
"checksum libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)" = "5ba3df4dcb460b9dfbd070d41c94c19209620c191b0340b929ce748a2bcd42d2"
"checksum memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)" = "d8b629fb514376c675b98c1421e80b151d3817ac42d7c667717d282761418d20"
"checksum regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)" = "4fd4ace6a8cf7860714a2c2280d6c1f7e6a413486c13298bbc86fd3da019402f"
"checksum regex-syntax 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "f9ec002c35e86791825ed294b50008eea9ddfc8def4420124fbc6b08db834957"
"checksum rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)" = "dcf128d1287d2ea9d80910b5f1120d0b8eede3fbf1abe91c40d39ea7d51e6fda"
"checksum strsim 0.5.2 (registry+https://github.com/rust-lang/crates.io-index)" = "67f84c44fbb2f91db7fef94554e6b2ac05909c9c0b0bc23bb98d3a1aebfe7f7c"
"checksum thread-id 2.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a9539db560102d1cef46b8b78ce737ff0bb64e7e18d35b2a5688f7d097d0ff03"
"checksum thread_local 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)" = "8576dbbfcaef9641452d5cf0df9b0e7eeab7694956dd33bb61515fb8f18cfdd5"
"checksum utf8-ranges 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)" = "a1ca13c08c41c9c3e04224ed9ff80461d97e121589ff27c753a16cb10830ae0f"
"checksum walkdir 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)" = "c66c0b9792f0a765345452775f3adbd28dde9d33f30d13e5dcc5ae17cf6f3780"
"checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a"
"checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc"

View File

@ -1,36 +0,0 @@
[package]
name = "rust-book"
version = "0.0.1"
authors = ["Steve Klabnik <steve@steveklabnik.com>"]
description = "The Rust Book"
[[bin]]
name = "concat_chapters"
path = "tools/src/bin/concat_chapters.rs"
[[bin]]
name = "lfp"
path = "tools/src/bin/lfp.rs"
[[bin]]
name = "link2print"
path = "tools/src/bin/link2print.rs"
[[bin]]
name = "remove_links"
path = "tools/src/bin/remove_links.rs"
[[bin]]
name = "remove_markup"
path = "tools/src/bin/remove_markup.rs"
[[bin]]
name = "convert_quotes"
path = "tools/src/bin/convert_quotes.rs"
[dependencies]
walkdir = "0.1.5"
docopt = "0.6.82"
rustc-serialize = "0.3.19"
regex = "0.1.73"
lazy_static = "0.2.1"

View File

@ -1,201 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2010-2017 The Rust Project Developers
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,25 +0,0 @@
Copyright (c) 2010-2017 The Rust Project Developers
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

View File

@ -1,13 +0,0 @@
#!/bin/bash
set -eu
dir=$1
mkdir -p "tmp/$dir"
for f in $dir/*.md
do
cat "$f" | cargo run --bin convert_quotes > "tmp/$f"
mv "tmp/$f" "$f"
done

View File

@ -1,24 +0,0 @@
#!/bin/bash
set -eu
cargo build --release
mkdir -p tmp
rm -rf tmp/*.md
# Get all the Markdown files in the src dir,
ls src/${1:-""}*.md | \
# except for `SUMMARY.md`.
grep -v SUMMARY.md | \
# Extract just the filename so we can reuse it easily.
xargs -n 1 basename | \
# Remove all links followed by `<!-- ignore -->``, then
# Change all remaining links from Markdown to italicized inline text.
while IFS= read -r filename; do
< "src/$filename" ./target/release/remove_links \
| ./target/release/link2print \
| ./target/release/remove_markup > "tmp/$filename"
done
# Concatenate the files into the `nostarch` dir.
./target/release/concat_chapters tmp nostarch

View File

@ -1,34 +0,0 @@
# Style Guide
## Prose
* Prefer title case for chapter/section headings, ex: `## Generating a Secret
Number` rather than `## Generating a secret number`.
* Prefer italics over single quotes when calling out a term, ex: `is an
*associated function* of` rather than `is an associated function of`.
* When talking about a method in prose, DO NOT include the parentheses, ex:
`read_line` rather than `read_line()`.
* Hard wrap at 80 chars
* Prefer not mixing code and not-code in one word, ex: ``Remember when we wrote
`use std::io`?`` rather than ``Remember when we `use`d `std::io`?``
## Code
* Add the file name before markdown blocks to make it clear which file we're
talking about, when applicable.
* When making changes to code, make it clear which parts of the code changed
and which stayed the same... not sure how to do this yet
* Split up long lines as appropriate to keep them under 80 chars if possible
* Use `bash` syntax highlighting for command line output code blocks
## Links
Once all the scripts are done:
* If a link shouldn't be printed, mark it to be ignored
* This includes all "Chapter XX" intra-book links, which *should* be links
for the HTML version
* Make intra-book links and stdlib API doc links relative so they work whether
the book is read offline or on docs.rust-lang.org
* Use markdown links and keep in mind that they will be changed into `text at
*url*` in print, so word them in a way that it reads well in that format

View File

@ -1,218 +0,0 @@
<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships" xmlns:v="urn:schemas-microsoft-com:vml" xmlns:w="http://schemas.openxmlformats.org/wordprocessingml/2006/main" xmlns:w10="urn:schemas-microsoft-com:office:word" xmlns:wp="http://schemas.openxmlformats.org/drawingml/2006/wordprocessingDrawing" xmlns:wps="http://schemas.microsoft.com/office/word/2010/wordprocessingShape" xmlns:wpg="http://schemas.microsoft.com/office/word/2010/wordprocessingGroup" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:wp14="http://schemas.microsoft.com/office/word/2010/wordprocessingDrawing" xmlns:w14="http://schemas.microsoft.com/office/word/2010/wordml">
<xsl:output method="text" />
<xsl:template match="/">
<xsl:apply-templates select="/w:document/w:body/*" />
</xsl:template>
<!-- Ignore these -->
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'TOC')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents1')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents2')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents3')]" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ChapterStart']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Normal']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Standard']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'AuthorQuery']" />
<!-- Paragraph styles -->
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ChapterTitle']">
<xsl:text>&#10;[TOC]&#10;&#10;</xsl:text>
<xsl:text># </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadA']">
<xsl:text>## </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadB']">
<xsl:text>### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadC']">
<xsl:text>#### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadBox']">
<xsl:text>### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'NumListA' or @w:val = 'NumListB']]">
<xsl:text>1. </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'NumListC']]">
<xsl:text>1. </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BulletA' or @w:val = 'BulletB' or @w:val = 'ListPlainA' or @w:val = 'ListPlainB']]">
<xsl:text>* </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BulletC' or @w:val = 'ListPlainC']]">
<xsl:text>* </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'SubBullet']]">
<xsl:text> * </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BodyFirst' or @w:val = 'Body' or @w:val = 'BodyFirstBox' or @w:val = 'BodyBox' or @w:val = '1stPara']]">
<xsl:if test=".//w:t">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:if>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeA' or @w:val = 'CodeAWingding']]">
<xsl:text>```&#10;</xsl:text>
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeB' or @w:val = 'CodeBWingding']]">
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeC' or @w:val = 'CodeCWingding']]">
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;```&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'CodeSingle']">
<xsl:text>```&#10;</xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;```&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ProductionDirective']">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'Caption' or @w:val = 'TableTitle' or @w:val = 'Caption1' or @w:val = 'Listing']]">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BlockQuote']]">
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BlockText']]">
<xsl:text>&#10;</xsl:text>
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Note']">
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p">
Unmatched: <xsl:value-of select="w:pPr/w:pStyle/@w:val" />
<xsl:text>
</xsl:text>
</xsl:template>
<!-- Character styles -->
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'Literal' or @w:val = 'LiteralBold' or @w:val = 'LiteralCaption' or @w:val = 'LiteralBox']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>`</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>`</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:when test="normalize-space(w:t) != w:t and w:t != ''">
<xsl:text> </xsl:text>
</xsl:when>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'EmphasisBold']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>**</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>**</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:when test="normalize-space(w:t) != w:t and w:t != ''">
<xsl:text> </xsl:text>
</xsl:when>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'EmphasisItalic' or @w:val = 'EmphasisItalicBox' or @w:val = 'EmphasisNote' or @w:val = 'EmphasisRevCaption' or @w:val = 'EmphasisRevItal']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>*</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>*</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:otherwise>
<xsl:text> </xsl:text>
</xsl:otherwise>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r">
<xsl:value-of select="w:t" />
</xsl:template>
</xsl:stylesheet>

View File

@ -1,104 +0,0 @@
#[macro_use] extern crate lazy_static;
extern crate regex;
use std::env;
use std::io;
use std::io::{Read, Write};
use std::process::exit;
use std::fs::{create_dir, read_dir, File};
use std::path::{Path, PathBuf};
use std::collections::BTreeMap;
use regex::Regex;
static PATTERNS: &'static [(&'static str, &'static str)] = &[
(r"ch(\d\d)-\d\d-.*\.md", "chapter$1.md"),
(r"appendix-(\d\d).*\.md", "appendix.md"),
];
lazy_static! {
static ref MATCHERS: Vec<(Regex, &'static str)> = {
PATTERNS.iter()
.map(|&(expr, repl)| (Regex::new(expr).unwrap(), repl))
.collect()
};
}
fn main() {
let args: Vec<String> = env::args().collect();
if args.len() < 3 {
println!("Usage: {} <src-dir> <target-dir>", args[0]);
exit(1);
}
let source_dir = ensure_dir_exists(&args[1]).unwrap();
let target_dir = ensure_dir_exists(&args[2]).unwrap();
let mut matched_files = match_files(source_dir, target_dir);
matched_files.sort();
for (target_path, source_paths) in group_by_target(matched_files) {
concat_files(source_paths, target_path).unwrap();
}
}
fn match_files(source_dir: &Path, target_dir: &Path) -> Vec<(PathBuf, PathBuf)> {
read_dir(source_dir)
.expect("Unable to read source directory")
.filter_map(|maybe_entry| maybe_entry.ok())
.filter_map(|entry| {
let source_filename = entry.file_name();
let source_filename = &source_filename.to_string_lossy().into_owned();
for &(ref regex, replacement) in MATCHERS.iter() {
if regex.is_match(source_filename) {
let target_filename = regex.replace_all(source_filename, replacement);
let source_path = entry.path();
let mut target_path = PathBuf::from(&target_dir);
target_path.push(target_filename);
return Some((source_path, target_path));
}
}
None
})
.collect()
}
fn group_by_target(matched_files: Vec<(PathBuf, PathBuf)>) -> BTreeMap<PathBuf, Vec<PathBuf>> {
let mut grouped: BTreeMap<PathBuf, Vec<PathBuf>> = BTreeMap::new();
for (source, target) in matched_files {
if let Some(source_paths) = grouped.get_mut(&target) {
source_paths.push(source);
continue;
}
let source_paths = vec![source];
grouped.insert(target.clone(), source_paths);
}
grouped
}
fn concat_files(source_paths: Vec<PathBuf>, target_path: PathBuf) -> io::Result<()> {
println!("Concatenating into {}:", target_path.to_string_lossy());
let mut target = try!(File::create(target_path));
try!(target.write_all(b"\n[TOC]\n"));
for path in source_paths {
println!(" {}", path.to_string_lossy());
let mut source = try!(File::open(path));
let mut contents: Vec<u8> = Vec::new();
try!(source.read_to_end(&mut contents));
try!(target.write_all(b"\n"));
try!(target.write_all(&contents));
try!(target.write_all(b"\n"));
}
Ok(())
}
fn ensure_dir_exists(dir_string: &str) -> io::Result<&Path> {
let path = Path::new(dir_string);
if !path.exists() {
try!(create_dir(path));
}
Ok(&path)
}

View File

@ -1,73 +0,0 @@
use std::io;
use std::io::{Read, Write};
fn main() {
let mut is_in_code_block = false;
let mut is_in_inline_code = false;
let mut is_in_html_tag = false;
let mut buffer = String::new();
if let Err(e) = io::stdin().read_to_string(&mut buffer) {
panic!(e);
}
for line in buffer.lines() {
if line.is_empty() {
is_in_inline_code = false;
}
if line.starts_with("```") {
is_in_code_block = !is_in_code_block;
}
if is_in_code_block {
is_in_inline_code = false;
is_in_html_tag = false;
write!(io::stdout(), "{}\n", line).unwrap();
} else {
let mut modified_line = &mut String::new();
let mut previous_char = std::char::REPLACEMENT_CHARACTER;
let mut chars_in_line = line.chars();
while let Some(possible_match) = chars_in_line.next() {
// check if inside inline code
if possible_match == '`' {
is_in_inline_code = !is_in_inline_code;
}
// check if inside html tag
if possible_match == '<' && !is_in_inline_code {
is_in_html_tag = true;
}
if possible_match == '>' && !is_in_inline_code {
is_in_html_tag = false;
}
// replace with right/left apostrophe/quote
let char_to_push =
if possible_match == '\'' && !is_in_inline_code && !is_in_html_tag {
if (previous_char != std::char::REPLACEMENT_CHARACTER &&
!previous_char.is_whitespace()) ||
previous_char == ''
{
''
} else {
''
}
} else if possible_match == '"' && !is_in_inline_code && !is_in_html_tag {
if (previous_char != std::char::REPLACEMENT_CHARACTER &&
!previous_char.is_whitespace()) ||
previous_char == '“'
{
'”'
} else {
'“'
}
} else {
// leave untouched
possible_match
};
modified_line.push(char_to_push);
previous_char = char_to_push;
}
write!(io::stdout(), "{}\n", modified_line).unwrap();
}
}
}

View File

@ -1,243 +0,0 @@
// We have some long regex literals, so:
// ignore-tidy-linelength
extern crate docopt;
extern crate rustc_serialize;
extern crate walkdir;
use docopt::Docopt;
use std::{path, fs, io};
use std::io::BufRead;
fn main () {
let args: Args = Docopt::new(USAGE)
.and_then(|d| d.decode())
.unwrap_or_else(|e| e.exit());
let src_dir = &path::Path::new(&args.arg_src_dir);
let found_errs = walkdir::WalkDir::new(src_dir)
.min_depth(1)
.into_iter()
.map(|entry| {
match entry {
Ok(entry) => entry,
Err(err) => {
eprintln!("{:?}", err);
std::process::exit(911)
},
}
})
.map(|entry| {
let path = entry.path();
if is_file_of_interest(path) {
let err_vec = lint_file(path);
for err in &err_vec {
match *err {
LintingError::LineOfInterest(line_num, ref line) =>
eprintln!("{}:{}\t{}", path.display(), line_num, line),
LintingError::UnableToOpenFile =>
eprintln!("Unable to open {}.", path.display()),
}
}
!err_vec.is_empty()
} else {
false
}
})
.collect::<Vec<_>>()
.iter()
.any(|result| *result);
if found_errs {
std::process::exit(1)
} else {
std::process::exit(0)
}
}
const USAGE: &'static str = "
counter
Usage:
lfp <src-dir>
lfp (-h | --help)
Options:
-h --help Show this screen.
";
#[derive(Debug, RustcDecodable)]
struct Args {
arg_src_dir: String,
}
fn lint_file(path: &path::Path) -> Vec<LintingError> {
match fs::File::open(path) {
Ok(file) => lint_lines(io::BufReader::new(&file).lines()),
Err(_) => vec![LintingError::UnableToOpenFile],
}
}
fn lint_lines<I>(lines: I) -> Vec<LintingError>
where I: Iterator<Item=io::Result<String>> {
lines
.enumerate()
.map(|(line_num, line)| {
let raw_line = line.unwrap();
if is_line_of_interest(&raw_line) {
Err(LintingError::LineOfInterest(line_num, raw_line))
} else {
Ok(())
}
})
.filter(|result| result.is_err())
.map(|result| result.unwrap_err())
.collect()
}
fn is_file_of_interest(path: &path::Path) -> bool {
path.extension()
.map_or(false, |ext| ext == "md")
}
fn is_line_of_interest(line: &str) -> bool {
!line.split_whitespace()
.filter(|sub_string|
sub_string.contains("file://") &&
!sub_string.contains("file:///projects/")
)
.collect::<Vec<_>>()
.is_empty()
}
#[derive(Debug)]
enum LintingError {
UnableToOpenFile,
LineOfInterest(usize, String)
}
#[cfg(test)]
mod tests {
use std::path;
#[test]
fn lint_file_returns_a_vec_with_errs_when_lines_of_interest_are_found() {
let string = r#"
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/guessing_game`
Guess the number!
The secret number is: 61
Please input your guess.
10
You guessed: 10
Too small!
Please input your guess.
99
You guessed: 99
Too big!
Please input your guess.
foo
Please input your guess.
61
You guessed: 61
You win!
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 7
Please input your guess.
4
You guessed: 4
$ cargo run
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 83
Please input your guess.
5
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/debug/guessing_game`
Hello, world!
"#;
let raw_lines = string.to_string();
let lines = raw_lines.lines().map(|line| {
Ok(line.to_string())
});
let result_vec = super::lint_lines(lines);
assert!(!result_vec.is_empty());
assert_eq!(3, result_vec.len());
}
#[test]
fn lint_file_returns_an_empty_vec_when_no_lines_of_interest_are_found() {
let string = r#"
$ cargo run
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
Running `target/guessing_game`
Guess the number!
The secret number is: 61
Please input your guess.
10
You guessed: 10
Too small!
Please input your guess.
99
You guessed: 99
Too big!
Please input your guess.
foo
Please input your guess.
61
You guessed: 61
You win!
"#;
let raw_lines = string.to_string();
let lines = raw_lines.lines().map(|line| {
Ok(line.to_string())
});
let result_vec = super::lint_lines(lines);
assert!(result_vec.is_empty());
}
#[test]
fn is_file_of_interest_returns_false_when_the_path_is_a_directory() {
let uninteresting_fn = "src/img";
assert!(!super::is_file_of_interest(path::Path::new(uninteresting_fn)));
}
#[test]
fn is_file_of_interest_returns_false_when_the_filename_does_not_have_the_md_extension() {
let uninteresting_fn = "src/img/foo1.png";
assert!(!super::is_file_of_interest(path::Path::new(uninteresting_fn)));
}
#[test]
fn is_file_of_interest_returns_true_when_the_filename_has_the_md_extension() {
let interesting_fn = "src/ch01-00-introduction.md";
assert!(super::is_file_of_interest(path::Path::new(interesting_fn)));
}
#[test]
fn is_line_of_interest_does_not_report_a_line_if_the_line_contains_a_file_url_which_is_directly_followed_by_the_project_path() {
let sample_line = "Compiling guessing_game v0.1.0 (file:///projects/guessing_game)";
assert!(!super::is_line_of_interest(sample_line));
}
#[test]
fn is_line_of_interest_reports_a_line_if_the_line_contains_a_file_url_which_is_not_directly_followed_by_the_project_path() {
let sample_line = "Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)";
assert!(super::is_line_of_interest(sample_line));
}
}

View File

@ -1,406 +0,0 @@
// FIXME: we have some long lines that could be refactored, but it's not a big deal.
// ignore-tidy-linelength
extern crate regex;
use std::collections::HashMap;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main() {
write_md(parse_links(parse_references(read_md())));
}
fn read_md() -> String {
let mut buffer = String::new();
match io::stdin().read_to_string(&mut buffer) {
Ok(_) => buffer,
Err(error) => panic!(error),
}
}
fn write_md(output: String) {
write!(io::stdout(), "{}", output).unwrap();
}
fn parse_references(buffer: String) -> (String, HashMap<String, String>) {
let mut ref_map = HashMap::new();
// FIXME: currently doesn't handle "title" in following line.
let re = Regex::new(r###"(?m)\n?^ {0,3}\[([^]]+)\]:[[:blank:]]*(.*)$"###).unwrap();
let output = re.replace_all(&buffer, |caps: &Captures| {
let key = caps.at(1).unwrap().to_owned().to_uppercase();
let val = caps.at(2).unwrap().to_owned();
if ref_map.insert(key, val).is_some() {
panic!("Did not expect markdown page to have duplicate reference");
}
"".to_string()
});
(output, ref_map)
}
fn parse_links((buffer, ref_map): (String, HashMap<String, String>)) -> String {
// FIXME: check which punctuation is allowed by spec.
let re = Regex::new(r###"(?:(?P<pre>(?:```(?:[^`]|`[^`])*`?\n```\n)|(?:[^[]`[^`\n]+[\n]?[^`\n]*`))|(?:\[(?P<name>[^]]+)\](?:(?:\([[:blank:]]*(?P<val>[^")]*[^ ])(?:[[:blank:]]*"[^"]*")?\))|(?:\[(?P<key>[^]]*)\]))?))"###).expect("could not create regex");
let error_code = Regex::new(r###"^E\d{4}$"###).expect("could not create regex");
let output = re.replace_all(&buffer, |caps: &Captures| {
match caps.name("pre") {
Some(pre_section) => format!("{}", pre_section.to_owned()),
None => {
let name = caps.name("name").expect("could not get name").to_owned();
// Really we should ignore text inside code blocks,
// this is a hack to not try to treat `#[derive()]`,
// `[profile]`, `[test]`, or `[E\d\d\d\d]` like a link.
if name.starts_with("derive(") ||
name.starts_with("profile") ||
name.starts_with("test") ||
error_code.is_match(&name) {
return name
}
let val = match caps.name("val") {
// `[name](link)`
Some(value) => value.to_owned(),
None => {
match caps.name("key") {
Some(key) => {
match key {
// `[name][]`
"" => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
// `[name][reference]`
_ => format!("{}", ref_map.get(&key.to_uppercase()).expect(&format!("could not find url for the link text `{}`", key))),
}
}
// `[name]` as reference
None => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
}
}
};
format!("{} at *{}*", name, val)
}
}
});
output
}
#[cfg(test)]
mod tests {
fn parse(source: String) -> String {
super::parse_links(super::parse_references(source))
}
#[test]
fn parses_inline_link() {
let source = r"This is a [link](http://google.com) that should be expanded".to_string();
let target = r"This is a link at *http://google.com* that should be expanded".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_multiline_links() {
let source = r"This is a [link](http://google.com) that
should appear expanded. Another [location](/here/) and [another](http://gogogo)"
.to_string();
let target = r"This is a link at *http://google.com* that
should appear expanded. Another location at */here/* and another at *http://gogogo*"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference() {
let source = r"This is a [link][theref].
[theref]: http://example.com/foo
more text"
.to_string();
let target = r"This is a link at *http://example.com/foo*.
more text"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_implicit_link() {
let source = r"This is an [implicit][] link.
[implicit]: /The Link/"
.to_string();
let target = r"This is an implicit at */The Link/* link.".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_one_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_two_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_three_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
#[should_panic]
fn rejects_refs_with_four_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_optional_inline_title() {
let source = r###"This is a titled [link](http://example.com "My title")."###.to_string();
let target = r"This is a titled link at *http://example.com*.".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_title_with_puctuation() {
let source = r###"[link](http://example.com "It's Title")"###.to_string();
let target = r"link at *http://example.com*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_name_with_punctuation() {
let source = r###"[I'm here](there)"###.to_string();
let target = r###"I'm here at *there*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_name_with_utf8() {
let source = r###"[users forum](the users forum)"###.to_string();
let target = r###"users forum at *the users forum*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference_with_punctuation() {
let source = r###"[link][the ref-ref]
[the ref-ref]:http://example.com/ref-ref"###
.to_string();
let target = r###"link at *http://example.com/ref-ref*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference_case_insensitively() {
let source = r"[link][Ref]
[ref]: The reference"
.to_string();
let target = r"link at *The reference*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_link_as_reference_when_reference_is_empty() {
let source = r"[link as reference][]
[link as reference]: the actual reference"
.to_string();
let target = r"link as reference at *the actual reference*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_link_without_reference_as_reference() {
let source = r"[link] is alone
[link]: The contents"
.to_string();
let target = r"link at *The contents* is alone".to_string();
assert_eq!(parse(source), target);
}
#[test]
#[ignore]
fn parses_link_without_reference_as_reference_with_asterisks() {
let source = r"*[link]* is alone
[link]: The contents"
.to_string();
let target = r"*link* at *The contents* is alone".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_pre_sections() {
let source = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
"###
.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_quoted_sections() {
let source = r###"do not change `[package]`."###.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_quoted_sections_containing_newlines() {
let source = r"do not change `this [package]
is still here` [link](ref)"
.to_string();
let target = r"do not change `this [package]
is still here` link at *ref*"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_pre_sections_while_still_handling_links() {
let source = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
Another [link]
more text
[link]: http://gohere
"###
.to_string();
let target = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
Another link at *http://gohere*
more text
"###
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_quotes_in_pre_sections() {
let source = r###"```bash
$ cargo build
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
src/main.rs:23:21: 23:35 error: mismatched types [E0308]
src/main.rs:23 match guess.cmp(&secret_number) {
^~~~~~~~~~~~~~
src/main.rs:23:21: 23:35 help: run `rustc --explain E0308` to see a detailed explanation
src/main.rs:23:21: 23:35 note: expected type `&std::string::String`
src/main.rs:23:21: 23:35 note: found type `&_`
error: aborting due to previous error
Could not compile `guessing_game`.
```
"###
.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_short_quotes() {
let source = r"to `1` at index `[0]` i".to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_pre_sections_with_final_quote() {
let source = r###"```bash
$ cargo run
Compiling points v0.1.0 (file:///projects/points)
error: the trait bound `Point: std::fmt::Display` is not satisfied [--explain E0277]
--> src/main.rs:8:29
8 |> println!("Point 1: {}", p1);
|> ^^
<std macros>:2:27: 2:58: note: in this expansion of format_args!
<std macros>:3:1: 3:54: note: in this expansion of print! (defined in <std macros>)
src/main.rs:8:5: 8:33: note: in this expansion of println! (defined in <std macros>)
note: `Point` cannot be formatted with the default formatter; try using `:?` instead if you are using a format string
note: required by `std::fmt::Display::fmt`
```
`here` is another [link](the ref)
"###.to_string();
let target = r###"```bash
$ cargo run
Compiling points v0.1.0 (file:///projects/points)
error: the trait bound `Point: std::fmt::Display` is not satisfied [--explain E0277]
--> src/main.rs:8:29
8 |> println!("Point 1: {}", p1);
|> ^^
<std macros>:2:27: 2:58: note: in this expansion of format_args!
<std macros>:3:1: 3:54: note: in this expansion of print! (defined in <std macros>)
src/main.rs:8:5: 8:33: note: in this expansion of println! (defined in <std macros>)
note: `Point` cannot be formatted with the default formatter; try using `:?` instead if you are using a format string
note: required by `std::fmt::Display::fmt`
```
`here` is another link at *the ref*
"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_adam_p_cheatsheet() {
let source = r###"[I'm an inline-style link](https://www.google.com)
[I'm an inline-style link with title](https://www.google.com "Google's Homepage")
[I'm a reference-style link][Arbitrary case-insensitive reference text]
[I'm a relative reference to a repository file](../blob/master/LICENSE)
[You can use numbers for reference-style link definitions][1]
Or leave it empty and use the [link text itself][].
URLs and URLs in angle brackets will automatically get turned into links.
http://www.example.com or <http://www.example.com> and sometimes
example.com (but not on Github, for example).
Some text to show that the reference links can follow later.
[arbitrary case-insensitive reference text]: https://www.mozilla.org
[1]: http://slashdot.org
[link text itself]: http://www.reddit.com"###
.to_string();
let target = r###"I'm an inline-style link at *https://www.google.com*
I'm an inline-style link with title at *https://www.google.com*
I'm a reference-style link at *https://www.mozilla.org*
I'm a relative reference to a repository file at *../blob/master/LICENSE*
You can use numbers for reference-style link definitions at *http://slashdot.org*
Or leave it empty and use the link text itself at *http://www.reddit.com*.
URLs and URLs in angle brackets will automatically get turned into links.
http://www.example.com or <http://www.example.com> and sometimes
example.com (but not on Github, for example).
Some text to show that the reference links can follow later.
"###
.to_string();
assert_eq!(parse(source), target);
}
}

View File

@ -1,45 +0,0 @@
extern crate regex;
use std::collections::HashSet;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main () {
let mut buffer = String::new();
if let Err(e) = io::stdin().read_to_string(&mut buffer) {
panic!(e);
}
let mut refs = HashSet::new();
// Capture all links and link references.
let regex = r"\[([^\]]+)\](?:(?:\[([^\]]+)\])|(?:\([^\)]+\)))(?i)<!-- ignore -->";
let link_regex = Regex::new(regex).unwrap();
let first_pass = link_regex.replace_all(&buffer, |caps: &Captures| {
// Save the link reference we want to delete.
if let Some(reference) = caps.at(2) {
refs.insert(reference.to_owned());
}
// Put the link title back.
caps.at(1).unwrap().to_owned()
});
// Search for the references we need to delete.
let ref_regex = Regex::new(r"\n\[([^\]]+)\]:\s.*\n").unwrap();
let out = ref_regex.replace_all(&first_pass, |caps: &Captures| {
let capture = caps.at(1).unwrap().to_owned();
// Check if we've marked this reference for deletion ...
if refs.contains(capture.as_str()) {
return "".to_string();
}
// ... else we put back everything we captured.
caps.at(0).unwrap().to_owned()
});
write!(io::stdout(), "{}", out).unwrap();
}

View File

@ -1,52 +0,0 @@
extern crate regex;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main() {
write_md(remove_markup(read_md()));
}
fn read_md() -> String {
let mut buffer = String::new();
match io::stdin().read_to_string(&mut buffer) {
Ok(_) => buffer,
Err(error) => panic!(error),
}
}
fn write_md(output: String) {
write!(io::stdout(), "{}", output).unwrap();
}
fn remove_markup(input: String) -> String {
let filename_regex = Regex::new(r#"\A<span class="filename">(.*)</span>\z"#).unwrap();
// Captions sometimes take up multiple lines.
let caption_start_regex = Regex::new(r#"\A<span class="caption">(.*)\z"#).unwrap();
let caption_end_regex = Regex::new(r#"(.*)</span>\z"#).unwrap();
let regexen = vec![filename_regex, caption_start_regex, caption_end_regex];
let lines: Vec<_> = input.lines().flat_map(|line| {
// Remove our figure and caption markup.
if line == "<figure>" ||
line == "<figcaption>" ||
line == "</figcaption>" ||
line == "</figure>"
{
None
// Remove our syntax highlighting and rustdoc markers.
} else if line.starts_with("```") {
Some(String::from("```"))
// Remove the span around filenames and captions.
} else {
let result = regexen.iter().fold(line.to_string(), |result, regex| {
regex.replace_all(&result, |caps: &Captures| {
caps.at(1).unwrap().to_owned()
})
});
Some(result)
}
}).collect();
lines.join("\n")
}

View File

@ -9,6 +9,10 @@ edition = "2018"
name = "concat_chapters" name = "concat_chapters"
path = "tools/src/bin/concat_chapters.rs" path = "tools/src/bin/concat_chapters.rs"
[[bin]]
name = "convert_quotes"
path = "tools/src/bin/convert_quotes.rs"
[[bin]] [[bin]]
name = "lfp" name = "lfp"
path = "tools/src/bin/lfp.rs" path = "tools/src/bin/lfp.rs"
@ -25,10 +29,6 @@ path = "tools/src/bin/remove_links.rs"
name = "remove_markup" name = "remove_markup"
path = "tools/src/bin/remove_markup.rs" path = "tools/src/bin/remove_markup.rs"
[[bin]]
name = "convert_quotes"
path = "tools/src/bin/convert_quotes.rs"
[dependencies] [dependencies]
walkdir = "0.1.5" walkdir = "0.1.5"
docopt = "0.6.82" docopt = "0.6.82"

View File

@ -4,7 +4,7 @@
This repository contains the source of "The Rust Programming Language" book. This repository contains the source of "The Rust Programming Language" book.
[The book is available in dead-tree form from No Starch Press][nostarch] [The book is available in dead-tree form from No Starch Press][nostarch].
[nostarch]: https://nostarch.com/rust [nostarch]: https://nostarch.com/rust
@ -17,6 +17,10 @@ releases are updated less frequently.
[beta]: https://doc.rust-lang.org/beta/book/ [beta]: https://doc.rust-lang.org/beta/book/
[nightly]: https://doc.rust-lang.org/nightly/book/ [nightly]: https://doc.rust-lang.org/nightly/book/
See the [releases] to download just the code of all the code listings that appear in the book.
[releases]: https://github.com/rust-lang/book/releases
## Requirements ## Requirements
Building the book requires [mdBook], ideally the same 0.3.x version that Building the book requires [mdBook], ideally the same 0.3.x version that

View File

@ -1,32 +0,0 @@
Welcome to The Rust Programming Language book! This version of the text assumes
you are using Rust 1.31.0 or later, with `edition="2018"` in *Cargo.toml* of
all projects to use Rust 2018 Edition idioms. See the “Installation” section of
Chapter 1 to install or update Rust, and see the new Appendix E for information
on what editions of Rust are.
The 2018 Edition of the Rust language includes a number of improvements to make
Rust more ergonomic and easier to learn. This printing of the book has a number
of changes to reflect the improvements:
- Chapter 7, "Managing Growing Projects with Packages, Crates, and Modules",
has been mostly rewritten. The module system and the way paths work in the
2018 Edition have been made more consistent.
- Chapter 10 has new sections titled "Traits as Parameters" and "Returning
Types that Implement Traits" that explain the new `impl Trait` syntax.
- Chapter 11 has a new section "Using `Result<T, E>` in Tests" that shows how
to write tests that can use the `?` operator.
- The "Advanced Lifetimes" section of Chapter 19 has been removed as compiler
improvements have made the constructs in that section even more rare.
- The previous Appendix D on macros has been expanded to include procedural
macros, and has been moved to the "Macros" section in Chapter 19.
- Appendix A, "Keywords", also explains the new raw identifiers feature that
enables code written in Rust 2015 and Rust 2018 to interoperate.
- Appendix D now covers useful development tools that have been recently
released.
- We fixed a number of small errors and imprecise wording throughout the book.
Thank you to the readers who reported them!
Note that any code in the first printing of *The Rust Programming Language*
that compiled will continue to compile without `edition="2018"` in the
project's *Cargo.toml*, even as you update the version of the Rust compiler
that you're using. That's Rust's backwards compatibility guarantees at work!

View File

@ -1,142 +0,0 @@
[[package]]
name = "aho-corasick"
version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "docopt"
version = "0.6.86"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.5.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "kernel32-sys"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "lazy_static"
version = "0.2.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "libc"
version = "0.2.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "memchr"
version = "0.1.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex"
version = "0.1.80"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"aho-corasick 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)",
"regex-syntax 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)",
"thread_local 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)",
"utf8-ranges 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "regex-syntax"
version = "0.3.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "rust-book"
version = "0.0.1"
dependencies = [
"docopt 0.6.86 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)",
"walkdir 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rustc-serialize"
version = "0.3.24"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "strsim"
version = "0.5.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "thread-id"
version = "2.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "thread_local"
version = "0.2.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"thread-id 2.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "utf8-ranges"
version = "0.1.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "walkdir"
version = "0.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "winapi"
version = "0.2.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "winapi-build"
version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[metadata]
"checksum aho-corasick 0.5.3 (registry+https://github.com/rust-lang/crates.io-index)" = "ca972c2ea5f742bfce5687b9aef75506a764f61d37f8f649047846a9686ddb66"
"checksum docopt 0.6.86 (registry+https://github.com/rust-lang/crates.io-index)" = "4a7ef30445607f6fc8720f0a0a2c7442284b629cf0d049286860fae23e71c4d9"
"checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d"
"checksum lazy_static 0.2.10 (registry+https://github.com/rust-lang/crates.io-index)" = "236eb37a62591d4a41a89b7763d7de3e06ca02d5ab2815446a8bae5d2f8c2d57"
"checksum libc 0.2.33 (registry+https://github.com/rust-lang/crates.io-index)" = "5ba3df4dcb460b9dfbd070d41c94c19209620c191b0340b929ce748a2bcd42d2"
"checksum memchr 0.1.11 (registry+https://github.com/rust-lang/crates.io-index)" = "d8b629fb514376c675b98c1421e80b151d3817ac42d7c667717d282761418d20"
"checksum regex 0.1.80 (registry+https://github.com/rust-lang/crates.io-index)" = "4fd4ace6a8cf7860714a2c2280d6c1f7e6a413486c13298bbc86fd3da019402f"
"checksum regex-syntax 0.3.9 (registry+https://github.com/rust-lang/crates.io-index)" = "f9ec002c35e86791825ed294b50008eea9ddfc8def4420124fbc6b08db834957"
"checksum rustc-serialize 0.3.24 (registry+https://github.com/rust-lang/crates.io-index)" = "dcf128d1287d2ea9d80910b5f1120d0b8eede3fbf1abe91c40d39ea7d51e6fda"
"checksum strsim 0.5.2 (registry+https://github.com/rust-lang/crates.io-index)" = "67f84c44fbb2f91db7fef94554e6b2ac05909c9c0b0bc23bb98d3a1aebfe7f7c"
"checksum thread-id 2.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "a9539db560102d1cef46b8b78ce737ff0bb64e7e18d35b2a5688f7d097d0ff03"
"checksum thread_local 0.2.7 (registry+https://github.com/rust-lang/crates.io-index)" = "8576dbbfcaef9641452d5cf0df9b0e7eeab7694956dd33bb61515fb8f18cfdd5"
"checksum utf8-ranges 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)" = "a1ca13c08c41c9c3e04224ed9ff80461d97e121589ff27c753a16cb10830ae0f"
"checksum walkdir 0.1.8 (registry+https://github.com/rust-lang/crates.io-index)" = "c66c0b9792f0a765345452775f3adbd28dde9d33f30d13e5dcc5ae17cf6f3780"
"checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a"
"checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc"

View File

@ -1,36 +0,0 @@
[package]
name = "rust-book"
version = "0.0.1"
authors = ["Steve Klabnik <steve@steveklabnik.com>"]
description = "The Rust Book"
[[bin]]
name = "concat_chapters"
path = "tools/src/bin/concat_chapters.rs"
[[bin]]
name = "lfp"
path = "tools/src/bin/lfp.rs"
[[bin]]
name = "link2print"
path = "tools/src/bin/link2print.rs"
[[bin]]
name = "remove_links"
path = "tools/src/bin/remove_links.rs"
[[bin]]
name = "remove_markup"
path = "tools/src/bin/remove_markup.rs"
[[bin]]
name = "convert_quotes"
path = "tools/src/bin/convert_quotes.rs"
[dependencies]
walkdir = "0.1.5"
docopt = "0.6.82"
rustc-serialize = "0.3.19"
regex = "0.1.73"
lazy_static = "0.2.1"

View File

@ -1,201 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2010-2017 The Rust Project Developers
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,25 +0,0 @@
Copyright (c) 2010-2017 The Rust Project Developers
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

View File

@ -1,13 +0,0 @@
#!/bin/bash
set -eu
dir=$1
mkdir -p "tmp/$dir"
for f in $dir/*.md
do
cat "$f" | cargo run --bin convert_quotes > "tmp/$f"
mv "tmp/$f" "$f"
done

View File

@ -1,20 +0,0 @@
#!/bin/bash
set -eu
# Get all the docx files in the tmp dir.
ls tmp/*.docx | \
# Extract just the filename so we can reuse it easily.
xargs -n 1 basename -s .docx | \
while IFS= read -r filename; do
# Make a directory to put the XML in.
mkdir -p "tmp/$filename"
# Unzip the docx to get at the XML.
unzip -o "tmp/$filename.docx" -d "tmp/$filename"
# Convert to markdown with XSL.
xsltproc tools/docx-to-md.xsl "tmp/$filename/word/document.xml" | \
# Hard wrap at 80 chars at word bourdaries.
fold -w 80 -s | \
# Remove trailing whitespace and save in the `nostarch` dir for comparison.
sed -e "s/ *$//" > "nostarch/$filename.md"
done

View File

@ -1,24 +0,0 @@
#!/bin/bash
set -eu
cargo build --release
mkdir -p tmp
rm -rf tmp/*.md
# Get all the Markdown files in the src dir,
ls src/${1:-""}*.md | \
# except for `SUMMARY.md`.
grep -v SUMMARY.md | \
# Extract just the filename so we can reuse it easily.
xargs -n 1 basename | \
# Remove all links followed by `<!-- ignore -->``, then
# Change all remaining links from Markdown to italicized inline text.
while IFS= read -r filename; do
< "src/$filename" ./target/release/remove_links \
| ./target/release/link2print \
| ./target/release/remove_markup > "tmp/$filename"
done
# Concatenate the files into the `nostarch` dir.
./target/release/concat_chapters tmp nostarch

View File

@ -1,34 +0,0 @@
# Style Guide
## Prose
* Prefer title case for chapter/section headings, ex: `## Generating a Secret
Number` rather than `## Generating a secret number`.
* Prefer italics over single quotes when calling out a term, ex: `is an
*associated function* of` rather than `is an associated function of`.
* When talking about a method in prose, DO NOT include the parentheses, ex:
`read_line` rather than `read_line()`.
* Hard wrap at 80 chars
* Prefer not mixing code and not-code in one word, ex: ``Remember when we wrote
`use std::io`?`` rather than ``Remember when we `use`d `std::io`?``
## Code
* Add the file name before markdown blocks to make it clear which file we're
talking about, when applicable.
* When making changes to code, make it clear which parts of the code changed
and which stayed the same... not sure how to do this yet
* Split up long lines as appropriate to keep them under 80 chars if possible
* Use `bash` syntax highlighting for command line output code blocks
## Links
Once all the scripts are done:
* If a link shouldn't be printed, mark it to be ignored
* This includes all "Chapter XX" intra-book links, which *should* be links
for the HTML version
* Make intra-book links and stdlib API doc links relative so they work whether
the book is read offline or on docs.rust-lang.org
* Use markdown links and keep in mind that they will be changed into `text at
*url*` in print, so word them in a way that it reads well in that format

View File

@ -1,218 +0,0 @@
<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships" xmlns:v="urn:schemas-microsoft-com:vml" xmlns:w="http://schemas.openxmlformats.org/wordprocessingml/2006/main" xmlns:w10="urn:schemas-microsoft-com:office:word" xmlns:wp="http://schemas.openxmlformats.org/drawingml/2006/wordprocessingDrawing" xmlns:wps="http://schemas.microsoft.com/office/word/2010/wordprocessingShape" xmlns:wpg="http://schemas.microsoft.com/office/word/2010/wordprocessingGroup" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:wp14="http://schemas.microsoft.com/office/word/2010/wordprocessingDrawing" xmlns:w14="http://schemas.microsoft.com/office/word/2010/wordml">
<xsl:output method="text" />
<xsl:template match="/">
<xsl:apply-templates select="/w:document/w:body/*" />
</xsl:template>
<!-- Ignore these -->
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'TOC')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents1')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents2')]" />
<xsl:template match="w:p[starts-with(w:pPr/w:pStyle/@w:val, 'Contents3')]" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ChapterStart']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Normal']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Standard']" />
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'AuthorQuery']" />
<!-- Paragraph styles -->
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ChapterTitle']">
<xsl:text>&#10;[TOC]&#10;&#10;</xsl:text>
<xsl:text># </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadA']">
<xsl:text>## </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadB']">
<xsl:text>### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadC']">
<xsl:text>#### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'HeadBox']">
<xsl:text>### </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'NumListA' or @w:val = 'NumListB']]">
<xsl:text>1. </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'NumListC']]">
<xsl:text>1. </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BulletA' or @w:val = 'BulletB' or @w:val = 'ListPlainA' or @w:val = 'ListPlainB']]">
<xsl:text>* </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BulletC' or @w:val = 'ListPlainC']]">
<xsl:text>* </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'SubBullet']]">
<xsl:text> * </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BodyFirst' or @w:val = 'Body' or @w:val = 'BodyFirstBox' or @w:val = 'BodyBox' or @w:val = '1stPara']]">
<xsl:if test=".//w:t">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:if>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeA' or @w:val = 'CodeAWingding']]">
<xsl:text>```&#10;</xsl:text>
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeB' or @w:val = 'CodeBWingding']]">
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'CodeC' or @w:val = 'CodeCWingding']]">
<!-- Don't apply Emphasis/etc templates in code blocks -->
<xsl:for-each select="w:r">
<xsl:value-of select="w:t" />
</xsl:for-each>
<xsl:text>&#10;```&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'CodeSingle']">
<xsl:text>```&#10;</xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;```&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'ProductionDirective']">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'Caption' or @w:val = 'TableTitle' or @w:val = 'Caption1' or @w:val = 'Listing']]">
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BlockQuote']]">
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle[@w:val = 'BlockText']]">
<xsl:text>&#10;</xsl:text>
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p[w:pPr/w:pStyle/@w:val = 'Note']">
<xsl:text>> </xsl:text>
<xsl:apply-templates select="*" />
<xsl:text>&#10;&#10;</xsl:text>
</xsl:template>
<xsl:template match="w:p">
Unmatched: <xsl:value-of select="w:pPr/w:pStyle/@w:val" />
<xsl:text>
</xsl:text>
</xsl:template>
<!-- Character styles -->
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'Literal' or @w:val = 'LiteralBold' or @w:val = 'LiteralCaption' or @w:val = 'LiteralBox']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>`</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>`</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:when test="normalize-space(w:t) != w:t and w:t != ''">
<xsl:text> </xsl:text>
</xsl:when>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'EmphasisBold']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>**</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>**</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:when test="normalize-space(w:t) != w:t and w:t != ''">
<xsl:text> </xsl:text>
</xsl:when>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r[w:rPr/w:rStyle[@w:val = 'EmphasisItalic' or @w:val = 'EmphasisItalicBox' or @w:val = 'EmphasisNote' or @w:val = 'EmphasisRevCaption' or @w:val = 'EmphasisRevItal']]">
<xsl:choose>
<xsl:when test="normalize-space(w:t) != ''">
<xsl:if test="starts-with(w:t, ' ')">
<xsl:text> </xsl:text>
</xsl:if>
<xsl:text>*</xsl:text>
<xsl:value-of select="normalize-space(w:t)" />
<xsl:text>*</xsl:text>
<xsl:if test="substring(w:t, string-length(w:t)) = ' '">
<xsl:text> </xsl:text>
</xsl:if>
</xsl:when>
<xsl:otherwise>
<xsl:text> </xsl:text>
</xsl:otherwise>
</xsl:choose>
</xsl:template>
<xsl:template match="w:r">
<xsl:value-of select="w:t" />
</xsl:template>
</xsl:stylesheet>

View File

@ -1,104 +0,0 @@
#[macro_use] extern crate lazy_static;
extern crate regex;
use std::env;
use std::io;
use std::io::{Read, Write};
use std::process::exit;
use std::fs::{create_dir, read_dir, File};
use std::path::{Path, PathBuf};
use std::collections::BTreeMap;
use regex::Regex;
static PATTERNS: &'static [(&'static str, &'static str)] = &[
(r"ch(\d\d)-\d\d-.*\.md", "chapter$1.md"),
(r"appendix-(\d\d).*\.md", "appendix.md"),
];
lazy_static! {
static ref MATCHERS: Vec<(Regex, &'static str)> = {
PATTERNS.iter()
.map(|&(expr, repl)| (Regex::new(expr).unwrap(), repl))
.collect()
};
}
fn main() {
let args: Vec<String> = env::args().collect();
if args.len() < 3 {
println!("Usage: {} <src-dir> <target-dir>", args[0]);
exit(1);
}
let source_dir = ensure_dir_exists(&args[1]).unwrap();
let target_dir = ensure_dir_exists(&args[2]).unwrap();
let mut matched_files = match_files(source_dir, target_dir);
matched_files.sort();
for (target_path, source_paths) in group_by_target(matched_files) {
concat_files(source_paths, target_path).unwrap();
}
}
fn match_files(source_dir: &Path, target_dir: &Path) -> Vec<(PathBuf, PathBuf)> {
read_dir(source_dir)
.expect("Unable to read source directory")
.filter_map(|maybe_entry| maybe_entry.ok())
.filter_map(|entry| {
let source_filename = entry.file_name();
let source_filename = &source_filename.to_string_lossy().into_owned();
for &(ref regex, replacement) in MATCHERS.iter() {
if regex.is_match(source_filename) {
let target_filename = regex.replace_all(source_filename, replacement);
let source_path = entry.path();
let mut target_path = PathBuf::from(&target_dir);
target_path.push(target_filename);
return Some((source_path, target_path));
}
}
None
})
.collect()
}
fn group_by_target(matched_files: Vec<(PathBuf, PathBuf)>) -> BTreeMap<PathBuf, Vec<PathBuf>> {
let mut grouped: BTreeMap<PathBuf, Vec<PathBuf>> = BTreeMap::new();
for (source, target) in matched_files {
if let Some(source_paths) = grouped.get_mut(&target) {
source_paths.push(source);
continue;
}
let source_paths = vec![source];
grouped.insert(target.clone(), source_paths);
}
grouped
}
fn concat_files(source_paths: Vec<PathBuf>, target_path: PathBuf) -> io::Result<()> {
println!("Concatenating into {}:", target_path.to_string_lossy());
let mut target = try!(File::create(target_path));
try!(target.write_all(b"\n[TOC]\n"));
for path in source_paths {
println!(" {}", path.to_string_lossy());
let mut source = try!(File::open(path));
let mut contents: Vec<u8> = Vec::new();
try!(source.read_to_end(&mut contents));
try!(target.write_all(b"\n"));
try!(target.write_all(&contents));
try!(target.write_all(b"\n"));
}
Ok(())
}
fn ensure_dir_exists(dir_string: &str) -> io::Result<&Path> {
let path = Path::new(dir_string);
if !path.exists() {
try!(create_dir(path));
}
Ok(&path)
}

View File

@ -1,73 +0,0 @@
use std::io;
use std::io::{Read, Write};
fn main() {
let mut is_in_code_block = false;
let mut is_in_inline_code = false;
let mut is_in_html_tag = false;
let mut buffer = String::new();
if let Err(e) = io::stdin().read_to_string(&mut buffer) {
panic!(e);
}
for line in buffer.lines() {
if line.is_empty() {
is_in_inline_code = false;
}
if line.starts_with("```") {
is_in_code_block = !is_in_code_block;
}
if is_in_code_block {
is_in_inline_code = false;
is_in_html_tag = false;
write!(io::stdout(), "{}\n", line).unwrap();
} else {
let mut modified_line = &mut String::new();
let mut previous_char = std::char::REPLACEMENT_CHARACTER;
let mut chars_in_line = line.chars();
while let Some(possible_match) = chars_in_line.next() {
// Check if inside inline code.
if possible_match == '`' {
is_in_inline_code = !is_in_inline_code;
}
// Check if inside HTML tag.
if possible_match == '<' && !is_in_inline_code {
is_in_html_tag = true;
}
if possible_match == '>' && !is_in_inline_code {
is_in_html_tag = false;
}
// Replace with right/left apostrophe/quote.
let char_to_push =
if possible_match == '\'' && !is_in_inline_code && !is_in_html_tag {
if (previous_char != std::char::REPLACEMENT_CHARACTER &&
!previous_char.is_whitespace()) ||
previous_char == ''
{
''
} else {
''
}
} else if possible_match == '"' && !is_in_inline_code && !is_in_html_tag {
if (previous_char != std::char::REPLACEMENT_CHARACTER &&
!previous_char.is_whitespace()) ||
previous_char == '“'
{
'”'
} else {
'“'
}
} else {
// Leave untouched.
possible_match
};
modified_line.push(char_to_push);
previous_char = char_to_push;
}
write!(io::stdout(), "{}\n", modified_line).unwrap();
}
}
}

View File

@ -1,242 +0,0 @@
// We have some long regex literals, so:
// ignore-tidy-linelength
extern crate rustc_serialize;
extern crate docopt;
use docopt::Docopt;
extern crate walkdir;
use std::{path, fs, io};
use std::io::{BufRead, Write};
fn main () {
let args: Args = Docopt::new(USAGE)
.and_then(|d| d.decode())
.unwrap_or_else(|e| e.exit());
let src_dir = &path::Path::new(&args.arg_src_dir);
let found_errs = walkdir::WalkDir::new(src_dir)
.min_depth(1)
.into_iter()
.map(|entry| {
match entry {
Ok(entry) => entry,
Err(err) => {
eprintln!("{:?}", err);
std::process::exit(911)
},
}
})
.map(|entry| {
let path = entry.path();
if is_file_of_interest(path) {
let err_vec = lint_file(path);
for err in &err_vec {
match *err {
LintingError::LineOfInterest(line_num, ref line) =>
eprintln!("{}:{}\t{}", path.display(), line_num, line),
LintingError::UnableToOpenFile =>
eprintln!("Unable to open {}.", path.display()),
}
}
!err_vec.is_empty()
} else {
false
}
})
.collect::<Vec<_>>()
.iter()
.any(|result| *result);
if found_errs {
std::process::exit(1)
} else {
std::process::exit(0)
}
}
const USAGE: &'static str = "
counter
Usage:
lfp <src-dir>
lfp (-h | --help)
Options:
-h --help Show this screen.
";
#[derive(Debug, RustcDecodable)]
struct Args {
arg_src_dir: String,
}
fn lint_file(path: &path::Path) -> Vec<LintingError> {
match fs::File::open(path) {
Ok(file) => lint_lines(io::BufReader::new(&file).lines()),
Err(_) => vec![LintingError::UnableToOpenFile],
}
}
fn lint_lines<I>(lines: I) -> Vec<LintingError>
where I: Iterator<Item=io::Result<String>> {
lines
.enumerate()
.map(|(line_num, line)| {
let raw_line = line.unwrap();
if is_line_of_interest(&raw_line) {
Err(LintingError::LineOfInterest(line_num, raw_line))
} else {
Ok(())
}
})
.filter(|result| result.is_err())
.map(|result| result.unwrap_err())
.collect()
}
fn is_file_of_interest(path: &path::Path) -> bool {
path.extension()
.map_or(false, |ext| ext == "md")
}
fn is_line_of_interest(line: &str) -> bool {
!line.split_whitespace()
.filter(|sub_string|
sub_string.contains("file://") &&
!sub_string.contains("file:///projects/")
)
.collect::<Vec<_>>()
.is_empty()
}
#[derive(Debug)]
enum LintingError {
UnableToOpenFile,
LineOfInterest(usize, String)
}
#[cfg(test)]
mod tests {
use std::path;
#[test]
fn lint_file_returns_a_vec_with_errs_when_lines_of_interest_are_found() {
let string = r#"
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/guessing_game`
Guess the number!
The secret number is: 61
Please input your guess.
10
You guessed: 10
Too small!
Please input your guess.
99
You guessed: 99
Too big!
Please input your guess.
foo
Please input your guess.
61
You guessed: 61
You win!
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 7
Please input your guess.
4
You guessed: 4
$ cargo run
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 83
Please input your guess.
5
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
Running `target/debug/guessing_game`
Hello, world!
"#;
let raw_lines = string.to_string();
let lines = raw_lines.lines().map(|line| {
Ok(line.to_string())
});
let result_vec = super::lint_lines(lines);
assert!(!result_vec.is_empty());
assert_eq!(3, result_vec.len());
}
#[test]
fn lint_file_returns_an_empty_vec_when_no_lines_of_interest_are_found() {
let string = r#"
$ cargo run
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
Running `target/guessing_game`
Guess the number!
The secret number is: 61
Please input your guess.
10
You guessed: 10
Too small!
Please input your guess.
99
You guessed: 99
Too big!
Please input your guess.
foo
Please input your guess.
61
You guessed: 61
You win!
"#;
let raw_lines = string.to_string();
let lines = raw_lines.lines().map(|line| {
Ok(line.to_string())
});
let result_vec = super::lint_lines(lines);
assert!(result_vec.is_empty());
}
#[test]
fn is_file_of_interest_returns_false_when_the_path_is_a_directory() {
let uninteresting_fn = "src/img";
assert!(!super::is_file_of_interest(path::Path::new(uninteresting_fn)));
}
#[test]
fn is_file_of_interest_returns_false_when_the_filename_does_not_have_the_md_extension() {
let uninteresting_fn = "src/img/foo1.png";
assert!(!super::is_file_of_interest(path::Path::new(uninteresting_fn)));
}
#[test]
fn is_file_of_interest_returns_true_when_the_filename_has_the_md_extension() {
let interesting_fn = "src/ch01-00-introduction.md";
assert!(super::is_file_of_interest(path::Path::new(interesting_fn)));
}
#[test]
fn is_line_of_interest_does_not_report_a_line_if_the_line_contains_a_file_url_which_is_directly_followed_by_the_project_path() {
let sample_line = "Compiling guessing_game v0.1.0 (file:///projects/guessing_game)";
assert!(!super::is_line_of_interest(sample_line));
}
#[test]
fn is_line_of_interest_reports_a_line_if_the_line_contains_a_file_url_which_is_not_directly_followed_by_the_project_path() {
let sample_line = "Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)";
assert!(super::is_line_of_interest(sample_line));
}
}

View File

@ -1,406 +0,0 @@
// FIXME: we have some long lines that could be refactored, but it's not a big deal.
// ignore-tidy-linelength
extern crate regex;
use std::collections::HashMap;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main() {
write_md(parse_links(parse_references(read_md())));
}
fn read_md() -> String {
let mut buffer = String::new();
match io::stdin().read_to_string(&mut buffer) {
Ok(_) => buffer,
Err(error) => panic!(error),
}
}
fn write_md(output: String) {
write!(io::stdout(), "{}", output).unwrap();
}
fn parse_references(buffer: String) -> (String, HashMap<String, String>) {
let mut ref_map = HashMap::new();
// FIXME: currently doesn't handle "title" in following line.
let re = Regex::new(r###"(?m)\n?^ {0,3}\[([^]]+)\]:[[:blank:]]*(.*)$"###).unwrap();
let output = re.replace_all(&buffer, |caps: &Captures| {
let key = caps.at(1).unwrap().to_owned().to_uppercase();
let val = caps.at(2).unwrap().to_owned();
if ref_map.insert(key, val).is_some() {
panic!("Did not expect markdown page to have duplicate reference");
}
"".to_string()
});
(output, ref_map)
}
fn parse_links((buffer, ref_map): (String, HashMap<String, String>)) -> String {
// FIXME: check which punctuation is allowed by spec.
let re = Regex::new(r###"(?:(?P<pre>(?:```(?:[^`]|`[^`])*`?\n```\n)|(?:[^[]`[^`\n]+[\n]?[^`\n]*`))|(?:\[(?P<name>[^]]+)\](?:(?:\([[:blank:]]*(?P<val>[^")]*[^ ])(?:[[:blank:]]*"[^"]*")?\))|(?:\[(?P<key>[^]]*)\]))?))"###).expect("could not create regex");
let error_code = Regex::new(r###"^E\d{4}$"###).expect("could not create regex");
let output = re.replace_all(&buffer, |caps: &Captures| {
match caps.name("pre") {
Some(pre_section) => format!("{}", pre_section.to_owned()),
None => {
let name = caps.name("name").expect("could not get name").to_owned();
// Really we should ignore text inside code blocks,
// this is a hack to not try to treat `#[derive()]`,
// `[profile]`, `[test]`, or `[E\d\d\d\d]` like a link.
if name.starts_with("derive(") ||
name.starts_with("profile") ||
name.starts_with("test") ||
error_code.is_match(&name) {
return name
}
let val = match caps.name("val") {
// `[name](link)`
Some(value) => value.to_owned(),
None => {
match caps.name("key") {
Some(key) => {
match key {
// [name][]
"" => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
// [name][reference]
_ => format!("{}", ref_map.get(&key.to_uppercase()).expect(&format!("could not find url for the link text `{}`", key))),
}
}
// `[name]` as reference
None => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
}
}
};
format!("{} at *{}*", name, val)
}
}
});
output
}
#[cfg(test)]
mod tests {
fn parse(source: String) -> String {
super::parse_links(super::parse_references(source))
}
#[test]
fn parses_inline_link() {
let source = r"This is a [link](http://google.com) that should be expanded".to_string();
let target = r"This is a link at *http://google.com* that should be expanded".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_multiline_links() {
let source = r"This is a [link](http://google.com) that
should appear expanded. Another [location](/here/) and [another](http://gogogo)"
.to_string();
let target = r"This is a link at *http://google.com* that
should appear expanded. Another location at */here/* and another at *http://gogogo*"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference() {
let source = r"This is a [link][theref].
[theref]: http://example.com/foo
more text"
.to_string();
let target = r"This is a link at *http://example.com/foo*.
more text"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_implicit_link() {
let source = r"This is an [implicit][] link.
[implicit]: /The Link/"
.to_string();
let target = r"This is an implicit at */The Link/* link.".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_one_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_two_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_refs_with_three_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
#[should_panic]
fn rejects_refs_with_four_space_indentation() {
let source = r"This is a [link][ref]
[ref]: The link"
.to_string();
let target = r"This is a link at *The link*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_optional_inline_title() {
let source = r###"This is a titled [link](http://example.com "My title")."###.to_string();
let target = r"This is a titled link at *http://example.com*.".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_title_with_puctuation() {
let source = r###"[link](http://example.com "It's Title")"###.to_string();
let target = r"link at *http://example.com*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_name_with_punctuation() {
let source = r###"[I'm here](there)"###.to_string();
let target = r###"I'm here at *there*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_name_with_utf8() {
let source = r###"[users forum](the users forum)"###.to_string();
let target = r###"users forum at *the users forum*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference_with_punctuation() {
let source = r###"[link][the ref-ref]
[the ref-ref]:http://example.com/ref-ref"###
.to_string();
let target = r###"link at *http://example.com/ref-ref*"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_reference_case_insensitively() {
let source = r"[link][Ref]
[ref]: The reference"
.to_string();
let target = r"link at *The reference*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_link_as_reference_when_reference_is_empty() {
let source = r"[link as reference][]
[link as reference]: the actual reference"
.to_string();
let target = r"link as reference at *the actual reference*".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_link_without_reference_as_reference() {
let source = r"[link] is alone
[link]: The contents"
.to_string();
let target = r"link at *The contents* is alone".to_string();
assert_eq!(parse(source), target);
}
#[test]
#[ignore]
fn parses_link_without_reference_as_reference_with_asterisks() {
let source = r"*[link]* is alone
[link]: The contents"
.to_string();
let target = r"*link* at *The contents* is alone".to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_pre_sections() {
let source = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
"###
.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_quoted_sections() {
let source = r###"do not change `[package]`."###.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_quoted_sections_containing_newlines() {
let source = r"do not change `this [package]
is still here` [link](ref)"
.to_string();
let target = r"do not change `this [package]
is still here` link at *ref*"
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_links_in_pre_sections_while_still_handling_links() {
let source = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
Another [link]
more text
[link]: http://gohere
"###
.to_string();
let target = r###"```toml
[package]
name = "hello_cargo"
version = "0.1.0"
authors = ["Your Name <you@example.com>"]
[dependencies]
```
Another link at *http://gohere*
more text
"###
.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_quotes_in_pre_sections() {
let source = r###"```bash
$ cargo build
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
src/main.rs:23:21: 23:35 error: mismatched types [E0308]
src/main.rs:23 match guess.cmp(&secret_number) {
^~~~~~~~~~~~~~
src/main.rs:23:21: 23:35 help: run `rustc --explain E0308` to see a detailed explanation
src/main.rs:23:21: 23:35 note: expected type `&std::string::String`
src/main.rs:23:21: 23:35 note: found type `&_`
error: aborting due to previous error
Could not compile `guessing_game`.
```
"###
.to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_short_quotes() {
let source = r"to `1` at index `[0]` i".to_string();
let target = source.clone();
assert_eq!(parse(source), target);
}
#[test]
fn ignores_pre_sections_with_final_quote() {
let source = r###"```bash
$ cargo run
Compiling points v0.1.0 (file:///projects/points)
error: the trait bound `Point: std::fmt::Display` is not satisfied [--explain E0277]
--> src/main.rs:8:29
8 |> println!("Point 1: {}", p1);
|> ^^
<std macros>:2:27: 2:58: note: in this expansion of format_args!
<std macros>:3:1: 3:54: note: in this expansion of print! (defined in <std macros>)
src/main.rs:8:5: 8:33: note: in this expansion of println! (defined in <std macros>)
note: `Point` cannot be formatted with the default formatter; try using `:?` instead if you are using a format string
note: required by `std::fmt::Display::fmt`
```
`here` is another [link](the ref)
"###.to_string();
let target = r###"```bash
$ cargo run
Compiling points v0.1.0 (file:///projects/points)
error: the trait bound `Point: std::fmt::Display` is not satisfied [--explain E0277]
--> src/main.rs:8:29
8 |> println!("Point 1: {}", p1);
|> ^^
<std macros>:2:27: 2:58: note: in this expansion of format_args!
<std macros>:3:1: 3:54: note: in this expansion of print! (defined in <std macros>)
src/main.rs:8:5: 8:33: note: in this expansion of println! (defined in <std macros>)
note: `Point` cannot be formatted with the default formatter; try using `:?` instead if you are using a format string
note: required by `std::fmt::Display::fmt`
```
`here` is another link at *the ref*
"###.to_string();
assert_eq!(parse(source), target);
}
#[test]
fn parses_adam_p_cheatsheet() {
let source = r###"[I'm an inline-style link](https://www.google.com)
[I'm an inline-style link with title](https://www.google.com "Google's Homepage")
[I'm a reference-style link][Arbitrary case-insensitive reference text]
[I'm a relative reference to a repository file](../blob/master/LICENSE)
[You can use numbers for reference-style link definitions][1]
Or leave it empty and use the [link text itself][].
URLs and URLs in angle brackets will automatically get turned into links.
http://www.example.com or <http://www.example.com> and sometimes
example.com (but not on Github, for example).
Some text to show that the reference links can follow later.
[arbitrary case-insensitive reference text]: https://www.mozilla.org
[1]: http://slashdot.org
[link text itself]: http://www.reddit.com"###
.to_string();
let target = r###"I'm an inline-style link at *https://www.google.com*
I'm an inline-style link with title at *https://www.google.com*
I'm a reference-style link at *https://www.mozilla.org*
I'm a relative reference to a repository file at *../blob/master/LICENSE*
You can use numbers for reference-style link definitions at *http://slashdot.org*
Or leave it empty and use the link text itself at *http://www.reddit.com*.
URLs and URLs in angle brackets will automatically get turned into links.
http://www.example.com or <http://www.example.com> and sometimes
example.com (but not on Github, for example).
Some text to show that the reference links can follow later.
"###
.to_string();
assert_eq!(parse(source), target);
}
}

View File

@ -1,45 +0,0 @@
extern crate regex;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
use std::collections::HashSet;
fn main () {
let mut buffer = String::new();
if let Err(e) = io::stdin().read_to_string(&mut buffer) {
panic!(e);
}
let mut refs = HashSet::new();
// Capture all links and link references.
let regex = r"\[([^\]]+)\](?:(?:\[([^\]]+)\])|(?:\([^\)]+\)))(?i)<!-- ignore -->";
let link_regex = Regex::new(regex).unwrap();
let first_pass = link_regex.replace_all(&buffer, |caps: &Captures| {
// Save the link reference we want to delete.
if let Some(reference) = caps.at(2) {
refs.insert(reference.to_owned());
}
// Put the link title back.
caps.at(1).unwrap().to_owned()
});
// Search for the references we need to delete.
let ref_regex = Regex::new(r"\n\[([^\]]+)\]:\s.*\n").unwrap();
let out = ref_regex.replace_all(&first_pass, |caps: &Captures| {
let capture = caps.at(1).unwrap().to_owned();
// Check if we've marked this reference for deletion ...
if refs.contains(capture.as_str()) {
return "".to_string();
}
// ... else we put back everything we captured.
caps.at(0).unwrap().to_owned()
});
write!(io::stdout(), "{}", out).unwrap();
}

View File

@ -1,52 +0,0 @@
extern crate regex;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main() {
write_md(remove_markup(read_md()));
}
fn read_md() -> String {
let mut buffer = String::new();
match io::stdin().read_to_string(&mut buffer) {
Ok(_) => buffer,
Err(error) => panic!(error),
}
}
fn write_md(output: String) {
write!(io::stdout(), "{}", output).unwrap();
}
fn remove_markup(input: String) -> String {
let filename_regex = Regex::new(r#"\A<span class="filename">(.*)</span>\z"#).unwrap();
// Captions sometimes take up multiple lines.
let caption_start_regex = Regex::new(r#"\A<span class="caption">(.*)\z"#).unwrap();
let caption_end_regex = Regex::new(r#"(.*)</span>\z"#).unwrap();
let regexen = vec![filename_regex, caption_start_regex, caption_end_regex];
let lines: Vec<_> = input.lines().flat_map(|line| {
// Remove our figure and caption markup.
if line == "<figure>" ||
line == "<figcaption>" ||
line == "</figcaption>" ||
line == "</figure>"
{
None
// Remove our syntax highlighting and rustdoc markers.
} else if line.starts_with("```") {
Some(String::from("```"))
// Remove the span around filenames and captions.
} else {
let result = regexen.iter().fold(line.to_string(), |result, regex| {
regex.replace_all(&result, |caps: &Captures| {
caps.at(1).unwrap().to_owned()
})
});
Some(result)
}
}).collect();
lines.join("\n")
}

View File

@ -15,6 +15,8 @@ The following keywords currently have the functionality described.
* `as` - perform primitive casting, disambiguate the specific trait containing * `as` - perform primitive casting, disambiguate the specific trait containing
an item, or rename items in `use` and `extern crate` statements an item, or rename items in `use` and `extern crate` statements
* `async` - return a `Future` instead of blocking the current thread
* `await` - suspend execution until the result of a `Future` is ready
* `break` - exit a loop immediately * `break` - exit a loop immediately
* `const` - define constant items or constant raw pointers * `const` - define constant items or constant raw pointers
* `continue` - continue to the next loop iteration * `continue` - continue to the next loop iteration
@ -59,8 +61,6 @@ The following keywords do not have any functionality but are reserved by Rust
for potential future use. for potential future use.
* `abstract` * `abstract`
* `async`
* `await`
* `become` * `become`
* `box` * `box`
* `do` * `do`

View File

@ -234,11 +234,7 @@ through the last time that reference is used. For instance, this code will
compile because the last usage of the immutable references occurs before the compile because the last usage of the immutable references occurs before the
mutable reference is introduced: mutable reference is introduced:
<!-- This example is being ignored because there's a bug in rustdoc making the ```rust,edition2018
edition2018 not work. The bug is currently fixed in nightly, so when we update
the book to >= 1.35, `ignore` can be removed from this example. -->
```rust,edition2018,ignore
let mut s = String::from("hello"); let mut s = String::from("hello");
let r1 = &s; // no problem let r1 = &s; // no problem

View File

@ -34,7 +34,7 @@ same name in the same scope; tools are available to resolve name conflicts.
Rust has a number of features that allow you to manage your codes Rust has a number of features that allow you to manage your codes
organization, including which details are exposed, which details are private, organization, including which details are exposed, which details are private,
and what names are in each scope in your programs. These features, sometimes and what names are in each scope in your programs. These features, sometimes
collectively referred to as the *module system*, and include: collectively referred to as the *module system*, include:
* **Packages:** A Cargo feature that lets you build, test, and share crates * **Packages:** A Cargo feature that lets you build, test, and share crates
* **Crates:** A tree of modules that produces a library or executable * **Crates:** A tree of modules that produces a library or executable

View File

@ -115,28 +115,39 @@ is printed when the test that passes runs. That output has been captured. The
output from the test that failed, `I got the value 8`, appears in the section output from the test that failed, `I got the value 8`, appears in the section
of the test summary output, which also shows the cause of the test failure. of the test summary output, which also shows the cause of the test failure.
If we want to see printed values for passing tests as well, we can disable the If we want to see printed values for passing tests as well, we can tell Rust
output capture behavior by using the `--nocapture` flag: to also show the output of successful tests at the end with `--show-output`.
```text ```text
$ cargo test -- --nocapture $ cargo test -- --show-output
``` ```
When we run the tests in Listing 11-10 again with the `--nocapture` flag, we When we run the tests in Listing 11-10 again with the `--show-output` flag, we
see the following output: see the following output:
```text ```text
running 2 tests running 2 tests
I got the value 4
I got the value 8
test tests::this_test_will_pass ... ok test tests::this_test_will_pass ... ok
test tests::this_test_will_fail ... FAILED
successes:
---- tests::this_test_will_pass stdout ----
I got the value 4
successes:
tests::this_test_will_pass
failures:
---- tests::this_test_will_fail stdout ----
I got the value 8
thread 'tests::this_test_will_fail' panicked at 'assertion failed: `(left == right)` thread 'tests::this_test_will_fail' panicked at 'assertion failed: `(left == right)`
left: `5`, left: `5`,
right: `10`', src/lib.rs:19:9 right: `10`', src/lib.rs:19:9
note: Run with `RUST_BACKTRACE=1` for a backtrace. note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace.
test tests::this_test_will_fail ... FAILED
failures:
failures: failures:
tests::this_test_will_fail tests::this_test_will_fail
@ -144,11 +155,6 @@ failures:
test result: FAILED. 1 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out test result: FAILED. 1 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out
``` ```
Note that the output for the tests and the test results are interleaved; the
reason is that the tests are running in parallel, as we talked about in the
previous section. Try using the `--test-threads=1` option and the `--nocapture`
flag, and see what the output looks like then!
### Running a Subset of Tests by Name ### Running a Subset of Tests by Name
Sometimes, running a full test suite can take a long time. If youre working on Sometimes, running a full test suite can take a long time. If youre working on

View File

@ -281,7 +281,7 @@ fn main() {
match msg { match msg {
Message::Quit => { Message::Quit => {
println!("The Quit variant has no data to destructure.") println!("The Quit variant has no data to destructure.")
}, }
Message::Move { x, y } => { Message::Move { x, y } => {
println!( println!(
"Move in the x direction {} and in the y direction {}", "Move in the x direction {} and in the y direction {}",
@ -356,7 +356,7 @@ fn main() {
g, g,
b b
) )
}, }
Message::ChangeColor(Color::Hsv(h, s, v)) => { Message::ChangeColor(Color::Hsv(h, s, v)) => {
println!( println!(
"Change the color to hue {}, saturation {}, and value {}", "Change the color to hue {}, saturation {}, and value {}",

View File

@ -1,6 +1,6 @@
## Graceful Shutdown and Cleanup ## Graceful Shutdown and Cleanup
The code in Listing 20-21 is responding to requests asynchronously through the The code in Listing 20-20 is responding to requests asynchronously through the
use of a thread pool, as we intended. We get some warnings about the `workers`, use of a thread pool, as we intended. We get some warnings about the `workers`,
`id`, and `thread` fields that were not using in a direct way that reminds us `id`, and `thread` fields that were not using in a direct way that reminds us
were not cleaning up anything. When we use the less elegant <span were not cleaning up anything. When we use the less elegant <span
@ -18,7 +18,7 @@ accept only two requests before gracefully shutting down its thread pool.
Lets start with implementing `Drop` on our thread pool. When the pool is Lets start with implementing `Drop` on our thread pool. When the pool is
dropped, our threads should all join to make sure they finish their work. dropped, our threads should all join to make sure they finish their work.
Listing 20-23 shows a first attempt at a `Drop` implementation; this code wont Listing 20-22 shows a first attempt at a `Drop` implementation; this code wont
quite work yet. quite work yet.
<span class="filename">Filename: src/lib.rs</span> <span class="filename">Filename: src/lib.rs</span>
@ -35,7 +35,7 @@ impl Drop for ThreadPool {
} }
``` ```
<span class="caption">Listing 20-23: Joining each thread when the thread pool <span class="caption">Listing 20-22: Joining each thread when the thread pool
goes out of scope</span> goes out of scope</span>
First, we loop through each of the thread pool `workers`. We use `&mut` for First, we loop through each of the thread pool `workers`. We use `&mut` for
@ -178,7 +178,7 @@ thread should run, or it will be a `Terminate` variant that will cause the
thread to exit its loop and stop. thread to exit its loop and stop.
We need to adjust the channel to use values of type `Message` rather than type We need to adjust the channel to use values of type `Message` rather than type
`Job`, as shown in Listing 20-24. `Job`, as shown in Listing 20-23.
<span class="filename">Filename: src/lib.rs</span> <span class="filename">Filename: src/lib.rs</span>
@ -217,7 +217,7 @@ impl Worker {
Message::NewJob(job) => { Message::NewJob(job) => {
println!("Worker {} got a job; executing.", id); println!("Worker {} got a job; executing.", id);
job.call_box(); job();
}, },
Message::Terminate => { Message::Terminate => {
println!("Worker {} was told to terminate.", id); println!("Worker {} was told to terminate.", id);
@ -236,7 +236,7 @@ impl Worker {
} }
``` ```
<span class="caption">Listing 20-24: Sending and receiving `Message` values and <span class="caption">Listing 20-23: Sending and receiving `Message` values and
exiting the loop if a `Worker` receives `Message::Terminate`</span> exiting the loop if a `Worker` receives `Message::Terminate`</span>
To incorporate the `Message` enum, we need to change `Job` to `Message` in two To incorporate the `Message` enum, we need to change `Job` to `Message` in two
@ -248,9 +248,9 @@ received, and the thread will break out of the loop if the `Terminate` variant
is received. is received.
With these changes, the code will compile and continue to function in the same With these changes, the code will compile and continue to function in the same
way as it did after Listing 20-21. But well get a warning because we arent way as it did after Listing 20-20. But well get a warning because we arent
creating any messages of the `Terminate` variety. Lets fix this warning by creating any messages of the `Terminate` variety. Lets fix this warning by
changing our `Drop` implementation to look like Listing 20-25. changing our `Drop` implementation to look like Listing 20-24.
<span class="filename">Filename: src/lib.rs</span> <span class="filename">Filename: src/lib.rs</span>
@ -276,7 +276,7 @@ impl Drop for ThreadPool {
} }
``` ```
<span class="caption">Listing 20-25: Sending `Message::Terminate` to the <span class="caption">Listing 20-24: Sending `Message::Terminate` to the
workers before calling `join` on each worker thread</span> workers before calling `join` on each worker thread</span>
Were now iterating over the workers twice: once to send one `Terminate` Were now iterating over the workers twice: once to send one `Terminate`
@ -302,7 +302,7 @@ messages as there are workers, each worker will receive a terminate message
before `join` is called on its thread. before `join` is called on its thread.
To see this code in action, lets modify `main` to accept only two requests To see this code in action, lets modify `main` to accept only two requests
before gracefully shutting down the server, as shown in Listing 20-26. before gracefully shutting down the server, as shown in Listing 20-25.
<span class="filename">Filename: src/bin/main.rs</span> <span class="filename">Filename: src/bin/main.rs</span>
@ -323,7 +323,7 @@ fn main() {
} }
``` ```
<span class="caption">Listing 20-26: Shut down the server after serving two <span class="caption">Listing 20-25: Shut down the server after serving two
requests by exiting the loop</span> requests by exiting the loop</span>
You wouldnt want a real-world web server to shut down after serving only two You wouldnt want a real-world web server to shut down after serving only two

View File

@ -28,15 +28,8 @@ fn remove_markup(input: String) -> String {
let regexen = vec![filename_regex, caption_start_regex, caption_end_regex]; let regexen = vec![filename_regex, caption_start_regex, caption_end_regex];
let lines: Vec<_> = input.lines().flat_map(|line| { let lines: Vec<_> = input.lines().flat_map(|line| {
// Remove our figure and caption markup.
if line == "<figure>" ||
line == "<figcaption>" ||
line == "</figcaption>" ||
line == "</figure>"
{
None
// Remove our syntax highlighting and rustdoc markers. // Remove our syntax highlighting and rustdoc markers.
} else if line.starts_with("```") { if line.starts_with("```") {
Some(String::from("```")) Some(String::from("```"))
// Remove the span around filenames and captions. // Remove the span around filenames and captions.
} else { } else {

View File

@ -14,24 +14,24 @@ To install Rust through Rustup, you can go to
so on your platform. This will install both `rustup` itself and the `stable` so on your platform. This will install both `rustup` itself and the `stable`
version of `rustc` and `cargo`. version of `rustc` and `cargo`.
To install a specific Rust version, you can use `rustup install`: To install a specific Rust version, you can use `rustup toolchain install`:
```console ```console
$ rustup install 1.30.0 $ rustup toolchain install 1.30.0
``` ```
This works for a specific nightly, as well: This works for a specific nightly, as well:
```console ```console
$ rustup install nightly-2018-08-01 $ rustup toolchain install nightly-2018-08-01
``` ```
As well as any of our release channels: As well as any of our release channels:
```console ```console
$ rustup install stable $ rustup toolchain install stable
$ rustup install beta $ rustup toolchain install beta
$ rustup install nightly $ rustup toolchain install nightly
``` ```
## For updating your installation ## For updating your installation
@ -53,6 +53,12 @@ To set the default toolchain to something other than `stable`:
$ rustup default nightly $ rustup default nightly
``` ```
To uninstall a specific Rust version, you can use `rustup toolchain uninstall`:
```console
$ rustup toolchain uninstall 1.30.0
```
To use a toolchain other than the default, use `rustup run`: To use a toolchain other than the default, use `rustup run`:
```console ```console

View File

@ -39,7 +39,7 @@ Interrupt handlers look like plain functions (except for the lack of arguments)
similar to exception handlers. However they can not be called directly by other similar to exception handlers. However they can not be called directly by other
parts of the firmware due to the special calling conventions. It is however parts of the firmware due to the special calling conventions. It is however
possible to generate interrupt requests in software to trigger a diversion to possible to generate interrupt requests in software to trigger a diversion to
to the interrupt handler. the interrupt handler.
Similar to exception handlers it is also possible to declare `static mut` Similar to exception handlers it is also possible to declare `static mut`
variables inside the interrupt handlers for *safe* state keeping. variables inside the interrupt handlers for *safe* state keeping.

View File

@ -25,7 +25,7 @@ struct GpioConfig {
periph: GPIO_CONFIG, periph: GPIO_CONFIG,
} }
impl Gpio { impl GpioConfig {
pub fn set_enable(&mut self, is_enabled: bool) { pub fn set_enable(&mut self, is_enabled: bool) {
self.periph.modify(|_r, w| { self.periph.modify(|_r, w| {
w.enable().set_bit(is_enabled) w.enable().set_bit(is_enabled)

Some files were not shown because too many files have changed in this diff Show More