mirror of
https://git.proxmox.com/git/rustc
synced 2025-08-14 08:32:57 +00:00
New upstream version 1.40.0+dfsg1
This commit is contained in:
parent
e1599b0ce6
commit
e74abb3270
@ -15,7 +15,7 @@ links to the major sections:
|
||||
* [Helpful Links and Information](#helpful-links-and-information)
|
||||
|
||||
If you have questions, please make a post on [internals.rust-lang.org][internals] or
|
||||
hop on the [Rust Discord server][rust-discord], [Rust Zulip server][rust-zulip] or [#rust-internals][pound-rust-internals].
|
||||
hop on the [Rust Discord server][rust-discord] or [Rust Zulip server][rust-zulip].
|
||||
|
||||
As a reminder, all contributors are expected to follow our [Code of Conduct][coc].
|
||||
|
||||
@ -25,7 +25,6 @@ to contribute to it in more detail than this document.
|
||||
If this is your first time contributing, the [walkthrough] chapter of the guide
|
||||
can give you a good example of how a typical contribution would go.
|
||||
|
||||
[pound-rust-internals]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-internals
|
||||
[internals]: https://internals.rust-lang.org
|
||||
[rust-discord]: http://discord.gg/rust-lang
|
||||
[rust-zulip]: https://rust-lang.zulipchat.com
|
||||
@ -129,6 +128,14 @@ the master branch to your feature branch.
|
||||
Also, please make sure that fixup commits are squashed into other related
|
||||
commits with meaningful commit messages.
|
||||
|
||||
GitHub allows [closing issues using keywords][closing-keywords]. This feature
|
||||
should be used to keep the issue tracker tidy. However, it is generally preferred
|
||||
to put the "closes #123" text in the PR description rather than the issue commit;
|
||||
particularly during rebasing, citing the issue number in the commit can "spam"
|
||||
the issue in question.
|
||||
|
||||
[closing-keywords]: https://help.github.com/en/articles/closing-issues-using-keywords
|
||||
|
||||
Please make sure your pull request is in compliance with Rust's style
|
||||
guidelines by running
|
||||
|
||||
@ -404,7 +411,7 @@ If you're looking for somewhere to start, check out the [E-easy][eeasy] tag.
|
||||
There are a number of other ways to contribute to Rust that don't deal with
|
||||
this repository.
|
||||
|
||||
Answer questions in [#rust][pound-rust], or on [users.rust-lang.org][users],
|
||||
Answer questions in the _Get Help!_ channels from the [Rust Discord server][rust-discord], on [users.rust-lang.org][users],
|
||||
or on [StackOverflow][so].
|
||||
|
||||
Participate in the [RFC process](https://github.com/rust-lang/rfcs).
|
||||
@ -413,7 +420,7 @@ Find a [requested community library][community-library], build it, and publish
|
||||
it to [Crates.io](http://crates.io). Easier said than done, but very, very
|
||||
valuable!
|
||||
|
||||
[pound-rust]: http://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust
|
||||
[rust-discord]: https://discord.gg/rust-lang
|
||||
[users]: https://users.rust-lang.org/
|
||||
[so]: http://stackoverflow.com/questions/tagged/rust
|
||||
[community-library]: https://github.com/rust-lang/rfcs/labels/A-community-library
|
||||
|
813
Cargo.lock
generated
813
Cargo.lock
generated
File diff suppressed because it is too large
Load Diff
13
README.md
13
README.md
@ -33,6 +33,7 @@ or reading the [rustc guide][rustcguidebuild].
|
||||
* `curl`
|
||||
* `git`
|
||||
* `ssl` which comes in `libssl-dev` or `openssl-devel`
|
||||
* `pkg-config` if you are compiling on Linux and targeting Linux
|
||||
|
||||
2. Clone the [source] with `git`:
|
||||
|
||||
@ -243,19 +244,17 @@ The Rust community congregates in a few places:
|
||||
|
||||
To contribute to Rust, please see [CONTRIBUTING](CONTRIBUTING.md).
|
||||
|
||||
Rust has an [IRC] culture and most real-time collaboration happens in a
|
||||
variety of channels on Mozilla's IRC network, irc.mozilla.org. The
|
||||
most popular channel is [#rust], a venue for general discussion about
|
||||
Rust. And a good place to ask for help would be [#rust-beginners].
|
||||
Most real-time collaboration happens in a variety of channels on the
|
||||
[Rust Discord server][rust-discord], with channels dedicated for getting help,
|
||||
community, documentation, and all major contribution areas in the Rust ecosystem.
|
||||
A good place to ask for help would be the #help channel.
|
||||
|
||||
The [rustc guide] might be a good place to start if you want to find out how
|
||||
various parts of the compiler work.
|
||||
|
||||
Also, you may find the [rustdocs for the compiler itself][rustdocs] useful.
|
||||
|
||||
[IRC]: https://en.wikipedia.org/wiki/Internet_Relay_Chat
|
||||
[#rust]: irc://irc.mozilla.org/rust
|
||||
[#rust-beginners]: irc://irc.mozilla.org/rust-beginners
|
||||
[rust-discord]: https://discord.gg/rust-lang
|
||||
[rustc guide]: https://rust-lang.github.io/rustc-guide/about-this-guide.html
|
||||
[rustdocs]: https://doc.rust-lang.org/nightly/nightly-rustc/rustc/
|
||||
|
||||
|
@ -258,10 +258,9 @@
|
||||
[rust]
|
||||
|
||||
# Whether or not to optimize the compiler and standard library.
|
||||
#
|
||||
# Note: the slowness of the non optimized compiler compiling itself usually
|
||||
# outweighs the time gains in not doing optimizations, therefore a
|
||||
# full bootstrap takes much more time with `optimize` set to false.
|
||||
# WARNING: Building with optimize = false is NOT SUPPORTED. Due to bootstrapping,
|
||||
# building without optimizations takes much longer than optimizing. Further, some platforms
|
||||
# fail to build without this optimization (c.f. #65352).
|
||||
#optimize = true
|
||||
|
||||
# Indicates that the build should be configured for debugging Rust. A
|
||||
@ -341,6 +340,9 @@
|
||||
# nightly features
|
||||
#channel = "dev"
|
||||
|
||||
# The root location of the MUSL installation directory.
|
||||
#musl-root = "..."
|
||||
|
||||
# By default the `rustc` executable is built with `-Wl,-rpath` flags on Unix
|
||||
# platforms to ensure that the compiler is usable by default from the build
|
||||
# directory (as it links to a number of dynamic libraries). This may not be
|
||||
@ -374,9 +376,7 @@
|
||||
|
||||
# This is an array of the codegen backends that will be compiled for the rustc
|
||||
# that's being compiled. The default is to only build the LLVM codegen backend,
|
||||
# but you can also optionally enable the "emscripten" backend for asm.js or
|
||||
# make this an empty array (but that probably won't get too far in the
|
||||
# bootstrap)
|
||||
# and currently the only standard option supported is `"llvm"`
|
||||
#codegen-backends = ["llvm"]
|
||||
|
||||
# This is the name of the directory in which codegen backends will get installed
|
||||
|
@ -1 +1 @@
|
||||
4560ea788cb760f0a34127156c78e2552949f734
|
||||
73528e339aae0f17a15ffa49a8ac608f50c6cf14
|
@ -5,7 +5,4 @@ This directory contains the source code of the rust project, including:
|
||||
|
||||
For more information on how various parts of the compiler work, see the [rustc guide].
|
||||
|
||||
There is also useful content in this README:
|
||||
https://github.com/rust-lang/rust/tree/master/src/librustc/infer/lexical_region_resolve.
|
||||
|
||||
[rustc guide]: https://rust-lang.github.io/rustc-guide/about-this-guide.html
|
||||
|
@ -328,6 +328,8 @@ are:
|
||||
`Config` struct.
|
||||
* Adding a sanity check? Take a look at `bootstrap/sanity.rs`.
|
||||
|
||||
If you have any questions feel free to reach out on `#rust-infra` on IRC or ask on
|
||||
internals.rust-lang.org. When you encounter bugs, please file issues on the
|
||||
rust-lang/rust issue tracker.
|
||||
If you have any questions feel free to reach out on `#infra` channel in the
|
||||
[Rust Discord server][rust-discord] or ask on internals.rust-lang.org. When
|
||||
you encounter bugs, please file issues on the rust-lang/rust issue tracker.
|
||||
|
||||
[rust-discord]: https://discord.gg/rust-lang
|
||||
|
@ -734,10 +734,6 @@ class RustBuild(object):
|
||||
if module.endswith("llvm-project"):
|
||||
if self.get_toml('llvm-config') and self.get_toml('lld') != 'true':
|
||||
continue
|
||||
if module.endswith("llvm-emscripten"):
|
||||
backends = self.get_toml('codegen-backends')
|
||||
if backends is None or not 'emscripten' in backends:
|
||||
continue
|
||||
check = self.check_submodule(module, slow_submodules)
|
||||
filtered_submodules.append((module, check))
|
||||
submodules_names.append(module)
|
||||
|
@ -443,6 +443,7 @@ impl<'a> Builder<'a> {
|
||||
dist::Rustc,
|
||||
dist::DebuggerScripts,
|
||||
dist::Std,
|
||||
dist::RustcDev,
|
||||
dist::Analysis,
|
||||
dist::Src,
|
||||
dist::PlainSourceTarball,
|
||||
@ -817,12 +818,22 @@ impl<'a> Builder<'a> {
|
||||
|
||||
let mut rustflags = Rustflags::new(&target);
|
||||
if stage != 0 {
|
||||
if let Ok(s) = env::var("CARGOFLAGS_NOT_BOOTSTRAP") {
|
||||
cargo.args(s.split_whitespace());
|
||||
}
|
||||
rustflags.env("RUSTFLAGS_NOT_BOOTSTRAP");
|
||||
} else {
|
||||
if let Ok(s) = env::var("CARGOFLAGS_BOOTSTRAP") {
|
||||
cargo.args(s.split_whitespace());
|
||||
}
|
||||
rustflags.env("RUSTFLAGS_BOOTSTRAP");
|
||||
rustflags.arg("--cfg=bootstrap");
|
||||
}
|
||||
|
||||
if let Ok(s) = env::var("CARGOFLAGS") {
|
||||
cargo.args(s.split_whitespace());
|
||||
}
|
||||
|
||||
match mode {
|
||||
Mode::Std | Mode::ToolBootstrap | Mode::ToolStd => {},
|
||||
Mode::Rustc | Mode::Codegen | Mode::ToolRustc => {
|
||||
@ -875,7 +886,18 @@ impl<'a> Builder<'a> {
|
||||
// things still build right, please do!
|
||||
match mode {
|
||||
Mode::Std => metadata.push_str("std"),
|
||||
_ => {},
|
||||
// When we're building rustc tools, they're built with a search path
|
||||
// that contains things built during the rustc build. For example,
|
||||
// bitflags is built during the rustc build, and is a dependency of
|
||||
// rustdoc as well. We're building rustdoc in a different target
|
||||
// directory, though, which means that Cargo will rebuild the
|
||||
// dependency. When we go on to build rustdoc, we'll look for
|
||||
// bitflags, and find two different copies: one built during the
|
||||
// rustc step and one that we just built. This isn't always a
|
||||
// problem, somehow -- not really clear why -- but we know that this
|
||||
// fixes things.
|
||||
Mode::ToolRustc => metadata.push_str("tool-rustc"),
|
||||
_ => {}
|
||||
}
|
||||
cargo.env("__CARGO_DEFAULT_LIB_METADATA", &metadata);
|
||||
|
||||
@ -970,6 +992,7 @@ impl<'a> Builder<'a> {
|
||||
Some("-Wl,-rpath,@loader_path/../lib")
|
||||
} else if !target.contains("windows") &&
|
||||
!target.contains("wasm32") &&
|
||||
!target.contains("emscripten") &&
|
||||
!target.contains("fuchsia") {
|
||||
Some("-Wl,-rpath,$ORIGIN/../lib")
|
||||
} else {
|
||||
|
@ -161,7 +161,7 @@ impl Ord for Interned<String> {
|
||||
}
|
||||
}
|
||||
|
||||
struct TyIntern<T: Hash + Clone + Eq> {
|
||||
struct TyIntern<T: Clone + Eq> {
|
||||
items: Vec<T>,
|
||||
set: HashMap<T, Interned<T>>,
|
||||
}
|
||||
|
@ -13,7 +13,7 @@ use build_helper::output;
|
||||
use crate::Build;
|
||||
|
||||
// The version number
|
||||
pub const CFG_RELEASE_NUM: &str = "1.39.0";
|
||||
pub const CFG_RELEASE_NUM: &str = "1.40.0";
|
||||
|
||||
pub struct GitInfo {
|
||||
inner: Option<Info>,
|
||||
|
@ -55,6 +55,7 @@ impl Step for Std {
|
||||
cargo,
|
||||
args(builder.kind),
|
||||
&libstd_stamp(builder, compiler, target),
|
||||
vec![],
|
||||
true);
|
||||
|
||||
let libdir = builder.sysroot_libdir(compiler, target);
|
||||
@ -103,6 +104,7 @@ impl Step for Rustc {
|
||||
cargo,
|
||||
args(builder.kind),
|
||||
&librustc_stamp(builder, compiler, target),
|
||||
vec![],
|
||||
true);
|
||||
|
||||
let libdir = builder.sysroot_libdir(compiler, target);
|
||||
@ -155,6 +157,7 @@ impl Step for CodegenBackend {
|
||||
cargo,
|
||||
args(builder.kind),
|
||||
&codegen_backend_stamp(builder, compiler, target, backend),
|
||||
vec![],
|
||||
true);
|
||||
}
|
||||
}
|
||||
@ -199,6 +202,7 @@ impl Step for Rustdoc {
|
||||
cargo,
|
||||
args(builder.kind),
|
||||
&rustdoc_stamp(builder, compiler, target),
|
||||
vec![],
|
||||
true);
|
||||
|
||||
let libdir = builder.sysroot_libdir(compiler, target);
|
||||
|
@ -69,7 +69,7 @@ impl Step for Std {
|
||||
return;
|
||||
}
|
||||
|
||||
builder.ensure(StartupObjects { compiler, target });
|
||||
let mut target_deps = builder.ensure(StartupObjects { compiler, target });
|
||||
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
if compiler_to_use != compiler {
|
||||
@ -91,7 +91,7 @@ impl Step for Std {
|
||||
return;
|
||||
}
|
||||
|
||||
copy_third_party_objects(builder, &compiler, target);
|
||||
target_deps.extend(copy_third_party_objects(builder, &compiler, target).into_iter());
|
||||
|
||||
let mut cargo = builder.cargo(compiler, Mode::Std, target, "build");
|
||||
std_cargo(builder, &compiler, target, &mut cargo);
|
||||
@ -102,6 +102,7 @@ impl Step for Std {
|
||||
cargo,
|
||||
vec![],
|
||||
&libstd_stamp(builder, compiler, target),
|
||||
target_deps,
|
||||
false);
|
||||
|
||||
builder.ensure(StdLink {
|
||||
@ -113,9 +114,22 @@ impl Step for Std {
|
||||
}
|
||||
|
||||
/// Copies third pary objects needed by various targets.
|
||||
fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target: Interned<String>) {
|
||||
fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target: Interned<String>)
|
||||
-> Vec<PathBuf>
|
||||
{
|
||||
let libdir = builder.sysroot_libdir(*compiler, target);
|
||||
|
||||
let mut target_deps = vec![];
|
||||
|
||||
let mut copy_and_stamp = |sourcedir: &Path, name: &str| {
|
||||
let target = libdir.join(name);
|
||||
builder.copy(
|
||||
&sourcedir.join(name),
|
||||
&target,
|
||||
);
|
||||
target_deps.push(target);
|
||||
};
|
||||
|
||||
// Copies the crt(1,i,n).o startup objects
|
||||
//
|
||||
// Since musl supports fully static linking, we can cross link for it even
|
||||
@ -123,19 +137,13 @@ fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target:
|
||||
// files. As those shipped with glibc won't work, copy the ones provided by
|
||||
// musl so we have them on linux-gnu hosts.
|
||||
if target.contains("musl") {
|
||||
let srcdir = builder.musl_root(target).unwrap().join("lib");
|
||||
for &obj in &["crt1.o", "crti.o", "crtn.o"] {
|
||||
builder.copy(
|
||||
&builder.musl_root(target).unwrap().join("lib").join(obj),
|
||||
&libdir.join(obj),
|
||||
);
|
||||
copy_and_stamp(&srcdir, obj);
|
||||
}
|
||||
} else if target.ends_with("-wasi") {
|
||||
for &obj in &["crt1.o"] {
|
||||
builder.copy(
|
||||
&builder.wasi_root(target).unwrap().join("lib/wasm32-wasi").join(obj),
|
||||
&libdir.join(obj),
|
||||
);
|
||||
}
|
||||
let srcdir = builder.wasi_root(target).unwrap().join("lib/wasm32-wasi");
|
||||
copy_and_stamp(&srcdir, "crt1.o");
|
||||
}
|
||||
|
||||
// Copies libunwind.a compiled to be linked wit x86_64-fortanix-unknown-sgx.
|
||||
@ -145,11 +153,11 @@ fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target:
|
||||
// which is provided by std for this target.
|
||||
if target == "x86_64-fortanix-unknown-sgx" {
|
||||
let src_path_env = "X86_FORTANIX_SGX_LIBS";
|
||||
let obj = "libunwind.a";
|
||||
let src = env::var(src_path_env).expect(&format!("{} not found in env", src_path_env));
|
||||
let src = Path::new(&src).join(obj);
|
||||
builder.copy(&src, &libdir.join(obj));
|
||||
copy_and_stamp(Path::new(&src), "libunwind.a");
|
||||
}
|
||||
|
||||
target_deps
|
||||
}
|
||||
|
||||
/// Configure cargo to compile the standard library, adding appropriate env vars
|
||||
@ -210,7 +218,6 @@ pub fn std_cargo(builder: &Builder<'_>,
|
||||
// config.toml equivalent) is used
|
||||
let llvm_config = builder.ensure(native::Llvm {
|
||||
target: builder.config.build,
|
||||
emscripten: false,
|
||||
});
|
||||
cargo.env("LLVM_CONFIG", llvm_config);
|
||||
cargo.env("RUSTC_BUILD_SANITIZERS", "1");
|
||||
@ -307,7 +314,7 @@ pub struct StartupObjects {
|
||||
}
|
||||
|
||||
impl Step for StartupObjects {
|
||||
type Output = ();
|
||||
type Output = Vec<PathBuf>;
|
||||
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
run.path("src/rtstartup")
|
||||
@ -326,13 +333,15 @@ impl Step for StartupObjects {
|
||||
/// They don't require any library support as they're just plain old object
|
||||
/// files, so we just use the nightly snapshot compiler to always build them (as
|
||||
/// no other compilers are guaranteed to be available).
|
||||
fn run(self, builder: &Builder<'_>) {
|
||||
fn run(self, builder: &Builder<'_>) -> Vec<PathBuf> {
|
||||
let for_compiler = self.compiler;
|
||||
let target = self.target;
|
||||
if !target.contains("windows-gnu") {
|
||||
return
|
||||
return vec![]
|
||||
}
|
||||
|
||||
let mut target_deps = vec![];
|
||||
|
||||
let src_dir = &builder.src.join("src/rtstartup");
|
||||
let dst_dir = &builder.native_dir(target).join("rtstartup");
|
||||
let sysroot_dir = &builder.sysroot_libdir(for_compiler, target);
|
||||
@ -351,7 +360,9 @@ impl Step for StartupObjects {
|
||||
.arg(src_file));
|
||||
}
|
||||
|
||||
builder.copy(dst_file, &sysroot_dir.join(file.to_string() + ".o"));
|
||||
let target = sysroot_dir.join(file.to_string() + ".o");
|
||||
builder.copy(dst_file, &target);
|
||||
target_deps.push(target);
|
||||
}
|
||||
|
||||
for obj in ["crt2.o", "dllcrt2.o"].iter() {
|
||||
@ -359,8 +370,12 @@ impl Step for StartupObjects {
|
||||
builder.cc(target),
|
||||
target,
|
||||
obj);
|
||||
builder.copy(&src, &sysroot_dir.join(obj));
|
||||
let target = sysroot_dir.join(obj);
|
||||
builder.copy(&src, &target);
|
||||
target_deps.push(target);
|
||||
}
|
||||
|
||||
target_deps
|
||||
}
|
||||
}
|
||||
|
||||
@ -438,6 +453,7 @@ impl Step for Rustc {
|
||||
cargo,
|
||||
vec![],
|
||||
&librustc_stamp(builder, compiler, target),
|
||||
vec![],
|
||||
false);
|
||||
|
||||
builder.ensure(RustcLink {
|
||||
@ -586,7 +602,7 @@ impl Step for CodegenBackend {
|
||||
|
||||
let tmp_stamp = out_dir.join(".tmp.stamp");
|
||||
|
||||
let files = run_cargo(builder, cargo, vec![], &tmp_stamp, false);
|
||||
let files = run_cargo(builder, cargo, vec![], &tmp_stamp, vec![], false);
|
||||
if builder.config.dry_run {
|
||||
return;
|
||||
}
|
||||
@ -615,46 +631,37 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
|
||||
compiler: &Compiler,
|
||||
target: Interned<String>,
|
||||
backend: Interned<String>) -> String {
|
||||
let mut features = String::new();
|
||||
|
||||
match &*backend {
|
||||
"llvm" | "emscripten" => {
|
||||
"llvm" => {
|
||||
// Build LLVM for our target. This will implicitly build the
|
||||
// host LLVM if necessary.
|
||||
let llvm_config = builder.ensure(native::Llvm {
|
||||
target,
|
||||
emscripten: backend == "emscripten",
|
||||
});
|
||||
|
||||
if backend == "emscripten" {
|
||||
features.push_str(" emscripten");
|
||||
}
|
||||
|
||||
builder.info(&format!("Building stage{} codegen artifacts ({} -> {}, {})",
|
||||
compiler.stage, &compiler.host, target, backend));
|
||||
|
||||
// Pass down configuration from the LLVM build into the build of
|
||||
// librustc_llvm and librustc_codegen_llvm.
|
||||
if builder.is_rust_llvm(target) && backend != "emscripten" {
|
||||
if builder.is_rust_llvm(target) {
|
||||
cargo.env("LLVM_RUSTLLVM", "1");
|
||||
}
|
||||
|
||||
cargo.env("LLVM_CONFIG", &llvm_config);
|
||||
if backend != "emscripten" {
|
||||
let target_config = builder.config.target_config.get(&target);
|
||||
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
|
||||
cargo.env("CFG_LLVM_ROOT", s);
|
||||
}
|
||||
let target_config = builder.config.target_config.get(&target);
|
||||
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
|
||||
cargo.env("CFG_LLVM_ROOT", s);
|
||||
}
|
||||
// Some LLVM linker flags (-L and -l) may be needed to link librustc_llvm.
|
||||
if let Some(ref s) = builder.config.llvm_ldflags {
|
||||
cargo.env("LLVM_LINKER_FLAGS", s);
|
||||
}
|
||||
// Building with a static libstdc++ is only supported on linux right now,
|
||||
// Building with a static libstdc++ is only supported on linux and mingw right now,
|
||||
// not for MSVC or macOS
|
||||
if builder.config.llvm_static_stdcpp &&
|
||||
!target.contains("freebsd") &&
|
||||
!target.contains("windows") &&
|
||||
!target.contains("msvc") &&
|
||||
!target.contains("apple") {
|
||||
let file = compiler_file(builder,
|
||||
builder.cxx(target).unwrap(),
|
||||
@ -662,9 +669,7 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
|
||||
"libstdc++.a");
|
||||
cargo.env("LLVM_STATIC_STDCPP", file);
|
||||
}
|
||||
if builder.config.llvm_link_shared ||
|
||||
(builder.config.llvm_thin_lto && backend != "emscripten")
|
||||
{
|
||||
if builder.config.llvm_link_shared || builder.config.llvm_thin_lto {
|
||||
cargo.env("LLVM_LINK_SHARED", "1");
|
||||
}
|
||||
if builder.config.llvm_use_libcxx {
|
||||
@ -676,8 +681,7 @@ pub fn build_codegen_backend(builder: &Builder<'_>,
|
||||
}
|
||||
_ => panic!("unknown backend: {}", backend),
|
||||
}
|
||||
|
||||
features
|
||||
String::new()
|
||||
}
|
||||
|
||||
/// Creates the `codegen-backends` folder for a compiler that's about to be
|
||||
@ -954,6 +958,7 @@ pub fn run_cargo(builder: &Builder<'_>,
|
||||
cargo: Cargo,
|
||||
tail_args: Vec<String>,
|
||||
stamp: &Path,
|
||||
additional_target_deps: Vec<PathBuf>,
|
||||
is_check: bool)
|
||||
-> Vec<PathBuf>
|
||||
{
|
||||
@ -1070,6 +1075,7 @@ pub fn run_cargo(builder: &Builder<'_>,
|
||||
deps.push((path_to_add.into(), false));
|
||||
}
|
||||
|
||||
deps.extend(additional_target_deps.into_iter().map(|d| (d, false)));
|
||||
deps.sort();
|
||||
let mut new_contents = Vec::new();
|
||||
for (dep, proc_macro) in deps.iter() {
|
||||
|
@ -200,16 +200,15 @@ struct Build {
|
||||
target: Vec<String>,
|
||||
cargo: Option<String>,
|
||||
rustc: Option<String>,
|
||||
low_priority: Option<bool>,
|
||||
compiler_docs: Option<bool>,
|
||||
docs: Option<bool>,
|
||||
compiler_docs: Option<bool>,
|
||||
submodules: Option<bool>,
|
||||
fast_submodules: Option<bool>,
|
||||
gdb: Option<String>,
|
||||
locked_deps: Option<bool>,
|
||||
vendor: Option<bool>,
|
||||
nodejs: Option<String>,
|
||||
python: Option<String>,
|
||||
locked_deps: Option<bool>,
|
||||
vendor: Option<bool>,
|
||||
full_bootstrap: Option<bool>,
|
||||
extended: Option<bool>,
|
||||
tools: Option<HashSet<String>>,
|
||||
@ -217,6 +216,7 @@ struct Build {
|
||||
sanitizers: Option<bool>,
|
||||
profiler: Option<bool>,
|
||||
cargo_native_static: Option<bool>,
|
||||
low_priority: Option<bool>,
|
||||
configure_args: Option<Vec<String>>,
|
||||
local_rebuild: Option<bool>,
|
||||
print_step_timings: Option<bool>,
|
||||
@ -228,11 +228,11 @@ struct Build {
|
||||
struct Install {
|
||||
prefix: Option<String>,
|
||||
sysconfdir: Option<String>,
|
||||
datadir: Option<String>,
|
||||
docdir: Option<String>,
|
||||
bindir: Option<String>,
|
||||
libdir: Option<String>,
|
||||
mandir: Option<String>,
|
||||
datadir: Option<String>,
|
||||
|
||||
// standard paths, currently unused
|
||||
infodir: Option<String>,
|
||||
@ -243,14 +243,14 @@ struct Install {
|
||||
#[derive(Deserialize, Default)]
|
||||
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
|
||||
struct Llvm {
|
||||
ccache: Option<StringOrBool>,
|
||||
ninja: Option<bool>,
|
||||
assertions: Option<bool>,
|
||||
optimize: Option<bool>,
|
||||
thin_lto: Option<bool>,
|
||||
release_debuginfo: Option<bool>,
|
||||
assertions: Option<bool>,
|
||||
ccache: Option<StringOrBool>,
|
||||
version_check: Option<bool>,
|
||||
static_libstdcpp: Option<bool>,
|
||||
ninja: Option<bool>,
|
||||
targets: Option<String>,
|
||||
experimental_targets: Option<String>,
|
||||
link_jobs: Option<u32>,
|
||||
@ -293,6 +293,7 @@ impl Default for StringOrBool {
|
||||
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
|
||||
struct Rust {
|
||||
optimize: Option<bool>,
|
||||
debug: Option<bool>,
|
||||
codegen_units: Option<u32>,
|
||||
codegen_units_std: Option<u32>,
|
||||
debug_assertions: Option<bool>,
|
||||
@ -301,25 +302,24 @@ struct Rust {
|
||||
debuginfo_level_std: Option<u32>,
|
||||
debuginfo_level_tools: Option<u32>,
|
||||
debuginfo_level_tests: Option<u32>,
|
||||
parallel_compiler: Option<bool>,
|
||||
backtrace: Option<bool>,
|
||||
incremental: Option<bool>,
|
||||
parallel_compiler: Option<bool>,
|
||||
default_linker: Option<String>,
|
||||
channel: Option<String>,
|
||||
musl_root: Option<String>,
|
||||
rpath: Option<bool>,
|
||||
verbose_tests: Option<bool>,
|
||||
optimize_tests: Option<bool>,
|
||||
codegen_tests: Option<bool>,
|
||||
ignore_git: Option<bool>,
|
||||
debug: Option<bool>,
|
||||
dist_src: Option<bool>,
|
||||
verbose_tests: Option<bool>,
|
||||
incremental: Option<bool>,
|
||||
save_toolstates: Option<String>,
|
||||
codegen_backends: Option<Vec<String>>,
|
||||
codegen_backends_dir: Option<String>,
|
||||
lld: Option<bool>,
|
||||
lldb: Option<bool>,
|
||||
llvm_tools: Option<bool>,
|
||||
lldb: Option<bool>,
|
||||
deny_warnings: Option<bool>,
|
||||
backtrace_on_ice: Option<bool>,
|
||||
verify_llvm_ir: Option<bool>,
|
||||
@ -333,13 +333,13 @@ struct Rust {
|
||||
#[derive(Deserialize, Default)]
|
||||
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
|
||||
struct TomlTarget {
|
||||
llvm_config: Option<String>,
|
||||
llvm_filecheck: Option<String>,
|
||||
cc: Option<String>,
|
||||
cxx: Option<String>,
|
||||
ar: Option<String>,
|
||||
ranlib: Option<String>,
|
||||
linker: Option<String>,
|
||||
llvm_config: Option<String>,
|
||||
llvm_filecheck: Option<String>,
|
||||
android_ndk: Option<String>,
|
||||
crt_static: Option<bool>,
|
||||
musl_root: Option<String>,
|
||||
@ -668,7 +668,6 @@ impl Config {
|
||||
|
||||
pub fn llvm_enabled(&self) -> bool {
|
||||
self.rust_codegen_backends.contains(&INTERNER.intern_str("llvm"))
|
||||
|| self.rust_codegen_backends.contains(&INTERNER.intern_str("emscripten"))
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -55,7 +55,6 @@ o("sanitizers", "build.sanitizers", "build the sanitizer runtimes (asan, lsan, m
|
||||
o("dist-src", "rust.dist-src", "when building tarballs enables building a source tarball")
|
||||
o("cargo-native-static", "build.cargo-native-static", "static native libraries in cargo")
|
||||
o("profiler", "build.profiler", "build the profiler runtime")
|
||||
o("emscripten", None, "compile the emscripten backend as well as LLVM")
|
||||
o("full-tools", None, "enable all tools")
|
||||
o("lld", "rust.lld", "build lld")
|
||||
o("lldb", "rust.lldb", "build lldb")
|
||||
@ -134,6 +133,10 @@ v("musl-root-mips", "target.mips-unknown-linux-musl.musl-root",
|
||||
"mips-unknown-linux-musl install directory")
|
||||
v("musl-root-mipsel", "target.mipsel-unknown-linux-musl.musl-root",
|
||||
"mipsel-unknown-linux-musl install directory")
|
||||
v("musl-root-mips64", "target.mips64-unknown-linux-muslabi64.musl-root",
|
||||
"mips64-unknown-linux-muslabi64 install directory")
|
||||
v("musl-root-mips64el", "target.mips64el-unknown-linux-muslabi64.musl-root",
|
||||
"mips64el-unknown-linux-muslabi64 install directory")
|
||||
v("qemu-armhf-rootfs", "target.arm-unknown-linux-gnueabihf.qemu-rootfs",
|
||||
"rootfs in qemu testing, you probably don't want to use this")
|
||||
v("qemu-aarch64-rootfs", "target.aarch64-unknown-linux-gnu.qemu-rootfs",
|
||||
@ -335,10 +338,8 @@ for key in known_args:
|
||||
set('build.host', value.split(','))
|
||||
elif option.name == 'target':
|
||||
set('build.target', value.split(','))
|
||||
elif option.name == 'emscripten':
|
||||
set('rust.codegen-backends', ['llvm', 'emscripten'])
|
||||
elif option.name == 'full-tools':
|
||||
set('rust.codegen-backends', ['llvm', 'emscripten'])
|
||||
set('rust.codegen-backends', ['llvm'])
|
||||
set('rust.lld', True)
|
||||
set('rust.llvm-tools', True)
|
||||
set('build.extended', True)
|
||||
|
@ -236,7 +236,7 @@ fn make_win_dist(
|
||||
}
|
||||
|
||||
let target_tools = ["gcc.exe", "ld.exe", "dlltool.exe", "libwinpthread-1.dll"];
|
||||
let mut rustc_dlls = vec!["libstdc++-6.dll", "libwinpthread-1.dll"];
|
||||
let mut rustc_dlls = vec!["libwinpthread-1.dll"];
|
||||
if target_triple.starts_with("i686-") {
|
||||
rustc_dlls.push("libgcc_s_dw2-1.dll");
|
||||
} else {
|
||||
@ -637,6 +637,28 @@ impl Step for DebuggerScripts {
|
||||
}
|
||||
}
|
||||
|
||||
fn skip_host_target_lib(builder: &Builder<'_>, compiler: Compiler) -> bool {
|
||||
// The only true set of target libraries came from the build triple, so
|
||||
// let's reduce redundant work by only producing archives from that host.
|
||||
if compiler.host != builder.config.build {
|
||||
builder.info("\tskipping, not a build host");
|
||||
true
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}
|
||||
|
||||
/// Copy stamped files into an image's `target/lib` directory.
|
||||
fn copy_target_libs(builder: &Builder<'_>, target: &str, image: &Path, stamp: &Path) {
|
||||
let dst = image.join("lib/rustlib").join(target).join("lib");
|
||||
t!(fs::create_dir_all(&dst));
|
||||
for (path, host) in builder.read_stamp_file(stamp) {
|
||||
if !host || builder.config.build == target {
|
||||
builder.copy(&path, &dst.join(path.file_name().unwrap()));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Std {
|
||||
pub compiler: Compiler,
|
||||
@ -667,44 +689,19 @@ impl Step for Std {
|
||||
let target = self.target;
|
||||
|
||||
let name = pkgname(builder, "rust-std");
|
||||
|
||||
// The only true set of target libraries came from the build triple, so
|
||||
// let's reduce redundant work by only producing archives from that host.
|
||||
if compiler.host != builder.config.build {
|
||||
builder.info("\tskipping, not a build host");
|
||||
return distdir(builder).join(format!("{}-{}.tar.gz", name, target));
|
||||
let archive = distdir(builder).join(format!("{}-{}.tar.gz", name, target));
|
||||
if skip_host_target_lib(builder, compiler) {
|
||||
return archive;
|
||||
}
|
||||
|
||||
// We want to package up as many target libraries as possible
|
||||
// for the `rust-std` package, so if this is a host target we
|
||||
// depend on librustc and otherwise we just depend on libtest.
|
||||
if builder.hosts.iter().any(|t| t == target) {
|
||||
builder.ensure(compile::Rustc { compiler, target });
|
||||
} else {
|
||||
builder.ensure(compile::Std { compiler, target });
|
||||
}
|
||||
builder.ensure(compile::Std { compiler, target });
|
||||
|
||||
let image = tmpdir(builder).join(format!("{}-{}-image", name, target));
|
||||
let _ = fs::remove_dir_all(&image);
|
||||
|
||||
let dst = image.join("lib/rustlib").join(target);
|
||||
t!(fs::create_dir_all(&dst));
|
||||
let mut src = builder.sysroot_libdir(compiler, target).to_path_buf();
|
||||
src.pop(); // Remove the trailing /lib folder from the sysroot_libdir
|
||||
builder.cp_filtered(&src, &dst, &|path| {
|
||||
if let Some(name) = path.file_name().and_then(|s| s.to_str()) {
|
||||
if name == builder.config.rust_codegen_backends_dir.as_str() {
|
||||
return false
|
||||
}
|
||||
if name == "bin" {
|
||||
return false
|
||||
}
|
||||
if name.contains("LLVM") {
|
||||
return false
|
||||
}
|
||||
}
|
||||
true
|
||||
});
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
let stamp = compile::libstd_stamp(builder, compiler_to_use, target);
|
||||
copy_target_libs(builder, &target, &image, &stamp);
|
||||
|
||||
let mut cmd = rust_installer(builder);
|
||||
cmd.arg("generate")
|
||||
@ -723,7 +720,73 @@ impl Step for Std {
|
||||
let _time = timeit(builder);
|
||||
builder.run(&mut cmd);
|
||||
builder.remove_dir(&image);
|
||||
distdir(builder).join(format!("{}-{}.tar.gz", name, target))
|
||||
archive
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, PartialOrd, Ord, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct RustcDev {
|
||||
pub compiler: Compiler,
|
||||
pub target: Interned<String>,
|
||||
}
|
||||
|
||||
impl Step for RustcDev {
|
||||
type Output = PathBuf;
|
||||
const DEFAULT: bool = true;
|
||||
const ONLY_HOSTS: bool = true;
|
||||
|
||||
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
|
||||
run.path("rustc-dev")
|
||||
}
|
||||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
run.builder.ensure(RustcDev {
|
||||
compiler: run.builder.compiler_for(
|
||||
run.builder.top_stage,
|
||||
run.builder.config.build,
|
||||
run.target,
|
||||
),
|
||||
target: run.target,
|
||||
});
|
||||
}
|
||||
|
||||
fn run(self, builder: &Builder<'_>) -> PathBuf {
|
||||
let compiler = self.compiler;
|
||||
let target = self.target;
|
||||
|
||||
let name = pkgname(builder, "rustc-dev");
|
||||
let archive = distdir(builder).join(format!("{}-{}.tar.gz", name, target));
|
||||
if skip_host_target_lib(builder, compiler) {
|
||||
return archive;
|
||||
}
|
||||
|
||||
builder.ensure(compile::Rustc { compiler, target });
|
||||
|
||||
let image = tmpdir(builder).join(format!("{}-{}-image", name, target));
|
||||
let _ = fs::remove_dir_all(&image);
|
||||
|
||||
let compiler_to_use = builder.compiler_for(compiler.stage, compiler.host, target);
|
||||
let stamp = compile::librustc_stamp(builder, compiler_to_use, target);
|
||||
copy_target_libs(builder, &target, &image, &stamp);
|
||||
|
||||
let mut cmd = rust_installer(builder);
|
||||
cmd.arg("generate")
|
||||
.arg("--product-name=Rust")
|
||||
.arg("--rel-manifest-dir=rustlib")
|
||||
.arg("--success-message=Rust-is-ready-to-develop.")
|
||||
.arg("--image-dir").arg(&image)
|
||||
.arg("--work-dir").arg(&tmpdir(builder))
|
||||
.arg("--output-dir").arg(&distdir(builder))
|
||||
.arg(format!("--package-name={}-{}", name, target))
|
||||
.arg(format!("--component-name=rustc-dev-{}", target))
|
||||
.arg("--legacy-manifest-dirs=rustlib,cargo");
|
||||
|
||||
builder.info(&format!("Dist rustc-dev stage{} ({} -> {})",
|
||||
compiler.stage, &compiler.host, target));
|
||||
let _time = timeit(builder);
|
||||
builder.run(&mut cmd);
|
||||
builder.remove_dir(&image);
|
||||
archive
|
||||
}
|
||||
}
|
||||
|
||||
@ -826,7 +889,6 @@ fn copy_src_dirs(builder: &Builder<'_>, src_dirs: &[&str], exclude_dirs: &[&str]
|
||||
|
||||
const LLVM_TEST: &[&str] = &[
|
||||
"llvm-project/llvm/test", "llvm-project\\llvm\\test",
|
||||
"llvm-emscripten/test", "llvm-emscripten\\test",
|
||||
];
|
||||
if LLVM_TEST.iter().any(|path| spath.contains(path)) &&
|
||||
(spath.ends_with(".ll") ||
|
||||
@ -834,9 +896,6 @@ fn copy_src_dirs(builder: &Builder<'_>, src_dirs: &[&str], exclude_dirs: &[&str]
|
||||
spath.ends_with(".s")) {
|
||||
return false
|
||||
}
|
||||
if spath.contains("test/emscripten") || spath.contains("test\\emscripten") {
|
||||
return false
|
||||
}
|
||||
|
||||
let full_path = Path::new(dir).join(path);
|
||||
if exclude_dirs.iter().any(|excl| full_path == Path::new(excl)) {
|
||||
|
@ -160,7 +160,7 @@ mod job {
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(any(target_os = "haiku", not(any(unix, windows))))]
|
||||
#[cfg(any(target_os = "haiku", target_os = "hermit", not(any(unix, windows))))]
|
||||
mod job {
|
||||
pub unsafe fn setup(_build: &mut crate::Build) {
|
||||
}
|
||||
@ -232,7 +232,6 @@ pub struct Build {
|
||||
miri_info: channel::GitInfo,
|
||||
rustfmt_info: channel::GitInfo,
|
||||
in_tree_llvm_info: channel::GitInfo,
|
||||
emscripten_llvm_info: channel::GitInfo,
|
||||
local_rebuild: bool,
|
||||
fail_fast: bool,
|
||||
doc_tests: DocTests,
|
||||
@ -351,7 +350,6 @@ impl Build {
|
||||
|
||||
// we always try to use git for LLVM builds
|
||||
let in_tree_llvm_info = channel::GitInfo::new(false, &src.join("src/llvm-project"));
|
||||
let emscripten_llvm_info = channel::GitInfo::new(false, &src.join("src/llvm-emscripten"));
|
||||
|
||||
let mut build = Build {
|
||||
initial_rustc: config.initial_rustc.clone(),
|
||||
@ -376,7 +374,6 @@ impl Build {
|
||||
miri_info,
|
||||
rustfmt_info,
|
||||
in_tree_llvm_info,
|
||||
emscripten_llvm_info,
|
||||
cc: HashMap::new(),
|
||||
cxx: HashMap::new(),
|
||||
ar: HashMap::new(),
|
||||
@ -553,10 +550,6 @@ impl Build {
|
||||
self.out.join(&*target).join("llvm")
|
||||
}
|
||||
|
||||
fn emscripten_llvm_out(&self, target: Interned<String>) -> PathBuf {
|
||||
self.out.join(&*target).join("llvm-emscripten")
|
||||
}
|
||||
|
||||
fn lld_out(&self, target: Interned<String>) -> PathBuf {
|
||||
self.out.join(&*target).join("lld")
|
||||
}
|
||||
@ -1087,6 +1080,10 @@ impl Build {
|
||||
/// done. The file is updated immediately after this function completes.
|
||||
pub fn save_toolstate(&self, tool: &str, state: ToolState) {
|
||||
if let Some(ref path) = self.config.save_toolstates {
|
||||
if let Some(parent) = path.parent() {
|
||||
// Ensure the parent directory always exists
|
||||
t!(std::fs::create_dir_all(parent));
|
||||
}
|
||||
let mut file = t!(fs::OpenOptions::new()
|
||||
.create(true)
|
||||
.read(true)
|
||||
@ -1126,7 +1123,7 @@ impl Build {
|
||||
}
|
||||
|
||||
let mut paths = Vec::new();
|
||||
let contents = t!(fs::read(stamp));
|
||||
let contents = t!(fs::read(stamp), &stamp);
|
||||
// This is the method we use for extracting paths from the stamp file passed to us. See
|
||||
// run_cargo for more information (in compile.rs).
|
||||
for part in contents.split(|b| *b == 0) {
|
||||
@ -1144,6 +1141,7 @@ impl Build {
|
||||
pub fn copy(&self, src: &Path, dst: &Path) {
|
||||
if self.config.dry_run { return; }
|
||||
self.verbose_than(1, &format!("Copy {:?} to {:?}", src, dst));
|
||||
if src == dst { return; }
|
||||
let _ = fs::remove_file(&dst);
|
||||
let metadata = t!(src.symlink_metadata());
|
||||
if metadata.file_type().is_symlink() {
|
||||
|
@ -28,7 +28,6 @@ use crate::GitRepo;
|
||||
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
|
||||
pub struct Llvm {
|
||||
pub target: Interned<String>,
|
||||
pub emscripten: bool,
|
||||
}
|
||||
|
||||
impl Step for Llvm {
|
||||
@ -40,46 +39,35 @@ impl Step for Llvm {
|
||||
run.path("src/llvm-project")
|
||||
.path("src/llvm-project/llvm")
|
||||
.path("src/llvm")
|
||||
.path("src/llvm-emscripten")
|
||||
}
|
||||
|
||||
fn make_run(run: RunConfig<'_>) {
|
||||
let emscripten = run.path.ends_with("llvm-emscripten");
|
||||
run.builder.ensure(Llvm {
|
||||
target: run.target,
|
||||
emscripten,
|
||||
});
|
||||
}
|
||||
|
||||
/// Compile LLVM for `target`.
|
||||
fn run(self, builder: &Builder<'_>) -> PathBuf {
|
||||
let target = self.target;
|
||||
let emscripten = self.emscripten;
|
||||
|
||||
// If we're using a custom LLVM bail out here, but we can only use a
|
||||
// custom LLVM for the build triple.
|
||||
if !self.emscripten {
|
||||
if let Some(config) = builder.config.target_config.get(&target) {
|
||||
if let Some(ref s) = config.llvm_config {
|
||||
check_llvm_version(builder, s);
|
||||
return s.to_path_buf()
|
||||
}
|
||||
if let Some(config) = builder.config.target_config.get(&target) {
|
||||
if let Some(ref s) = config.llvm_config {
|
||||
check_llvm_version(builder, s);
|
||||
return s.to_path_buf()
|
||||
}
|
||||
}
|
||||
|
||||
let (llvm_info, root, out_dir, llvm_config_ret_dir) = if emscripten {
|
||||
let info = &builder.emscripten_llvm_info;
|
||||
let dir = builder.emscripten_llvm_out(target);
|
||||
let config_dir = dir.join("bin");
|
||||
(info, "src/llvm-emscripten", dir, config_dir)
|
||||
} else {
|
||||
let info = &builder.in_tree_llvm_info;
|
||||
let mut dir = builder.llvm_out(builder.config.build);
|
||||
if !builder.config.build.contains("msvc") || builder.config.ninja {
|
||||
dir.push("build");
|
||||
}
|
||||
(info, "src/llvm-project/llvm", builder.llvm_out(target), dir.join("bin"))
|
||||
};
|
||||
let llvm_info = &builder.in_tree_llvm_info;
|
||||
let root = "src/llvm-project/llvm";
|
||||
let out_dir = builder.llvm_out(target);
|
||||
let mut llvm_config_ret_dir = builder.llvm_out(builder.config.build);
|
||||
if !builder.config.build.contains("msvc") || builder.config.ninja {
|
||||
llvm_config_ret_dir.push("build");
|
||||
}
|
||||
llvm_config_ret_dir.push("bin");
|
||||
|
||||
let build_llvm_config = llvm_config_ret_dir
|
||||
.join(exe("llvm-config", &*builder.config.build));
|
||||
@ -107,8 +95,7 @@ impl Step for Llvm {
|
||||
}
|
||||
}
|
||||
|
||||
let descriptor = if emscripten { "Emscripten " } else { "" };
|
||||
builder.info(&format!("Building {}LLVM for {}", descriptor, target));
|
||||
builder.info(&format!("Building LLVM for {}", target));
|
||||
let _time = util::timeit(&builder);
|
||||
t!(fs::create_dir_all(&out_dir));
|
||||
|
||||
@ -123,23 +110,15 @@ impl Step for Llvm {
|
||||
|
||||
// NOTE: remember to also update `config.toml.example` when changing the
|
||||
// defaults!
|
||||
let llvm_targets = if self.emscripten {
|
||||
"JSBackend"
|
||||
} else {
|
||||
match builder.config.llvm_targets {
|
||||
Some(ref s) => s,
|
||||
None => "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;\
|
||||
Sparc;SystemZ;WebAssembly;X86",
|
||||
}
|
||||
let llvm_targets = match &builder.config.llvm_targets {
|
||||
Some(s) => s,
|
||||
None => "AArch64;ARM;Hexagon;MSP430;Mips;NVPTX;PowerPC;RISCV;\
|
||||
Sparc;SystemZ;WebAssembly;X86",
|
||||
};
|
||||
|
||||
let llvm_exp_targets = if self.emscripten {
|
||||
""
|
||||
} else {
|
||||
match builder.config.llvm_experimental_targets {
|
||||
Some(ref s) => s,
|
||||
None => "",
|
||||
}
|
||||
let llvm_exp_targets = match builder.config.llvm_experimental_targets {
|
||||
Some(ref s) => s,
|
||||
None => "",
|
||||
};
|
||||
|
||||
let assertions = if builder.config.llvm_assertions {"ON"} else {"OFF"};
|
||||
@ -157,40 +136,30 @@ impl Step for Llvm {
|
||||
.define("WITH_POLLY", "OFF")
|
||||
.define("LLVM_ENABLE_TERMINFO", "OFF")
|
||||
.define("LLVM_ENABLE_LIBEDIT", "OFF")
|
||||
.define("LLVM_ENABLE_BINDINGS", "OFF")
|
||||
.define("LLVM_ENABLE_Z3_SOLVER", "OFF")
|
||||
.define("LLVM_PARALLEL_COMPILE_JOBS", builder.jobs().to_string())
|
||||
.define("LLVM_TARGET_ARCH", target.split('-').next().unwrap())
|
||||
.define("LLVM_DEFAULT_TARGET_TRIPLE", target);
|
||||
|
||||
if builder.config.llvm_thin_lto && !emscripten {
|
||||
if builder.config.llvm_thin_lto {
|
||||
cfg.define("LLVM_ENABLE_LTO", "Thin");
|
||||
if !target.contains("apple") {
|
||||
cfg.define("LLVM_ENABLE_LLD", "ON");
|
||||
}
|
||||
}
|
||||
|
||||
// By default, LLVM will automatically find OCaml and, if it finds it,
|
||||
// install the LLVM bindings in LLVM_OCAML_INSTALL_PATH, which defaults
|
||||
// to /usr/bin/ocaml.
|
||||
// This causes problem for non-root builds of Rust. Side-step the issue
|
||||
// by setting LLVM_OCAML_INSTALL_PATH to a relative path, so it installs
|
||||
// in the prefix.
|
||||
cfg.define("LLVM_OCAML_INSTALL_PATH",
|
||||
env::var_os("LLVM_OCAML_INSTALL_PATH").unwrap_or_else(|| "usr/lib/ocaml".into()));
|
||||
|
||||
let want_lldb = builder.config.lldb_enabled && !self.emscripten;
|
||||
|
||||
// This setting makes the LLVM tools link to the dynamic LLVM library,
|
||||
// which saves both memory during parallel links and overall disk space
|
||||
// for the tools. We don't do this on every platform as it doesn't work
|
||||
// equally well everywhere.
|
||||
if builder.llvm_link_tools_dynamically(target) && !emscripten {
|
||||
if builder.llvm_link_tools_dynamically(target) {
|
||||
cfg.define("LLVM_LINK_LLVM_DYLIB", "ON");
|
||||
}
|
||||
|
||||
// For distribution we want the LLVM tools to be *statically* linked to libstdc++
|
||||
if builder.config.llvm_tools_enabled || want_lldb {
|
||||
if !target.contains("windows") {
|
||||
if builder.config.llvm_tools_enabled || builder.config.lldb_enabled {
|
||||
if !target.contains("msvc") {
|
||||
if target.contains("apple") {
|
||||
cfg.define("CMAKE_EXE_LINKER_FLAGS", "-static-libstdc++");
|
||||
} else {
|
||||
@ -217,7 +186,7 @@ impl Step for Llvm {
|
||||
enabled_llvm_projects.push("compiler-rt");
|
||||
}
|
||||
|
||||
if want_lldb {
|
||||
if builder.config.lldb_enabled {
|
||||
enabled_llvm_projects.push("clang");
|
||||
enabled_llvm_projects.push("lldb");
|
||||
// For the time being, disable code signing.
|
||||
@ -242,10 +211,9 @@ impl Step for Llvm {
|
||||
}
|
||||
|
||||
// http://llvm.org/docs/HowToCrossCompileLLVM.html
|
||||
if target != builder.config.build && !emscripten {
|
||||
if target != builder.config.build {
|
||||
builder.ensure(Llvm {
|
||||
target: builder.config.build,
|
||||
emscripten: false,
|
||||
});
|
||||
// FIXME: if the llvm root for the build triple is overridden then we
|
||||
// should use llvm-tblgen from there, also should verify that it
|
||||
@ -427,7 +395,7 @@ fn configure_cmake(builder: &Builder<'_>,
|
||||
cfg.define("CMAKE_C_FLAGS", cflags);
|
||||
let mut cxxflags = builder.cflags(target, GitRepo::Llvm).join(" ");
|
||||
if builder.config.llvm_static_stdcpp &&
|
||||
!target.contains("windows") &&
|
||||
!target.contains("msvc") &&
|
||||
!target.contains("netbsd")
|
||||
{
|
||||
cxxflags.push_str(" -static-libstdc++");
|
||||
@ -489,7 +457,6 @@ impl Step for Lld {
|
||||
|
||||
let llvm_config = builder.ensure(Llvm {
|
||||
target: self.target,
|
||||
emscripten: false,
|
||||
});
|
||||
|
||||
let out_dir = builder.lld_out(target);
|
||||
@ -567,6 +534,10 @@ impl Step for TestHelpers {
|
||||
builder.info("Building test helpers");
|
||||
t!(fs::create_dir_all(&dst));
|
||||
let mut cfg = cc::Build::new();
|
||||
// FIXME: Workaround for https://github.com/emscripten-core/emscripten/issues/9013
|
||||
if target.contains("emscripten") {
|
||||
cfg.pic(false);
|
||||
}
|
||||
|
||||
// We may have found various cross-compilers a little differently due to our
|
||||
// extra configuration, so inform gcc of these compilers. Note, though, that
|
||||
|
@ -386,8 +386,17 @@ impl Step for Miri {
|
||||
extra_features: Vec::new(),
|
||||
});
|
||||
if let Some(miri) = miri {
|
||||
let mut cargo = builder.cargo(compiler, Mode::ToolRustc, host, "install");
|
||||
cargo.arg("xargo");
|
||||
// Configure `cargo install` path. cargo adds a `bin/`.
|
||||
cargo.env("CARGO_INSTALL_ROOT", &builder.out);
|
||||
|
||||
let mut cargo = Command::from(cargo);
|
||||
if !try_run(builder, &mut cargo) {
|
||||
return;
|
||||
}
|
||||
|
||||
// # Run `cargo miri setup`.
|
||||
// As a side-effect, this will install xargo.
|
||||
let mut cargo = tool::prepare_tool_cargo(
|
||||
builder,
|
||||
compiler,
|
||||
@ -412,9 +421,7 @@ impl Step for Miri {
|
||||
cargo.env("XARGO_RUST_SRC", builder.src.join("src"));
|
||||
// Debug things.
|
||||
cargo.env("RUST_BACKTRACE", "1");
|
||||
// Configure `cargo install` path, and let cargo-miri know that that's where
|
||||
// xargo ends up.
|
||||
cargo.env("CARGO_INSTALL_ROOT", &builder.out); // cargo adds a `bin/`
|
||||
// Let cargo-miri know where xargo ended up.
|
||||
cargo.env("XARGO", builder.out.join("bin").join("xargo"));
|
||||
|
||||
let mut cargo = Command::from(cargo);
|
||||
@ -427,7 +434,7 @@ impl Step for Miri {
|
||||
// (We do this separately from the above so that when the setup actually
|
||||
// happens we get some output.)
|
||||
// We re-use the `cargo` from above.
|
||||
cargo.arg("--env");
|
||||
cargo.arg("--print-sysroot");
|
||||
|
||||
// FIXME: Is there a way in which we can re-use the usual `run` helpers?
|
||||
let miri_sysroot = if builder.config.dry_run {
|
||||
@ -437,13 +444,11 @@ impl Step for Miri {
|
||||
let out = cargo.output()
|
||||
.expect("We already ran `cargo miri setup` before and that worked");
|
||||
assert!(out.status.success(), "`cargo miri setup` returned with non-0 exit code");
|
||||
// Output is "MIRI_SYSROOT=<str>\n".
|
||||
// Output is "<sysroot>\n".
|
||||
let stdout = String::from_utf8(out.stdout)
|
||||
.expect("`cargo miri setup` stdout is not valid UTF-8");
|
||||
let stdout = stdout.trim();
|
||||
builder.verbose(&format!("`cargo miri setup --env` returned: {:?}", stdout));
|
||||
let sysroot = stdout.splitn(2, '=')
|
||||
.nth(1).expect("`cargo miri setup` stdout did not contain '='");
|
||||
let sysroot = stdout.trim_end();
|
||||
builder.verbose(&format!("`cargo miri setup --print-sysroot` said: {:?}", sysroot));
|
||||
sysroot.to_owned()
|
||||
};
|
||||
|
||||
@ -1047,10 +1052,11 @@ impl Step for Compiletest {
|
||||
// Also provide `rust_test_helpers` for the host.
|
||||
builder.ensure(native::TestHelpers { target: compiler.host });
|
||||
|
||||
// wasm32 can't build the test helpers
|
||||
if !target.contains("wasm32") {
|
||||
// As well as the target, except for plain wasm32, which can't build it
|
||||
if !target.contains("wasm32") || target.contains("emscripten") {
|
||||
builder.ensure(native::TestHelpers { target });
|
||||
}
|
||||
|
||||
builder.ensure(RemoteCopyLibs { compiler, target });
|
||||
|
||||
let mut cmd = builder.tool_cmd(Tool::Compiletest);
|
||||
@ -1164,7 +1170,7 @@ impl Step for Compiletest {
|
||||
}).to_string()
|
||||
})
|
||||
};
|
||||
let lldb_exe = if builder.config.lldb_enabled && !target.contains("emscripten") {
|
||||
let lldb_exe = if builder.config.lldb_enabled {
|
||||
// Test against the lldb that was just built.
|
||||
builder.llvm_out(target).join("bin").join("lldb")
|
||||
} else {
|
||||
@ -1233,7 +1239,6 @@ impl Step for Compiletest {
|
||||
if builder.config.llvm_enabled() {
|
||||
let llvm_config = builder.ensure(native::Llvm {
|
||||
target: builder.config.build,
|
||||
emscripten: false,
|
||||
});
|
||||
if !builder.config.dry_run {
|
||||
let llvm_version = output(Command::new(&llvm_config).arg("--version"));
|
||||
|
@ -244,6 +244,7 @@ pub fn prepare_tool_cargo(
|
||||
path.ends_with("rls") ||
|
||||
path.ends_with("clippy") ||
|
||||
path.ends_with("miri") ||
|
||||
path.ends_with("rustbook") ||
|
||||
path.ends_with("rustfmt")
|
||||
{
|
||||
cargo.env("LIBZ_SYS_STATIC", "1");
|
||||
|
@ -21,6 +21,13 @@ macro_rules! t {
|
||||
Err(e) => panic!("{} failed with {}", stringify!($e), e),
|
||||
}
|
||||
};
|
||||
// it can show extra info in the second parameter
|
||||
($e:expr, $extra:expr) => {
|
||||
match $e {
|
||||
Ok(e) => e,
|
||||
Err(e) => panic!("{} failed with {} ({:?})", stringify!($e), e, $extra),
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
// Because Cargo adds the compiler's dylib path to our library search path, llvm-config may
|
||||
|
@ -130,6 +130,8 @@ jobs:
|
||||
IMAGE: i686-gnu-nopt
|
||||
test-various:
|
||||
IMAGE: test-various
|
||||
wasm32:
|
||||
IMAGE: wasm32
|
||||
x86_64-gnu:
|
||||
IMAGE: x86_64-gnu
|
||||
x86_64-gnu-full-bootstrap:
|
||||
@ -138,6 +140,7 @@ jobs:
|
||||
IMAGE: x86_64-gnu-aux
|
||||
x86_64-gnu-tools:
|
||||
IMAGE: x86_64-gnu-tools
|
||||
DEPLOY_TOOLSTATES_JSON: toolstates-linux.json
|
||||
x86_64-gnu-debug:
|
||||
IMAGE: x86_64-gnu-debug
|
||||
x86_64-gnu-nopt:
|
||||
@ -260,8 +263,9 @@ jobs:
|
||||
# MSVC tools tests
|
||||
x86_64-msvc-tools:
|
||||
MSYS_BITS: 64
|
||||
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstates.json windows
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --save-toolstates=/tmp/toolstates.json
|
||||
SCRIPT: src/ci/docker/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstate/toolstates.json windows
|
||||
RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --save-toolstates=/tmp/toolstate/toolstates.json
|
||||
DEPLOY_TOOLSTATES_JSON: toolstates-windows.json
|
||||
|
||||
# 32/64-bit MinGW builds.
|
||||
#
|
||||
@ -313,6 +317,7 @@ jobs:
|
||||
|
||||
# 32/64 bit MSVC and GNU deployment
|
||||
dist-x86_64-msvc:
|
||||
MSYS_BITS: 64
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=x86_64-pc-windows-msvc
|
||||
--target=x86_64-pc-windows-msvc,aarch64-pc-windows-msvc
|
||||
@ -322,6 +327,7 @@ jobs:
|
||||
DIST_REQUIRE_ALL_TOOLS: 1
|
||||
DEPLOY: 1
|
||||
dist-i686-msvc:
|
||||
MSYS_BITS: 32
|
||||
RUST_CONFIGURE_ARGS: >
|
||||
--build=i686-pc-windows-msvc
|
||||
--target=i586-pc-windows-msvc
|
||||
|
@ -22,14 +22,6 @@ jobs:
|
||||
IMAGE: x86_64-gnu-llvm-6.0
|
||||
mingw-check:
|
||||
IMAGE: mingw-check
|
||||
|
||||
- job: LinuxTools
|
||||
timeoutInMinutes: 600
|
||||
pool:
|
||||
vmImage: ubuntu-16.04
|
||||
steps:
|
||||
- template: steps/run.yml
|
||||
parameters:
|
||||
only_on_updated_submodules: 'yes'
|
||||
variables:
|
||||
IMAGE: x86_64-gnu-tools
|
||||
x86_64-gnu-tools:
|
||||
IMAGE: x86_64-gnu-tools
|
||||
CI_ONLY_WHEN_SUBMODULES_CHANGED: 1
|
||||
|
@ -1,46 +0,0 @@
|
||||
steps:
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
curl -f http://releases.llvm.org/7.0.0/clang+llvm-7.0.0-x86_64-apple-darwin.tar.xz | tar xJf -
|
||||
|
||||
export CC=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang
|
||||
echo "##vso[task.setvariable variable=CC]$CC"
|
||||
|
||||
export CXX=`pwd`/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang++
|
||||
echo "##vso[task.setvariable variable=CXX]$CXX"
|
||||
|
||||
# Configure `AR` specifically so rustbuild doesn't try to infer it as
|
||||
# `clang-ar` by accident.
|
||||
echo "##vso[task.setvariable variable=AR]ar"
|
||||
displayName: Install clang (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
|
||||
# If we're compiling for MSVC then we, like most other distribution builders,
|
||||
# switch to clang as the compiler. This'll allow us eventually to enable LTO
|
||||
# amongst LLVM and rustc. Note that we only do this on MSVC as I don't think
|
||||
# clang has an output mode compatible with MinGW that we need. If it does we
|
||||
# should switch to clang for MinGW as well!
|
||||
#
|
||||
# Note that the LLVM installer is an NSIS installer
|
||||
#
|
||||
# Original downloaded here came from
|
||||
# http://releases.llvm.org/7.0.0/LLVM-7.0.0-win64.exe
|
||||
# That installer was run through `wine` on Linux and then the resulting
|
||||
# installation directory (found in `$HOME/.wine/drive_c/Program Files/LLVM`) was
|
||||
# packaged up into a tarball. We've had issues otherwise that the installer will
|
||||
# randomly hang, provide not a lot of useful information, pollute global state,
|
||||
# etc. In general the tarball is just more confined and easier to deal with when
|
||||
# working with various CI environments.
|
||||
- bash: |
|
||||
set -e
|
||||
mkdir -p citools
|
||||
cd citools
|
||||
curl -f https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/LLVM-7.0.0-win64.tar.gz | tar xzf -
|
||||
echo "##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]$RUST_CONFIGURE_ARGS --set llvm.clang-cl=`pwd`/clang-rust/bin/clang-cl.exe"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), eq(variables['MINGW_URL'],''))
|
||||
displayName: Install clang (Windows)
|
||||
|
||||
# Note that we don't install clang on Linux since its compiler story is just so
|
||||
# different. Each container has its own toolchain configured appropriately
|
||||
# already.
|
@ -1,21 +0,0 @@
|
||||
steps:
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
curl -fo /usr/local/bin/sccache https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-02-sccache-x86_64-apple-darwin
|
||||
chmod +x /usr/local/bin/sccache
|
||||
displayName: Install sccache (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
|
||||
- script: |
|
||||
md sccache
|
||||
powershell -Command "$ProgressPreference = 'SilentlyContinue'; iwr -outf sccache\sccache.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2018-04-26-sccache-x86_64-pc-windows-msvc"
|
||||
echo ##vso[task.prependpath]%CD%\sccache
|
||||
displayName: Install sccache (Windows)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# Note that we don't install sccache on Linux since it's installed elsewhere
|
||||
# through all the containers.
|
||||
#
|
||||
# FIXME: we should probably install sccache outside the containers and then
|
||||
# mount it inside the containers so we can centralize all installation here.
|
@ -1,120 +0,0 @@
|
||||
steps:
|
||||
# We use the WIX toolset to create combined installers for Windows, and these
|
||||
# binaries are downloaded from
|
||||
# https://github.com/wixtoolset/wix3 originally
|
||||
- bash: |
|
||||
set -e
|
||||
curl -O https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/wix311-binaries.zip
|
||||
echo "##vso[task.setvariable variable=WIX]`pwd`/wix"
|
||||
mkdir -p wix/bin
|
||||
cd wix/bin
|
||||
7z x ../../wix311-binaries.zip
|
||||
displayName: Install wix
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# We use InnoSetup and its `iscc` program to also create combined installers.
|
||||
# Honestly at this point WIX above and `iscc` are just holdovers from
|
||||
# oh-so-long-ago and are required for creating installers on Windows. I think
|
||||
# one is MSI installers and one is EXE, but they're not used so frequently at
|
||||
# this point anyway so perhaps it's a wash!
|
||||
- script: |
|
||||
echo ##vso[task.prependpath]C:\Program Files (x86)\Inno Setup 5
|
||||
curl.exe -o is-install.exe https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-08-22-is.exe
|
||||
is-install.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-
|
||||
displayName: Install InnoSetup
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# We've had issues with the default drive in use running out of space during a
|
||||
# build, and it looks like the `C:` drive has more space than the default `D:`
|
||||
# drive. We should probably confirm this with the azure pipelines team at some
|
||||
# point, but this seems to fix our "disk space full" problems.
|
||||
- script: |
|
||||
mkdir c:\MORE_SPACE
|
||||
mklink /J build c:\MORE_SPACE
|
||||
displayName: "Ensure build happens on C:/ instead of D:/"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
- bash: git config --replace-all --global core.autocrlf false
|
||||
displayName: "Disable git automatic line ending conversion (on C:/)"
|
||||
|
||||
# Download and install MSYS2, needed primarily for the test suite (run-make) but
|
||||
# also used by the MinGW toolchain for assembling things.
|
||||
#
|
||||
# FIXME: we should probe the default azure image and see if we can use the MSYS2
|
||||
# toolchain there. (if there's even one there). For now though this gets the job
|
||||
# done.
|
||||
- bash: |
|
||||
set -e
|
||||
choco install msys2 --params="/InstallDir:$(System.Workfolder)/msys2 /NoPath" -y --no-progress
|
||||
echo "##vso[task.prependpath]$(System.Workfolder)/msys2/usr/bin"
|
||||
mkdir -p "$(System.Workfolder)/msys2/home/$USERNAME"
|
||||
displayName: Install msys2
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
- bash: pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar
|
||||
displayName: Install msys2 base deps
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# If we need to download a custom MinGW, do so here and set the path
|
||||
# appropriately.
|
||||
#
|
||||
# Here we also do a pretty heinous thing which is to mangle the MinGW
|
||||
# installation we just downloaded. Currently, as of this writing, we're using
|
||||
# MinGW-w64 builds of gcc, and that's currently at 6.3.0. We use 6.3.0 as it
|
||||
# appears to be the first version which contains a fix for #40546, builds
|
||||
# randomly failing during LLVM due to ar.exe/ranlib.exe failures.
|
||||
#
|
||||
# Unfortunately, though, 6.3.0 *also* is the first version of MinGW-w64 builds
|
||||
# to contain a regression in gdb (#40184). As a result if we were to use the
|
||||
# gdb provided (7.11.1) then we would fail all debuginfo tests.
|
||||
#
|
||||
# In order to fix spurious failures (pretty high priority) we use 6.3.0. To
|
||||
# avoid disabling gdb tests we download an *old* version of gdb, specifically
|
||||
# that found inside the 6.2.0 distribution. We then overwrite the 6.3.0 gdb
|
||||
# with the 6.2.0 gdb to get tests passing.
|
||||
#
|
||||
# Note that we don't literally overwrite the gdb.exe binary because it appears
|
||||
# to just use gdborig.exe, so that's the binary we deal with instead.
|
||||
- bash: |
|
||||
set -e
|
||||
curl -o mingw.7z $MINGW_URL/$MINGW_ARCHIVE
|
||||
7z x -y mingw.7z > /dev/null
|
||||
curl -o $MINGW_DIR/bin/gdborig.exe $MINGW_URL/2017-04-20-${MSYS_BITS}bit-gdborig.exe
|
||||
echo "##vso[task.prependpath]`pwd`/$MINGW_DIR/bin"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), ne(variables['MINGW_URL'],''))
|
||||
displayName: Download custom MinGW
|
||||
|
||||
# Otherwise install MinGW through `pacman`
|
||||
- bash: |
|
||||
set -e
|
||||
arch=i686
|
||||
if [ "$MSYS_BITS" = "64" ]; then
|
||||
arch=x86_64
|
||||
fi
|
||||
pacman -S --noconfirm --needed mingw-w64-$arch-toolchain mingw-w64-$arch-cmake mingw-w64-$arch-gcc mingw-w64-$arch-python2
|
||||
echo "##vso[task.prependpath]$(System.Workfolder)/msys2/mingw$MSYS_BITS/bin"
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'), eq(variables['MINGW_URL'],''))
|
||||
displayName: Download standard MinGW
|
||||
|
||||
# Make sure we use the native python interpreter instead of some msys equivalent
|
||||
# one way or another. The msys interpreters seem to have weird path conversions
|
||||
# baked in which break LLVM's build system one way or another, so let's use the
|
||||
# native version which keeps everything as native as possible.
|
||||
- bash: |
|
||||
set -e
|
||||
cp C:/Python27amd64/python.exe C:/Python27amd64/python2.7.exe
|
||||
echo "##vso[task.prependpath]C:/Python27amd64"
|
||||
displayName: Prefer the "native" Python as LLVM has trouble building with MSYS sometimes
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
|
||||
# Note that this is originally from the github releases patch of Ninja
|
||||
- bash: |
|
||||
set -e
|
||||
mkdir ninja
|
||||
curl -o ninja.zip https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2017-03-15-ninja-win.zip
|
||||
7z x -oninja ninja.zip
|
||||
rm ninja.zip
|
||||
echo "##vso[task.setvariable variable=RUST_CONFIGURE_ARGS]$RUST_CONFIGURE_ARGS --enable-ninja"
|
||||
echo "##vso[task.prependpath]`pwd`/ninja"
|
||||
displayName: Download and install ninja
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
@ -6,11 +6,6 @@
|
||||
#
|
||||
# Check travis config for `gdb --batch` command to print all crash logs
|
||||
|
||||
parameters:
|
||||
# When this parameter is set to anything other than an empty string the tests
|
||||
# will only be executed when the commit updates submodules
|
||||
only_on_updated_submodules: ''
|
||||
|
||||
steps:
|
||||
|
||||
# Disable automatic line ending conversion, which is enabled by default on
|
||||
@ -26,21 +21,8 @@ steps:
|
||||
- checkout: self
|
||||
fetchDepth: 2
|
||||
|
||||
# Set the SKIP_JOB environment variable if this job is supposed to only run
|
||||
# when submodules are updated and they were not. The following time consuming
|
||||
# tasks will be skipped when the environment variable is present.
|
||||
- ${{ if parameters.only_on_updated_submodules }}:
|
||||
- bash: |
|
||||
set -e
|
||||
# Submodules pseudo-files inside git have the 160000 permissions, so when
|
||||
# those files are present in the diff a submodule was updated.
|
||||
if git diff HEAD^ | grep "^index .* 160000" >/dev/null 2>&1; then
|
||||
echo "Executing the job since submodules are updated"
|
||||
else
|
||||
echo "Not executing this job since no submodules were updated"
|
||||
echo "##vso[task.setvariable variable=SKIP_JOB;]1"
|
||||
fi
|
||||
displayName: Decide whether to run this job
|
||||
- bash: src/ci/scripts/should-skip-this.sh
|
||||
displayName: Decide whether to run this job
|
||||
|
||||
# Spawn a background process to collect CPU usage statistics which we'll upload
|
||||
# at the end of the build. See the comments in the script here for more
|
||||
@ -48,86 +30,106 @@ steps:
|
||||
- bash: python src/ci/cpu-usage-over-time.py &> cpu-usage.csv &
|
||||
displayName: "Collect CPU-usage statistics in the background"
|
||||
|
||||
- bash: printenv | sort
|
||||
displayName: Show environment variables
|
||||
- bash: src/ci/scripts/dump-environment.sh
|
||||
displayName: Show the current environment
|
||||
|
||||
- bash: |
|
||||
set -e
|
||||
df -h
|
||||
du . | sort -nr | head -n100
|
||||
displayName: Show disk usage
|
||||
# FIXME: this hasn't been tested, but maybe it works on Windows? Should test!
|
||||
condition: and(succeeded(), ne(variables['Agent.OS'], 'Windows_NT'))
|
||||
- bash: src/ci/scripts/install-sccache.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Install sccache
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- template: install-sccache.yml
|
||||
- template: install-clang.yml
|
||||
- bash: src/ci/scripts/install-clang.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Install clang
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# Switch to XCode 9.3 on OSX since it seems to be the last version that supports
|
||||
# i686-apple-darwin. We'll eventually want to upgrade this and it will probably
|
||||
# force us to drop i686-apple-darwin, but let's keep the wheels turning for now.
|
||||
- bash: |
|
||||
set -e
|
||||
sudo xcode-select --switch /Applications/Xcode_9.3.app
|
||||
displayName: Switch to Xcode 9.3 (OSX)
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Darwin'))
|
||||
- bash: src/ci/scripts/switch-xcode.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Switch to Xcode 9.3
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- template: install-windows-build-deps.yml
|
||||
- bash: src/ci/scripts/install-wix.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Install wix
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# Looks like docker containers have IPv6 disabled by default, so let's turn it
|
||||
# on since libstd tests require it
|
||||
- bash: |
|
||||
set -e
|
||||
sudo mkdir -p /etc/docker
|
||||
echo '{"ipv6":true,"fixed-cidr-v6":"fd9a:8454:6789:13f7::/64"}' | sudo tee /etc/docker/daemon.json
|
||||
sudo service docker restart
|
||||
displayName: Enable IPv6
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['Agent.OS'], 'Linux'))
|
||||
- bash: src/ci/scripts/install-innosetup.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Install InnoSetup
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/windows-symlink-build-dir.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Ensure the build happens on C:\ instead of D:\
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/disable-git-crlf-conversion.sh
|
||||
displayName: "Disable git automatic line ending conversion (on C:/)"
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/install-msys2.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
SYSTEM_WORKFOLDER: $(System.Workfolder)
|
||||
displayName: Install msys2
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/install-msys2-packages.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
SYSTEM_WORKFOLDER: $(System.Workfolder)
|
||||
displayName: Install msys2 packages
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/install-mingw.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
SYSTEM_WORKFOLDER: $(System.Workfolder)
|
||||
displayName: Install MinGW
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/install-ninja.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Install ninja
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
- bash: src/ci/scripts/enable-docker-ipv6.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Enable IPv6 on Docker
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# Disable automatic line ending conversion (again). On Windows, when we're
|
||||
# installing dependencies, something switches the git configuration directory or
|
||||
# re-enables autocrlf. We've not tracked down the exact cause -- and there may
|
||||
# be multiple -- but this should ensure submodules are checked out with the
|
||||
# appropriate line endings.
|
||||
- bash: git config --replace-all --global core.autocrlf false
|
||||
displayName: "Disable git automatic line ending conversion"
|
||||
- bash: src/ci/scripts/disable-git-crlf-conversion.sh
|
||||
displayName: Disable git automatic line ending conversion
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# Check out all our submodules, but more quickly than using git by using one of
|
||||
# our custom scripts
|
||||
- bash: |
|
||||
set -e
|
||||
mkdir -p $HOME/rustsrc
|
||||
$BUILD_SOURCESDIRECTORY/src/ci/init_repo.sh . $HOME/rustsrc
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), ne(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Check out submodules (Unix)
|
||||
- script: |
|
||||
if not exist C:\cache\rustsrc\NUL mkdir C:\cache\rustsrc
|
||||
sh src/ci/init_repo.sh . /c/cache/rustsrc
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Check out submodules (Windows)
|
||||
- bash: src/ci/scripts/checkout-submodules.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Checkout submodules
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# See also the disable for autocrlf above, this just checks that it worked
|
||||
#
|
||||
# We check both in rust-lang/rust and in a submodule to make sure both are
|
||||
# accurate. Submodules are checked out significantly later than the main
|
||||
# repository in this script, so settings can (and do!) change between then.
|
||||
#
|
||||
# Linux (and maybe macOS) builders don't currently have dos2unix so just only
|
||||
# run this step on Windows.
|
||||
- bash: |
|
||||
set -x
|
||||
# print out the git configuration so we can better investigate failures in
|
||||
# the following
|
||||
git config --list --show-origin
|
||||
dos2unix -ih Cargo.lock src/tools/rust-installer/install-template.sh
|
||||
endings=$(dos2unix -ic Cargo.lock src/tools/rust-installer/install-template.sh)
|
||||
# if endings has non-zero length, error out
|
||||
if [ -n "$endings" ]; then exit 1 ; fi
|
||||
condition: and(succeeded(), eq(variables['Agent.OS'], 'Windows_NT'))
|
||||
displayName: Verify line endings are LF
|
||||
- bash: src/ci/scripts/verify-line-endings.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
displayName: Verify line endings
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
|
||||
# Ensure the `aws` CLI is installed so we can deploy later on, cache docker
|
||||
# images, etc.
|
||||
- bash: src/ci/install-awscli.sh
|
||||
- bash: src/ci/scripts/install-awscli.sh
|
||||
env:
|
||||
AGENT_OS: $(Agent.OS)
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
@ -181,37 +183,21 @@ steps:
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB))
|
||||
displayName: Run build
|
||||
|
||||
# If we're a deploy builder, use the `aws` command to publish everything to our
|
||||
# bucket.
|
||||
- bash: |
|
||||
set -e
|
||||
source src/ci/shared.sh
|
||||
if [ "$AGENT_OS" = "Linux" ]; then
|
||||
rm -rf obj/build/dist/doc
|
||||
upload_dir=obj/build/dist
|
||||
else
|
||||
rm -rf build/dist/doc
|
||||
upload_dir=build/dist
|
||||
fi
|
||||
ls -la $upload_dir
|
||||
deploy_dir=rustc-builds
|
||||
if [ "$DEPLOY_ALT" == "1" ]; then
|
||||
deploy_dir=rustc-builds-alt
|
||||
fi
|
||||
retry aws s3 cp --no-progress --recursive --acl public-read ./$upload_dir s3://$DEPLOY_BUCKET/$deploy_dir/$BUILD_SOURCEVERSION
|
||||
- bash: src/ci/scripts/upload-artifacts.sh
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(UPLOAD_AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(UPLOAD_AWS_SECRET_ACCESS_KEY)
|
||||
condition: and(succeeded(), not(variables.SKIP_JOB), or(eq(variables.DEPLOY, '1'), eq(variables.DEPLOY_ALT, '1')))
|
||||
displayName: Upload artifacts
|
||||
|
||||
# Upload CPU usage statistics that we've been gathering this whole time. Always
|
||||
# execute this step in case we want to inspect failed builds, but don't let
|
||||
# errors here ever fail the build since this is just informational.
|
||||
- bash: aws s3 cp --acl public-read cpu-usage.csv s3://$DEPLOY_BUCKET/rustc-builds/$BUILD_SOURCEVERSION/cpu-$CI_JOB_NAME.csv
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(UPLOAD_AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(UPLOAD_AWS_SECRET_ACCESS_KEY)
|
||||
condition: variables['UPLOAD_AWS_SECRET_ACCESS_KEY']
|
||||
continueOnError: true
|
||||
displayName: Upload CPU usage statistics
|
||||
# Adding a condition on DEPLOY=1 or DEPLOY_ALT=1 is not needed as all deploy
|
||||
# builders *should* have the AWS credentials available. Still, explicitly
|
||||
# adding the condition is helpful as this way CI will not silently skip
|
||||
# deploying artifacts from a dist builder if the variables are misconfigured,
|
||||
# erroring about invalid credentials instead.
|
||||
condition: |
|
||||
and(
|
||||
succeeded(), not(variables.SKIP_JOB),
|
||||
or(
|
||||
variables.UPLOAD_AWS_SECRET_ACCESS_KEY,
|
||||
eq(variables.DEPLOY, '1'), eq(variables.DEPLOY_ALT, '1')
|
||||
)
|
||||
)
|
||||
|
@ -1,47 +0,0 @@
|
||||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++ \
|
||||
make \
|
||||
file \
|
||||
curl \
|
||||
ca-certificates \
|
||||
python \
|
||||
git \
|
||||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
xz-utils
|
||||
|
||||
COPY scripts/emscripten.sh /scripts/
|
||||
RUN bash /scripts/emscripten.sh
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
||||
ENV PATH=$PATH:/emsdk-portable
|
||||
ENV PATH=$PATH:/emsdk-portable/clang/e1.38.15_64bit/
|
||||
ENV PATH=$PATH:/emsdk-portable/emscripten/1.38.15/
|
||||
ENV PATH=$PATH:/emsdk-portable/node/8.9.1_64bit/bin/
|
||||
ENV EMSCRIPTEN=/emsdk-portable/emscripten/1.38.15/
|
||||
ENV BINARYEN_ROOT=/emsdk-portable/clang/e1.38.15_64bit/binaryen/
|
||||
ENV EM_CONFIG=/emsdk-portable/.emscripten
|
||||
|
||||
ENV TARGETS=asmjs-unknown-emscripten
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS --enable-emscripten --disable-optimize-tests
|
||||
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS \
|
||||
src/test/ui \
|
||||
src/test/run-fail \
|
||||
src/libstd \
|
||||
src/liballoc \
|
||||
src/libcore
|
||||
|
||||
# Debug assertions in rustc are largely covered by other builders, and LLVM
|
||||
# assertions cause this builder to slow down by quite a large amount and don't
|
||||
# buy us a huge amount over other builders (not sure if we've ever seen an
|
||||
# asmjs-specific backend assertion trip), so disable assertions for these
|
||||
# tests.
|
||||
ENV NO_LLVM_ASSERTIONS=1
|
||||
ENV NO_DEBUG_ASSERTIONS=1
|
41
src/ci/docker/disabled/asmjs/Dockerfile
Normal file
41
src/ci/docker/disabled/asmjs/Dockerfile
Normal file
@ -0,0 +1,41 @@
|
||||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++ \
|
||||
make \
|
||||
file \
|
||||
curl \
|
||||
ca-certificates \
|
||||
python \
|
||||
git \
|
||||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
xz-utils \
|
||||
bzip2
|
||||
|
||||
COPY scripts/emscripten.sh /scripts/
|
||||
RUN bash /scripts/emscripten.sh
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
||||
ENV PATH=$PATH:/emsdk-portable
|
||||
ENV PATH=$PATH:/emsdk-portable/upstream/emscripten/
|
||||
ENV PATH=$PATH:/emsdk-portable/node/12.9.1_64bit/bin/
|
||||
ENV BINARYEN_ROOT=/emsdk-portable/upstream/
|
||||
|
||||
ENV TARGETS=asmjs-unknown-emscripten
|
||||
|
||||
# Use -O1 optimizations in the link step to reduce time spent optimizing JS.
|
||||
ENV EMCC_CFLAGS=-O1
|
||||
|
||||
# Emscripten installation is user-specific
|
||||
ENV NO_CHANGE_USER=1
|
||||
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS
|
||||
|
||||
# This is almost identical to the wasm32-unknown-emscripten target, so
|
||||
# running with assertions again is not useful
|
||||
ENV NO_DEBUG_ASSERTIONS=1
|
||||
ENV NO_LLVM_ASSERTIONS=1
|
@ -1,35 +0,0 @@
|
||||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++ \
|
||||
make \
|
||||
file \
|
||||
curl \
|
||||
ca-certificates \
|
||||
python \
|
||||
git \
|
||||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
xz-utils \
|
||||
jq \
|
||||
bzip2
|
||||
|
||||
# emscripten
|
||||
COPY scripts/emscripten-wasm.sh /scripts/
|
||||
COPY wasm32-exp/node.sh /usr/local/bin/node
|
||||
RUN bash /scripts/emscripten-wasm.sh
|
||||
|
||||
# cache
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
||||
# env
|
||||
ENV PATH=/wasm-install/emscripten:/wasm-install/bin:$PATH
|
||||
ENV EM_CONFIG=/root/.emscripten
|
||||
|
||||
ENV TARGETS=wasm32-experimental-emscripten
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS --experimental-targets=WebAssembly
|
||||
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS
|
@ -1,9 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
path="$(dirname $1)"
|
||||
file="$(basename $1)"
|
||||
|
||||
shift
|
||||
|
||||
cd "$path"
|
||||
exec /node-v8.0.0-linux-x64/bin/node "$file" "$@"
|
@ -1,32 +0,0 @@
|
||||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++ \
|
||||
make \
|
||||
file \
|
||||
curl \
|
||||
ca-certificates \
|
||||
python \
|
||||
git \
|
||||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
xz-utils
|
||||
|
||||
# emscripten
|
||||
COPY scripts/emscripten.sh /scripts/
|
||||
RUN bash /scripts/emscripten.sh
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
||||
ENV PATH=$PATH:/emsdk-portable
|
||||
ENV PATH=$PATH:/emsdk-portable/clang/e1.38.15_64bit/
|
||||
ENV PATH=$PATH:/emsdk-portable/emscripten/1.38.15/
|
||||
ENV PATH=$PATH:/emsdk-portable/node/8.9.1_64bit/bin/
|
||||
ENV EMSCRIPTEN=/emsdk-portable/emscripten/1.38.15/
|
||||
ENV BINARYEN_ROOT=/emsdk-portable/clang/e1.38.15_64bit/binaryen/
|
||||
ENV EM_CONFIG=/emsdk-portable/.emscripten
|
||||
|
||||
ENV TARGETS=wasm32-unknown-emscripten
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS
|
@ -15,6 +15,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++-arm-linux-gnueabi \
|
||||
g++-arm-linux-gnueabihf \
|
||||
g++-aarch64-linux-gnu \
|
||||
g++-mips64-linux-gnuabi64 \
|
||||
g++-mips64el-linux-gnuabi64 \
|
||||
gcc-sparc64-linux-gnu \
|
||||
libc6-dev-sparc64-cross \
|
||||
bzip2 \
|
||||
@ -77,6 +79,14 @@ RUN env \
|
||||
CC=mipsel-openwrt-linux-gcc \
|
||||
CXX=mipsel-openwrt-linux-g++ \
|
||||
bash musl.sh mipsel && \
|
||||
env \
|
||||
CC=mips64-linux-gnuabi64-gcc \
|
||||
CXX=mips64-linux-gnuabi64-g++ \
|
||||
bash musl.sh mips64 && \
|
||||
env \
|
||||
CC=mips64el-linux-gnuabi64-gcc \
|
||||
CXX=mips64el-linux-gnuabi64-g++ \
|
||||
bash musl.sh mips64el && \
|
||||
rm -rf /build/*
|
||||
|
||||
# FIXME(mozilla/sccache#235) this shouldn't be necessary but is currently
|
||||
@ -97,6 +107,8 @@ ENV TARGETS=$TARGETS,wasm32-unknown-emscripten
|
||||
ENV TARGETS=$TARGETS,x86_64-rumprun-netbsd
|
||||
ENV TARGETS=$TARGETS,mips-unknown-linux-musl
|
||||
ENV TARGETS=$TARGETS,mipsel-unknown-linux-musl
|
||||
ENV TARGETS=$TARGETS,mips64-unknown-linux-muslabi64
|
||||
ENV TARGETS=$TARGETS,mips64el-unknown-linux-muslabi64
|
||||
ENV TARGETS=$TARGETS,arm-unknown-linux-musleabi
|
||||
ENV TARGETS=$TARGETS,arm-unknown-linux-musleabihf
|
||||
ENV TARGETS=$TARGETS,armv5te-unknown-linux-gnueabi
|
||||
@ -125,6 +137,8 @@ ENV TARGETS=$TARGETS,thumbv7neon-unknown-linux-gnueabihf
|
||||
|
||||
ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
|
||||
CC_mips_unknown_linux_musl=mips-openwrt-linux-gcc \
|
||||
CC_mips64el_unknown_linux_muslabi64=mips64el-linux-gnuabi64-gcc \
|
||||
CC_mips64_unknown_linux_muslabi64=mips64-linux-gnuabi64-gcc \
|
||||
CC_sparc64_unknown_linux_gnu=sparc64-linux-gnu-gcc \
|
||||
CC_x86_64_unknown_redox=x86_64-unknown-redox-gcc \
|
||||
CC_thumbv7neon_unknown_linux_gnueabihf=arm-linux-gnueabihf-gcc \
|
||||
@ -139,7 +153,8 @@ ENV RUST_CONFIGURE_ARGS \
|
||||
--musl-root-aarch64=/musl-aarch64 \
|
||||
--musl-root-mips=/musl-mips \
|
||||
--musl-root-mipsel=/musl-mipsel \
|
||||
--enable-emscripten \
|
||||
--musl-root-mips64=/musl-mips64 \
|
||||
--musl-root-mips64el=/musl-mips64el \
|
||||
--disable-docs
|
||||
|
||||
ENV SCRIPT \
|
||||
|
@ -106,6 +106,7 @@ fi
|
||||
mkdir -p $HOME/.cargo
|
||||
mkdir -p $objdir/tmp
|
||||
mkdir -p $objdir/cores
|
||||
mkdir -p /tmp/toolstate
|
||||
|
||||
args=
|
||||
if [ "$SCCACHE_BUCKET" != "" ]; then
|
||||
@ -156,6 +157,7 @@ else
|
||||
args="$args --volume $objdir:/checkout/obj"
|
||||
args="$args --volume $HOME/.cargo:/cargo"
|
||||
args="$args --volume $HOME/rustsrc:$HOME/rustsrc"
|
||||
args="$args --volume /tmp/toolstate:/tmp/toolstate"
|
||||
args="$args --env LOCAL_USER_ID=`id -u`"
|
||||
fi
|
||||
|
||||
|
@ -1,37 +0,0 @@
|
||||
set -ex
|
||||
|
||||
hide_output() {
|
||||
set +x
|
||||
on_err="
|
||||
echo ERROR: An error was encountered with the build.
|
||||
cat /tmp/build.log
|
||||
exit 1
|
||||
"
|
||||
trap "$on_err" ERR
|
||||
bash -c "while true; do sleep 30; echo \$(date) - building ...; done" &
|
||||
PING_LOOP_PID=$!
|
||||
$@ &> /tmp/build.log
|
||||
trap - ERR
|
||||
kill $PING_LOOP_PID
|
||||
rm -f /tmp/build.log
|
||||
set -x
|
||||
}
|
||||
|
||||
# Download last known good emscripten from WebAssembly waterfall
|
||||
BUILD=$(curl -fL https://storage.googleapis.com/wasm-llvm/builds/linux/lkgr.json | \
|
||||
jq '.build | tonumber')
|
||||
curl -sL https://storage.googleapis.com/wasm-llvm/builds/linux/$BUILD/wasm-binaries.tbz2 | \
|
||||
hide_output tar xvkj
|
||||
|
||||
# node 8 is required to run wasm
|
||||
cd /
|
||||
curl -sL https://nodejs.org/dist/v8.0.0/node-v8.0.0-linux-x64.tar.xz | \
|
||||
tar -xJ
|
||||
|
||||
# Make emscripten use wasm-ready node and LLVM tools
|
||||
echo "EMSCRIPTEN_ROOT = '/wasm-install/emscripten'" >> /root/.emscripten
|
||||
echo "NODE_JS='/usr/local/bin/node'" >> /root/.emscripten
|
||||
echo "LLVM_ROOT='/wasm-install/bin'" >> /root/.emscripten
|
||||
echo "BINARYEN_ROOT = '/wasm-install'" >> /root/.emscripten
|
||||
echo "COMPILER_ENGINE = NODE_JS" >> /root/.emscripten
|
||||
echo "JS_ENGINES = [NODE_JS]" >> /root/.emscripten
|
@ -17,22 +17,7 @@ exit 1
|
||||
set -x
|
||||
}
|
||||
|
||||
cd /
|
||||
curl -fL https://mozilla-games.s3.amazonaws.com/emscripten/releases/emsdk-portable.tar.gz | \
|
||||
tar -xz
|
||||
|
||||
git clone https://github.com/emscripten-core/emsdk.git /emsdk-portable
|
||||
cd /emsdk-portable
|
||||
./emsdk update
|
||||
hide_output ./emsdk install sdk-1.38.15-64bit
|
||||
./emsdk activate sdk-1.38.15-64bit
|
||||
|
||||
# Compile and cache libc
|
||||
source ./emsdk_env.sh
|
||||
echo "main(){}" > a.c
|
||||
HOME=/emsdk-portable/ emcc a.c
|
||||
HOME=/emsdk-portable/ emcc -s BINARYEN=1 a.c
|
||||
rm -f a.*
|
||||
|
||||
# Make emsdk usable by any user
|
||||
cp /root/.emscripten /emsdk-portable
|
||||
chmod a+rxw -R /emsdk-portable
|
||||
hide_output ./emsdk install 1.38.46-upstream
|
||||
./emsdk activate 1.38.46-upstream
|
||||
|
44
src/ci/docker/wasm32/Dockerfile
Normal file
44
src/ci/docker/wasm32/Dockerfile
Normal file
@ -0,0 +1,44 @@
|
||||
FROM ubuntu:16.04
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
g++ \
|
||||
make \
|
||||
file \
|
||||
curl \
|
||||
ca-certificates \
|
||||
python \
|
||||
git \
|
||||
cmake \
|
||||
sudo \
|
||||
gdb \
|
||||
xz-utils \
|
||||
bzip2
|
||||
|
||||
COPY scripts/emscripten.sh /scripts/
|
||||
RUN bash /scripts/emscripten.sh
|
||||
|
||||
COPY scripts/sccache.sh /scripts/
|
||||
RUN sh /scripts/sccache.sh
|
||||
|
||||
ENV PATH=$PATH:/emsdk-portable
|
||||
ENV PATH=$PATH:/emsdk-portable/upstream/emscripten/
|
||||
ENV PATH=$PATH:/emsdk-portable/node/12.9.1_64bit/bin/
|
||||
ENV BINARYEN_ROOT=/emsdk-portable/upstream/
|
||||
|
||||
ENV TARGETS=wasm32-unknown-emscripten
|
||||
|
||||
# Use -O1 optimizations in the link step to reduce time spent optimizing.
|
||||
ENV EMCC_CFLAGS=-O1
|
||||
|
||||
# Emscripten installation is user-specific
|
||||
ENV NO_CHANGE_USER=1
|
||||
|
||||
# FIXME: Re-enable these tests once https://github.com/rust-lang/cargo/pull/7476
|
||||
# is picked up by CI
|
||||
ENV SCRIPT python2.7 ../x.py test --target $TARGETS \
|
||||
--exclude src/libcore \
|
||||
--exclude src/liballoc \
|
||||
--exclude src/libproc_macro \
|
||||
--exclude src/libstd \
|
||||
--exclude src/libterm \
|
||||
--exclude src/libtest
|
@ -26,5 +26,5 @@ ENV CHECK_LINKS 1
|
||||
|
||||
ENV RUST_CONFIGURE_ARGS \
|
||||
--build=x86_64-unknown-linux-gnu \
|
||||
--save-toolstates=/tmp/toolstates.json
|
||||
ENV SCRIPT /tmp/checktools.sh ../x.py /tmp/toolstates.json linux
|
||||
--save-toolstates=/tmp/toolstate/toolstates.json
|
||||
ENV SCRIPT /tmp/checktools.sh ../x.py /tmp/toolstate/toolstates.json linux
|
||||
|
@ -3,7 +3,7 @@
|
||||
set -eu
|
||||
|
||||
X_PY="$1"
|
||||
TOOLSTATE_FILE="$(realpath $2)"
|
||||
TOOLSTATE_FILE="$(realpath -m $2)"
|
||||
OS="$3"
|
||||
COMMIT="$(git rev-parse HEAD)"
|
||||
CHANGED_FILES="$(git diff --name-status HEAD HEAD^)"
|
||||
@ -13,6 +13,7 @@ SIX_WEEK_CYCLE="$(( ($(date +%s) / 86400 - 20) % 42 ))"
|
||||
# The Wednesday after this has value 0.
|
||||
# We track this value to prevent regressing tools in the last week of the 6-week cycle.
|
||||
|
||||
mkdir -p "$(dirname $TOOLSTATE_FILE)"
|
||||
touch "$TOOLSTATE_FILE"
|
||||
|
||||
# Try to test all the tools and store the build/test success in the TOOLSTATE_FILE
|
||||
|
@ -47,7 +47,7 @@ function fetch_github_commit_archive {
|
||||
rm $cached
|
||||
}
|
||||
|
||||
included="src/llvm-project src/llvm-emscripten src/doc/book src/doc/rust-by-example"
|
||||
included="src/llvm-project src/doc/book src/doc/rust-by-example"
|
||||
modules="$(git config --file .gitmodules --get-regexp '\.path$' | cut -d' ' -f2)"
|
||||
modules=($modules)
|
||||
use_git=""
|
||||
|
@ -55,6 +55,9 @@ if [ "$DEPLOY$DEPLOY_ALT" = "1" ]; then
|
||||
if [ "$NO_LLVM_ASSERTIONS" = "1" ]; then
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-llvm-assertions"
|
||||
elif [ "$DEPLOY_ALT" != "" ]; then
|
||||
if [ "$NO_PARALLEL_COMPILER" = "" ]; then
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set rust.parallel-compiler"
|
||||
fi
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-llvm-assertions"
|
||||
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --set rust.verify-llvm-ir"
|
||||
fi
|
||||
@ -114,7 +117,7 @@ make check-bootstrap
|
||||
|
||||
# Display the CPU and memory information. This helps us know why the CI timing
|
||||
# is fluctuating.
|
||||
if isOSX; then
|
||||
if isMacOS; then
|
||||
system_profiler SPHardwareDataType || true
|
||||
sysctl hw || true
|
||||
ncpus=$(sysctl -n hw.ncpu)
|
||||
|
17
src/ci/scripts/checkout-submodules.sh
Executable file
17
src/ci/scripts/checkout-submodules.sh
Executable file
@ -0,0 +1,17 @@
|
||||
#!/bin/bash
|
||||
# Check out all our submodules, but more quickly than using git by using one of
|
||||
# our custom scripts
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
path="/c/cache/rustsrc"
|
||||
else
|
||||
path="${HOME}/rustsrc"
|
||||
fi
|
||||
|
||||
mkdir -p "${path}"
|
||||
"$(cd "$(dirname "$0")" && pwd)/../init_repo.sh" . "${path}"
|
13
src/ci/scripts/disable-git-crlf-conversion.sh
Executable file
13
src/ci/scripts/disable-git-crlf-conversion.sh
Executable file
@ -0,0 +1,13 @@
|
||||
#!/bin/bash
|
||||
# Disable automatic line ending conversion, which is enabled by default on
|
||||
# Azure's Windows image. Having the conversion enabled caused regressions both
|
||||
# in our test suite (it broke miri tests) and in the ecosystem, since we
|
||||
# started shipping install scripts with CRLF endings instead of the old LF.
|
||||
#
|
||||
# Note that we do this a couple times during the build as the PATH and current
|
||||
# user/directory change, e.g. when mingw is enabled.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
git config --replace-all --global core.autocrlf false
|
19
src/ci/scripts/dump-environment.sh
Executable file
19
src/ci/scripts/dump-environment.sh
Executable file
@ -0,0 +1,19 @@
|
||||
#!/bin/bash
|
||||
# This script dumps information about the build environment to stdout.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
echo "environment variables:"
|
||||
printenv | sort
|
||||
echo
|
||||
|
||||
echo "disk usage:"
|
||||
df -h
|
||||
echo
|
||||
|
||||
echo "biggest files in the working dir:"
|
||||
set +o pipefail
|
||||
du . | sort -nr | head -n100
|
||||
set -o pipefail
|
||||
echo
|
15
src/ci/scripts/enable-docker-ipv6.sh
Executable file
15
src/ci/scripts/enable-docker-ipv6.sh
Executable file
@ -0,0 +1,15 @@
|
||||
#!/bin/bash
|
||||
# Looks like docker containers have IPv6 disabled by default, so let's turn it
|
||||
# on since libstd tests require it
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isLinux; then
|
||||
sudo mkdir -p /etc/docker
|
||||
echo '{"ipv6":true,"fixed-cidr-v6":"fd9a:8454:6789:13f7::/64"}' \
|
||||
| sudo tee /etc/docker/daemon.json
|
||||
sudo service docker restart
|
||||
fi
|
@ -16,12 +16,14 @@
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
MIRROR="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc/2019-07-27-awscli.tar"
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
MIRROR="${MIRRORS_BASE}/2019-07-27-awscli.tar"
|
||||
DEPS_DIR="/tmp/awscli-deps"
|
||||
|
||||
pip="pip"
|
||||
pipflags=""
|
||||
if [[ "${AGENT_OS}" == "Linux" ]]; then
|
||||
if isLinux; then
|
||||
pip="pip3"
|
||||
pipflags="--user"
|
||||
|
43
src/ci/scripts/install-clang.sh
Executable file
43
src/ci/scripts/install-clang.sh
Executable file
@ -0,0 +1,43 @@
|
||||
#!/bin/bash
|
||||
# This script installs clang on the local machine. Note that we don't install
|
||||
# clang on Linux since its compiler story is just so different. Each container
|
||||
# has its own toolchain configured appropriately already.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isMacOS; then
|
||||
curl -f "${MIRRORS_BASE}/clang%2Bllvm-7.0.0-x86_64-apple-darwin.tar.xz" | tar xJf -
|
||||
|
||||
ciCommandSetEnv CC "$(pwd)/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang"
|
||||
ciCommandSetEnv CXX "$(pwd)/clang+llvm-7.0.0-x86_64-apple-darwin/bin/clang++"
|
||||
|
||||
# Configure `AR` specifically so rustbuild doesn't try to infer it as
|
||||
# `clang-ar` by accident.
|
||||
ciCommandSetEnv AR "ar"
|
||||
elif isWindows && [[ -z ${MINGW_URL+x} ]]; then
|
||||
# If we're compiling for MSVC then we, like most other distribution builders,
|
||||
# switch to clang as the compiler. This'll allow us eventually to enable LTO
|
||||
# amongst LLVM and rustc. Note that we only do this on MSVC as I don't think
|
||||
# clang has an output mode compatible with MinGW that we need. If it does we
|
||||
# should switch to clang for MinGW as well!
|
||||
#
|
||||
# Note that the LLVM installer is an NSIS installer
|
||||
#
|
||||
# Original downloaded here came from
|
||||
# http://releases.llvm.org/7.0.0/LLVM-7.0.0-win64.exe
|
||||
# That installer was run through `wine` on Linux and then the resulting
|
||||
# installation directory (found in `$HOME/.wine/drive_c/Program Files/LLVM`) was
|
||||
# packaged up into a tarball. We've had issues otherwise that the installer will
|
||||
# randomly hang, provide not a lot of useful information, pollute global state,
|
||||
# etc. In general the tarball is just more confined and easier to deal with when
|
||||
# working with various CI environments.
|
||||
|
||||
mkdir -p citools
|
||||
cd citools
|
||||
curl -f "${MIRRORS_BASE}/LLVM-7.0.0-win64.tar.gz" | tar xzf -
|
||||
ciCommandSetEnv RUST_CONFIGURE_ARGS \
|
||||
"${RUST_CONFIGURE_ARGS} --set llvm.clang-cl=$(pwd)/clang-rust/bin/clang-cl.exe"
|
||||
fi
|
18
src/ci/scripts/install-innosetup.sh
Executable file
18
src/ci/scripts/install-innosetup.sh
Executable file
@ -0,0 +1,18 @@
|
||||
#!/bin/bash
|
||||
# We use InnoSetup and its `iscc` program to also create combined installers.
|
||||
# Honestly at this point WIX above and `iscc` are just holdovers from
|
||||
# oh-so-long-ago and are required for creating installers on Windows. I think
|
||||
# one is MSI installers and one is EXE, but they're not used so frequently at
|
||||
# this point anyway so perhaps it's a wash!
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
curl.exe -o is-install.exe "${MIRRORS_BASE}/2017-08-22-is.exe"
|
||||
cmd.exe //c "is-install.exe /VERYSILENT /SUPPRESSMSGBOXES /NORESTART /SP-"
|
||||
|
||||
ciCommandAddPath "C:\\Program Files (x86)\\Inno Setup 5"
|
||||
fi
|
45
src/ci/scripts/install-mingw.sh
Executable file
45
src/ci/scripts/install-mingw.sh
Executable file
@ -0,0 +1,45 @@
|
||||
#!/bin/bash
|
||||
# If we need to download a custom MinGW, do so here and set the path
|
||||
# appropriately.
|
||||
#
|
||||
# Here we also do a pretty heinous thing which is to mangle the MinGW
|
||||
# installation we just downloaded. Currently, as of this writing, we're using
|
||||
# MinGW-w64 builds of gcc, and that's currently at 6.3.0. We use 6.3.0 as it
|
||||
# appears to be the first version which contains a fix for #40546, builds
|
||||
# randomly failing during LLVM due to ar.exe/ranlib.exe failures.
|
||||
#
|
||||
# Unfortunately, though, 6.3.0 *also* is the first version of MinGW-w64 builds
|
||||
# to contain a regression in gdb (#40184). As a result if we were to use the
|
||||
# gdb provided (7.11.1) then we would fail all debuginfo tests.
|
||||
#
|
||||
# In order to fix spurious failures (pretty high priority) we use 6.3.0. To
|
||||
# avoid disabling gdb tests we download an *old* version of gdb, specifically
|
||||
# that found inside the 6.2.0 distribution. We then overwrite the 6.3.0 gdb
|
||||
# with the 6.2.0 gdb to get tests passing.
|
||||
#
|
||||
# Note that we don't literally overwrite the gdb.exe binary because it appears
|
||||
# to just use gdborig.exe, so that's the binary we deal with instead.
|
||||
#
|
||||
# Otherwise install MinGW through `pacman`
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
if [[ -z "${MINGW_URL+x}" ]]; then
|
||||
arch=i686
|
||||
if [ "$MSYS_BITS" = "64" ]; then
|
||||
arch=x86_64
|
||||
fi
|
||||
pacman -S --noconfirm --needed mingw-w64-$arch-toolchain mingw-w64-$arch-cmake \
|
||||
mingw-w64-$arch-gcc mingw-w64-$arch-python2
|
||||
ciCommandAddPath "${SYSTEM_WORKFOLDER}/msys2/mingw${MSYS_BITS}/bin"
|
||||
else
|
||||
curl -o mingw.7z "${MINGW_URL}/${MINGW_ARCHIVE}"
|
||||
7z x -y mingw.7z > /dev/null
|
||||
curl -o "${MINGW_DIR}/bin/gdborig.exe" "${MINGW_URL}/2017-04-20-${MSYS_BITS}bit-gdborig.exe"
|
||||
ciCommandAddPath "$(pwd)/${MINGW_DIR}/bin"
|
||||
fi
|
||||
fi
|
17
src/ci/scripts/install-msys2-packages.sh
Executable file
17
src/ci/scripts/install-msys2-packages.sh
Executable file
@ -0,0 +1,17 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
pacman -S --noconfirm --needed base-devel ca-certificates make diffutils tar
|
||||
|
||||
# Make sure we use the native python interpreter instead of some msys equivalent
|
||||
# one way or another. The msys interpreters seem to have weird path conversions
|
||||
# baked in which break LLVM's build system one way or another, so let's use the
|
||||
# native version which keeps everything as native as possible.
|
||||
cp C:/Python27amd64/python.exe C:/Python27amd64/python2.7.exe
|
||||
ciCommandAddPath "C:\\Python27amd64"
|
||||
fi
|
19
src/ci/scripts/install-msys2.sh
Executable file
19
src/ci/scripts/install-msys2.sh
Executable file
@ -0,0 +1,19 @@
|
||||
#!/bin/bash
|
||||
# Download and install MSYS2, needed primarily for the test suite (run-make) but
|
||||
# also used by the MinGW toolchain for assembling things.
|
||||
#
|
||||
# FIXME: we should probe the default azure image and see if we can use the MSYS2
|
||||
# toolchain there. (if there's even one there). For now though this gets the job
|
||||
# done.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
choco install msys2 --params="/InstallDir:${SYSTEM_WORKFOLDER}/msys2 /NoPath" -y --no-progress
|
||||
mkdir -p "${SYSTEM_WORKFOLDER}/msys2/home/${USERNAME}"
|
||||
|
||||
ciCommandAddPath "${SYSTEM_WORKFOLDER}/msys2/usr/bin"
|
||||
fi
|
16
src/ci/scripts/install-ninja.sh
Executable file
16
src/ci/scripts/install-ninja.sh
Executable file
@ -0,0 +1,16 @@
|
||||
#!/bin/bash
|
||||
# Note that this is originally from the github releases patch of Ninja
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
mkdir ninja
|
||||
curl -o ninja.zip "${MIRRORS_BASE}/2017-03-15-ninja-win.zip"
|
||||
7z x -oninja ninja.zip
|
||||
rm ninja.zip
|
||||
ciCommandSetEnv "RUST_CONFIGURE_ARGS" "${RUST_CONFIGURE_ARGS} --enable-ninja"
|
||||
ciCommandAddPath "$(pwd)/ninja"
|
||||
fi
|
20
src/ci/scripts/install-sccache.sh
Executable file
20
src/ci/scripts/install-sccache.sh
Executable file
@ -0,0 +1,20 @@
|
||||
#!/bin/bash
|
||||
# This script installs sccache on the local machine. Note that we don't install
|
||||
# sccache on Linux since it's installed elsewhere through all the containers.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isMacOS; then
|
||||
curl -fo /usr/local/bin/sccache "${MIRRORS_BASE}/2018-04-02-sccache-x86_64-apple-darwin"
|
||||
chmod +x /usr/local/bin/sccache
|
||||
elif isWindows; then
|
||||
mkdir -p sccache
|
||||
curl -fo sccache/sccache.exe "${MIRRORS_BASE}/2018-04-26-sccache-x86_64-pc-windows-msvc"
|
||||
ciCommandAddPath "$(pwd)/sccache"
|
||||
fi
|
||||
|
||||
# FIXME: we should probably install sccache outside the containers and then
|
||||
# mount it inside the containers so we can centralize all installation here.
|
17
src/ci/scripts/install-wix.sh
Executable file
17
src/ci/scripts/install-wix.sh
Executable file
@ -0,0 +1,17 @@
|
||||
#!/bin/bash
|
||||
# We use the WIX toolset to create combined installers for Windows, and these
|
||||
# binaries are downloaded from https://github.com/wixtoolset/wix3 originally
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
ciCommandSetEnv WIX "$(pwd)/wix"
|
||||
|
||||
curl -O "${MIRRORS_BASE}/wix311-binaries.zip"
|
||||
mkdir -p wix/bin
|
||||
cd wix/bin
|
||||
7z x ../../wix311-binaries.zip
|
||||
fi
|
20
src/ci/scripts/should-skip-this.sh
Executable file
20
src/ci/scripts/should-skip-this.sh
Executable file
@ -0,0 +1,20 @@
|
||||
#!/bin/bash
|
||||
# Set the SKIP_JOB environment variable if this job is supposed to only run
|
||||
# when submodules are updated and they were not. The following time consuming
|
||||
# tasks will be skipped when the environment variable is present.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if [[ -z "${CI_ONLY_WHEN_SUBMODULES_CHANGED+x}" ]]; then
|
||||
echo "Executing the job since there is no skip rule in effect"
|
||||
elif git diff HEAD^ | grep --quiet "^index .* 160000"; then
|
||||
# Submodules pseudo-files inside git have the 160000 permissions, so when
|
||||
# those files are present in the diff a submodule was updated.
|
||||
echo "Executing the job since submodules are updated"
|
||||
else
|
||||
echo "Not executing this job since no submodules were updated"
|
||||
ciCommandSetEnv SKIP_JOB 1
|
||||
fi
|
13
src/ci/scripts/switch-xcode.sh
Executable file
13
src/ci/scripts/switch-xcode.sh
Executable file
@ -0,0 +1,13 @@
|
||||
#!/bin/bash
|
||||
# Switch to XCode 9.3 on OSX since it seems to be the last version that supports
|
||||
# i686-apple-darwin. We'll eventually want to upgrade this and it will probably
|
||||
# force us to drop i686-apple-darwin, but let's keep the wheels turning for now.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isMacOS; then
|
||||
sudo xcode-select --switch /Applications/Xcode_9.3.app
|
||||
fi
|
41
src/ci/scripts/upload-artifacts.sh
Executable file
41
src/ci/scripts/upload-artifacts.sh
Executable file
@ -0,0 +1,41 @@
|
||||
#!/bin/bash
|
||||
# Upload all the artifacts to our S3 bucket. All the files inside ${upload_dir}
|
||||
# will be uploaded to the deploy bucket and eventually signed and released in
|
||||
# static.rust-lang.org.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
upload_dir="$(mktemp -d)"
|
||||
|
||||
# Release tarballs produced by a dist builder.
|
||||
if [[ "${DEPLOY-0}" -eq "1" ]] || [[ "${DEPLOY_ALT-0}" -eq "1" ]]; then
|
||||
dist_dir=build/dist
|
||||
if isLinux; then
|
||||
dist_dir=obj/build/dist
|
||||
fi
|
||||
rm -rf "${dist_dir}/doc"
|
||||
cp -r "${dist_dir}"/* "${upload_dir}"
|
||||
fi
|
||||
|
||||
# CPU usage statistics.
|
||||
cp cpu-usage.csv "${upload_dir}/cpu-${CI_JOB_NAME}.csv"
|
||||
|
||||
# Toolstate data.
|
||||
if [[ -n "${DEPLOY_TOOLSTATES_JSON+x}" ]]; then
|
||||
cp /tmp/toolstate/toolstates.json "${upload_dir}/${DEPLOY_TOOLSTATES_JSON}"
|
||||
fi
|
||||
|
||||
echo "Files that will be uploaded:"
|
||||
ls -lah "${upload_dir}"
|
||||
echo
|
||||
|
||||
deploy_dir="rustc-builds"
|
||||
if [[ "${DEPLOY_ALT-0}" -eq "1" ]]; then
|
||||
deploy_dir="rustc-builds-alt"
|
||||
fi
|
||||
deploy_url="s3://${DEPLOY_BUCKET}/${deploy_dir}/$(ciCommit)"
|
||||
|
||||
retry aws s3 cp --no-progress --recursive --acl public-read "${upload_dir}" "${deploy_url}"
|
24
src/ci/scripts/verify-line-endings.sh
Executable file
24
src/ci/scripts/verify-line-endings.sh
Executable file
@ -0,0 +1,24 @@
|
||||
#!/bin/bash
|
||||
# See also the disable for autocrlf, this just checks that it worked.
|
||||
#
|
||||
# We check both in rust-lang/rust and in a submodule to make sure both are
|
||||
# accurate. Submodules are checked out significantly later than the main
|
||||
# repository in this script, so settings can (and do!) change between then.
|
||||
#
|
||||
# Linux (and maybe macOS) builders don't currently have dos2unix so just only
|
||||
# run this step on Windows.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
# print out the git configuration so we can better investigate failures in
|
||||
# the following
|
||||
git config --list --show-origin
|
||||
dos2unix -ih Cargo.lock src/tools/rust-installer/install-template.sh
|
||||
endings=$(dos2unix -ic Cargo.lock src/tools/rust-installer/install-template.sh)
|
||||
# if endings has non-zero length, error out
|
||||
if [ -n "$endings" ]; then exit 1 ; fi
|
||||
fi
|
15
src/ci/scripts/windows-symlink-build-dir.sh
Executable file
15
src/ci/scripts/windows-symlink-build-dir.sh
Executable file
@ -0,0 +1,15 @@
|
||||
#!/bin/bash
|
||||
# We've had issues with the default drive in use running out of space during a
|
||||
# build, and it looks like the `C:` drive has more space than the default `D:`
|
||||
# drive. We should probably confirm this with the azure pipelines team at some
|
||||
# point, but this seems to fix our "disk space full" problems.
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
source "$(cd "$(dirname "$0")" && pwd)/../shared.sh"
|
||||
|
||||
if isWindows; then
|
||||
cmd //c "mkdir c:\\MORE_SPACE"
|
||||
cmd //c "mklink /J build c:\\MORE_SPACE"
|
||||
fi
|
@ -4,6 +4,8 @@
|
||||
# `source shared.sh`, hence the invalid shebang and not being
|
||||
# marked as an executable file in git.
|
||||
|
||||
export MIRRORS_BASE="https://rust-lang-ci-mirrors.s3-us-west-1.amazonaws.com/rustc"
|
||||
|
||||
# See http://unix.stackexchange.com/questions/82598
|
||||
# Duplicated in docker/dist-various-2/shared.sh
|
||||
function retry {
|
||||
@ -28,10 +30,43 @@ function isCI {
|
||||
[ "$CI" = "true" ] || [ "$TF_BUILD" = "True" ]
|
||||
}
|
||||
|
||||
function isOSX {
|
||||
function isMacOS {
|
||||
[ "$AGENT_OS" = "Darwin" ]
|
||||
}
|
||||
|
||||
function isWindows {
|
||||
[ "$AGENT_OS" = "Windows_NT" ]
|
||||
}
|
||||
|
||||
function isLinux {
|
||||
[ "$AGENT_OS" = "Linux" ]
|
||||
}
|
||||
|
||||
function getCIBranch {
|
||||
echo "$BUILD_SOURCEBRANCHNAME"
|
||||
}
|
||||
|
||||
function ciCommit {
|
||||
echo "${BUILD_SOURCEVERSION}"
|
||||
}
|
||||
|
||||
function ciCommandAddPath {
|
||||
if [[ $# -ne 1 ]]; then
|
||||
echo "usage: $0 <path>"
|
||||
exit 1
|
||||
fi
|
||||
path="$1"
|
||||
|
||||
echo "##vso[task.prependpath]${path}"
|
||||
}
|
||||
|
||||
function ciCommandSetEnv {
|
||||
if [[ $# -ne 2 ]]; then
|
||||
echo "usage: $0 <name> <value>"
|
||||
exit 1
|
||||
fi
|
||||
name="$1"
|
||||
value="$2"
|
||||
|
||||
echo "##vso[task.setvariable variable=${name}]${value}"
|
||||
}
|
||||
|
2
src/doc/book/Cargo.lock
generated
2
src/doc/book/Cargo.lock
generated
@ -1,3 +1,5 @@
|
||||
# This file is automatically @generated by Cargo.
|
||||
# It is not intended for manual editing.
|
||||
[[package]]
|
||||
name = "aho-corasick"
|
||||
version = "0.5.3"
|
||||
|
12
src/doc/book/ci/build.sh
Normal file → Executable file
12
src/doc/book/ci/build.sh
Normal file → Executable file
@ -4,13 +4,6 @@ set -e
|
||||
|
||||
export PATH=$PATH:/home/travis/.cargo/bin;
|
||||
|
||||
# Feature check
|
||||
cd ci/stable-check
|
||||
|
||||
cargo run -- ../../src
|
||||
|
||||
cd ../..
|
||||
|
||||
echo 'Spellchecking...'
|
||||
bash ci/spellcheck.sh list
|
||||
echo 'Testing...'
|
||||
@ -19,3 +12,8 @@ echo 'Building...'
|
||||
mdbook build
|
||||
echo 'Linting for local file paths...'
|
||||
cargo run --bin lfp src
|
||||
echo 'Validating references'
|
||||
for file in src/*.md ; do
|
||||
echo Checking references in $file
|
||||
cargo run --quiet --bin link2print < $file > /dev/null
|
||||
done
|
||||
|
4
src/doc/book/ci/stable-check/Cargo.lock
generated
4
src/doc/book/ci/stable-check/Cargo.lock
generated
@ -1,4 +0,0 @@
|
||||
[root]
|
||||
name = "stable-check"
|
||||
version = "0.1.0"
|
||||
|
@ -1,6 +0,0 @@
|
||||
[package]
|
||||
name = "stable-check"
|
||||
version = "0.1.0"
|
||||
authors = ["steveklabnik <steve@steveklabnik.com>"]
|
||||
|
||||
[dependencies]
|
@ -1,43 +0,0 @@
|
||||
use std::error::Error;
|
||||
use std::env;
|
||||
use std::fs;
|
||||
use std::fs::File;
|
||||
use std::io::prelude::*;
|
||||
use std::path::Path;
|
||||
|
||||
fn main() {
|
||||
let arg = env::args().nth(1).unwrap_or_else(|| {
|
||||
println!("Please pass a src directory as the first argument");
|
||||
std::process::exit(1);
|
||||
});
|
||||
|
||||
match check_directory(&Path::new(&arg)) {
|
||||
Ok(()) => println!("passed!"),
|
||||
Err(e) => {
|
||||
println!("Error: {}", e);
|
||||
std::process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
fn check_directory(dir: &Path) -> Result<(), Box<Error>> {
|
||||
for entry in fs::read_dir(dir)? {
|
||||
let entry = entry?;
|
||||
let path = entry.path();
|
||||
|
||||
if path.is_dir() {
|
||||
continue;
|
||||
}
|
||||
|
||||
let mut file = File::open(&path)?;
|
||||
let mut contents = String::new();
|
||||
file.read_to_string(&mut contents)?;
|
||||
|
||||
if contents.contains("#![feature") {
|
||||
return Err(From::from(format!("Feature flag found in {:?}", path)));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
1
src/doc/book/rust-toolchain
Normal file
1
src/doc/book/rust-toolchain
Normal file
@ -0,0 +1 @@
|
||||
1.37.0
|
@ -1,10 +1,10 @@
|
||||
# Appendix D - Useful Development Tools
|
||||
## Appendix D - Useful Development Tools
|
||||
|
||||
In this appendix, we talk about some useful development tools that the Rust
|
||||
project provides. We’ll look at automatic formatting, quick ways to apply
|
||||
warning fixes, a linter, and integrating with IDEs.
|
||||
|
||||
## Automatic Formatting with `rustfmt`
|
||||
### Automatic Formatting with `rustfmt`
|
||||
|
||||
The `rustfmt` tool reformats your code according to the community code style.
|
||||
Many collaborative projects use `rustfmt` to prevent arguments about which
|
||||
@ -29,7 +29,7 @@ on `rustfmt`, see [its documentation][rustfmt].
|
||||
|
||||
[rustfmt]: https://github.com/rust-lang/rustfmt
|
||||
|
||||
## Fix Your Code with `rustfix`
|
||||
### Fix Your Code with `rustfix`
|
||||
|
||||
The rustfix tool is included with Rust installations and can automatically fix
|
||||
some compiler warnings. If you’ve written code in Rust, you’ve probably seen
|
||||
@ -96,7 +96,7 @@ The `for` loop variable is now named `_i`, and the warning no longer appears.
|
||||
You can also use the `cargo fix` command to transition your code between
|
||||
different Rust editions. Editions are covered in Appendix E.
|
||||
|
||||
## More Lints with Clippy
|
||||
### More Lints with Clippy
|
||||
|
||||
The Clippy tool is a collection of lints to analyze your code so you can catch
|
||||
common mistakes and improve your Rust code.
|
||||
@ -158,7 +158,7 @@ For more information on Clippy, see [its documentation][clippy].
|
||||
|
||||
[clippy]: https://github.com/rust-lang/rust-clippy
|
||||
|
||||
## IDE Integration Using the Rust Language Server
|
||||
### IDE Integration Using the Rust Language Server
|
||||
|
||||
To help IDE integration, the Rust project distributes the *Rust Language
|
||||
Server* (`rls`). This tool speaks the [Language Server
|
||||
|
@ -1,4 +1,4 @@
|
||||
# Appendix E - Editions
|
||||
## Appendix E - Editions
|
||||
|
||||
In Chapter 1, you saw that `cargo new` adds a bit of metadata to your
|
||||
*Cargo.toml* file about an edition. This appendix talks about what that means!
|
||||
|
@ -1,4 +1,4 @@
|
||||
# Appendix G - How Rust is Made and “Nightly Rust”
|
||||
## Appendix G - How Rust is Made and “Nightly Rust”
|
||||
|
||||
This appendix is about how Rust is made and how that affects you as a Rust
|
||||
developer.
|
||||
|
@ -104,7 +104,7 @@ chapters. In concept chapters, you’ll learn about an aspect of Rust. In projec
|
||||
chapters, we’ll build small programs together, applying what you’ve learned so
|
||||
far. Chapters 2, 12, and 20 are project chapters; the rest are concept chapters.
|
||||
|
||||
Chapter 1 explains how to install Rust, how to write a Hello, world! program,
|
||||
Chapter 1 explains how to install Rust, how to write a “Hello, world!” program,
|
||||
and how to use Cargo, Rust’s package manager and build tool. Chapter 2 is a
|
||||
hands-on introduction to the Rust language. Here we cover concepts at a high
|
||||
level, and later chapters will provide additional detail. If you want to get
|
||||
|
@ -126,9 +126,9 @@ resources include [the Users forum][users] and [Stack Overflow][stackoverflow].
|
||||
|
||||
### Local Documentation
|
||||
|
||||
The installer also includes a copy of the documentation locally, so you can
|
||||
read it offline. Run `rustup doc` to open the local documentation in your
|
||||
browser.
|
||||
The installation of Rust also includes a copy of the documentation locally, so
|
||||
you can read it offline. Run `rustup doc` to open the local documentation in
|
||||
your browser.
|
||||
|
||||
Any time a type or function is provided by the standard library and you’re not
|
||||
sure what it does or how to use it, use the application programming interface
|
||||
|
@ -20,7 +20,7 @@ we suggest making a *projects* directory in your home directory and keeping all
|
||||
your projects there.
|
||||
|
||||
Open a terminal and enter the following commands to make a *projects* directory
|
||||
and a directory for the Hello, world! project within the *projects* directory.
|
||||
and a directory for the “Hello, world!” project within the *projects* directory.
|
||||
|
||||
For Linux, macOS, and PowerShell on Windows, enter this:
|
||||
|
||||
@ -86,7 +86,7 @@ program. That makes you a Rust programmer—welcome!
|
||||
|
||||
### Anatomy of a Rust Program
|
||||
|
||||
Let’s review in detail what just happened in your Hello, world! program.
|
||||
Let’s review in detail what just happened in your “Hello, world!” program.
|
||||
Here’s the first piece of the puzzle:
|
||||
|
||||
```rust
|
||||
@ -178,7 +178,7 @@ From here, you run the *main* or *main.exe* file, like this:
|
||||
$ ./main # or .\main.exe on Windows
|
||||
```
|
||||
|
||||
If *main.rs* was your Hello, world! program, this line would print `Hello,
|
||||
If *main.rs* was your “Hello, world!” program, this line would print `Hello,
|
||||
world!` to your terminal.
|
||||
|
||||
If you’re more familiar with a dynamic language, such as Ruby, Python, or
|
||||
|
@ -6,9 +6,9 @@ such as building your code, downloading the libraries your code depends on, and
|
||||
building those libraries. (We call libraries your code needs *dependencies*.)
|
||||
|
||||
The simplest Rust programs, like the one we’ve written so far, don’t have any
|
||||
dependencies. So if we had built the Hello, world! project with Cargo, it would
|
||||
only use the part of Cargo that handles building your code. As you write more
|
||||
complex Rust programs, you’ll add dependencies, and if you start a project
|
||||
dependencies. So if we had built the “Hello, world!” project with Cargo, it
|
||||
would only use the part of Cargo that handles building your code. As you write
|
||||
more complex Rust programs, you’ll add dependencies, and if you start a project
|
||||
using Cargo, adding dependencies will be much easier to do.
|
||||
|
||||
Because the vast majority of Rust projects use Cargo, the rest of this book
|
||||
@ -29,7 +29,7 @@ determine how to install Cargo separately.
|
||||
### Creating a Project with Cargo
|
||||
|
||||
Let’s create a new project using Cargo and look at how it differs from our
|
||||
original Hello, world! project. Navigate back to your *projects* directory (or
|
||||
original “Hello, world!” project. Navigate back to your *projects* directory (or
|
||||
wherever you decided to store your code). Then, on any operating system, run
|
||||
the following:
|
||||
|
||||
@ -99,10 +99,10 @@ fn main() {
|
||||
}
|
||||
```
|
||||
|
||||
Cargo has generated a Hello, world! program for you, just like the one we wrote
|
||||
in Listing 1-1! So far, the differences between our previous project and the
|
||||
project Cargo generates are that Cargo placed the code in the *src* directory,
|
||||
and we have a *Cargo.toml* configuration file in the top directory.
|
||||
Cargo has generated a “Hello, world!” program for you, just like the one we
|
||||
wrote in Listing 1-1! So far, the differences between our previous project and
|
||||
the project Cargo generates are that Cargo placed the code in the *src*
|
||||
directory, and we have a *Cargo.toml* configuration file in the top directory.
|
||||
|
||||
Cargo expects your source files to live inside the *src* directory. The
|
||||
top-level project directory is just for README files, license information,
|
||||
@ -110,14 +110,14 @@ configuration files, and anything else not related to your code. Using Cargo
|
||||
helps you organize your projects. There’s a place for everything, and
|
||||
everything is in its place.
|
||||
|
||||
If you started a project that doesn’t use Cargo, as we did with the Hello,
|
||||
world! project, you can convert it to a project that does use Cargo. Move the
|
||||
If you started a project that doesn’t use Cargo, as we did with the “Hello,
|
||||
world!” project, you can convert it to a project that does use Cargo. Move the
|
||||
project code into the *src* directory and create an appropriate *Cargo.toml*
|
||||
file.
|
||||
|
||||
### Building and Running a Cargo Project
|
||||
|
||||
Now let’s look at what’s different when we build and run the Hello, world!
|
||||
Now let’s look at what’s different when we build and run the “Hello, world!”
|
||||
program with Cargo! From your *hello_cargo* directory, build your project by
|
||||
entering the following command:
|
||||
|
||||
@ -237,7 +237,7 @@ you’ve learned how to:
|
||||
* Install the latest stable version of Rust using `rustup`
|
||||
* Update to a newer Rust version
|
||||
* Open locally installed documentation
|
||||
* Write and run a Hello, world! program using `rustc` directly
|
||||
* Write and run a “Hello, world!” program using `rustc` directly
|
||||
* Create and run a new project using the conventions of Cargo
|
||||
|
||||
This is a great time to build a more substantial program to get used to reading
|
||||
|
@ -201,7 +201,7 @@ io::stdin().read_line(&mut guess)
|
||||
.expect("Failed to read line");
|
||||
```
|
||||
|
||||
If we hadn’t listed the `use std::io` line at the beginning of the program, we
|
||||
If we hadn’t put the `use std::io` line at the beginning of the program, we
|
||||
could have written this function call as `std::io::stdin`. The `stdin` function
|
||||
returns an instance of [`std::io::Stdin`][iostdin]<!-- ignore -->, which is a
|
||||
type that represents a handle to the standard input for your terminal.
|
||||
@ -373,23 +373,28 @@ code that uses `rand`, we need to modify the *Cargo.toml* file to include the
|
||||
the bottom beneath the `[dependencies]` section header that Cargo created for
|
||||
you:
|
||||
|
||||
<!-- When updating the version of `rand` used, also update the version of
|
||||
`rand` used in these files so they all match:
|
||||
* ch07-04-bringing-paths-into-scope-with-the-use-keyword.md
|
||||
* ch14-03-cargo-workspaces.md
|
||||
-->
|
||||
|
||||
<span class="filename">Filename: Cargo.toml</span>
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
|
||||
rand = "0.3.14"
|
||||
rand = "0.5.5"
|
||||
```
|
||||
|
||||
In the *Cargo.toml* file, everything that follows a header is part of a section
|
||||
that continues until another section starts. The `[dependencies]` section is
|
||||
where you tell Cargo which external crates your project depends on and which
|
||||
versions of those crates you require. In this case, we’ll specify the `rand`
|
||||
crate with the semantic version specifier `0.3.14`. Cargo understands [Semantic
|
||||
crate with the semantic version specifier `0.5.5`. Cargo understands [Semantic
|
||||
Versioning][semver]<!-- ignore --> (sometimes called *SemVer*), which is a
|
||||
standard for writing version numbers. The number `0.3.14` is actually shorthand
|
||||
for `^0.3.14`, which means “any version that has a public API compatible with
|
||||
version 0.3.14.”
|
||||
standard for writing version numbers. The number `0.5.5` is actually shorthand
|
||||
for `^0.5.5`, which means “any version that has a public API compatible with
|
||||
version 0.5.5.”
|
||||
|
||||
[semver]: http://semver.org
|
||||
|
||||
@ -398,13 +403,19 @@ Listing 2-2.
|
||||
|
||||
```text
|
||||
$ cargo build
|
||||
Updating registry `https://github.com/rust-lang/crates.io-index`
|
||||
Downloading rand v0.3.14
|
||||
Downloading libc v0.2.14
|
||||
Compiling libc v0.2.14
|
||||
Compiling rand v0.3.14
|
||||
Updating crates.io index
|
||||
Downloaded rand v0.5.5
|
||||
Downloaded libc v0.2.62
|
||||
Downloaded rand_core v0.2.2
|
||||
Downloaded rand_core v0.3.1
|
||||
Downloaded rand_core v0.4.2
|
||||
Compiling rand_core v0.4.2
|
||||
Compiling libc v0.2.62
|
||||
Compiling rand_core v0.3.1
|
||||
Compiling rand_core v0.2.2
|
||||
Compiling rand v0.5.5
|
||||
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
|
||||
Finished dev [unoptimized + debuginfo] target(s) in 2.53 secs
|
||||
Finished dev [unoptimized + debuginfo] target(s) in 2.53 s
|
||||
```
|
||||
|
||||
<span class="caption">Listing 2-2: The output from running `cargo build` after
|
||||
@ -422,8 +433,8 @@ their open source Rust projects for others to use.
|
||||
|
||||
After updating the registry, Cargo checks the `[dependencies]` section and
|
||||
downloads any crates you don’t have yet. In this case, although we only listed
|
||||
`rand` as a dependency, Cargo also grabbed a copy of `libc`, because `rand`
|
||||
depends on `libc` to work. After downloading the crates, Rust compiles them and
|
||||
`rand` as a dependency, Cargo also grabbed `libc` and `rand_core`, because `rand`
|
||||
depends on those to work. After downloading the crates, Rust compiles them and
|
||||
then compiles the project with the dependencies available.
|
||||
|
||||
If you immediately run `cargo build` again without making any changes, you
|
||||
@ -439,7 +450,7 @@ and build again, you’ll only see two lines of output:
|
||||
```text
|
||||
$ cargo build
|
||||
Compiling guessing_game v0.1.0 (file:///projects/guessing_game)
|
||||
Finished dev [unoptimized + debuginfo] target(s) in 2.53 secs
|
||||
Finished dev [unoptimized + debuginfo] target(s) in 2.53s
|
||||
```
|
||||
|
||||
These lines show Cargo only updates the build with your tiny change to the
|
||||
@ -452,7 +463,7 @@ your part of the code.
|
||||
Cargo has a mechanism that ensures you can rebuild the same artifact every time
|
||||
you or anyone else builds your code: Cargo will use only the versions of the
|
||||
dependencies you specified until you indicate otherwise. For example, what
|
||||
happens if next week version 0.3.15 of the `rand` crate comes out and
|
||||
happens if next week version 0.5.6 of the `rand` crate comes out and
|
||||
contains an important bug fix but also contains a regression that will break
|
||||
your code?
|
||||
|
||||
@ -464,7 +475,7 @@ the *Cargo.lock* file. When you build your project in the future, Cargo will
|
||||
see that the *Cargo.lock* file exists and use the versions specified there
|
||||
rather than doing all the work of figuring out versions again. This lets you
|
||||
have a reproducible build automatically. In other words, your project will
|
||||
remain at `0.3.14` until you explicitly upgrade, thanks to the *Cargo.lock*
|
||||
remain at `0.5.5` until you explicitly upgrade, thanks to the *Cargo.lock*
|
||||
file.
|
||||
|
||||
#### Updating a Crate to Get a New Version
|
||||
@ -474,26 +485,25 @@ which will ignore the *Cargo.lock* file and figure out all the latest versions
|
||||
that fit your specifications in *Cargo.toml*. If that works, Cargo will write
|
||||
those versions to the *Cargo.lock* file.
|
||||
|
||||
But by default, Cargo will only look for versions greater than `0.3.0` and less
|
||||
than `0.4.0`. If the `rand` crate has released two new versions, `0.3.15` and
|
||||
`0.4.0`, you would see the following if you ran `cargo update`:
|
||||
But by default, Cargo will only look for versions greater than `0.5.5` and less
|
||||
than `0.6.0`. If the `rand` crate has released two new versions, `0.5.6` and
|
||||
`0.6.0`, you would see the following if you ran `cargo update`:
|
||||
|
||||
```text
|
||||
$ cargo update
|
||||
Updating registry `https://github.com/rust-lang/crates.io-index`
|
||||
Updating rand v0.3.14 -> v0.3.15
|
||||
Updating crates.io index
|
||||
Updating rand v0.5.5 -> v0.5.6
|
||||
```
|
||||
|
||||
At this point, you would also notice a change in your *Cargo.lock* file noting
|
||||
that the version of the `rand` crate you are now using is `0.3.15`.
|
||||
that the version of the `rand` crate you are now using is `0.5.6`.
|
||||
|
||||
If you wanted to use `rand` version `0.4.0` or any version in the `0.4.x`
|
||||
If you wanted to use `rand` version `0.6.0` or any version in the `0.6.x`
|
||||
series, you’d have to update the *Cargo.toml* file to look like this instead:
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
|
||||
rand = "0.4.0"
|
||||
rand = "0.6.0"
|
||||
```
|
||||
|
||||
The next time you run `cargo build`, Cargo will update the registry of crates
|
||||
|
@ -65,7 +65,7 @@ But mutability can be very useful. Variables are immutable only by default; as
|
||||
you did in Chapter 2, you can make them mutable by adding `mut` in front of the
|
||||
variable name. In addition to allowing this value to change, `mut` conveys
|
||||
intent to future readers of the code by indicating that other parts of the code
|
||||
will be changing this variable's value.
|
||||
will be changing this variable’s value.
|
||||
|
||||
For example, let’s change *src/main.rs* to the following:
|
||||
|
||||
|
@ -228,9 +228,9 @@ primitive compound types: tuples and arrays.
|
||||
|
||||
#### The Tuple Type
|
||||
|
||||
A tuple is a general way of grouping together some number of other values
|
||||
with a variety of types into one compound type. Tuples have a fixed length:
|
||||
once declared, they cannot grow or shrink in size.
|
||||
A tuple is a general way of grouping together a number of values with a variety
|
||||
of types into one compound type. Tuples have a fixed length: once declared,
|
||||
they cannot grow or shrink in size.
|
||||
|
||||
We create a tuple by writing a comma-separated list of values inside
|
||||
parentheses. Each position in the tuple has a type, and the types of the
|
||||
@ -286,8 +286,8 @@ fn main() {
|
||||
```
|
||||
|
||||
This program creates a tuple, `x`, and then makes new variables for each
|
||||
element by using their index. As with most programming languages, the first
|
||||
index in a tuple is 0.
|
||||
element by using their respective indices. As with most programming languages,
|
||||
the first index in a tuple is 0.
|
||||
|
||||
#### The Array Type
|
||||
|
||||
@ -318,7 +318,7 @@ vector. Chapter 8 discusses vectors in more detail.
|
||||
An example of when you might want to use an array rather than a vector is in a
|
||||
program that needs to know the names of the months of the year. It’s very
|
||||
unlikely that such a program will need to add or remove months, so you can use
|
||||
an array because you know it will always contain 12 items:
|
||||
an array because you know it will always contain 12 elements:
|
||||
|
||||
```rust
|
||||
let months = ["January", "February", "March", "April", "May", "June", "July",
|
||||
@ -334,7 +334,7 @@ let a: [i32; 5] = [1, 2, 3, 4, 5];
|
||||
```
|
||||
|
||||
Here, `i32` is the type of each element. After the semicolon, the number `5`
|
||||
indicates the element contains five items.
|
||||
indicates the array contains five elements.
|
||||
|
||||
Writing an array’s type this way looks similar to an alternative syntax for
|
||||
initializing an array: if you want to create an array that contains the same
|
||||
|
@ -1,7 +1,7 @@
|
||||
# Enums and Pattern Matching
|
||||
|
||||
In this chapter we’ll look at *enumerations*, also referred to as *enums*.
|
||||
Enums allow you to define a type by enumerating its possible values. First,
|
||||
Enums allow you to define a type by enumerating its possible *variants*. First,
|
||||
we’ll define and use an enum to show how an enum can encode meaning along with
|
||||
data. Next, we’ll explore a particularly useful enum, called `Option`, which
|
||||
expresses that a value can be either something or nothing. Then we’ll look at
|
||||
|
@ -5,18 +5,18 @@ are useful and more appropriate than structs in this case. Say we need to work
|
||||
with IP addresses. Currently, two major standards are used for IP addresses:
|
||||
version four and version six. These are the only possibilities for an IP
|
||||
address that our program will come across: we can *enumerate* all possible
|
||||
values, which is where enumeration gets its name.
|
||||
variants, which is where enumeration gets its name.
|
||||
|
||||
Any IP address can be either a version four or a version six address, but not
|
||||
both at the same time. That property of IP addresses makes the enum data
|
||||
structure appropriate, because enum values can only be one of the variants.
|
||||
structure appropriate, because enum values can only be one of its variants.
|
||||
Both version four and version six addresses are still fundamentally IP
|
||||
addresses, so they should be treated as the same type when the code is handling
|
||||
situations that apply to any kind of IP address.
|
||||
|
||||
We can express this concept in code by defining an `IpAddrKind` enumeration and
|
||||
listing the possible kinds an IP address can be, `V4` and `V6`. These are known
|
||||
as the *variants* of the enum:
|
||||
listing the possible kinds an IP address can be, `V4` and `V6`. These are the
|
||||
variants of the enum:
|
||||
|
||||
```rust
|
||||
enum IpAddrKind {
|
||||
|
@ -241,6 +241,12 @@ In Chapter 2, we programmed a guessing game project that used an external
|
||||
package called `rand` to get random numbers. To use `rand` in our project, we
|
||||
added this line to *Cargo.toml*:
|
||||
|
||||
<!-- When updating the version of `rand` used, also update the version of
|
||||
`rand` used in these files so they all match:
|
||||
* ch02-00-guessing-game-tutorial.md
|
||||
* ch14-03-cargo-workspaces.md
|
||||
-->
|
||||
|
||||
<span class="filename">Filename: Cargo.toml</span>
|
||||
|
||||
```toml
|
||||
|
@ -76,7 +76,7 @@ that module.
|
||||
|
||||
## Summary
|
||||
|
||||
Rust lets you organize your packages into crates and your crates into modules
|
||||
Rust lets you split a package into multiple crates and a crate into modules
|
||||
so you can refer to items defined in one module from another module. You can do
|
||||
this by specifying absolute or relative paths. These paths can be brought into
|
||||
scope with a `use` statement so you can use a shorter path for multiple uses of
|
||||
|
@ -470,13 +470,13 @@ and returns it. Of course, using `fs::read_to_string` doesn’t give us the
|
||||
opportunity to explain all the error handling, so we did it the longer way
|
||||
first.
|
||||
|
||||
#### The `?` Operator Can Only Be Used in Functions That Return `Result`
|
||||
#### The `?` Operator Can Be Used in Functions That Return `Result`
|
||||
|
||||
The `?` operator can only be used in functions that have a return type of
|
||||
The `?` operator can be used in functions that have a return type of
|
||||
`Result`, because it is defined to work in the same way as the `match`
|
||||
expression we defined in Listing 9-6. The part of the `match` that requires a
|
||||
return type of `Result` is `return Err(e)`, so the return type of the function
|
||||
must be a `Result` to be compatible with this `return`.
|
||||
can be a `Result` to be compatible with this `return`.
|
||||
|
||||
Let’s look at what happens if we use the `?` operator in the `main` function,
|
||||
which you’ll recall has a return type of `()`:
|
||||
@ -505,8 +505,9 @@ error[E0277]: the `?` operator can only be used in a function that returns
|
||||
```
|
||||
|
||||
This error points out that we’re only allowed to use the `?` operator in a
|
||||
function that returns `Result<T, E>`. When you’re writing code in a function
|
||||
that doesn’t return `Result<T, E>`, and you want to use `?` when you call other
|
||||
function that returns `Result` or `Option` or another type that implements
|
||||
`std::ops::Try`. When you’re writing code in a function
|
||||
that doesn’t return one of these types, and you want to use `?` when you call other
|
||||
functions that return `Result<T, E>`, you have two choices to fix this problem.
|
||||
One technique is to change the return type of your function to be `Result<T,
|
||||
E>` if you have no restrictions preventing that. The other technique is to use
|
||||
|
@ -611,12 +611,12 @@ reduce duplication but also specify to the compiler that we want the generic
|
||||
type to have particular behavior. The compiler can then use the trait bound
|
||||
information to check that all the concrete types used with our code provide the
|
||||
correct behavior. In dynamically typed languages, we would get an error at
|
||||
runtime if we called a method on a type that the type didn’t implement. But
|
||||
Rust moves these errors to compile time so we’re forced to fix the problems
|
||||
before our code is even able to run. Additionally, we don’t have to write code
|
||||
that checks for behavior at runtime because we’ve already checked at compile
|
||||
time. Doing so improves performance without having to give up the flexibility
|
||||
of generics.
|
||||
runtime if we called a method on a type which didn’t implement the type which
|
||||
defines the method. But Rust moves these errors to compile time so we’re forced
|
||||
to fix the problems before our code is even able to run. Additionally, we don’t
|
||||
have to write code that checks for behavior at runtime because we’ve already
|
||||
checked at compile time. Doing so improves performance without having to give
|
||||
up the flexibility of generics.
|
||||
|
||||
Another kind of generic that we’ve already been using is called *lifetimes*.
|
||||
Rather than ensuring that a type has the behavior we want, lifetimes ensure
|
||||
|
@ -196,7 +196,7 @@ pub fn search<'a>(query: &str, contents: &'a str) -> Vec<&'a str> {
|
||||
</span>
|
||||
|
||||
The `lines` method returns an iterator. We’ll talk about iterators in depth in
|
||||
[Chapter 13][ch13]<!-- ignore -->, but recall that you saw this way of using an
|
||||
[Chapter 13][ch13-iterators]<!-- ignore -->, but recall that you saw this way of using an
|
||||
iterator in [Listing 3-5][ch3-iter]<!-- ignore -->, where we used a `for` loop
|
||||
with an iterator to run some code on each item in a collection.
|
||||
|
||||
@ -266,7 +266,7 @@ At this point, we could consider opportunities for refactoring the
|
||||
implementation of the search function while keeping the tests passing to
|
||||
maintain the same functionality. The code in the search function isn’t too bad,
|
||||
but it doesn’t take advantage of some useful features of iterators. We’ll
|
||||
return to this example in [Chapter 13][ch13]<!-- ignore -->, where we’ll
|
||||
return to this example in [Chapter 13][ch13-iterators]<!-- ignore -->, where we’ll
|
||||
explore iterators in detail, and look at how to improve it.
|
||||
|
||||
#### Using the `search` Function in the `run` Function
|
||||
@ -336,3 +336,4 @@ ch10-03-lifetime-syntax.html#validating-references-with-lifetimes
|
||||
[ch11-anatomy]: ch11-01-writing-tests.html#the-anatomy-of-a-test-function
|
||||
[ch10-lifetimes]: ch10-03-lifetime-syntax.html
|
||||
[ch3-iter]: ch03-05-control-flow.html#looping-through-a-collection-with-for
|
||||
[ch13-iterators]: ch13-02-iterators.html
|
||||
|
@ -133,11 +133,9 @@ The first `if` block calls `simulated_expensive_calculation` twice, the `if`
|
||||
inside the outer `else` doesn’t call it at all, and the code inside the
|
||||
second `else` case calls it once.
|
||||
|
||||
<!-- NEXT PARAGRAPH WRAPPED WEIRD INTENTIONALLY SEE #199 -->
|
||||
|
||||
The desired behavior of the `generate_workout` function is to first check
|
||||
whether the user wants a low-intensity workout (indicated by a number less
|
||||
than 25) or a high-intensity workout (a number of 25 or greater).
|
||||
whether the user wants a low-intensity workout (indicated by a number less than
|
||||
25) or a high-intensity workout (a number of 25 or greater).
|
||||
|
||||
Low-intensity workout plans will recommend a number of push-ups and sit-ups
|
||||
based on the complex algorithm we’re simulating.
|
||||
|
@ -192,12 +192,17 @@ each other. Let’s add the `rand` crate to the `[dependencies]` section in the
|
||||
*add-one/Cargo.toml* file to be able to use the `rand` crate in the `add-one`
|
||||
crate:
|
||||
|
||||
<!-- When updating the version of `rand` used, also update the version of
|
||||
`rand` used in these files so they all match:
|
||||
* ch02-00-guessing-game-tutorial.md
|
||||
* ch07-04-bringing-paths-into-scope-with-the-use-keyword.md
|
||||
-->
|
||||
|
||||
<span class="filename">Filename: add-one/Cargo.toml</span>
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
|
||||
rand = "0.3.14"
|
||||
rand = "0.5.5"
|
||||
```
|
||||
|
||||
We can now add `use rand;` to the *add-one/src/lib.rs* file, and building the
|
||||
@ -206,10 +211,10 @@ and compile the `rand` crate:
|
||||
|
||||
```text
|
||||
$ cargo build
|
||||
Updating registry `https://github.com/rust-lang/crates.io-index`
|
||||
Downloading rand v0.3.14
|
||||
Updating crates.io index
|
||||
Downloaded rand v0.5.5
|
||||
--snip--
|
||||
Compiling rand v0.3.14
|
||||
Compiling rand v0.5.5
|
||||
Compiling add-one v0.1.0 (file:///projects/add/add-one)
|
||||
Compiling adder v0.1.0 (file:///projects/add/adder)
|
||||
Finished dev [unoptimized + debuginfo] target(s) in 10.18 secs
|
||||
|
@ -58,7 +58,7 @@ an instance of your type goes out of scope. We’re printing some text here to
|
||||
demonstrate when Rust will call `drop`.
|
||||
|
||||
In `main`, we create two instances of `CustomSmartPointer` and then print
|
||||
`CustomSmartPointers created.`. At the end of `main`, our instances of
|
||||
`CustomSmartPointers created`. At the end of `main`, our instances of
|
||||
`CustomSmartPointer` will go out of scope, and Rust will call the code we put
|
||||
in the `drop` method, printing our final message. Note that we didn’t need to
|
||||
call the `drop` method explicitly.
|
||||
@ -84,7 +84,7 @@ functionality. Disabling `drop` isn’t usually necessary; the whole point of th
|
||||
`Drop` trait is that it’s taken care of automatically. Occasionally, however,
|
||||
you might want to clean up a value early. One example is when using smart
|
||||
pointers that manage locks: you might want to force the `drop` method that
|
||||
releases the lock to run so other code in the same scope can acquire the lock.
|
||||
releases the lock so that other code in the same scope can acquire the lock.
|
||||
Rust doesn’t let you call the `Drop` trait’s `drop` method manually; instead
|
||||
you have to call the `std::mem::drop` function provided by the standard library
|
||||
if you want to force a value to be dropped before the end of its scope.
|
||||
@ -146,7 +146,7 @@ an argument. The function is in the prelude, so we can modify `main` in Listing
|
||||
#
|
||||
# impl Drop for CustomSmartPointer {
|
||||
# fn drop(&mut self) {
|
||||
# println!("Dropping CustomSmartPointer!");
|
||||
# println!("Dropping CustomSmartPointer with data `{}`!", self.data);
|
||||
# }
|
||||
# }
|
||||
#
|
||||
|
@ -1,16 +1,14 @@
|
||||
## `RefCell<T>` and the Interior Mutability Pattern
|
||||
|
||||
<!-- NEXT PARAGRAPH WRAPPED WEIRD INTENTIONALLY SEE #199 -->
|
||||
|
||||
*Interior mutability* is a design pattern in Rust that allows you to mutate
|
||||
data even when there are immutable references to that data; normally, this
|
||||
action is disallowed by the borrowing rules. To mutate data, the pattern uses
|
||||
`unsafe` code inside a data structure to bend Rust’s usual rules that govern
|
||||
mutation and borrowing. We haven’t yet covered unsafe code; we will in
|
||||
Chapter 19. We can use types that use the interior mutability pattern when we
|
||||
can ensure that the borrowing rules will be followed at runtime, even though
|
||||
the compiler can’t guarantee that. The `unsafe` code involved is then wrapped
|
||||
in a safe API, and the outer type is still immutable.
|
||||
mutation and borrowing. We haven’t yet covered unsafe code; we will in Chapter
|
||||
19. We can use types that use the interior mutability pattern when we can
|
||||
ensure that the borrowing rules will be followed at runtime, even though the
|
||||
compiler can’t guarantee that. The `unsafe` code involved is then wrapped in a
|
||||
safe API, and the outer type is still immutable.
|
||||
|
||||
Let’s explore this concept by looking at the `RefCell<T>` type that follows the
|
||||
interior mutability pattern.
|
||||
|
@ -55,16 +55,14 @@ of the streams will end up in one river at the end. We’ll start with a single
|
||||
producer for now, but we’ll add multiple producers when we get this example
|
||||
working.
|
||||
|
||||
<!-- NEXT PARAGRAPH WRAPPED WEIRD INTENTIONALLY SEE #199 -->
|
||||
|
||||
The `mpsc::channel` function returns a tuple, the first element of which is the
|
||||
sending end and the second element is the receiving end. The abbreviations `tx`
|
||||
and `rx` are traditionally used in many fields for *transmitter* and *receiver*
|
||||
respectively, so we name our variables as such to indicate each end. We’re
|
||||
using a `let` statement with a pattern that destructures the tuples; we’ll
|
||||
discuss the use of patterns in `let` statements and destructuring in
|
||||
Chapter 18. Using a `let` statement this way is a convenient approach to
|
||||
extract the pieces of the tuple returned by `mpsc::channel`.
|
||||
discuss the use of patterns in `let` statements and destructuring in Chapter
|
||||
18. Using a `let` statement this way is a convenient approach to extract the
|
||||
pieces of the tuple returned by `mpsc::channel`.
|
||||
|
||||
Let’s move the transmitting end into a spawned thread and have it send one
|
||||
string so the spawned thread is communicating with the main thread, as shown in
|
||||
|
@ -275,14 +275,15 @@ new type and draw it because `SelectBox` implements the `Draw` trait, which
|
||||
means it implements the `draw` method.
|
||||
|
||||
This concept—of being concerned only with the messages a value responds to
|
||||
rather than the value’s concrete type—is similar to the concept *duck typing*
|
||||
in dynamically typed languages: if it walks like a duck and quacks like a duck,
|
||||
then it must be a duck! In the implementation of `run` on `Screen` in Listing
|
||||
17-5, `run` doesn’t need to know what the concrete type of each component is.
|
||||
It doesn’t check whether a component is an instance of a `Button` or a
|
||||
`SelectBox`, it just calls the `draw` method on the component. By specifying
|
||||
`Box<dyn Draw>` as the type of the values in the `components` vector, we’ve
|
||||
defined `Screen` to need values that we can call the `draw` method on.
|
||||
rather than the value’s concrete type—is similar to the concept of *duck
|
||||
typing* in dynamically typed languages: if it walks like a duck and quacks
|
||||
like a duck, then it must be a duck! In the implementation of `run` on `Screen`
|
||||
in Listing 17-5, `run` doesn’t need to know what the concrete type of each
|
||||
component is. It doesn’t check whether a component is an instance of a `Button`
|
||||
or a `SelectBox`, it just calls the `draw` method on the component. By
|
||||
specifying `Box<dyn Draw>` as the type of the values in the `components`
|
||||
vector, we’ve defined `Screen` to need values that we can call the `draw`
|
||||
method on.
|
||||
|
||||
The advantage of using trait objects and Rust’s type system to write code
|
||||
similar to code using duck typing is that we never have to check whether a
|
||||
|
@ -10,8 +10,9 @@ a_value` because if the value in the `a_value` variable is `None` rather than
|
||||
|
||||
Function parameters, `let` statements, and `for` loops can only accept
|
||||
irrefutable patterns, because the program cannot do anything meaningful when
|
||||
values don’t match. The `if let` and `while let` expressions only accept
|
||||
refutable patterns, because by definition they’re intended to handle possible
|
||||
values don’t match. The `if let` and `while let` expressions accept
|
||||
refutable and irrefutable patterns, but the compiler warns against
|
||||
irrefutable patterns because by definition they’re intended to handle possible
|
||||
failure: the functionality of a conditional is in its ability to perform
|
||||
differently depending on success or failure.
|
||||
|
||||
@ -69,9 +70,9 @@ patterns instead of `let`</span>
|
||||
We’ve given the code an out! This code is perfectly valid, although it means we
|
||||
cannot use an irrefutable pattern without receiving an error. If we give `if
|
||||
let` a pattern that will always match, such as `x`, as shown in Listing 18-10,
|
||||
it will not compile.
|
||||
the compiler will give a warning.
|
||||
|
||||
```rust,ignore,does_not_compile
|
||||
```rust,ignore
|
||||
if let x = 5 {
|
||||
println!("{}", x);
|
||||
};
|
||||
@ -84,11 +85,15 @@ Rust complains that it doesn’t make sense to use `if let` with an irrefutable
|
||||
pattern:
|
||||
|
||||
```text
|
||||
error[E0162]: irrefutable if-let pattern
|
||||
--> <anon>:2:8
|
||||
warning: irrefutable if-let pattern
|
||||
--> <anon>:2:5
|
||||
|
|
||||
2 | if let x = 5 {
|
||||
| ^ irrefutable pattern
|
||||
2 | / if let x = 5 {
|
||||
3 | | println!("{}", x);
|
||||
4 | | };
|
||||
| |_^
|
||||
|
|
||||
= note: #[warn(irrefutable_let_patterns)] on by default
|
||||
```
|
||||
|
||||
For this reason, match arms must use refutable patterns, except for the last
|
||||
|
@ -711,11 +711,11 @@ fn main() {
|
||||
|
||||
match x {
|
||||
Some(50) => println!("Got 50"),
|
||||
Some(n) if n == y => println!("Matched, n = {:?}", n),
|
||||
Some(n) if n == y => println!("Matched, n = {}", n),
|
||||
_ => println!("Default case, x = {:?}", x),
|
||||
}
|
||||
|
||||
println!("at the end: x = {:?}, y = {:?}", x, y);
|
||||
println!("at the end: x = {:?}, y = {}", x, y);
|
||||
}
|
||||
```
|
||||
|
||||
|
@ -251,7 +251,7 @@ fn split_at_mut(slice: &mut [i32], mid: usize) -> (&mut [i32], &mut [i32]) {
|
||||
This function first gets the total length of the slice. Then it asserts that
|
||||
the index given as a parameter is within the slice by checking whether it’s
|
||||
less than or equal to the length. The assertion means that if we pass an index
|
||||
that is greater than the index to split the slice at, the function will panic
|
||||
that is greater than the length to split the slice at, the function will panic
|
||||
before it attempts to use that index.
|
||||
|
||||
Then we return two mutable slices in a tuple: one from the start of the
|
||||
|
@ -364,7 +364,7 @@ impl ThreadPool {
|
||||
```
|
||||
|
||||
We still use the `()` after `FnOnce` because this `FnOnce` represents a closure
|
||||
that takes no parameters and doesn’t return a value. Just like function
|
||||
that takes no parameters and returns the unit type `()`. Just like function
|
||||
definitions, the return type can be omitted from the signature, but even if we
|
||||
have no parameters, we still need the parentheses.
|
||||
|
||||
|
@ -1,812 +1,7 @@
|
||||
% Grammar
|
||||
|
||||
# Introduction
|
||||
The Rust grammar may now be found in the [reference]. Additionally, the [grammar
|
||||
working group] is working on producing a testable grammar.
|
||||
|
||||
This document is the primary reference for the Rust programming language grammar. It
|
||||
provides only one kind of material:
|
||||
|
||||
- Chapters that formally define the language grammar.
|
||||
|
||||
This document does not serve as an introduction to the language. Background
|
||||
familiarity with the language is assumed. A separate [guide] is available to
|
||||
help acquire such background.
|
||||
|
||||
This document also does not serve as a reference to the [standard] library
|
||||
included in the language distribution. Those libraries are documented
|
||||
separately by extracting documentation attributes from their source code. Many
|
||||
of the features that one might expect to be language features are library
|
||||
features in Rust, so what you're looking for may be there, not here.
|
||||
|
||||
[guide]: guide.html
|
||||
[standard]: std/index.html
|
||||
|
||||
# Notation
|
||||
|
||||
Rust's grammar is defined over Unicode codepoints, each conventionally denoted
|
||||
`U+XXXX`, for 4 or more hexadecimal digits `X`. _Most_ of Rust's grammar is
|
||||
confined to the ASCII range of Unicode, and is described in this document by a
|
||||
dialect of Extended Backus-Naur Form (EBNF), specifically a dialect of EBNF
|
||||
supported by common automated LL(k) parsing tools such as `llgen`, rather than
|
||||
the dialect given in ISO 14977. The dialect can be defined self-referentially
|
||||
as follows:
|
||||
|
||||
```antlr
|
||||
grammar : rule + ;
|
||||
rule : nonterminal ':' productionrule ';' ;
|
||||
productionrule : production [ '|' production ] * ;
|
||||
production : term * ;
|
||||
term : element repeats ;
|
||||
element : LITERAL | IDENTIFIER | '[' productionrule ']' ;
|
||||
repeats : [ '*' | '+' ] NUMBER ? | NUMBER ? | '?' ;
|
||||
```
|
||||
|
||||
Where:
|
||||
|
||||
- Whitespace in the grammar is ignored.
|
||||
- Square brackets are used to group rules.
|
||||
- `LITERAL` is a single printable ASCII character, or an escaped hexadecimal
|
||||
ASCII code of the form `\xQQ`, in single quotes, denoting the corresponding
|
||||
Unicode codepoint `U+00QQ`.
|
||||
- `IDENTIFIER` is a nonempty string of ASCII letters and underscores.
|
||||
- The `repeat` forms apply to the adjacent `element`, and are as follows:
|
||||
- `?` means zero or one repetition
|
||||
- `*` means zero or more repetitions
|
||||
- `+` means one or more repetitions
|
||||
- NUMBER trailing a repeat symbol gives a maximum repetition count
|
||||
- NUMBER on its own gives an exact repetition count
|
||||
|
||||
This EBNF dialect should hopefully be familiar to many readers.
|
||||
|
||||
## Unicode productions
|
||||
|
||||
A few productions in Rust's grammar permit Unicode codepoints outside the ASCII
|
||||
range. We define these productions in terms of character properties specified
|
||||
in the Unicode standard, rather than in terms of ASCII-range codepoints. The
|
||||
section [Special Unicode Productions](#special-unicode-productions) lists these
|
||||
productions.
|
||||
|
||||
## String table productions
|
||||
|
||||
Some rules in the grammar — notably [unary
|
||||
operators](#unary-operator-expressions), [binary
|
||||
operators](#binary-operator-expressions), and [keywords](#keywords) — are
|
||||
given in a simplified form: as a listing of a table of unquoted, printable
|
||||
whitespace-separated strings. These cases form a subset of the rules regarding
|
||||
the [token](#tokens) rule, and are assumed to be the result of a
|
||||
lexical-analysis phase feeding the parser, driven by a DFA, operating over the
|
||||
disjunction of all such string table entries.
|
||||
|
||||
When such a string enclosed in double-quotes (`"`) occurs inside the grammar,
|
||||
it is an implicit reference to a single member of such a string table
|
||||
production. See [tokens](#tokens) for more information.
|
||||
|
||||
# Lexical structure
|
||||
|
||||
## Input format
|
||||
|
||||
Rust input is interpreted as a sequence of Unicode codepoints encoded in UTF-8.
|
||||
Most Rust grammar rules are defined in terms of printable ASCII-range
|
||||
codepoints, but a small number are defined in terms of Unicode properties or
|
||||
explicit codepoint lists. [^inputformat]
|
||||
|
||||
[^inputformat]: Substitute definitions for the special Unicode productions are
|
||||
provided to the grammar verifier, restricted to ASCII range, when verifying the
|
||||
grammar in this document.
|
||||
|
||||
## Special Unicode Productions
|
||||
|
||||
The following productions in the Rust grammar are defined in terms of Unicode
|
||||
properties: `ident`, `non_null`, `non_eol`, `non_single_quote` and
|
||||
`non_double_quote`.
|
||||
|
||||
### Identifiers
|
||||
|
||||
The `ident` production is any nonempty Unicode string of
|
||||
the following form:
|
||||
|
||||
- The first character is in one of the following ranges `U+0041` to `U+005A`
|
||||
("A" to "Z"), `U+0061` to `U+007A` ("a" to "z"), or `U+005F` ("\_").
|
||||
- The remaining characters are in the range `U+0030` to `U+0039` ("0" to "9"),
|
||||
or any of the prior valid initial characters.
|
||||
|
||||
as long as the identifier does _not_ occur in the set of [keywords](#keywords).
|
||||
|
||||
### Delimiter-restricted productions
|
||||
|
||||
Some productions are defined by exclusion of particular Unicode characters:
|
||||
|
||||
- `non_null` is any single Unicode character aside from `U+0000` (null)
|
||||
- `non_eol` is any single Unicode character aside from `U+000A` (`'\n'`)
|
||||
- `non_single_quote` is any single Unicode character aside from `U+0027` (`'`)
|
||||
- `non_double_quote` is any single Unicode character aside from `U+0022` (`"`)
|
||||
|
||||
## Comments
|
||||
|
||||
```antlr
|
||||
comment : block_comment | line_comment ;
|
||||
block_comment : "/*" block_comment_body * "*/" ;
|
||||
block_comment_body : [block_comment | character] * ;
|
||||
line_comment : "//" non_eol * ;
|
||||
```
|
||||
|
||||
**FIXME:** add doc grammar?
|
||||
|
||||
## Whitespace
|
||||
|
||||
```antlr
|
||||
whitespace_char : '\x20' | '\x09' | '\x0a' | '\x0d' ;
|
||||
whitespace : [ whitespace_char | comment ] + ;
|
||||
```
|
||||
|
||||
## Tokens
|
||||
|
||||
```antlr
|
||||
simple_token : keyword | unop | binop ;
|
||||
token : simple_token | ident | literal | symbol | whitespace token ;
|
||||
```
|
||||
|
||||
### Keywords
|
||||
|
||||
<p id="keyword-table-marker"></p>
|
||||
|
||||
| | | | | |
|
||||
|----------|----------|----------|----------|----------|
|
||||
| _ | abstract | alignof | as | become |
|
||||
| box | break | const | continue | crate |
|
||||
| do | else | enum | extern | false |
|
||||
| final | fn | for | if | impl |
|
||||
| in | let | loop | macro | match |
|
||||
| mod | move | mut | offsetof | override |
|
||||
| priv | proc | pub | pure | ref |
|
||||
| return | Self | self | sizeof | static |
|
||||
| struct | super | trait | true | type |
|
||||
| typeof | unsafe | unsized | use | virtual |
|
||||
| where | while | yield | | |
|
||||
|
||||
|
||||
Each of these keywords has special meaning in its grammar, and all of them are
|
||||
excluded from the `ident` rule.
|
||||
|
||||
Not all of these keywords are used by the language. Some of them were used
|
||||
before Rust 1.0, and were left reserved once their implementations were
|
||||
removed. Some of them were reserved before 1.0 to make space for possible
|
||||
future features.
|
||||
|
||||
### Literals
|
||||
|
||||
```antlr
|
||||
lit_suffix : ident;
|
||||
literal : [ string_lit | char_lit | byte_string_lit | byte_lit | num_lit | bool_lit ] lit_suffix ?;
|
||||
```
|
||||
|
||||
The optional `lit_suffix` production is only used for certain numeric literals,
|
||||
but is reserved for future extension. That is, the above gives the lexical
|
||||
grammar, but a Rust parser will reject everything but the 12 special cases
|
||||
mentioned in [Number literals](reference/tokens.html#number-literals) in the
|
||||
reference.
|
||||
|
||||
#### Character and string literals
|
||||
|
||||
```antlr
|
||||
char_lit : '\x27' char_body '\x27' ;
|
||||
string_lit : '"' string_body * '"' | 'r' raw_string ;
|
||||
|
||||
char_body : non_single_quote
|
||||
| '\x5c' [ '\x27' | common_escape | unicode_escape ] ;
|
||||
|
||||
string_body : non_double_quote
|
||||
| '\x5c' [ '\x22' | common_escape | unicode_escape ] ;
|
||||
raw_string : '"' raw_string_body '"' | '#' raw_string '#' ;
|
||||
|
||||
common_escape : '\x5c'
|
||||
| 'n' | 'r' | 't' | '0'
|
||||
| 'x' hex_digit 2
|
||||
unicode_escape : 'u' '{' hex_digit+ 6 '}';
|
||||
|
||||
hex_digit : 'a' | 'b' | 'c' | 'd' | 'e' | 'f'
|
||||
| 'A' | 'B' | 'C' | 'D' | 'E' | 'F'
|
||||
| dec_digit ;
|
||||
oct_digit : '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' ;
|
||||
dec_digit : '0' | nonzero_dec ;
|
||||
nonzero_dec: '1' | '2' | '3' | '4'
|
||||
| '5' | '6' | '7' | '8' | '9' ;
|
||||
```
|
||||
|
||||
#### Byte and byte string literals
|
||||
|
||||
```antlr
|
||||
byte_lit : "b\x27" byte_body '\x27' ;
|
||||
byte_string_lit : "b\x22" string_body * '\x22' | "br" raw_byte_string ;
|
||||
|
||||
byte_body : ascii_non_single_quote
|
||||
| '\x5c' [ '\x27' | common_escape ] ;
|
||||
|
||||
byte_string_body : ascii_non_double_quote
|
||||
| '\x5c' [ '\x22' | common_escape ] ;
|
||||
raw_byte_string : '"' raw_byte_string_body '"' | '#' raw_byte_string '#' ;
|
||||
|
||||
```
|
||||
|
||||
#### Number literals
|
||||
|
||||
```antlr
|
||||
num_lit : nonzero_dec [ dec_digit | '_' ] * float_suffix ?
|
||||
| '0' [ [ dec_digit | '_' ] * float_suffix ?
|
||||
| 'b' [ '1' | '0' | '_' ] +
|
||||
| 'o' [ oct_digit | '_' ] +
|
||||
| 'x' [ hex_digit | '_' ] + ] ;
|
||||
|
||||
float_suffix : [ exponent | '.' dec_lit exponent ? ] ? ;
|
||||
|
||||
exponent : ['E' | 'e'] ['-' | '+' ] ? dec_lit ;
|
||||
dec_lit : [ dec_digit | '_' ] + ;
|
||||
```
|
||||
|
||||
#### Boolean literals
|
||||
|
||||
```antlr
|
||||
bool_lit : [ "true" | "false" ] ;
|
||||
```
|
||||
|
||||
The two values of the boolean type are written `true` and `false`.
|
||||
|
||||
### Symbols
|
||||
|
||||
```antlr
|
||||
symbol : "::" | "->"
|
||||
| '#' | '[' | ']' | '(' | ')' | '{' | '}'
|
||||
| ',' | ';' ;
|
||||
```
|
||||
|
||||
Symbols are a general class of printable [tokens](#tokens) that play structural
|
||||
roles in a variety of grammar productions. They are cataloged here for
|
||||
completeness as the set of remaining miscellaneous printable tokens that do not
|
||||
otherwise appear as [unary operators](#unary-operator-expressions), [binary
|
||||
operators](#binary-operator-expressions), or [keywords](#keywords).
|
||||
|
||||
## Paths
|
||||
|
||||
```antlr
|
||||
expr_path : [ "::" ] ident [ "::" expr_path_tail ] + ;
|
||||
expr_path_tail : '<' type_expr [ ',' type_expr ] + '>'
|
||||
| expr_path ;
|
||||
|
||||
type_path : ident [ type_path_tail ] + ;
|
||||
type_path_tail : '<' type_expr [ ',' type_expr ] + '>'
|
||||
| "::" type_path ;
|
||||
```
|
||||
|
||||
# Syntax extensions
|
||||
|
||||
## Macros
|
||||
|
||||
```antlr
|
||||
expr_macro_rules : "macro_rules" '!' ident '(' macro_rule * ')' ';'
|
||||
| "macro_rules" '!' ident '{' macro_rule * '}' ;
|
||||
macro_rule : '(' matcher * ')' "=>" '(' transcriber * ')' ';' ;
|
||||
matcher : '(' matcher * ')' | '[' matcher * ']'
|
||||
| '{' matcher * '}' | '$' ident ':' ident
|
||||
| '$' '(' matcher * ')' sep_token? [ '*' | '+' ]
|
||||
| non_special_token ;
|
||||
transcriber : '(' transcriber * ')' | '[' transcriber * ']'
|
||||
| '{' transcriber * '}' | '$' ident
|
||||
| '$' '(' transcriber * ')' sep_token? [ '*' | '+' ]
|
||||
| non_special_token ;
|
||||
```
|
||||
|
||||
# Crates and source files
|
||||
|
||||
**FIXME:** grammar? What production covers #![crate_id = "foo"] ?
|
||||
|
||||
# Items and attributes
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
## Items
|
||||
|
||||
```antlr
|
||||
item : vis ? mod_item | fn_item | type_item | struct_item | enum_item
|
||||
| const_item | static_item | trait_item | impl_item | extern_block_item ;
|
||||
```
|
||||
|
||||
### Type Parameters
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Modules
|
||||
|
||||
```antlr
|
||||
mod_item : "mod" ident ( ';' | '{' mod '}' );
|
||||
mod : [ view_item | item ] * ;
|
||||
```
|
||||
|
||||
#### View items
|
||||
|
||||
```antlr
|
||||
view_item : extern_crate_decl | use_decl ';' ;
|
||||
```
|
||||
|
||||
##### Extern crate declarations
|
||||
|
||||
```antlr
|
||||
extern_crate_decl : "extern" "crate" crate_name
|
||||
crate_name: ident | ( ident "as" ident )
|
||||
```
|
||||
|
||||
##### Use declarations
|
||||
|
||||
```antlr
|
||||
use_decl : vis ? "use" [ path "as" ident
|
||||
| path_glob ] ;
|
||||
|
||||
path_glob : ident [ "::" [ path_glob
|
||||
| '*' ] ] ?
|
||||
| '{' path_item [ ',' path_item ] * '}' ;
|
||||
|
||||
path_item : ident | "self" ;
|
||||
```
|
||||
|
||||
### Functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Generic functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Unsafety
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
##### Unsafe functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
##### Unsafe blocks
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Diverging functions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type definitions
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Structures
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Enumerations
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Constant items
|
||||
|
||||
```antlr
|
||||
const_item : "const" ident ':' type '=' expr ';' ;
|
||||
```
|
||||
|
||||
### Static items
|
||||
|
||||
```antlr
|
||||
static_item : "static" ident ':' type '=' expr ';' ;
|
||||
```
|
||||
|
||||
#### Mutable statics
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Traits
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Implementations
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### External blocks
|
||||
|
||||
```antlr
|
||||
extern_block_item : "extern" '{' extern_block '}' ;
|
||||
extern_block : [ foreign_fn ] * ;
|
||||
```
|
||||
|
||||
## Visibility and Privacy
|
||||
|
||||
```antlr
|
||||
vis : "pub" ;
|
||||
```
|
||||
### Re-exporting and Visibility
|
||||
|
||||
See [Use declarations](#use-declarations).
|
||||
|
||||
## Attributes
|
||||
|
||||
```antlr
|
||||
attribute : '#' '!' ? '[' meta_item ']' ;
|
||||
meta_item : ident [ '=' literal
|
||||
| '(' meta_seq ')' ] ? ;
|
||||
meta_seq : meta_item [ ',' meta_seq ] ? ;
|
||||
```
|
||||
|
||||
# Statements and expressions
|
||||
|
||||
## Statements
|
||||
|
||||
```antlr
|
||||
stmt : decl_stmt | expr_stmt | ';' ;
|
||||
```
|
||||
|
||||
### Declaration statements
|
||||
|
||||
```antlr
|
||||
decl_stmt : item | let_decl ;
|
||||
```
|
||||
|
||||
#### Item declarations
|
||||
|
||||
See [Items](#items).
|
||||
|
||||
#### Variable declarations
|
||||
|
||||
```antlr
|
||||
let_decl : "let" pat [':' type ] ? [ init ] ? ';' ;
|
||||
init : [ '=' ] expr ;
|
||||
```
|
||||
|
||||
### Expression statements
|
||||
|
||||
```antlr
|
||||
expr_stmt : expr ';' ;
|
||||
```
|
||||
|
||||
## Expressions
|
||||
|
||||
```antlr
|
||||
expr : literal | path | tuple_expr | unit_expr | struct_expr
|
||||
| block_expr | method_call_expr | field_expr | array_expr
|
||||
| idx_expr | range_expr | unop_expr | binop_expr
|
||||
| paren_expr | call_expr | lambda_expr | while_expr
|
||||
| loop_expr | break_expr | continue_expr | for_expr
|
||||
| if_expr | match_expr | if_let_expr | while_let_expr
|
||||
| return_expr ;
|
||||
```
|
||||
|
||||
#### Lvalues, rvalues and temporaries
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Moved and copied types
|
||||
|
||||
**FIXME:** Do we want to capture this in the grammar as different productions?
|
||||
|
||||
### Literal expressions
|
||||
|
||||
See [Literals](#literals).
|
||||
|
||||
### Path expressions
|
||||
|
||||
See [Paths](#paths).
|
||||
|
||||
### Tuple expressions
|
||||
|
||||
```antlr
|
||||
tuple_expr : '(' [ expr [ ',' expr ] * | expr ',' ] ? ')' ;
|
||||
```
|
||||
|
||||
### Unit expressions
|
||||
|
||||
```antlr
|
||||
unit_expr : "()" ;
|
||||
```
|
||||
|
||||
### Structure expressions
|
||||
|
||||
```antlr
|
||||
struct_expr_field_init : ident | ident ':' expr ;
|
||||
struct_expr : expr_path '{' struct_expr_field_init
|
||||
[ ',' struct_expr_field_init ] *
|
||||
[ ".." expr ] '}' |
|
||||
expr_path '(' expr
|
||||
[ ',' expr ] * ')' |
|
||||
expr_path ;
|
||||
```
|
||||
|
||||
### Block expressions
|
||||
|
||||
```antlr
|
||||
block_expr : '{' [ stmt | item ] *
|
||||
[ expr ] '}' ;
|
||||
```
|
||||
|
||||
### Method-call expressions
|
||||
|
||||
```antlr
|
||||
method_call_expr : expr '.' ident paren_expr_list ;
|
||||
```
|
||||
|
||||
### Field expressions
|
||||
|
||||
```antlr
|
||||
field_expr : expr '.' ident ;
|
||||
```
|
||||
|
||||
### Array expressions
|
||||
|
||||
```antlr
|
||||
array_expr : '[' "mut" ? array_elems? ']' ;
|
||||
|
||||
array_elems : [expr [',' expr]*] | [expr ';' expr] ;
|
||||
```
|
||||
|
||||
### Index expressions
|
||||
|
||||
```antlr
|
||||
idx_expr : expr '[' expr ']' ;
|
||||
```
|
||||
|
||||
### Range expressions
|
||||
|
||||
```antlr
|
||||
range_expr : expr ".." expr |
|
||||
expr ".." |
|
||||
".." expr |
|
||||
".." ;
|
||||
```
|
||||
|
||||
### Unary operator expressions
|
||||
|
||||
```antlr
|
||||
unop_expr : unop expr ;
|
||||
unop : '-' | '*' | '!' ;
|
||||
```
|
||||
|
||||
### Binary operator expressions
|
||||
|
||||
```antlr
|
||||
binop_expr : expr binop expr | type_cast_expr
|
||||
| assignment_expr | compound_assignment_expr ;
|
||||
binop : arith_op | bitwise_op | lazy_bool_op | comp_op
|
||||
```
|
||||
|
||||
#### Arithmetic operators
|
||||
|
||||
```antlr
|
||||
arith_op : '+' | '-' | '*' | '/' | '%' ;
|
||||
```
|
||||
|
||||
#### Bitwise operators
|
||||
|
||||
```antlr
|
||||
bitwise_op : '&' | '|' | '^' | "<<" | ">>" ;
|
||||
```
|
||||
|
||||
#### Lazy boolean operators
|
||||
|
||||
```antlr
|
||||
lazy_bool_op : "&&" | "||" ;
|
||||
```
|
||||
|
||||
#### Comparison operators
|
||||
|
||||
```antlr
|
||||
comp_op : "==" | "!=" | '<' | '>' | "<=" | ">=" ;
|
||||
```
|
||||
|
||||
#### Type cast expressions
|
||||
|
||||
```antlr
|
||||
type_cast_expr : value "as" type ;
|
||||
```
|
||||
|
||||
#### Assignment expressions
|
||||
|
||||
```antlr
|
||||
assignment_expr : expr '=' expr ;
|
||||
```
|
||||
|
||||
#### Compound assignment expressions
|
||||
|
||||
```antlr
|
||||
compound_assignment_expr : expr [ arith_op | bitwise_op ] '=' expr ;
|
||||
```
|
||||
|
||||
### Grouped expressions
|
||||
|
||||
```antlr
|
||||
paren_expr : '(' expr ')' ;
|
||||
```
|
||||
|
||||
### Call expressions
|
||||
|
||||
```antlr
|
||||
expr_list : [ expr [ ',' expr ]* ] ? ;
|
||||
paren_expr_list : '(' expr_list ')' ;
|
||||
call_expr : expr paren_expr_list ;
|
||||
```
|
||||
|
||||
### Lambda expressions
|
||||
|
||||
```antlr
|
||||
ident_list : [ ident [ ',' ident ]* ] ? ;
|
||||
lambda_expr : '|' ident_list '|' expr ;
|
||||
```
|
||||
|
||||
### While loops
|
||||
|
||||
```antlr
|
||||
while_expr : [ lifetime ':' ] ? "while" no_struct_literal_expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### Infinite loops
|
||||
|
||||
```antlr
|
||||
loop_expr : [ lifetime ':' ] ? "loop" '{' block '}';
|
||||
```
|
||||
|
||||
### Break expressions
|
||||
|
||||
```antlr
|
||||
break_expr : "break" [ lifetime ] ?;
|
||||
```
|
||||
|
||||
### Continue expressions
|
||||
|
||||
```antlr
|
||||
continue_expr : "continue" [ lifetime ] ?;
|
||||
```
|
||||
|
||||
### For expressions
|
||||
|
||||
```antlr
|
||||
for_expr : [ lifetime ':' ] ? "for" pat "in" no_struct_literal_expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### If expressions
|
||||
|
||||
```antlr
|
||||
if_expr : "if" no_struct_literal_expr '{' block '}'
|
||||
else_tail ? ;
|
||||
|
||||
else_tail : "else" [ if_expr | if_let_expr
|
||||
| '{' block '}' ] ;
|
||||
```
|
||||
|
||||
### Match expressions
|
||||
|
||||
```antlr
|
||||
match_expr : "match" no_struct_literal_expr '{' match_arm * '}' ;
|
||||
|
||||
match_arm : attribute * match_pat "=>" [ expr "," | '{' block '}' ] ;
|
||||
|
||||
match_pat : pat [ '|' pat ] * [ "if" expr ] ? ;
|
||||
```
|
||||
|
||||
### If let expressions
|
||||
|
||||
```antlr
|
||||
if_let_expr : "if" "let" pat '=' expr '{' block '}'
|
||||
else_tail ? ;
|
||||
```
|
||||
|
||||
### While let loops
|
||||
|
||||
```antlr
|
||||
while_let_expr : [ lifetime ':' ] ? "while" "let" pat '=' expr '{' block '}' ;
|
||||
```
|
||||
|
||||
### Return expressions
|
||||
|
||||
```antlr
|
||||
return_expr : "return" expr ? ;
|
||||
```
|
||||
|
||||
# Type system
|
||||
|
||||
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
|
||||
|
||||
## Types
|
||||
|
||||
### Primitive types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Machine types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
#### Machine-dependent integer types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Textual types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Tuple types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Array, and Slice types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Structure types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Enumerated types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Pointer types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Function types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Closure types
|
||||
|
||||
```antlr
|
||||
closure_type := [ 'unsafe' ] [ '<' lifetime-list '>' ] '|' arg-list '|'
|
||||
[ ':' bound-list ] [ '->' type ]
|
||||
lifetime-list := lifetime | lifetime ',' lifetime-list
|
||||
arg-list := ident ':' type | ident ':' type ',' arg-list
|
||||
```
|
||||
|
||||
### Never type
|
||||
An empty type
|
||||
|
||||
```antlr
|
||||
never_type : "!" ;
|
||||
```
|
||||
|
||||
### Object types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type parameters
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
### Type parameter bounds
|
||||
|
||||
```antlr
|
||||
bound-list := bound | bound '+' bound-list '+' ?
|
||||
bound := ty_bound | lt_bound
|
||||
lt_bound := lifetime
|
||||
ty_bound := ty_bound_noparen | (ty_bound_noparen)
|
||||
ty_bound_noparen := [?] [ for<lt_param_defs> ] simple_path
|
||||
```
|
||||
|
||||
### Self types
|
||||
|
||||
**FIXME:** grammar?
|
||||
|
||||
## Type kinds
|
||||
|
||||
**FIXME:** this is probably not relevant to the grammar...
|
||||
|
||||
# Memory and concurrency models
|
||||
|
||||
**FIXME:** is this entire chapter relevant here? Or should it all have been covered by some production already?
|
||||
|
||||
## Memory model
|
||||
|
||||
### Memory allocation and lifetime
|
||||
|
||||
### Memory ownership
|
||||
|
||||
### Variables
|
||||
|
||||
### Boxes
|
||||
|
||||
## Threads
|
||||
|
||||
### Communication between threads
|
||||
|
||||
### Thread lifecycle
|
||||
[reference]: https://doc.rust-lang.org/reference/
|
||||
[grammar working group]: https://github.com/rust-lang/wg-grammar
|
||||
|
@ -1,20 +1,23 @@
|
||||
# Atomics
|
||||
|
||||
Rust pretty blatantly just inherits C11's memory model for atomics. This is not
|
||||
Rust pretty blatantly just inherits the memory model for atomics from C++20. This is not
|
||||
due to this model being particularly excellent or easy to understand. Indeed,
|
||||
this model is quite complex and known to have [several flaws][C11-busted].
|
||||
Rather, it is a pragmatic concession to the fact that *everyone* is pretty bad
|
||||
at modeling atomics. At very least, we can benefit from existing tooling and
|
||||
research around C.
|
||||
research around the C/C++ memory model.
|
||||
(You'll often see this model referred to as "C/C++11" or just "C11". C just copies
|
||||
the C++ memory model; and C++11 was the first version of the model but it has
|
||||
received some bugfixes since then.)
|
||||
|
||||
Trying to fully explain the model in this book is fairly hopeless. It's defined
|
||||
in terms of madness-inducing causality graphs that require a full book to
|
||||
properly understand in a practical way. If you want all the nitty-gritty
|
||||
details, you should check out [C's specification (Section 7.17)][C11-model].
|
||||
details, you should check out the [C++20 draft specification (Section 31)][C++-model].
|
||||
Still, we'll try to cover the basics and some of the problems Rust developers
|
||||
face.
|
||||
|
||||
The C11 memory model is fundamentally about trying to bridge the gap between the
|
||||
The C++ memory model is fundamentally about trying to bridge the gap between the
|
||||
semantics we want, the optimizations compilers want, and the inconsistent chaos
|
||||
our hardware wants. *We* would like to just write programs and have them do
|
||||
exactly what we said but, you know, fast. Wouldn't that be great?
|
||||
@ -113,7 +116,7 @@ programming:
|
||||
|
||||
# Data Accesses
|
||||
|
||||
The C11 memory model attempts to bridge the gap by allowing us to talk about the
|
||||
The C++ memory model attempts to bridge the gap by allowing us to talk about the
|
||||
*causality* of our program. Generally, this is by establishing a *happens
|
||||
before* relationship between parts of the program and the threads that are
|
||||
running them. This gives the hardware and compiler room to optimize the program
|
||||
@ -148,7 +151,7 @@ propagated to other threads. The set of orderings Rust exposes are:
|
||||
* Acquire
|
||||
* Relaxed
|
||||
|
||||
(Note: We explicitly do not expose the C11 *consume* ordering)
|
||||
(Note: We explicitly do not expose the C++ *consume* ordering)
|
||||
|
||||
TODO: negative reasoning vs positive reasoning? TODO: "can't forget to
|
||||
synchronize"
|
||||
@ -252,4 +255,4 @@ relaxed operations can be cheaper on weakly-ordered platforms.
|
||||
|
||||
|
||||
[C11-busted]: http://plv.mpi-sws.org/c11comp/popl15.pdf
|
||||
[C11-model]: http://www.open-std.org/jtc1/sc22/wg14/www/standards.html#9899
|
||||
[C++-model]: http://eel.is/c++draft/atomics.order
|
||||
|
@ -16,10 +16,14 @@ fn index(idx: usize, arr: &[u8]) -> Option<u8> {
|
||||
}
|
||||
```
|
||||
|
||||
This function is safe and correct. We check that the index is in bounds, and if it
|
||||
is, index into the array in an unchecked manner. But even in such a trivial
|
||||
function, the scope of the unsafe block is questionable. Consider changing the
|
||||
`<` to a `<=`:
|
||||
This function is safe and correct. We check that the index is in bounds, and if
|
||||
it is, index into the array in an unchecked manner. We say that such a correct
|
||||
unsafely implemented function is *sound*, meaning that safe code cannot cause
|
||||
Undefined Behavior through it (which, remember, is the single fundamental
|
||||
property of Safe Rust).
|
||||
|
||||
But even in such a trivial function, the scope of the unsafe block is
|
||||
questionable. Consider changing the `<` to a `<=`:
|
||||
|
||||
```rust
|
||||
fn index(idx: usize, arr: &[u8]) -> Option<u8> {
|
||||
@ -33,10 +37,10 @@ fn index(idx: usize, arr: &[u8]) -> Option<u8> {
|
||||
}
|
||||
```
|
||||
|
||||
This program is now unsound, and yet *we only modified safe code*. This is the
|
||||
fundamental problem of safety: it's non-local. The soundness of our unsafe
|
||||
operations necessarily depends on the state established by otherwise
|
||||
"safe" operations.
|
||||
This program is now *unsound*, Safe Rust can cause Undefined Behavior, and yet
|
||||
*we only modified safe code*. This is the fundamental problem of safety: it's
|
||||
non-local. The soundness of our unsafe operations necessarily depends on the
|
||||
state established by otherwise "safe" operations.
|
||||
|
||||
Safety is modular in the sense that opting into unsafety doesn't require you
|
||||
to consider arbitrary other kinds of badness. For instance, doing an unchecked
|
||||
|
@ -1,7 +1,8 @@
|
||||
language: rust
|
||||
language: shell
|
||||
|
||||
rust:
|
||||
- nightly
|
||||
before_install:
|
||||
- curl -sSL https://sh.rustup.rs | sh -s -- -y --default-toolchain=nightly --profile=minimal -c rust-docs
|
||||
- export PATH="$HOME/.cargo/bin:$PATH"
|
||||
|
||||
install:
|
||||
- travis_retry curl -Lf https://github.com/rust-lang-nursery/mdBook/releases/download/v0.3.1/mdbook-v0.3.1-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=$HOME/.cargo/bin
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user