New upstream version 1.21.0+dfsg1

This commit is contained in:
Ximin Luo 2017-10-16 16:39:26 +02:00
parent 041b39d230
commit 3b2f29766f
2023 changed files with 114908 additions and 42293 deletions

View File

@ -99,7 +99,7 @@ Before you can start building the compiler you need to configure the build for
your system. In most cases, that will just mean using the defaults provided your system. In most cases, that will just mean using the defaults provided
for Rust. for Rust.
To change configuration, you must copy the file `src/bootstrap/config.toml.example` To change configuration, you must copy the file `config.toml.example`
to `config.toml` in the directory from which you will be running the build, and to `config.toml` in the directory from which you will be running the build, and
change the settings provided. change the settings provided.
@ -237,10 +237,13 @@ Some common invocations of `x.py` are:
## Pull Requests ## Pull Requests
Pull requests are the primary mechanism we use to change Rust. GitHub itself Pull requests are the primary mechanism we use to change Rust. GitHub itself
has some [great documentation][pull-requests] on using the Pull Request has some [great documentation][pull-requests] on using the Pull Request feature.
feature. We use the 'fork and pull' model described there. We use the "fork and pull" model [described here][development-models], where
contributors push changes to their personal fork and create pull requests to
bring those changes into the source repository.
[pull-requests]: https://help.github.com/articles/using-pull-requests/ [pull-requests]: https://help.github.com/articles/about-pull-requests/
[development-models]: https://help.github.com/articles/about-collaborative-development-models/
Please make pull requests against the `master` branch. Please make pull requests against the `master` branch.
@ -289,7 +292,7 @@ been approved. The PR then enters the [merge queue][merge-queue], where @bors
will run all the tests on every platform we support. If it all works out, will run all the tests on every platform we support. If it all works out,
@bors will merge your code into `master` and close the pull request. @bors will merge your code into `master` and close the pull request.
[merge-queue]: https://buildbot.rust-lang.org/homu/queue/rust [merge-queue]: https://buildbot2.rust-lang.org/homu/queue/rust
Speaking of tests, Rust has a comprehensive test suite. More information about Speaking of tests, Rust has a comprehensive test suite. More information about
it can be found it can be found
@ -412,4 +415,4 @@ are:
[tlgba]: http://tomlee.co/2014/04/a-more-detailed-tour-of-the-rust-compiler/ [tlgba]: http://tomlee.co/2014/04/a-more-detailed-tour-of-the-rust-compiler/
[ro]: http://www.rustaceans.org/ [ro]: http://www.rustaceans.org/
[rctd]: ./src/test/COMPILER_TESTS.md [rctd]: ./src/test/COMPILER_TESTS.md
[cheatsheet]: https://buildbot.rust-lang.org/homu/ [cheatsheet]: https://buildbot2.rust-lang.org/homu/

View File

@ -6,16 +6,17 @@ terms.
Longer version: Longer version:
The Rust Project is copyright 2010, The Rust Project Copyrights in the Rust project are retained by their contributors. No
Developers. copyright assignment is required to contribute to the Rust project.
Licensed under the Apache License, Version 2.0 Some files include explicit copyright notices and/or license notices.
<LICENSE-APACHE or For full authorship information, see AUTHORS.txt and the version control
http://www.apache.org/licenses/LICENSE-2.0> or the MIT history.
license <LICENSE-MIT or http://opensource.org/licenses/MIT>,
at your option. All files in the project carrying such Except as otherwise noted (below and/or in individual files), Rust is
notice may not be copied, modified, or distributed except licensed under the Apache License, Version 2.0 <LICENSE-APACHE> or
according to those terms. <http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
<LICENSE-MIT> or <http://opensource.org/licenses/MIT>, at your option.
The Rust Project includes packages written by third parties. The Rust Project includes packages written by third parties.
@ -282,25 +283,3 @@ their own copyright notices and license terms:
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE
USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
OF SUCH DAMAGE. OF SUCH DAMAGE.
* Additional copyright may be retained by contributors other
than Mozilla, the Rust Project Developers, or the parties
enumerated in this file. Such copyright can be determined
on a case-by-case basis by examining the author of each
portion of a file in the revision-control commit records
of the project, or by consulting representative comments
claiming copyright ownership for a file.
For example, the text:
"Copyright (c) 2011 Google Inc."
appears in some files, and these files thereby denote
that their author and copyright-holder is Google Inc.
In all such cases, the absence of explicit licensing text
indicates that the contributor chose to license their work
for distribution under identical terms to those Mozilla
has chosen for the collective work, enumerated at the top
of this file. The only difference is the retention of
copyright itself, held by the contributor.

View File

@ -1,5 +1,3 @@
Copyright (c) 2010 The Rust Project Developers
Permission is hereby granted, free of charge, to any Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the documentation files (the "Software"), to deal in the

View File

@ -39,7 +39,7 @@ Read ["Installation"] from [The Book].
``` ```
> ***Note:*** Install locations can be adjusted by copying the config file > ***Note:*** Install locations can be adjusted by copying the config file
> from `./src/bootstrap/config.toml.example` to `./config.toml`, and > from `./config.toml.example` to `./config.toml`, and
> adjusting the `prefix` option under `[install]`. Various other options, such > adjusting the `prefix` option under `[install]`. Various other options, such
> as enabling debug information, are also supported, and are documented in > as enabling debug information, are also supported, and are documented in
> the config file. > the config file.
@ -135,7 +135,7 @@ Windows build triples are:
- `i686-pc-windows-msvc` - `i686-pc-windows-msvc`
- `x86_64-pc-windows-msvc` - `x86_64-pc-windows-msvc`
The build triple can be specified by either specifying `--build=ABI` when The build triple can be specified by either specifying `--build=<triple>` when
invoking `x.py` commands, or by copying the `config.toml` file (as described invoking `x.py` commands, or by copying the `config.toml` file (as described
in Building From Source), and modifying the `build` option under the `[build]` in Building From Source), and modifying the `build` option under the `[build]`
section. section.

View File

@ -1,3 +1,370 @@
Version 1.20.0 (2017-08-31)
===========================
Language
--------
- [Associated constants in traits is now stabilised.][42809]
- [A lot of macro bugs are now fixed.][42913]
Compiler
--------
- [Struct fields are now properly coerced to the expected field type.][42807]
- [Enabled wasm LLVM backend][42571] WASM can now be built with the
`wasm32-experimental-emscripten` target.
- [Changed some of the error messages to be more helpful.][42033]
- [Add support for RELRO(RELocation Read-Only) for platforms that support
it.][43170]
- [rustc now reports the total number of errors on compilation failure][43015]
previously this was only the number of errors in the pass that failed.
- [Expansion in rustc has been sped up 29x.][42533]
- [added `msp430-none-elf` target.][43099]
- [rustc will now suggest one-argument enum variant to fix type mismatch when
applicable][43178]
- [Fixes backtraces on Redox][43228]
- [rustc now identifies different versions of same crate when absolute paths of
different types match in an error message.][42826]
Libraries
---------
- [Relaxed Debug constraints on `{HashMap,BTreeMap}::{Keys,Values}`.][42854]
- [Impl `PartialEq`, `Eq`, `PartialOrd`, `Ord`, `Debug`, `Hash` for unsized
tuples.][43011]
- [Impl `fmt::{Display, Debug}` for `Ref`, `RefMut`, `MutexGuard`,
`RwLockReadGuard`, `RwLockWriteGuard`][42822]
- [Impl `Clone` for `DefaultHasher`.][42799]
- [Impl `Sync` for `SyncSender`.][42397]
- [Impl `FromStr` for `char`][42271]
- [Fixed how `{f32, f64}::{is_sign_negative, is_sign_positive}` handles
NaN.][42431]
- [allow messages in the `unimplemented!()` macro.][42155]
ie. `unimplemented!("Waiting for 1.21 to be stable")`
- [`pub(restricted)` is now supported in the `thread_local!` macro.][43185]
- [Upgrade to Unicode 10.0.0][42999]
- [Reimplemented `{f32, f64}::{min, max}` in Rust instead of using CMath.][42430]
- [Skip the main thread's manual stack guard on Linux][43072]
- [Iterator::nth for `ops::{Range, RangeFrom}` is now done in O(1) time][43077]
- [`#[repr(align(N))]` attribute max number is now 2^31 - 1.][43097] This was
previously 2^15.
- [`{OsStr, Path}::Display` now avoids allocations where possible][42613]
Stabilized APIs
---------------
- [`CStr::into_c_string`]
- [`CString::as_c_str`]
- [`CString::into_boxed_c_str`]
- [`Chain::get_mut`]
- [`Chain::get_ref`]
- [`Chain::into_inner`]
- [`Option::get_or_insert_with`]
- [`Option::get_or_insert`]
- [`OsStr::into_os_string`]
- [`OsString::into_boxed_os_str`]
- [`Take::get_mut`]
- [`Take::get_ref`]
- [`Utf8Error::error_len`]
- [`char::EscapeDebug`]
- [`char::escape_debug`]
- [`compile_error!`]
- [`f32::from_bits`]
- [`f32::to_bits`]
- [`f64::from_bits`]
- [`f64::to_bits`]
- [`mem::ManuallyDrop`]
- [`slice::sort_unstable_by_key`]
- [`slice::sort_unstable_by`]
- [`slice::sort_unstable`]
- [`ste::from_boxed_utf8_unchecked`]
- [`str::as_bytes_mut`]
- [`str::as_bytes_mut`]
- [`str::from_utf8_mut`]
- [`str::from_utf8_unchecked_mut`]
- [`str::get_mut`]
- [`str::get_unchecked_mut`]
- [`str::get_unchecked`]
- [`str::get`]
- [`str::into_boxed_bytes`]
Cargo
-----
- [Cargo API token location moved from `~/.cargo/config` to
`~/.cargo/credentials`.][cargo/3978]
- [Cargo will now build `main.rs` binaries that are in sub-directories of
`src/bin`.][cargo/4214] ie. Having `src/bin/server/main.rs` and
`src/bin/client/main.rs` generates `target/debug/server` and `target/debug/client`
- [You can now specify version of a binary when installed through
`cargo install` using `--vers`.][cargo/4229]
- [Added `--no-fail-fast` flag to cargo to run all benchmarks regardless of
failure.][cargo/4248]
- [Changed the convention around which file is the crate root.][cargo/4259]
- [The `include`/`exclude` property in `Cargo.toml` now accepts gitignore paths
instead of glob patterns][cargo/4270]. Glob patterns are now deprecated.
Compatibility Notes
-------------------
- [Functions with `'static` in their return types will now not be as usable as
if they were using lifetime parameters instead.][42417]
- [The reimplementation of `{f32, f64}::is_sign_{negative, positive}` now
takes the sign of NaN into account where previously didn't.][42430]
[42033]: https://github.com/rust-lang/rust/pull/42033
[42155]: https://github.com/rust-lang/rust/pull/42155
[42271]: https://github.com/rust-lang/rust/pull/42271
[42397]: https://github.com/rust-lang/rust/pull/42397
[42417]: https://github.com/rust-lang/rust/pull/42417
[42430]: https://github.com/rust-lang/rust/pull/42430
[42431]: https://github.com/rust-lang/rust/pull/42431
[42533]: https://github.com/rust-lang/rust/pull/42533
[42571]: https://github.com/rust-lang/rust/pull/42571
[42613]: https://github.com/rust-lang/rust/pull/42613
[42799]: https://github.com/rust-lang/rust/pull/42799
[42807]: https://github.com/rust-lang/rust/pull/42807
[42809]: https://github.com/rust-lang/rust/pull/42809
[42822]: https://github.com/rust-lang/rust/pull/42822
[42826]: https://github.com/rust-lang/rust/pull/42826
[42854]: https://github.com/rust-lang/rust/pull/42854
[42913]: https://github.com/rust-lang/rust/pull/42913
[42999]: https://github.com/rust-lang/rust/pull/42999
[43011]: https://github.com/rust-lang/rust/pull/43011
[43015]: https://github.com/rust-lang/rust/pull/43015
[43072]: https://github.com/rust-lang/rust/pull/43072
[43077]: https://github.com/rust-lang/rust/pull/43077
[43097]: https://github.com/rust-lang/rust/pull/43097
[43099]: https://github.com/rust-lang/rust/pull/43099
[43170]: https://github.com/rust-lang/rust/pull/43170
[43178]: https://github.com/rust-lang/rust/pull/43178
[43185]: https://github.com/rust-lang/rust/pull/43185
[43228]: https://github.com/rust-lang/rust/pull/43228
[cargo/3978]: https://github.com/rust-lang/cargo/pull/3978
[cargo/4214]: https://github.com/rust-lang/cargo/pull/4214
[cargo/4229]: https://github.com/rust-lang/cargo/pull/4229
[cargo/4248]: https://github.com/rust-lang/cargo/pull/4248
[cargo/4259]: https://github.com/rust-lang/cargo/pull/4259
[cargo/4270]: https://github.com/rust-lang/cargo/pull/4270
[`CStr::into_c_string`]: https://doc.rust-lang.org/std/ffi/struct.CStr.html#method.into_c_string
[`CString::as_c_str`]: https://doc.rust-lang.org/std/ffi/struct.CString.html#method.as_c_str
[`CString::into_boxed_c_str`]: https://doc.rust-lang.org/std/ffi/struct.CString.html#method.into_boxed_c_str
[`Chain::get_mut`]: https://doc.rust-lang.org/std/io/struct.Chain.html#method.get_mut
[`Chain::get_ref`]: https://doc.rust-lang.org/std/io/struct.Chain.html#method.get_ref
[`Chain::into_inner`]: https://doc.rust-lang.org/std/io/struct.Chain.html#method.into_inner
[`Option::get_or_insert_with`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.get_or_insert_with
[`Option::get_or_insert`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.get_or_insert
[`OsStr::into_os_string`]: https://doc.rust-lang.org/std/ffi/struct.OsStr.html#method.into_os_string
[`OsString::into_boxed_os_str`]: https://doc.rust-lang.org/std/ffi/struct.OsString.html#method.into_boxed_os_str
[`Take::get_mut`]: https://doc.rust-lang.org/std/io/struct.Take.html#method.get_mut
[`Take::get_ref`]: https://doc.rust-lang.org/std/io/struct.Take.html#method.get_ref
[`Utf8Error::error_len`]: https://doc.rust-lang.org/std/str/struct.Utf8Error.html#method.error_len
[`char::EscapeDebug`]: https://doc.rust-lang.org/std/char/struct.EscapeDebug.html
[`char::escape_debug`]: https://doc.rust-lang.org/std/primitive.char.html#method.escape_debug
[`compile_error!`]: https://doc.rust-lang.org/std/macro.compile_error.html
[`f32::from_bits`]: https://doc.rust-lang.org/std/primitive.f32.html#method.from_bits
[`f32::to_bits`]: https://doc.rust-lang.org/std/primitive.f32.html#method.to_bits
[`f64::from_bits`]: https://doc.rust-lang.org/std/primitive.f64.html#method.from_bits
[`f64::to_bits`]: https://doc.rust-lang.org/std/primitive.f64.html#method.to_bits
[`mem::ManuallyDrop`]: https://doc.rust-lang.org/std/mem/union.ManuallyDrop.html
[`slice::sort_unstable_by_key`]: https://doc.rust-lang.org/std/primitive.slice.html#method.sort_unstable_by_key
[`slice::sort_unstable_by`]: https://doc.rust-lang.org/std/primitive.slice.html#method.sort_unstable_by
[`slice::sort_unstable`]: https://doc.rust-lang.org/std/primitive.slice.html#method.sort_unstable
[`ste::from_boxed_utf8_unchecked`]: https://doc.rust-lang.org/std/str/fn.from_boxed_utf8_unchecked.html
[`str::as_bytes_mut`]: https://doc.rust-lang.org/std/primitive.str.html#method.as_bytes_mut
[`str::from_utf8_mut`]: https://doc.rust-lang.org/std/str/fn.from_utf8_mut.html
[`str::from_utf8_unchecked_mut`]: https://doc.rust-lang.org/std/str/fn.from_utf8_unchecked_mut.html
[`str::get_mut`]: https://doc.rust-lang.org/std/primitive.str.html#method.get_mut
[`str::get_unchecked_mut`]: https://doc.rust-lang.org/std/primitive.str.html#method.get_unchecked_mut
[`str::get_unchecked`]: https://doc.rust-lang.org/std/primitive.str.html#method.get_unchecked
[`str::get`]: https://doc.rust-lang.org/std/primitive.str.html#method.get
[`str::into_boxed_bytes`]: https://doc.rust-lang.org/std/primitive.str.html#method.into_boxed_bytes
Version 1.19.0 (2017-07-20)
===========================
Language
--------
- [Numeric fields can now be used for creating tuple structs.][41145] [RFC 1506]
For example `struct Point(u32, u32); let x = Point { 0: 7, 1: 0 };`.
- [Macro recursion limit increased to 1024 from 64.][41676]
- [Added lint for detecting unused macros.][41907]
- [`loop` can now return a value with `break`.][42016] [RFC 1624]
For example: `let x = loop { break 7; };`
- [C compatible `union`s are now available.][42068] [RFC 1444] They can only
contain `Copy` types and cannot have a `Drop` implementation.
Example: `union Foo { bar: u8, baz: usize }`
- [Non capturing closures can now be coerced into `fn`s,][42162] [RFC 1558]
Example: `let foo: fn(u8) -> u8 = |v: u8| { v };`
Compiler
--------
- [Add support for bootstrapping the Rust compiler toolchain on Android.][41370]
- [Change `arm-linux-androideabi` to correspond to the `armeabi`
official ABI.][41656] If you wish to continue targeting the `armeabi-v7a` ABI
you should use `--target armv7-linux-androideabi`.
- [Fixed ICE when removing a source file between compilation sessions.][41873]
- [Minor optimisation of string operations.][42037]
- [Compiler error message is now `aborting due to previous error(s)` instead of
`aborting due to N previous errors`][42150] This was previously inaccurate and
would only count certain kinds of errors.
- [The compiler now supports Visual Studio 2017][42225]
- [The compiler is now built against LLVM 4.0.1 by default][42948]
- [Added a lot][42264] of [new error codes][42302]
- [Added `target-feature=+crt-static` option][37406] [RFC 1721] Which allows
libraries with C Run-time Libraries(CRT) to be statically linked.
- [Fixed various ARM codegen bugs][42740]
Libraries
---------
- [`String` now implements `FromIterator<Cow<'a, str>>` and
`Extend<Cow<'a, str>>`][41449]
- [`Vec` now implements `From<&mut [T]>`][41530]
- [`Box<[u8]>` now implements `From<Box<str>>`][41258]
- [`SplitWhitespace` now implements `Clone`][41659]
- [`[u8]::reverse` is now 5x faster and `[u16]::reverse` is now
1.5x faster][41764]
- [`eprint!` and `eprintln!` macros added to prelude.][41192] Same as the `print!`
macros, but for printing to stderr.
Stabilized APIs
---------------
- [`OsString::shrink_to_fit`]
- [`cmp::Reverse`]
- [`Command::envs`]
- [`thread::ThreadId`]
Cargo
-----
- [Build scripts can now add environment variables to the environment
the crate is being compiled in.
Example: `println!("cargo:rustc-env=FOO=bar");`][cargo/3929]
- [Subcommands now replace the current process rather than spawning a new
child process][cargo/3970]
- [Workspace members can now accept glob file patterns][cargo/3979]
- [Added `--all` flag to the `cargo bench` subcommand to run benchmarks of all
the members in a given workspace.][cargo/3988]
- [Updated `libssh2-sys` to 0.2.6][cargo/4008]
- [Target directory path is now in the cargo metadata][cargo/4022]
- [Cargo no longer checks out a local working directory for the
crates.io index][cargo/4026] This should provide smaller file size for the
registry, and improve cloning times, especially on Windows machines.
- [Added an `--exclude` option for excluding certain packages when using the
`--all` option][cargo/4031]
- [Cargo will now automatically retry when receiving a 5xx error
from crates.io][cargo/4032]
- [The `--features` option now accepts multiple comma or space
delimited values.][cargo/4084]
- [Added support for custom target specific runners][cargo/3954]
Misc
----
- [Added `rust-windbg.cmd`][39983] for loading rust `.natvis` files in the
Windows Debugger.
- [Rust will now release XZ compressed packages][rust-installer/57]
- [rustup will now prefer to download rust packages with
XZ compression][rustup/1100] over GZip packages.
- [Added the ability to escape `#` in rust documentation][41785] By adding
additional `#`'s ie. `##` is now `#`
Compatibility Notes
-------------------
- [`MutexGuard<T>` may only be `Sync` if `T` is `Sync`.][41624]
- [`-Z` flags are now no longer allowed to be used on the stable
compiler.][41751] This has been a warning for a year previous to this.
- [As a result of the `-Z` flag change, the `cargo-check` plugin no
longer works][42844]. Users should migrate to the built-in `check`
command, which has been available since 1.16.
- [Ending a float literal with `._` is now a hard error.
Example: `42._` .][41946]
- [Any use of a private `extern crate` outside of its module is now a
hard error.][36886] This was previously a warning.
- [`use ::self::foo;` is now a hard error.][36888] `self` paths are always
relative while the `::` prefix makes a path absolute, but was ignored and the
path was relative regardless.
- [Floating point constants in match patterns is now a hard error][36890]
This was previously a warning.
- [Struct or enum constants that don't derive `PartialEq` & `Eq` used
match patterns is now a hard error][36891] This was previously a warning.
- [Lifetimes named `'_` are no longer allowed.][36892] This was previously
a warning.
- [From the pound escape, lines consisting of multiple `#`s are
now visible][41785]
- [It is an error to reexport private enum variants][42460]. This is
known to break a number of crates that depend on an older version of
mustache.
- [On Windows, if `VCINSTALLDIR` is set incorrectly, `rustc` will try
to use it to find the linker, and the build will fail where it did
not previously][42607]
[36886]: https://github.com/rust-lang/rust/issues/36886
[36888]: https://github.com/rust-lang/rust/issues/36888
[36890]: https://github.com/rust-lang/rust/issues/36890
[36891]: https://github.com/rust-lang/rust/issues/36891
[36892]: https://github.com/rust-lang/rust/issues/36892
[37406]: https://github.com/rust-lang/rust/issues/37406
[39983]: https://github.com/rust-lang/rust/pull/39983
[41145]: https://github.com/rust-lang/rust/pull/41145
[41192]: https://github.com/rust-lang/rust/pull/41192
[41258]: https://github.com/rust-lang/rust/pull/41258
[41370]: https://github.com/rust-lang/rust/pull/41370
[41449]: https://github.com/rust-lang/rust/pull/41449
[41530]: https://github.com/rust-lang/rust/pull/41530
[41624]: https://github.com/rust-lang/rust/pull/41624
[41656]: https://github.com/rust-lang/rust/pull/41656
[41659]: https://github.com/rust-lang/rust/pull/41659
[41676]: https://github.com/rust-lang/rust/pull/41676
[41751]: https://github.com/rust-lang/rust/pull/41751
[41764]: https://github.com/rust-lang/rust/pull/41764
[41785]: https://github.com/rust-lang/rust/pull/41785
[41873]: https://github.com/rust-lang/rust/pull/41873
[41907]: https://github.com/rust-lang/rust/pull/41907
[41946]: https://github.com/rust-lang/rust/pull/41946
[42016]: https://github.com/rust-lang/rust/pull/42016
[42037]: https://github.com/rust-lang/rust/pull/42037
[42068]: https://github.com/rust-lang/rust/pull/42068
[42150]: https://github.com/rust-lang/rust/pull/42150
[42162]: https://github.com/rust-lang/rust/pull/42162
[42225]: https://github.com/rust-lang/rust/pull/42225
[42264]: https://github.com/rust-lang/rust/pull/42264
[42302]: https://github.com/rust-lang/rust/pull/42302
[42460]: https://github.com/rust-lang/rust/issues/42460
[42607]: https://github.com/rust-lang/rust/issues/42607
[42740]: https://github.com/rust-lang/rust/pull/42740
[42844]: https://github.com/rust-lang/rust/issues/42844
[42948]: https://github.com/rust-lang/rust/pull/42948
[RFC 1444]: https://github.com/rust-lang/rfcs/pull/1444
[RFC 1506]: https://github.com/rust-lang/rfcs/pull/1506
[RFC 1558]: https://github.com/rust-lang/rfcs/pull/1558
[RFC 1624]: https://github.com/rust-lang/rfcs/pull/1624
[RFC 1721]: https://github.com/rust-lang/rfcs/pull/1721
[`Command::envs`]: https://doc.rust-lang.org/std/process/struct.Command.html#method.envs
[`OsString::shrink_to_fit`]: https://doc.rust-lang.org/std/ffi/struct.OsString.html#method.shrink_to_fit
[`cmp::Reverse`]: https://doc.rust-lang.org/std/cmp/struct.Reverse.html
[`thread::ThreadId`]: https://doc.rust-lang.org/std/thread/struct.ThreadId.html
[cargo/3929]: https://github.com/rust-lang/cargo/pull/3929
[cargo/3954]: https://github.com/rust-lang/cargo/pull/3954
[cargo/3970]: https://github.com/rust-lang/cargo/pull/3970
[cargo/3979]: https://github.com/rust-lang/cargo/pull/3979
[cargo/3988]: https://github.com/rust-lang/cargo/pull/3988
[cargo/4008]: https://github.com/rust-lang/cargo/pull/4008
[cargo/4022]: https://github.com/rust-lang/cargo/pull/4022
[cargo/4026]: https://github.com/rust-lang/cargo/pull/4026
[cargo/4031]: https://github.com/rust-lang/cargo/pull/4031
[cargo/4032]: https://github.com/rust-lang/cargo/pull/4032
[cargo/4084]: https://github.com/rust-lang/cargo/pull/4084
[rust-installer/57]: https://github.com/rust-lang/rust-installer/pull/57
[rustup/1100]: https://github.com/rust-lang-nursery/rustup.rs/pull/1100
Version 1.18.0 (2017-06-08) Version 1.18.0 (2017-06-08)
=========================== ===========================
@ -530,6 +897,9 @@ Compatibility Notes
* [Ctrl-Z returns from `Stdin.read()` when reading from the console on * [Ctrl-Z returns from `Stdin.read()` when reading from the console on
Windows][38274] Windows][38274]
* [Clean up semantics of `self` in an import list][38313] * [Clean up semantics of `self` in an import list][38313]
* Reimplemented lifetime elision. This change was almost entirely compatible
with existing code, but it did close a number of small bugs and loopholes,
as well as being more accepting in some other [cases][41105].
[37057]: https://github.com/rust-lang/rust/pull/37057 [37057]: https://github.com/rust-lang/rust/pull/37057
[37761]: https://github.com/rust-lang/rust/pull/37761 [37761]: https://github.com/rust-lang/rust/pull/37761
@ -564,6 +934,7 @@ Compatibility Notes
[39048]: https://github.com/rust-lang/rust/pull/39048 [39048]: https://github.com/rust-lang/rust/pull/39048
[39282]: https://github.com/rust-lang/rust/pull/39282 [39282]: https://github.com/rust-lang/rust/pull/39282
[39379]: https://github.com/rust-lang/rust/pull/39379 [39379]: https://github.com/rust-lang/rust/pull/39379
[41105]: https://github.com/rust-lang/rust/issues/41105
[`<*const T>::wrapping_offset`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.wrapping_offset [`<*const T>::wrapping_offset`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.wrapping_offset
[`<*mut T>::wrapping_offset`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.wrapping_offset [`<*mut T>::wrapping_offset`]: https://doc.rust-lang.org/std/primitive.pointer.html#method.wrapping_offset
[`Duration::checked_add`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.checked_add [`Duration::checked_add`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.checked_add

10
configure vendored
View File

@ -437,7 +437,6 @@ opt local-rust 0 "use an installed rustc rather than downloading a snapshot"
opt local-rebuild 0 "assume local-rust matches the current version, for rebuilds; implies local-rust, and is implied if local-rust already matches the current version" opt local-rebuild 0 "assume local-rust matches the current version, for rebuilds; implies local-rust, and is implied if local-rust already matches the current version"
opt llvm-static-stdcpp 0 "statically link to libstdc++ for LLVM" opt llvm-static-stdcpp 0 "statically link to libstdc++ for LLVM"
opt llvm-link-shared 0 "prefer shared linking to LLVM (llvm-config --link-shared)" opt llvm-link-shared 0 "prefer shared linking to LLVM (llvm-config --link-shared)"
opt llvm-clean-rebuild 0 "delete LLVM build directory on rebuild"
opt rpath 1 "build rpaths into rustc itself" opt rpath 1 "build rpaths into rustc itself"
opt stage0-landing-pads 1 "enable landing pads during bootstrap with stage0" opt stage0-landing-pads 1 "enable landing pads during bootstrap with stage0"
# This is used by the automation to produce single-target nightlies # This is used by the automation to produce single-target nightlies
@ -490,6 +489,7 @@ valopt musl-root-armhf "" "arm-unknown-linux-musleabihf install directory"
valopt musl-root-armv7 "" "armv7-unknown-linux-musleabihf install directory" valopt musl-root-armv7 "" "armv7-unknown-linux-musleabihf install directory"
valopt extra-filename "" "Additional data that is hashed and passed to the -C extra-filename flag" valopt extra-filename "" "Additional data that is hashed and passed to the -C extra-filename flag"
valopt qemu-armhf-rootfs "" "rootfs in qemu testing, you probably don't want to use this" valopt qemu-armhf-rootfs "" "rootfs in qemu testing, you probably don't want to use this"
valopt qemu-aarch64-rootfs "" "rootfs in qemu testing, you probably don't want to use this"
valopt experimental-targets "" "experimental LLVM targets to build" valopt experimental-targets "" "experimental LLVM targets to build"
if [ -e ${CFG_SRC_DIR}.git ] if [ -e ${CFG_SRC_DIR}.git ]
@ -560,8 +560,8 @@ case "$CFG_RELEASE_CHANNEL" in
*-pc-windows-gnu) *-pc-windows-gnu)
;; ;;
*) *)
CFG_ENABLE_DEBUGINFO_LINES=1 enable_if_not_disabled debuginfo-lines
CFG_ENABLE_DEBUGINFO_ONLY_STD=1 enable_if_not_disabled debuginfo-only-std
;; ;;
esac esac
@ -572,8 +572,8 @@ case "$CFG_RELEASE_CHANNEL" in
*-pc-windows-gnu) *-pc-windows-gnu)
;; ;;
*) *)
CFG_ENABLE_DEBUGINFO_LINES=1 enable_if_not_disabled debuginfo-lines
CFG_ENABLE_DEBUGINFO_ONLY_STD=1 enable_if_not_disabled debuginfo-only-std
;; ;;
esac esac
;; ;;

774
src/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -16,6 +16,27 @@ members = [
"tools/remote-test-server", "tools/remote-test-server",
"tools/rust-installer", "tools/rust-installer",
"tools/cargo", "tools/cargo",
"tools/rustdoc",
"tools/rls",
# FIXME(https://github.com/rust-lang/cargo/issues/4089): move these to exclude
"tools/rls/test_data/borrow_error",
"tools/rls/test_data/completion",
"tools/rls/test_data/find_all_refs",
"tools/rls/test_data/find_all_refs_no_cfg_test",
"tools/rls/test_data/goto_def",
"tools/rls/test_data/highlight",
"tools/rls/test_data/hover",
"tools/rls/test_data/rename",
"tools/rls/test_data/reformat",
"tools/rls/test_data/bin_lib_no_cfg_test",
"tools/rls/test_data/multiple_bins",
"tools/rls/test_data/bin_lib",
"tools/rls/test_data/reformat_with_range",
"tools/rls/test_data/find_impls",
"tools/rls/test_data/infer_bin",
"tools/rls/test_data/infer_custom_bin",
"tools/rls/test_data/infer_lib",
"tools/rls/test_data/omit_init_build",
] ]
# Curiously, compiletest will segfault if compiled with opt-level=3 on 64-bit # Curiously, compiletest will segfault if compiled with opt-level=3 on 64-bit

View File

@ -33,8 +33,11 @@ build_helper = { path = "../build_helper" }
cmake = "0.1.23" cmake = "0.1.23"
filetime = "0.1" filetime = "0.1"
num_cpus = "1.0" num_cpus = "1.0"
toml = "0.1"
getopts = "0.2" getopts = "0.2"
rustc-serialize = "0.3"
gcc = "0.3.50" gcc = "0.3.50"
libc = "0.2" libc = "0.2"
serde = "1.0.8"
serde_derive = "1.0.8"
serde_json = "1.0.2"
toml = "0.4"
lazy_static = "0.2"

View File

@ -73,16 +73,19 @@ The script accepts commands, flags, and arguments to determine what to do:
## Configuring rustbuild ## Configuring rustbuild
There are currently two primary methods for configuring the rustbuild build There are currently two methods for configuring the rustbuild build system.
system. First, the `./configure` options serialized in `config.mk` will be
parsed and read. That is, if any `./configure` options are passed, they'll be
handled naturally.
Next, rustbuild offers a TOML-based configuration system with a `config.toml` First, rustbuild offers a TOML-based configuration system with a `config.toml`
file in the same location as `config.mk`. An example of this configuration can file in the same location as `config.mk`. An example of this configuration can
be found at `src/bootstrap/config.toml.example`, and the configuration file be found at `config.toml.example`, and the configuration file can also be passed
can also be passed as `--config path/to/config.toml` if the build system is as `--config path/to/config.toml` if the build system is being invoked manually
being invoked manually (via the python script). (via the python script).
Next, the `./configure` options serialized in `config.mk` will be
parsed and read. That is, if any `./configure` options are passed, they'll be
handled naturally. `./configure` should almost never be used for local
installations, and is primarily useful for CI. Prefer to customize behavior
using `config.toml`.
Finally, rustbuild makes use of the [gcc-rs crate] which has [its own Finally, rustbuild makes use of the [gcc-rs crate] which has [its own
method][env-vars] of configuring C compilers and C flags via environment method][env-vars] of configuring C compilers and C flags via environment
@ -310,17 +313,18 @@ After that, each module in rustbuild should have enough documentation to keep
you up and running. Some general areas that you may be interested in modifying you up and running. Some general areas that you may be interested in modifying
are: are:
* Adding a new build tool? Take a look at `bootstrap/step.rs` for examples of * Adding a new build tool? Take a look at `bootstrap/tool.rs` for examples of
other tools. other tools.
* Adding a new compiler crate? Look no further! Adding crates can be done by * Adding a new compiler crate? Look no further! Adding crates can be done by
adding a new directory with `Cargo.toml` followed by configuring all adding a new directory with `Cargo.toml` followed by configuring all
`Cargo.toml` files accordingly. `Cargo.toml` files accordingly.
* Adding a new dependency from crates.io? We're still working on that, so hold * Adding a new dependency from crates.io? This should just work inside the
off on that for now. compiler artifacts stage (everything other than libtest and libstd).
* Adding a new configuration option? Take a look at `bootstrap/config.rs` or * Adding a new configuration option? You'll want to modify `bootstrap/flags.rs`
perhaps `bootstrap/flags.rs` and then modify the build elsewhere to read that for command line flags and then `bootstrap/config.rs` to copy the flags to the
option. `Config` struct.
* Adding a sanity check? Take a look at `bootstrap/sanity.rs`. * Adding a sanity check? Take a look at `bootstrap/sanity.rs`.
If you have any questions feel free to reach out on `#rust-internals` on IRC or If you have any questions feel free to reach out on `#rust-infra` on IRC or ask on
open an issue in the bug tracker! internals.rust-lang.org. When you encounter bugs, please file issues on the
rust-lang/rust issue tracker.

View File

@ -21,11 +21,10 @@ extern crate bootstrap;
use std::env; use std::env;
use bootstrap::{Flags, Config, Build}; use bootstrap::{Config, Build};
fn main() { fn main() {
let args = env::args().skip(1).collect::<Vec<_>>(); let args = env::args().skip(1).collect::<Vec<_>>();
let flags = Flags::parse(&args); let config = Config::parse(&args);
let config = Config::parse(&flags.build, flags.config.clone()); Build::new(config).build();
Build::new(flags, config).build();
} }

View File

@ -185,7 +185,10 @@ fn main() {
// Emit save-analysis info. // Emit save-analysis info.
if env::var("RUSTC_SAVE_ANALYSIS") == Ok("api".to_string()) { if env::var("RUSTC_SAVE_ANALYSIS") == Ok("api".to_string()) {
cmd.arg("-Zsave-analysis-api"); cmd.arg("-Zsave-analysis");
cmd.env("RUST_SAVE_ANALYSIS_CONFIG",
"{\"output_file\": null,\"full_docs\": false,\"pub_only\": true,\
\"distro_crate\": true,\"signatures\": false,\"borrow_data\": false}");
} }
// Dealing with rpath here is a little special, so let's go into some // Dealing with rpath here is a little special, so let's go into some
@ -234,10 +237,14 @@ fn main() {
} }
} }
if target.contains("pc-windows-msvc") { if let Ok(s) = env::var("RUSTC_CRT_STATIC") {
cmd.arg("-Z").arg("unstable-options"); if s == "true" {
cmd.arg("-C").arg("target-feature=+crt-static"); cmd.arg("-C").arg("target-feature=+crt-static");
} }
if s == "false" {
cmd.arg("-C").arg("target-feature=-crt-static");
}
}
// Force all crates compiled by this compiler to (a) be unstable and (b) // Force all crates compiled by this compiler to (a) be unstable and (b)
// allow the `rustc_private` feature to link to other unstable crates // allow the `rustc_private` feature to link to other unstable crates

View File

@ -37,12 +37,12 @@ def get(url, path, verbose=False):
if os.path.exists(path): if os.path.exists(path):
if verify(path, sha_path, False): if verify(path, sha_path, False):
if verbose: if verbose:
print("using already-download file " + path) print("using already-download file", path)
return return
else: else:
if verbose: if verbose:
print("ignoring already-download file " + print("ignoring already-download file",
path + " due to failed verification") path, "due to failed verification")
os.unlink(path) os.unlink(path)
download(temp_path, url, True, verbose) download(temp_path, url, True, verbose)
if not verify(temp_path, sha_path, verbose): if not verify(temp_path, sha_path, verbose):
@ -59,12 +59,12 @@ def delete_if_present(path, verbose):
"""Remove the given file if present""" """Remove the given file if present"""
if os.path.isfile(path): if os.path.isfile(path):
if verbose: if verbose:
print("removing " + path) print("removing", path)
os.unlink(path) os.unlink(path)
def download(path, url, probably_big, verbose): def download(path, url, probably_big, verbose):
for x in range(0, 4): for _ in range(0, 4):
try: try:
_download(path, url, probably_big, verbose, True) _download(path, url, probably_big, verbose, True)
return return
@ -96,7 +96,7 @@ def _download(path, url, probably_big, verbose, exception):
def verify(path, sha_path, verbose): def verify(path, sha_path, verbose):
"""Check if the sha256 sum of the given path is valid""" """Check if the sha256 sum of the given path is valid"""
if verbose: if verbose:
print("verifying " + path) print("verifying", path)
with open(path, "rb") as source: with open(path, "rb") as source:
found = hashlib.sha256(source.read()).hexdigest() found = hashlib.sha256(source.read()).hexdigest()
with open(sha_path, "r") as sha256sum: with open(sha_path, "r") as sha256sum:
@ -111,29 +111,30 @@ def verify(path, sha_path, verbose):
def unpack(tarball, dst, verbose=False, match=None): def unpack(tarball, dst, verbose=False, match=None):
"""Unpack the given tarball file""" """Unpack the given tarball file"""
print("extracting " + tarball) print("extracting", tarball)
fname = os.path.basename(tarball).replace(".tar.gz", "") fname = os.path.basename(tarball).replace(".tar.gz", "")
with contextlib.closing(tarfile.open(tarball)) as tar: with contextlib.closing(tarfile.open(tarball)) as tar:
for p in tar.getnames(): for member in tar.getnames():
if "/" not in p: if "/" not in member:
continue continue
name = p.replace(fname + "/", "", 1) name = member.replace(fname + "/", "", 1)
if match is not None and not name.startswith(match): if match is not None and not name.startswith(match):
continue continue
name = name[len(match) + 1:] name = name[len(match) + 1:]
fp = os.path.join(dst, name) dst_path = os.path.join(dst, name)
if verbose: if verbose:
print(" extracting " + p) print(" extracting", member)
tar.extract(p, dst) tar.extract(member, dst)
tp = os.path.join(dst, p) src_path = os.path.join(dst, member)
if os.path.isdir(tp) and os.path.exists(fp): if os.path.isdir(src_path) and os.path.exists(dst_path):
continue continue
shutil.move(tp, fp) shutil.move(src_path, dst_path)
shutil.rmtree(os.path.join(dst, fname)) shutil.rmtree(os.path.join(dst, fname))
def run(args, verbose=False, exception=False, **kwargs): def run(args, verbose=False, exception=False, **kwargs):
"""Run a child program in a new process"""
if verbose: if verbose:
print("running: " + ' '.join(args)) print("running: " + ' '.join(args))
sys.stdout.flush() sys.stdout.flush()
@ -149,97 +150,118 @@ def run(args, verbose=False, exception=False, **kwargs):
def stage0_data(rust_root): def stage0_data(rust_root):
"""Build a dictionary from stage0.txt"""
nightlies = os.path.join(rust_root, "src/stage0.txt") nightlies = os.path.join(rust_root, "src/stage0.txt")
data = {}
with open(nightlies, 'r') as nightlies: with open(nightlies, 'r') as nightlies:
for line in nightlies: lines = [line.rstrip() for line in nightlies
line = line.rstrip() # Strip newline character, '\n' if not line.startswith("#")]
if line.startswith("#") or line == '': return dict([line.split(": ", 1) for line in lines if line])
continue
a, b = line.split(": ", 1)
data[a] = b
return data
def format_build_time(duration): def format_build_time(duration):
"""Return a nicer format for build time
>>> format_build_time('300')
'0:05:00'
"""
return str(datetime.timedelta(seconds=int(duration))) return str(datetime.timedelta(seconds=int(duration)))
class RustBuild(object): class RustBuild(object):
"""Provide all the methods required to build Rust"""
def __init__(self):
self.cargo_channel = ''
self.date = ''
self._download_url = 'https://static.rust-lang.org'
self.rustc_channel = ''
self.build = ''
self.build_dir = os.path.join(os.getcwd(), "build")
self.clean = False
self.config_mk = ''
self.config_toml = ''
self.printed = False
self.rust_root = os.path.abspath(os.path.join(__file__, '../../..'))
self.use_locked_deps = ''
self.use_vendored_sources = ''
self.verbose = False
def download_stage0(self): def download_stage0(self):
cache_dst = os.path.join(self.build_dir, "cache") """Fetch the build system for Rust, written in Rust
rustc_cache = os.path.join(cache_dst, self.stage0_date())
if not os.path.exists(rustc_cache):
os.makedirs(rustc_cache)
rustc_channel = self.stage0_rustc_channel() This method will build a cache directory, then it will fetch the
cargo_channel = self.stage0_cargo_channel() tarball which has the stage0 compiler used to then bootstrap the Rust
compiler itself.
Each downloaded tarball is extracted, after that, the script
will move all the content to the right place.
"""
rustc_channel = self.rustc_channel
cargo_channel = self.cargo_channel
if self.rustc().startswith(self.bin_root()) and \ if self.rustc().startswith(self.bin_root()) and \
(not os.path.exists(self.rustc()) or self.rustc_out_of_date()): (not os.path.exists(self.rustc()) or
self.print_what_it_means_to_bootstrap() self.program_out_of_date(self.rustc_stamp())):
self.print_what_bootstrap_means()
if os.path.exists(self.bin_root()): if os.path.exists(self.bin_root()):
shutil.rmtree(self.bin_root()) shutil.rmtree(self.bin_root())
filename = "rust-std-{}-{}.tar.gz".format( filename = "rust-std-{}-{}.tar.gz".format(
rustc_channel, self.build) rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date() pattern = "rust-std-{}".format(self.build)
tarball = os.path.join(rustc_cache, filename) self._download_stage0_helper(filename, pattern)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="rust-std-" + self.build,
verbose=self.verbose)
filename = "rustc-{}-{}.tar.gz".format(rustc_channel, self.build) filename = "rustc-{}-{}.tar.gz".format(rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date() self._download_stage0_helper(filename, "rustc")
tarball = os.path.join(rustc_cache, filename) self.fix_executable("{}/bin/rustc".format(self.bin_root()))
if not os.path.exists(tarball): self.fix_executable("{}/bin/rustdoc".format(self.bin_root()))
get("{}/{}".format(url, filename), with open(self.rustc_stamp(), 'w') as rust_stamp:
tarball, verbose=self.verbose) rust_stamp.write(self.date)
unpack(tarball, self.bin_root(),
match="rustc", verbose=self.verbose)
self.fix_executable(self.bin_root() + "/bin/rustc")
self.fix_executable(self.bin_root() + "/bin/rustdoc")
with open(self.rustc_stamp(), 'w') as f:
f.write(self.stage0_date())
if "pc-windows-gnu" in self.build: if "pc-windows-gnu" in self.build:
filename = "rust-mingw-{}-{}.tar.gz".format( filename = "rust-mingw-{}-{}.tar.gz".format(
rustc_channel, self.build) rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date() self._download_stage0_helper(filename, "rust-mingw")
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="rust-mingw", verbose=self.verbose)
if self.cargo().startswith(self.bin_root()) and \ if self.cargo().startswith(self.bin_root()) and \
(not os.path.exists(self.cargo()) or self.cargo_out_of_date()): (not os.path.exists(self.cargo()) or
self.print_what_it_means_to_bootstrap() self.program_out_of_date(self.cargo_stamp())):
self.print_what_bootstrap_means()
filename = "cargo-{}-{}.tar.gz".format(cargo_channel, self.build) filename = "cargo-{}-{}.tar.gz".format(cargo_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date() self._download_stage0_helper(filename, "cargo")
self.fix_executable("{}/bin/cargo".format(self.bin_root()))
with open(self.cargo_stamp(), 'w') as cargo_stamp:
cargo_stamp.write(self.date)
def _download_stage0_helper(self, filename, pattern):
cache_dst = os.path.join(self.build_dir, "cache")
rustc_cache = os.path.join(cache_dst, self.date)
if not os.path.exists(rustc_cache):
os.makedirs(rustc_cache)
url = "{}/dist/{}".format(self._download_url, self.date)
tarball = os.path.join(rustc_cache, filename) tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball): if not os.path.exists(tarball):
get("{}/{}".format(url, filename), get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
tarball, verbose=self.verbose) unpack(tarball, self.bin_root(), match=pattern, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="cargo", verbose=self.verbose)
self.fix_executable(self.bin_root() + "/bin/cargo")
with open(self.cargo_stamp(), 'w') as f:
f.write(self.stage0_date())
def fix_executable(self, fname): @staticmethod
# If we're on NixOS we need to change the path to the dynamic loader def fix_executable(fname):
"""Modifies the interpreter section of 'fname' to fix the dynamic linker
This method is only required on NixOS and uses the PatchELF utility to
change the dynamic linker of ELF executables.
Please see https://nixos.org/patchelf.html for more information
"""
default_encoding = sys.getdefaultencoding() default_encoding = sys.getdefaultencoding()
try: try:
ostype = subprocess.check_output( ostype = subprocess.check_output(
['uname', '-s']).strip().decode(default_encoding) ['uname', '-s']).strip().decode(default_encoding)
except (subprocess.CalledProcessError, WindowsError): except subprocess.CalledProcessError:
return return
except OSError as reason:
if getattr(reason, 'winerror', None) is not None:
return
raise reason
if ostype != "Linux": if ostype != "Linux":
return return
@ -257,8 +279,8 @@ class RustBuild(object):
interpreter = subprocess.check_output( interpreter = subprocess.check_output(
["patchelf", "--print-interpreter", fname]) ["patchelf", "--print-interpreter", fname])
interpreter = interpreter.strip().decode(default_encoding) interpreter = interpreter.strip().decode(default_encoding)
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as reason:
print("warning: failed to call patchelf: %s" % e) print("warning: failed to call patchelf:", reason)
return return
loader = interpreter.split("/")[-1] loader = interpreter.split("/")[-1]
@ -267,8 +289,8 @@ class RustBuild(object):
ldd_output = subprocess.check_output( ldd_output = subprocess.check_output(
['ldd', '/run/current-system/sw/bin/sh']) ['ldd', '/run/current-system/sw/bin/sh'])
ldd_output = ldd_output.strip().decode(default_encoding) ldd_output = ldd_output.strip().decode(default_encoding)
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as reason:
print("warning: unable to call ldd: %s" % e) print("warning: unable to call ldd:", reason)
return return
for line in ldd_output.splitlines(): for line in ldd_output.splitlines():
@ -285,45 +307,66 @@ class RustBuild(object):
try: try:
subprocess.check_output( subprocess.check_output(
["patchelf", "--set-interpreter", correct_interpreter, fname]) ["patchelf", "--set-interpreter", correct_interpreter, fname])
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as reason:
print("warning: failed to call patchelf: %s" % e) print("warning: failed to call patchelf:", reason)
return return
def stage0_date(self):
return self._date
def stage0_rustc_channel(self):
return self._rustc_channel
def stage0_cargo_channel(self):
return self._cargo_channel
def rustc_stamp(self): def rustc_stamp(self):
"""Return the path for .rustc-stamp""" """Return the path for .rustc-stamp
>>> rb = RustBuild()
>>> rb.build_dir = "build"
>>> rb.rustc_stamp() == os.path.join("build", "stage0", ".rustc-stamp")
True
"""
return os.path.join(self.bin_root(), '.rustc-stamp') return os.path.join(self.bin_root(), '.rustc-stamp')
def cargo_stamp(self): def cargo_stamp(self):
"""Return the path for .cargo-stamp""" """Return the path for .cargo-stamp
>>> rb = RustBuild()
>>> rb.build_dir = "build"
>>> rb.cargo_stamp() == os.path.join("build", "stage0", ".cargo-stamp")
True
"""
return os.path.join(self.bin_root(), '.cargo-stamp') return os.path.join(self.bin_root(), '.cargo-stamp')
def rustc_out_of_date(self): def program_out_of_date(self, stamp_path):
"""Check if rustc is out of date""" """Check if the given program stamp is out of date"""
if not os.path.exists(self.rustc_stamp()) or self.clean: if not os.path.exists(stamp_path) or self.clean:
return True return True
with open(self.rustc_stamp(), 'r') as f: with open(stamp_path, 'r') as stamp:
return self.stage0_date() != f.read() return self.date != stamp.read()
def cargo_out_of_date(self):
"""Check if cargo is out of date"""
if not os.path.exists(self.cargo_stamp()) or self.clean:
return True
with open(self.cargo_stamp(), 'r') as f:
return self.stage0_date() != f.read()
def bin_root(self): def bin_root(self):
"""Return the binary root directory
>>> rb = RustBuild()
>>> rb.build_dir = "build"
>>> rb.bin_root() == os.path.join("build", "stage0")
True
When the 'build' property is given should be a nested directory:
>>> rb.build = "devel"
>>> rb.bin_root() == os.path.join("build", "devel", "stage0")
True
"""
return os.path.join(self.build_dir, self.build, "stage0") return os.path.join(self.build_dir, self.build, "stage0")
def get_toml(self, key): def get_toml(self, key):
"""Returns the value of the given key in config.toml, otherwise returns None
>>> rb = RustBuild()
>>> rb.config_toml = 'key1 = "value1"\\nkey2 = "value2"'
>>> rb.get_toml("key2")
'value2'
If the key does not exists, the result is None:
>>> rb.get_toml("key3") == None
True
"""
for line in self.config_toml.splitlines(): for line in self.config_toml.splitlines():
match = re.match(r'^{}\s*=(.*)$'.format(key), line) match = re.match(r'^{}\s*=(.*)$'.format(key), line)
if match is not None: if match is not None:
@ -332,6 +375,18 @@ class RustBuild(object):
return None return None
def get_mk(self, key): def get_mk(self, key):
"""Returns the value of the given key in config.mk, otherwise returns None
>>> rb = RustBuild()
>>> rb.config_mk = 'key := value\\n'
>>> rb.get_mk('key')
'value'
If the key does not exists, the result is None:
>>> rb.get_mk('does_not_exists') == None
True
"""
for line in iter(self.config_mk.splitlines()): for line in iter(self.config_mk.splitlines()):
if line.startswith(key + ' '): if line.startswith(key + ' '):
var = line[line.find(':=') + 2:].strip() var = line[line.find(':=') + 2:].strip()
@ -340,36 +395,64 @@ class RustBuild(object):
return None return None
def cargo(self): def cargo(self):
config = self.get_toml('cargo') """Return config path for cargo"""
if config: return self.program_config('cargo')
return config
config = self.get_mk('CFG_LOCAL_RUST_ROOT')
if config:
return config + '/bin/cargo' + self.exe_suffix()
return os.path.join(self.bin_root(), "bin/cargo" + self.exe_suffix())
def rustc(self): def rustc(self):
config = self.get_toml('rustc') """Return config path for rustc"""
return self.program_config('rustc')
def program_config(self, program):
"""Return config path for the given program
>>> rb = RustBuild()
>>> rb.config_toml = 'rustc = "rustc"\\n'
>>> rb.config_mk = 'CFG_LOCAL_RUST_ROOT := /tmp/rust\\n'
>>> rb.program_config('rustc')
'rustc'
>>> cargo_path = rb.program_config('cargo')
>>> cargo_path.rstrip(".exe") == os.path.join("/tmp/rust",
... "bin", "cargo")
True
>>> rb.config_toml = ''
>>> rb.config_mk = ''
>>> cargo_path = rb.program_config('cargo')
>>> cargo_path.rstrip(".exe") == os.path.join(rb.bin_root(),
... "bin", "cargo")
True
"""
config = self.get_toml(program)
if config: if config:
return config return config
config = self.get_mk('CFG_LOCAL_RUST_ROOT') config = self.get_mk('CFG_LOCAL_RUST_ROOT')
if config: if config:
return config + '/bin/rustc' + self.exe_suffix() return os.path.join(config, "bin", "{}{}".format(
return os.path.join(self.bin_root(), "bin/rustc" + self.exe_suffix()) program, self.exe_suffix()))
return os.path.join(self.bin_root(), "bin", "{}{}".format(
program, self.exe_suffix()))
def get_string(self, line): @staticmethod
def get_string(line):
"""Return the value between double quotes
>>> RustBuild.get_string(' "devel" ')
'devel'
"""
start = line.find('"') start = line.find('"')
if start == -1: if start == -1:
return None return None
end = start + 1 + line[start + 1:].find('"') end = start + 1 + line[start + 1:].find('"')
return line[start + 1:end] return line[start + 1:end]
def exe_suffix(self): @staticmethod
def exe_suffix():
"""Return a suffix for executables"""
if sys.platform == 'win32': if sys.platform == 'win32':
return '.exe' return '.exe'
return '' return ''
def print_what_it_means_to_bootstrap(self): def print_what_bootstrap_means(self):
"""Prints more information about the build system"""
if hasattr(self, 'printed'): if hasattr(self, 'printed'):
return return
self.printed = True self.printed = True
@ -386,10 +469,19 @@ class RustBuild(object):
print(' src/bootstrap/README.md before the download finishes') print(' src/bootstrap/README.md before the download finishes')
def bootstrap_binary(self): def bootstrap_binary(self):
return os.path.join(self.build_dir, "bootstrap/debug/bootstrap") """Return the path of the boostrap binary
>>> rb = RustBuild()
>>> rb.build_dir = "build"
>>> rb.bootstrap_binary() == os.path.join("build", "bootstrap",
... "debug", "bootstrap")
True
"""
return os.path.join(self.build_dir, "bootstrap", "debug", "bootstrap")
def build_bootstrap(self): def build_bootstrap(self):
self.print_what_it_means_to_bootstrap() """Build bootstrap"""
self.print_what_bootstrap_means()
build_dir = os.path.join(self.build_dir, "bootstrap") build_dir = os.path.join(self.build_dir, "bootstrap")
if self.clean and os.path.exists(build_dir): if self.clean and os.path.exists(build_dir):
shutil.rmtree(build_dir) shutil.rmtree(build_dir)
@ -409,7 +501,8 @@ class RustBuild(object):
env["PATH"] = os.path.join(self.bin_root(), "bin") + \ env["PATH"] = os.path.join(self.bin_root(), "bin") + \
os.pathsep + env["PATH"] os.pathsep + env["PATH"]
if not os.path.isfile(self.cargo()): if not os.path.isfile(self.cargo()):
raise Exception("no cargo executable found at `%s`" % self.cargo()) raise Exception("no cargo executable found at `{}`".format(
self.cargo()))
args = [self.cargo(), "build", "--manifest-path", args = [self.cargo(), "build", "--manifest-path",
os.path.join(self.rust_root, "src/bootstrap/Cargo.toml")] os.path.join(self.rust_root, "src/bootstrap/Cargo.toml")]
if self.verbose: if self.verbose:
@ -423,6 +516,7 @@ class RustBuild(object):
run(args, env=env, verbose=self.verbose) run(args, env=env, verbose=self.verbose)
def build_triple(self): def build_triple(self):
"""Build triple as in LLVM"""
default_encoding = sys.getdefaultencoding() default_encoding = sys.getdefaultencoding()
config = self.get_toml('build') config = self.get_toml('build')
if config: if config:
@ -445,23 +539,26 @@ class RustBuild(object):
# The goal here is to come up with the same triple as LLVM would, # The goal here is to come up with the same triple as LLVM would,
# at least for the subset of platforms we're willing to target. # at least for the subset of platforms we're willing to target.
if ostype == 'Linux': ostype_mapper = {
'Bitrig': 'unknown-bitrig',
'Darwin': 'apple-darwin',
'DragonFly': 'unknown-dragonfly',
'FreeBSD': 'unknown-freebsd',
'Haiku': 'unknown-haiku',
'NetBSD': 'unknown-netbsd',
'OpenBSD': 'unknown-openbsd'
}
# Consider the direct transformation first and then the special cases
if ostype in ostype_mapper:
ostype = ostype_mapper[ostype]
elif ostype == 'Linux':
os_from_sp = subprocess.check_output( os_from_sp = subprocess.check_output(
['uname', '-o']).strip().decode(default_encoding) ['uname', '-o']).strip().decode(default_encoding)
if os_from_sp == 'Android': if os_from_sp == 'Android':
ostype = 'linux-android' ostype = 'linux-android'
else: else:
ostype = 'unknown-linux-gnu' ostype = 'unknown-linux-gnu'
elif ostype == 'FreeBSD':
ostype = 'unknown-freebsd'
elif ostype == 'DragonFly':
ostype = 'unknown-dragonfly'
elif ostype == 'Bitrig':
ostype = 'unknown-bitrig'
elif ostype == 'OpenBSD':
ostype = 'unknown-openbsd'
elif ostype == 'NetBSD':
ostype = 'unknown-netbsd'
elif ostype == 'SunOS': elif ostype == 'SunOS':
ostype = 'sun-solaris' ostype = 'sun-solaris'
# On Solaris, uname -m will return a machine classification instead # On Solaris, uname -m will return a machine classification instead
@ -477,10 +574,6 @@ class RustBuild(object):
if self.verbose: if self.verbose:
raise Exception(err) raise Exception(err)
sys.exit(err) sys.exit(err)
elif ostype == 'Darwin':
ostype = 'apple-darwin'
elif ostype == 'Haiku':
ostype = 'unknown-haiku'
elif ostype.startswith('MINGW'): elif ostype.startswith('MINGW'):
# msys' `uname` does not print gcc configuration, but prints msys # msys' `uname` does not print gcc configuration, but prints msys
# configuration. so we cannot believe `uname -m`: # configuration. so we cannot believe `uname -m`:
@ -499,13 +592,36 @@ class RustBuild(object):
cputype = 'x86_64' cputype = 'x86_64'
ostype = 'pc-windows-gnu' ostype = 'pc-windows-gnu'
else: else:
err = "unknown OS type: " + ostype err = "unknown OS type: {}".format(ostype)
if self.verbose: if self.verbose:
raise ValueError(err) raise ValueError(err)
sys.exit(err) sys.exit(err)
if cputype in {'i386', 'i486', 'i686', 'i786', 'x86'}: cputype_mapper = {
cputype = 'i686' 'BePC': 'i686',
'aarch64': 'aarch64',
'amd64': 'x86_64',
'arm64': 'aarch64',
'i386': 'i686',
'i486': 'i686',
'i686': 'i686',
'i786': 'i686',
'powerpc': 'powerpc',
'powerpc64': 'powerpc64',
'powerpc64le': 'powerpc64le',
'ppc': 'powerpc',
'ppc64': 'powerpc64',
'ppc64le': 'powerpc64le',
's390x': 's390x',
'x64': 'x86_64',
'x86': 'i686',
'x86-64': 'x86_64',
'x86_64': 'x86_64'
}
# Consider the direct transformation first and then the special cases
if cputype in cputype_mapper:
cputype = cputype_mapper[cputype]
elif cputype in {'xscale', 'arm'}: elif cputype in {'xscale', 'arm'}:
cputype = 'arm' cputype = 'arm'
if ostype == 'linux-android': if ostype == 'linux-android':
@ -522,40 +638,26 @@ class RustBuild(object):
ostype = 'linux-androideabi' ostype = 'linux-androideabi'
else: else:
ostype += 'eabihf' ostype += 'eabihf'
elif cputype in {'aarch64', 'arm64'}:
cputype = 'aarch64'
elif cputype == 'mips': elif cputype == 'mips':
if sys.byteorder == 'big': if sys.byteorder == 'big':
cputype = 'mips' cputype = 'mips'
elif sys.byteorder == 'little': elif sys.byteorder == 'little':
cputype = 'mipsel' cputype = 'mipsel'
else: else:
raise ValueError('unknown byteorder: ' + sys.byteorder) raise ValueError("unknown byteorder: {}".format(sys.byteorder))
elif cputype == 'mips64': elif cputype == 'mips64':
if sys.byteorder == 'big': if sys.byteorder == 'big':
cputype = 'mips64' cputype = 'mips64'
elif sys.byteorder == 'little': elif sys.byteorder == 'little':
cputype = 'mips64el' cputype = 'mips64el'
else: else:
raise ValueError('unknown byteorder: ' + sys.byteorder) raise ValueError('unknown byteorder: {}'.format(sys.byteorder))
# only the n64 ABI is supported, indicate it # only the n64 ABI is supported, indicate it
ostype += 'abi64' ostype += 'abi64'
elif cputype in {'powerpc', 'ppc'}:
cputype = 'powerpc'
elif cputype in {'powerpc64', 'ppc64'}:
cputype = 'powerpc64'
elif cputype in {'powerpc64le', 'ppc64le'}:
cputype = 'powerpc64le'
elif cputype == 'sparcv9': elif cputype == 'sparcv9':
pass pass
elif cputype in {'amd64', 'x86_64', 'x86-64', 'x64'}:
cputype = 'x86_64'
elif cputype == 's390x':
cputype = 's390x'
elif cputype == 'BePC':
cputype = 'i686'
else: else:
err = "unknown cpu type: " + cputype err = "unknown cpu type: {}".format(cputype)
if self.verbose: if self.verbose:
raise ValueError(err) raise ValueError(err)
sys.exit(err) sys.exit(err)
@ -563,6 +665,7 @@ class RustBuild(object):
return "{}-{}".format(cputype, ostype) return "{}-{}".format(cputype, ostype)
def update_submodules(self): def update_submodules(self):
"""Update submodules"""
if (not os.path.exists(os.path.join(self.rust_root, ".git"))) or \ if (not os.path.exists(os.path.join(self.rust_root, ".git"))) or \
self.get_toml('submodules') == "false" or \ self.get_toml('submodules') == "false" or \
self.get_mk('CFG_DISABLE_MANAGE_SUBMODULES') == "1": self.get_mk('CFG_DISABLE_MANAGE_SUBMODULES') == "1":
@ -592,10 +695,16 @@ class RustBuild(object):
"clean", "-qdfx"], "clean", "-qdfx"],
cwd=self.rust_root, verbose=self.verbose) cwd=self.rust_root, verbose=self.verbose)
def set_dev_environment(self):
"""Set download URL for development environment"""
self._download_url = 'https://dev-static.rust-lang.org'
def bootstrap(): def bootstrap():
"""Configure, fetch, build and run the initial bootstrap"""
parser = argparse.ArgumentParser(description='Build rust') parser = argparse.ArgumentParser(description='Build rust')
parser.add_argument('--config') parser.add_argument('--config')
parser.add_argument('--build')
parser.add_argument('--clean', action='store_true') parser.add_argument('--clean', action='store_true')
parser.add_argument('-v', '--verbose', action='store_true') parser.add_argument('-v', '--verbose', action='store_true')
@ -603,107 +712,103 @@ def bootstrap():
args, _ = parser.parse_known_args(args) args, _ = parser.parse_known_args(args)
# Configure initial bootstrap # Configure initial bootstrap
rb = RustBuild() build = RustBuild()
rb.config_toml = '' build.verbose = args.verbose
rb.config_mk = '' build.clean = args.clean
rb.rust_root = os.path.abspath(os.path.join(__file__, '../../..'))
rb.build_dir = os.path.join(os.getcwd(), "build")
rb.verbose = args.verbose
rb.clean = args.clean
try: try:
with open(args.config or 'config.toml') as config: with open(args.config or 'config.toml') as config:
rb.config_toml = config.read() build.config_toml = config.read()
except: except:
pass pass
try: try:
rb.config_mk = open('config.mk').read() build.config_mk = open('config.mk').read()
except: except:
pass pass
if '\nverbose = 2' in rb.config_toml: if '\nverbose = 2' in build.config_toml:
rb.verbose = 2 build.verbose = 2
elif '\nverbose = 1' in rb.config_toml: elif '\nverbose = 1' in build.config_toml:
rb.verbose = 1 build.verbose = 1
rb.use_vendored_sources = '\nvendor = true' in rb.config_toml or \ build.use_vendored_sources = '\nvendor = true' in build.config_toml or \
'CFG_ENABLE_VENDOR' in rb.config_mk 'CFG_ENABLE_VENDOR' in build.config_mk
rb.use_locked_deps = '\nlocked-deps = true' in rb.config_toml or \ build.use_locked_deps = '\nlocked-deps = true' in build.config_toml or \
'CFG_ENABLE_LOCKED_DEPS' in rb.config_mk 'CFG_ENABLE_LOCKED_DEPS' in build.config_mk
if 'SUDO_USER' in os.environ and not rb.use_vendored_sources: if 'SUDO_USER' in os.environ and not build.use_vendored_sources:
if os.environ.get('USER') != os.environ['SUDO_USER']: if os.environ.get('USER') != os.environ['SUDO_USER']:
rb.use_vendored_sources = True build.use_vendored_sources = True
print('info: looks like you are running this command under `sudo`') print('info: looks like you are running this command under `sudo`')
print(' and so in order to preserve your $HOME this will now') print(' and so in order to preserve your $HOME this will now')
print(' use vendored sources by default. Note that if this') print(' use vendored sources by default. Note that if this')
print(' does not work you should run a normal build first') print(' does not work you should run a normal build first')
print(' before running a command like `sudo make install`') print(' before running a command like `sudo make install`')
if rb.use_vendored_sources: if build.use_vendored_sources:
if not os.path.exists('.cargo'): if not os.path.exists('.cargo'):
os.makedirs('.cargo') os.makedirs('.cargo')
with open('.cargo/config', 'w') as f: with open('.cargo/config', 'w') as cargo_config:
f.write(""" cargo_config.write("""
[source.crates-io] [source.crates-io]
replace-with = 'vendored-sources' replace-with = 'vendored-sources'
registry = 'https://example.com' registry = 'https://example.com'
[source.vendored-sources] [source.vendored-sources]
directory = '{}/src/vendor' directory = '{}/src/vendor'
""".format(rb.rust_root)) """.format(build.rust_root))
else: else:
if os.path.exists('.cargo'): if os.path.exists('.cargo'):
shutil.rmtree('.cargo') shutil.rmtree('.cargo')
data = stage0_data(rb.rust_root) data = stage0_data(build.rust_root)
rb._date = data['date'] build.date = data['date']
rb._rustc_channel = data['rustc'] build.rustc_channel = data['rustc']
rb._cargo_channel = data['cargo'] build.cargo_channel = data['cargo']
if 'dev' in data:
rb._download_url = 'https://dev-static.rust-lang.org'
else:
rb._download_url = 'https://static.rust-lang.org'
rb.update_submodules() if 'dev' in data:
build.set_dev_environment()
build.update_submodules()
# Fetch/build the bootstrap # Fetch/build the bootstrap
rb.build = rb.build_triple() build.build = args.build or build.build_triple()
rb.download_stage0() build.download_stage0()
sys.stdout.flush() sys.stdout.flush()
rb.build_bootstrap() build.build_bootstrap()
sys.stdout.flush() sys.stdout.flush()
# Run the bootstrap # Run the bootstrap
args = [rb.bootstrap_binary()] args = [build.bootstrap_binary()]
args.extend(sys.argv[1:]) args.extend(sys.argv[1:])
env = os.environ.copy() env = os.environ.copy()
env["BUILD"] = rb.build env["BUILD"] = build.build
env["SRC"] = rb.rust_root env["SRC"] = build.rust_root
env["BOOTSTRAP_PARENT_ID"] = str(os.getpid()) env["BOOTSTRAP_PARENT_ID"] = str(os.getpid())
env["BOOTSTRAP_PYTHON"] = sys.executable env["BOOTSTRAP_PYTHON"] = sys.executable
run(args, env=env, verbose=rb.verbose) run(args, env=env, verbose=build.verbose)
def main(): def main():
"""Entry point for the bootstrap process"""
start_time = time() start_time = time()
help_triggered = ( help_triggered = (
'-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1) '-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1)
try: try:
bootstrap() bootstrap()
if not help_triggered: if not help_triggered:
print("Build completed successfully in %s" % print("Build completed successfully in {}".format(
format_build_time(time() - start_time)) format_build_time(time() - start_time)))
except (SystemExit, KeyboardInterrupt) as e: except (SystemExit, KeyboardInterrupt) as error:
if hasattr(e, 'code') and isinstance(e.code, int): if hasattr(error, 'code') and isinstance(error.code, int):
exit_code = e.code exit_code = error.code
else: else:
exit_code = 1 exit_code = 1
print(e) print(error)
if not help_triggered: if not help_triggered:
print("Build completed unsuccessfully in %s" % print("Build completed unsuccessfully in {}".format(
format_build_time(time() - start_time)) format_build_time(time() - start_time)))
sys.exit(exit_code) sys.exit(exit_code)

View File

@ -0,0 +1,114 @@
# Copyright 2015-2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
"""Bootstrap tests"""
import os
import doctest
import unittest
import tempfile
import hashlib
from shutil import rmtree
import bootstrap
class Stage0DataTestCase(unittest.TestCase):
"""Test Case for stage0_data"""
def setUp(self):
self.rust_root = tempfile.mkdtemp()
os.mkdir(os.path.join(self.rust_root, "src"))
with open(os.path.join(self.rust_root, "src",
"stage0.txt"), "w") as stage0:
stage0.write("#ignore\n\ndate: 2017-06-15\nrustc: beta\ncargo: beta")
def tearDown(self):
rmtree(self.rust_root)
def test_stage0_data(self):
"""Extract data from stage0.txt"""
expected = {"date": "2017-06-15", "rustc": "beta", "cargo": "beta"}
data = bootstrap.stage0_data(self.rust_root)
self.assertDictEqual(data, expected)
class VerifyTestCase(unittest.TestCase):
"""Test Case for verify"""
def setUp(self):
self.container = tempfile.mkdtemp()
self.src = os.path.join(self.container, "src.txt")
self.sums = os.path.join(self.container, "sums")
self.bad_src = os.path.join(self.container, "bad.txt")
content = "Hello world"
with open(self.src, "w") as src:
src.write(content)
with open(self.sums, "w") as sums:
sums.write(hashlib.sha256(content.encode("utf-8")).hexdigest())
with open(self.bad_src, "w") as bad:
bad.write("Hello!")
def tearDown(self):
rmtree(self.container)
def test_valid_file(self):
"""Check if the sha256 sum of the given file is valid"""
self.assertTrue(bootstrap.verify(self.src, self.sums, False))
def test_invalid_file(self):
"""Should verify that the file is invalid"""
self.assertFalse(bootstrap.verify(self.bad_src, self.sums, False))
class ProgramOutOfDate(unittest.TestCase):
"""Test if a program is out of date"""
def setUp(self):
self.container = tempfile.mkdtemp()
os.mkdir(os.path.join(self.container, "stage0"))
self.build = bootstrap.RustBuild()
self.build.date = "2017-06-15"
self.build.build_dir = self.container
self.rustc_stamp_path = os.path.join(self.container, "stage0",
".rustc-stamp")
def tearDown(self):
rmtree(self.container)
def test_stamp_path_does_not_exists(self):
"""Return True when the stamp file does not exists"""
if os.path.exists(self.rustc_stamp_path):
os.unlink(self.rustc_stamp_path)
self.assertTrue(self.build.program_out_of_date(self.rustc_stamp_path))
def test_dates_are_different(self):
"""Return True when the dates are different"""
with open(self.rustc_stamp_path, "w") as rustc_stamp:
rustc_stamp.write("2017-06-14")
self.assertTrue(self.build.program_out_of_date(self.rustc_stamp_path))
def test_same_dates(self):
"""Return False both dates match"""
with open(self.rustc_stamp_path, "w") as rustc_stamp:
rustc_stamp.write("2017-06-15")
self.assertFalse(self.build.program_out_of_date(self.rustc_stamp_path))
if __name__ == '__main__':
SUITE = unittest.TestSuite()
TEST_LOADER = unittest.TestLoader()
SUITE.addTest(doctest.DocTestSuite(bootstrap))
SUITE.addTests([
TEST_LOADER.loadTestsFromTestCase(Stage0DataTestCase),
TEST_LOADER.loadTestsFromTestCase(VerifyTestCase),
TEST_LOADER.loadTestsFromTestCase(ProgramOutOfDate)])
RUNNER = unittest.TextTestRunner(verbosity=2)
RUNNER.run(SUITE)

630
src/bootstrap/builder.rs Normal file
View File

@ -0,0 +1,630 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::fmt::Debug;
use std::hash::Hash;
use std::cell::RefCell;
use std::path::{Path, PathBuf};
use std::process::Command;
use std::fs;
use std::ops::Deref;
use std::any::Any;
use std::collections::BTreeSet;
use compile;
use install;
use dist;
use util::{exe, libdir, add_lib_path};
use {Build, Mode};
use cache::{INTERNER, Interned, Cache};
use check;
use flags::Subcommand;
use doc;
use tool;
use native;
pub use Compiler;
pub struct Builder<'a> {
pub build: &'a Build,
pub top_stage: u32,
pub kind: Kind,
cache: Cache,
stack: RefCell<Vec<Box<Any>>>,
}
impl<'a> Deref for Builder<'a> {
type Target = Build;
fn deref(&self) -> &Self::Target {
self.build
}
}
pub trait Step: 'static + Clone + Debug + PartialEq + Eq + Hash {
/// `PathBuf` when directories are created or to return a `Compiler` once
/// it's been assembled.
type Output: Clone;
const DEFAULT: bool = false;
/// Run this rule for all hosts without cross compiling.
const ONLY_HOSTS: bool = false;
/// Run this rule for all targets, but only with the native host.
const ONLY_BUILD_TARGETS: bool = false;
/// Only run this step with the build triple as host and target.
const ONLY_BUILD: bool = false;
/// Primary function to execute this rule. Can call `builder.ensure(...)`
/// with other steps to run those.
fn run(self, builder: &Builder) -> Self::Output;
/// When bootstrap is passed a set of paths, this controls whether this rule
/// will execute. However, it does not get called in a "default" context
/// when we are not passed any paths; in that case, make_run is called
/// directly.
fn should_run(run: ShouldRun) -> ShouldRun;
/// Build up a "root" rule, either as a default rule or from a path passed
/// to us.
///
/// When path is `None`, we are executing in a context where no paths were
/// passed. When `./x.py build` is run, for example, this rule could get
/// called if it is in the correct list below with a path of `None`.
fn make_run(_run: RunConfig) {
// It is reasonable to not have an implementation of make_run for rules
// who do not want to get called from the root context. This means that
// they are likely dependencies (e.g., sysroot creation) or similar, and
// as such calling them from ./x.py isn't logical.
unimplemented!()
}
}
pub struct RunConfig<'a> {
pub builder: &'a Builder<'a>,
pub host: Interned<String>,
pub target: Interned<String>,
pub path: Option<&'a Path>,
}
struct StepDescription {
default: bool,
only_hosts: bool,
only_build_targets: bool,
only_build: bool,
should_run: fn(ShouldRun) -> ShouldRun,
make_run: fn(RunConfig),
}
impl StepDescription {
fn from<S: Step>() -> StepDescription {
StepDescription {
default: S::DEFAULT,
only_hosts: S::ONLY_HOSTS,
only_build_targets: S::ONLY_BUILD_TARGETS,
only_build: S::ONLY_BUILD,
should_run: S::should_run,
make_run: S::make_run,
}
}
fn maybe_run(&self, builder: &Builder, path: Option<&Path>) {
let build = builder.build;
let hosts = if self.only_build_targets || self.only_build {
build.build_triple()
} else {
&build.hosts
};
// Determine the targets participating in this rule.
let targets = if self.only_hosts {
if build.config.run_host_only {
&[]
} else if self.only_build {
build.build_triple()
} else {
&build.hosts
}
} else {
&build.targets
};
for host in hosts {
for target in targets {
let run = RunConfig {
builder,
path,
host: *host,
target: *target,
};
(self.make_run)(run);
}
}
}
fn run(v: &[StepDescription], builder: &Builder, paths: &[PathBuf]) {
let should_runs = v.iter().map(|desc| {
(desc.should_run)(ShouldRun::new(builder))
}).collect::<Vec<_>>();
if paths.is_empty() {
for (desc, should_run) in v.iter().zip(should_runs) {
if desc.default && should_run.is_really_default {
desc.maybe_run(builder, None);
}
}
} else {
for path in paths {
let mut attempted_run = false;
for (desc, should_run) in v.iter().zip(&should_runs) {
if should_run.run(path) {
attempted_run = true;
desc.maybe_run(builder, Some(path));
}
}
if !attempted_run {
eprintln!("Warning: no rules matched {}.", path.display());
}
}
}
}
}
#[derive(Clone)]
pub struct ShouldRun<'a> {
pub builder: &'a Builder<'a>,
// use a BTreeSet to maintain sort order
paths: BTreeSet<PathBuf>,
// If this is a default rule, this is an additional constraint placed on
// it's run. Generally something like compiler docs being enabled.
is_really_default: bool,
}
impl<'a> ShouldRun<'a> {
fn new(builder: &'a Builder) -> ShouldRun<'a> {
ShouldRun {
builder,
paths: BTreeSet::new(),
is_really_default: true, // by default no additional conditions
}
}
pub fn default_condition(mut self, cond: bool) -> Self {
self.is_really_default = cond;
self
}
pub fn krate(mut self, name: &str) -> Self {
for (_, krate_path) in self.builder.crates(name) {
self.paths.insert(PathBuf::from(krate_path));
}
self
}
pub fn path(mut self, path: &str) -> Self {
self.paths.insert(PathBuf::from(path));
self
}
// allows being more explicit about why should_run in Step returns the value passed to it
pub fn never(self) -> ShouldRun<'a> {
self
}
fn run(&self, path: &Path) -> bool {
self.paths.iter().any(|p| path.ends_with(p))
}
}
#[derive(Copy, Clone, PartialEq, Eq, Debug)]
pub enum Kind {
Build,
Test,
Bench,
Dist,
Doc,
Install,
}
impl<'a> Builder<'a> {
fn get_step_descriptions(kind: Kind) -> Vec<StepDescription> {
macro_rules! describe {
($($rule:ty),+ $(,)*) => {{
vec![$(StepDescription::from::<$rule>()),+]
}};
}
match kind {
Kind::Build => describe!(compile::Std, compile::Test, compile::Rustc,
compile::StartupObjects, tool::BuildManifest, tool::Rustbook, tool::ErrorIndex,
tool::UnstableBookGen, tool::Tidy, tool::Linkchecker, tool::CargoTest,
tool::Compiletest, tool::RemoteTestServer, tool::RemoteTestClient,
tool::RustInstaller, tool::Cargo, tool::Rls, tool::Rustdoc,
native::Llvm),
Kind::Test => describe!(check::Tidy, check::Bootstrap, check::DefaultCompiletest,
check::HostCompiletest, check::Crate, check::CrateLibrustc, check::Linkcheck,
check::Cargotest, check::Cargo, check::Rls, check::Docs, check::ErrorIndex,
check::Distcheck),
Kind::Bench => describe!(check::Crate, check::CrateLibrustc),
Kind::Doc => describe!(doc::UnstableBook, doc::UnstableBookGen, doc::TheBook,
doc::Standalone, doc::Std, doc::Test, doc::Rustc, doc::ErrorIndex, doc::Nomicon,
doc::Reference, doc::Rustdoc, doc::CargoBook),
Kind::Dist => describe!(dist::Docs, dist::Mingw, dist::Rustc, dist::DebuggerScripts,
dist::Std, dist::Analysis, dist::Src, dist::PlainSourceTarball, dist::Cargo,
dist::Rls, dist::Extended, dist::HashSign),
Kind::Install => describe!(install::Docs, install::Std, install::Cargo, install::Rls,
install::Analysis, install::Src, install::Rustc),
}
}
pub fn get_help(build: &Build, subcommand: &str) -> Option<String> {
let kind = match subcommand {
"build" => Kind::Build,
"doc" => Kind::Doc,
"test" => Kind::Test,
"bench" => Kind::Bench,
"dist" => Kind::Dist,
"install" => Kind::Install,
_ => return None,
};
let builder = Builder {
build,
top_stage: build.config.stage.unwrap_or(2),
kind,
cache: Cache::new(),
stack: RefCell::new(Vec::new()),
};
let builder = &builder;
let mut should_run = ShouldRun::new(builder);
for desc in Builder::get_step_descriptions(builder.kind) {
should_run = (desc.should_run)(should_run);
}
let mut help = String::from("Available paths:\n");
for path in should_run.paths {
help.push_str(format!(" ./x.py {} {}\n", subcommand, path.display()).as_str());
}
Some(help)
}
pub fn run(build: &Build) {
let (kind, paths) = match build.config.cmd {
Subcommand::Build { ref paths } => (Kind::Build, &paths[..]),
Subcommand::Doc { ref paths } => (Kind::Doc, &paths[..]),
Subcommand::Test { ref paths, .. } => (Kind::Test, &paths[..]),
Subcommand::Bench { ref paths, .. } => (Kind::Bench, &paths[..]),
Subcommand::Dist { ref paths } => (Kind::Dist, &paths[..]),
Subcommand::Install { ref paths } => (Kind::Install, &paths[..]),
Subcommand::Clean => panic!(),
};
let builder = Builder {
build,
top_stage: build.config.stage.unwrap_or(2),
kind,
cache: Cache::new(),
stack: RefCell::new(Vec::new()),
};
StepDescription::run(&Builder::get_step_descriptions(builder.kind), &builder, paths);
}
pub fn default_doc(&self, paths: Option<&[PathBuf]>) {
let paths = paths.unwrap_or(&[]);
StepDescription::run(&Builder::get_step_descriptions(Kind::Doc), self, paths);
}
/// Obtain a compiler at a given stage and for a given host. Explicitly does
/// not take `Compiler` since all `Compiler` instances are meant to be
/// obtained through this function, since it ensures that they are valid
/// (i.e., built and assembled).
pub fn compiler(&self, stage: u32, host: Interned<String>) -> Compiler {
self.ensure(compile::Assemble { target_compiler: Compiler { stage, host } })
}
pub fn sysroot(&self, compiler: Compiler) -> Interned<PathBuf> {
self.ensure(compile::Sysroot { compiler })
}
/// Returns the libdir where the standard library and other artifacts are
/// found for a compiler's sysroot.
pub fn sysroot_libdir(
&self, compiler: Compiler, target: Interned<String>
) -> Interned<PathBuf> {
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct Libdir {
compiler: Compiler,
target: Interned<String>,
}
impl Step for Libdir {
type Output = Interned<PathBuf>;
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
fn run(self, builder: &Builder) -> Interned<PathBuf> {
let compiler = self.compiler;
let lib = if compiler.stage >= 2 && builder.build.config.libdir_relative.is_some() {
builder.build.config.libdir_relative.clone().unwrap()
} else {
PathBuf::from("lib")
};
let sysroot = builder.sysroot(self.compiler).join(lib)
.join("rustlib").join(self.target).join("lib");
let _ = fs::remove_dir_all(&sysroot);
t!(fs::create_dir_all(&sysroot));
INTERNER.intern_path(sysroot)
}
}
self.ensure(Libdir { compiler, target })
}
/// Returns the compiler's libdir where it stores the dynamic libraries that
/// it itself links against.
///
/// For example this returns `<sysroot>/lib` on Unix and `<sysroot>/bin` on
/// Windows.
pub fn rustc_libdir(&self, compiler: Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.build.rustc_snapshot_libdir()
} else {
self.sysroot(compiler).join(libdir(&compiler.host))
}
}
/// Adds the compiler's directory of dynamic libraries to `cmd`'s dynamic
/// library lookup path.
pub fn add_rustc_lib_path(&self, compiler: Compiler, cmd: &mut Command) {
// Windows doesn't need dylib path munging because the dlls for the
// compiler live next to the compiler and the system will find them
// automatically.
if cfg!(windows) {
return
}
add_lib_path(vec![self.rustc_libdir(compiler)], cmd);
}
/// Get a path to the compiler specified.
pub fn rustc(&self, compiler: Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.initial_rustc.clone()
} else {
self.sysroot(compiler).join("bin").join(exe("rustc", &compiler.host))
}
}
pub fn rustdoc(&self, host: Interned<String>) -> PathBuf {
self.ensure(tool::Rustdoc { host })
}
pub fn rustdoc_cmd(&self, host: Interned<String>) -> Command {
let mut cmd = Command::new(&self.out.join("bootstrap/debug/rustdoc"));
let compiler = self.compiler(self.top_stage, host);
cmd
.env("RUSTC_STAGE", compiler.stage.to_string())
.env("RUSTC_SYSROOT", self.sysroot(compiler))
.env("RUSTC_LIBDIR", self.sysroot_libdir(compiler, self.build.build))
.env("CFG_RELEASE_CHANNEL", &self.build.config.channel)
.env("RUSTDOC_REAL", self.rustdoc(host));
cmd
}
/// Prepares an invocation of `cargo` to be run.
///
/// This will create a `Command` that represents a pending execution of
/// Cargo. This cargo will be configured to use `compiler` as the actual
/// rustc compiler, its output will be scoped by `mode`'s output directory,
/// it will pass the `--target` flag for the specified `target`, and will be
/// executing the Cargo command `cmd`.
pub fn cargo(&self,
compiler: Compiler,
mode: Mode,
target: Interned<String>,
cmd: &str) -> Command {
let mut cargo = Command::new(&self.initial_cargo);
let out_dir = self.stage_out(compiler, mode);
cargo.env("CARGO_TARGET_DIR", out_dir)
.arg(cmd)
.arg("-j").arg(self.jobs().to_string())
.arg("--target").arg(target);
// FIXME: Temporary fix for https://github.com/rust-lang/cargo/issues/3005
// Force cargo to output binaries with disambiguating hashes in the name
cargo.env("__CARGO_DEFAULT_LIB_METADATA", &self.config.channel);
let stage;
if compiler.stage == 0 && self.local_rebuild {
// Assume the local-rebuild rustc already has stage1 features.
stage = 1;
} else {
stage = compiler.stage;
}
// Customize the compiler we're running. Specify the compiler to cargo
// as our shim and then pass it some various options used to configure
// how the actual compiler itself is called.
//
// These variables are primarily all read by
// src/bootstrap/bin/{rustc.rs,rustdoc.rs}
cargo.env("RUSTBUILD_NATIVE_DIR", self.native_dir(target))
.env("RUSTC", self.out.join("bootstrap/debug/rustc"))
.env("RUSTC_REAL", self.rustc(compiler))
.env("RUSTC_STAGE", stage.to_string())
.env("RUSTC_CODEGEN_UNITS",
self.config.rust_codegen_units.to_string())
.env("RUSTC_DEBUG_ASSERTIONS",
self.config.rust_debug_assertions.to_string())
.env("RUSTC_SYSROOT", self.sysroot(compiler))
.env("RUSTC_LIBDIR", self.rustc_libdir(compiler))
.env("RUSTC_RPATH", self.config.rust_rpath.to_string())
.env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc"))
.env("RUSTDOC_REAL", if cmd == "doc" || cmd == "test" {
self.rustdoc(compiler.host)
} else {
PathBuf::from("/path/to/nowhere/rustdoc/not/required")
})
.env("RUSTC_FLAGS", self.rustc_flags(target).join(" "));
if mode != Mode::Tool {
// Tools don't get debuginfo right now, e.g. cargo and rls don't
// get compiled with debuginfo.
cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string())
.env("RUSTC_DEBUGINFO_LINES", self.config.rust_debuginfo_lines.to_string())
.env("RUSTC_FORCE_UNSTABLE", "1");
// Currently the compiler depends on crates from crates.io, and
// then other crates can depend on the compiler (e.g. proc-macro
// crates). Let's say, for example that rustc itself depends on the
// bitflags crate. If an external crate then depends on the
// bitflags crate as well, we need to make sure they don't
// conflict, even if they pick the same version of bitflags. We'll
// want to make sure that e.g. a plugin and rustc each get their
// own copy of bitflags.
// Cargo ensures that this works in general through the -C metadata
// flag. This flag will frob the symbols in the binary to make sure
// they're different, even though the source code is the exact
// same. To solve this problem for the compiler we extend Cargo's
// already-passed -C metadata flag with our own. Our rustc.rs
// wrapper around the actual rustc will detect -C metadata being
// passed and frob it with this extra string we're passing in.
cargo.env("RUSTC_METADATA_SUFFIX", "rustc");
}
if let Some(x) = self.crt_static(target) {
cargo.env("RUSTC_CRT_STATIC", x.to_string());
}
// Enable usage of unstable features
cargo.env("RUSTC_BOOTSTRAP", "1");
self.add_rust_test_threads(&mut cargo);
// Almost all of the crates that we compile as part of the bootstrap may
// have a build script, including the standard library. To compile a
// build script, however, it itself needs a standard library! This
// introduces a bit of a pickle when we're compiling the standard
// library itself.
//
// To work around this we actually end up using the snapshot compiler
// (stage0) for compiling build scripts of the standard library itself.
// The stage0 compiler is guaranteed to have a libstd available for use.
//
// For other crates, however, we know that we've already got a standard
// library up and running, so we can use the normal compiler to compile
// build scripts in that situation.
if mode == Mode::Libstd {
cargo.env("RUSTC_SNAPSHOT", &self.initial_rustc)
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_snapshot_libdir());
} else {
cargo.env("RUSTC_SNAPSHOT", self.rustc(compiler))
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler));
}
// Ignore incremental modes except for stage0, since we're
// not guaranteeing correctness across builds if the compiler
// is changing under your feet.`
if self.config.incremental && compiler.stage == 0 {
let incr_dir = self.incremental_dir(compiler);
cargo.env("RUSTC_INCREMENTAL", incr_dir);
}
if let Some(ref on_fail) = self.config.on_fail {
cargo.env("RUSTC_ON_FAIL", on_fail);
}
cargo.env("RUSTC_VERBOSE", format!("{}", self.verbosity));
// Specify some various options for build scripts used throughout
// the build.
//
// FIXME: the guard against msvc shouldn't need to be here
if !target.contains("msvc") {
cargo.env(format!("CC_{}", target), self.cc(target))
.env(format!("AR_{}", target), self.ar(target).unwrap()) // only msvc is None
.env(format!("CFLAGS_{}", target), self.cflags(target).join(" "));
if let Ok(cxx) = self.cxx(target) {
cargo.env(format!("CXX_{}", target), cxx);
}
}
if mode == Mode::Libstd && self.config.extended && compiler.is_final_stage(self) {
cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string());
}
// Environment variables *required* throughout the build
//
// FIXME: should update code to not require this env var
cargo.env("CFG_COMPILER_HOST_TRIPLE", target);
// Set this for all builds to make sure doc builds also get it.
cargo.env("CFG_RELEASE_CHANNEL", &self.build.config.channel);
if self.is_verbose() {
cargo.arg("-v");
}
// FIXME: cargo bench does not accept `--release`
if self.config.rust_optimize && cmd != "bench" {
cargo.arg("--release");
}
if self.config.locked_deps {
cargo.arg("--locked");
}
if self.config.vendor || self.is_sudo {
cargo.arg("--frozen");
}
self.ci_env.force_coloring_in_ci(&mut cargo);
cargo
}
/// Ensure that a given step is built, returning it's output. This will
/// cache the step, so it is safe (and good!) to call this as often as
/// needed to ensure that all dependencies are built.
pub fn ensure<S: Step>(&'a self, step: S) -> S::Output {
{
let mut stack = self.stack.borrow_mut();
for stack_step in stack.iter() {
// should skip
if stack_step.downcast_ref::<S>().map_or(true, |stack_step| *stack_step != step) {
continue;
}
let mut out = String::new();
out += &format!("\n\nCycle in build detected when adding {:?}\n", step);
for el in stack.iter().rev() {
out += &format!("\t{:?}\n", el);
}
panic!(out);
}
if let Some(out) = self.cache.get(&step) {
self.build.verbose(&format!("{}c {:?}", " ".repeat(stack.len()), step));
return out;
}
self.build.verbose(&format!("{}> {:?}", " ".repeat(stack.len()), step));
stack.push(Box::new(step.clone()));
}
let out = step.clone().run(self);
{
let mut stack = self.stack.borrow_mut();
let cur_step = stack.pop().expect("step stack empty");
assert_eq!(cur_step.downcast_ref(), Some(&step));
}
self.build.verbose(&format!("{}< {:?}", " ".repeat(self.stack.borrow().len()), step));
self.cache.put(step, out.clone());
out
}
}

267
src/bootstrap/cache.rs Normal file
View File

@ -0,0 +1,267 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::any::{Any, TypeId};
use std::borrow::Borrow;
use std::cell::RefCell;
use std::collections::HashMap;
use std::convert::AsRef;
use std::ffi::OsStr;
use std::fmt;
use std::hash::{Hash, Hasher};
use std::marker::PhantomData;
use std::mem;
use std::ops::Deref;
use std::path::{Path, PathBuf};
use std::sync::Mutex;
use builder::Step;
pub struct Interned<T>(usize, PhantomData<*const T>);
impl Default for Interned<String> {
fn default() -> Self {
INTERNER.intern_string(String::default())
}
}
impl Default for Interned<PathBuf> {
fn default() -> Self {
INTERNER.intern_path(PathBuf::default())
}
}
impl<T> Copy for Interned<T> {}
impl<T> Clone for Interned<T> {
fn clone(&self) -> Interned<T> {
*self
}
}
impl<T> PartialEq for Interned<T> {
fn eq(&self, other: &Self) -> bool {
self.0 == other.0
}
}
impl<T> Eq for Interned<T> {}
impl PartialEq<str> for Interned<String> {
fn eq(&self, other: &str) -> bool {
*self == other
}
}
impl<'a> PartialEq<&'a str> for Interned<String> {
fn eq(&self, other: &&str) -> bool {
**self == **other
}
}
impl<'a, T> PartialEq<&'a Interned<T>> for Interned<T> {
fn eq(&self, other: &&Self) -> bool {
self.0 == other.0
}
}
impl<'a, T> PartialEq<Interned<T>> for &'a Interned<T> {
fn eq(&self, other: &Interned<T>) -> bool {
self.0 == other.0
}
}
unsafe impl<T> Send for Interned<T> {}
unsafe impl<T> Sync for Interned<T> {}
impl fmt::Display for Interned<String> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let s: &str = &*self;
f.write_str(s)
}
}
impl fmt::Debug for Interned<String> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let s: &str = &*self;
f.write_fmt(format_args!("{:?}", s))
}
}
impl fmt::Debug for Interned<PathBuf> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let s: &Path = &*self;
f.write_fmt(format_args!("{:?}", s))
}
}
impl Hash for Interned<String> {
fn hash<H: Hasher>(&self, state: &mut H) {
let l = INTERNER.strs.lock().unwrap();
l.get(*self).hash(state)
}
}
impl Hash for Interned<PathBuf> {
fn hash<H: Hasher>(&self, state: &mut H) {
let l = INTERNER.paths.lock().unwrap();
l.get(*self).hash(state)
}
}
impl Deref for Interned<String> {
type Target = str;
fn deref(&self) -> &'static str {
let l = INTERNER.strs.lock().unwrap();
unsafe { mem::transmute::<&str, &'static str>(l.get(*self)) }
}
}
impl Deref for Interned<PathBuf> {
type Target = Path;
fn deref(&self) -> &'static Path {
let l = INTERNER.paths.lock().unwrap();
unsafe { mem::transmute::<&Path, &'static Path>(l.get(*self)) }
}
}
impl AsRef<Path> for Interned<PathBuf> {
fn as_ref(&self) -> &'static Path {
let l = INTERNER.paths.lock().unwrap();
unsafe { mem::transmute::<&Path, &'static Path>(l.get(*self)) }
}
}
impl AsRef<Path> for Interned<String> {
fn as_ref(&self) -> &'static Path {
let l = INTERNER.strs.lock().unwrap();
unsafe { mem::transmute::<&Path, &'static Path>(l.get(*self).as_ref()) }
}
}
impl AsRef<OsStr> for Interned<PathBuf> {
fn as_ref(&self) -> &'static OsStr {
let l = INTERNER.paths.lock().unwrap();
unsafe { mem::transmute::<&OsStr, &'static OsStr>(l.get(*self).as_ref()) }
}
}
impl AsRef<OsStr> for Interned<String> {
fn as_ref(&self) -> &'static OsStr {
let l = INTERNER.strs.lock().unwrap();
unsafe { mem::transmute::<&OsStr, &'static OsStr>(l.get(*self).as_ref()) }
}
}
struct TyIntern<T> {
items: Vec<T>,
set: HashMap<T, Interned<T>>,
}
impl<T: Hash + Clone + Eq> TyIntern<T> {
fn new() -> TyIntern<T> {
TyIntern {
items: Vec::new(),
set: HashMap::new(),
}
}
fn intern_borrow<B>(&mut self, item: &B) -> Interned<T>
where
B: Eq + Hash + ToOwned<Owned=T> + ?Sized,
T: Borrow<B>,
{
if let Some(i) = self.set.get(&item) {
return *i;
}
let item = item.to_owned();
let interned = Interned(self.items.len(), PhantomData::<*const T>);
self.set.insert(item.clone(), interned);
self.items.push(item);
interned
}
fn intern(&mut self, item: T) -> Interned<T> {
if let Some(i) = self.set.get(&item) {
return *i;
}
let interned = Interned(self.items.len(), PhantomData::<*const T>);
self.set.insert(item.clone(), interned);
self.items.push(item);
interned
}
fn get(&self, i: Interned<T>) -> &T {
&self.items[i.0]
}
}
pub struct Interner {
strs: Mutex<TyIntern<String>>,
paths: Mutex<TyIntern<PathBuf>>,
}
impl Interner {
fn new() -> Interner {
Interner {
strs: Mutex::new(TyIntern::new()),
paths: Mutex::new(TyIntern::new()),
}
}
pub fn intern_str(&self, s: &str) -> Interned<String> {
self.strs.lock().unwrap().intern_borrow(s)
}
pub fn intern_string(&self, s: String) -> Interned<String> {
self.strs.lock().unwrap().intern(s)
}
pub fn intern_path(&self, s: PathBuf) -> Interned<PathBuf> {
self.paths.lock().unwrap().intern(s)
}
}
lazy_static! {
pub static ref INTERNER: Interner = Interner::new();
}
/// This is essentially a HashMap which allows storing any type in its input and
/// any type in its output. It is a write-once cache; values are never evicted,
/// which means that references to the value can safely be returned from the
/// get() method.
#[derive(Debug)]
pub struct Cache(
RefCell<HashMap<
TypeId,
Box<Any>, // actually a HashMap<Step, Interned<Step::Output>>
>>
);
impl Cache {
pub fn new() -> Cache {
Cache(RefCell::new(HashMap::new()))
}
pub fn put<S: Step>(&self, step: S, value: S::Output) {
let mut cache = self.0.borrow_mut();
let type_id = TypeId::of::<S>();
let stepcache = cache.entry(type_id)
.or_insert_with(|| Box::new(HashMap::<S, S::Output>::new()))
.downcast_mut::<HashMap<S, S::Output>>()
.expect("invalid type mapped");
assert!(!stepcache.contains_key(&step), "processing {:?} a second time", step);
stepcache.insert(step, value);
}
pub fn get<S: Step>(&self, step: &S) -> Option<S::Output> {
let mut cache = self.0.borrow_mut();
let type_id = TypeId::of::<S>();
let stepcache = cache.entry(type_id)
.or_insert_with(|| Box::new(HashMap::<S, S::Output>::new()))
.downcast_mut::<HashMap<S, S::Output>>()
.expect("invalid type mapped");
stepcache.get(step).cloned()
}
}

View File

@ -32,25 +32,24 @@
//! everything. //! everything.
use std::process::Command; use std::process::Command;
use std::iter;
use build_helper::{cc2ar, output}; use build_helper::{cc2ar, output};
use gcc; use gcc;
use Build; use Build;
use config::Target; use config::Target;
use cache::Interned;
pub fn find(build: &mut Build) { pub fn find(build: &mut Build) {
// For all targets we're going to need a C compiler for building some shims // For all targets we're going to need a C compiler for building some shims
// and such as well as for being a linker for Rust code. // and such as well as for being a linker for Rust code.
// for target in build.targets.iter().chain(&build.hosts).cloned().chain(iter::once(build.build)) {
// This includes targets that aren't necessarily passed on the commandline
// (FIXME: Perhaps it shouldn't?)
for target in &build.config.target {
let mut cfg = gcc::Config::new(); let mut cfg = gcc::Config::new();
cfg.cargo_metadata(false).opt_level(0).debug(false) cfg.cargo_metadata(false).opt_level(0).debug(false)
.target(target).host(&build.build); .target(&target).host(&build.build);
let config = build.config.target_config.get(target); let config = build.config.target_config.get(&target);
if let Some(cc) = config.and_then(|c| c.cc.as_ref()) { if let Some(cc) = config.and_then(|c| c.cc.as_ref()) {
cfg.compiler(cc); cfg.compiler(cc);
} else { } else {
@ -58,23 +57,20 @@ pub fn find(build: &mut Build) {
} }
let compiler = cfg.get_compiler(); let compiler = cfg.get_compiler();
let ar = cc2ar(compiler.path(), target); let ar = cc2ar(compiler.path(), &target);
build.verbose(&format!("CC_{} = {:?}", target, compiler.path())); build.verbose(&format!("CC_{} = {:?}", &target, compiler.path()));
if let Some(ref ar) = ar { if let Some(ref ar) = ar {
build.verbose(&format!("AR_{} = {:?}", target, ar)); build.verbose(&format!("AR_{} = {:?}", &target, ar));
} }
build.cc.insert(target.to_string(), (compiler, ar)); build.cc.insert(target, (compiler, ar));
} }
// For all host triples we need to find a C++ compiler as well // For all host triples we need to find a C++ compiler as well
// for host in build.hosts.iter().cloned().chain(iter::once(build.build)) {
// This includes hosts that aren't necessarily passed on the commandline
// (FIXME: Perhaps it shouldn't?)
for host in &build.config.host {
let mut cfg = gcc::Config::new(); let mut cfg = gcc::Config::new();
cfg.cargo_metadata(false).opt_level(0).debug(false).cpp(true) cfg.cargo_metadata(false).opt_level(0).debug(false).cpp(true)
.target(host).host(&build.build); .target(&host).host(&build.build);
let config = build.config.target_config.get(host); let config = build.config.target_config.get(&host);
if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) { if let Some(cxx) = config.and_then(|c| c.cxx.as_ref()) {
cfg.compiler(cxx); cfg.compiler(cxx);
} else { } else {
@ -82,16 +78,16 @@ pub fn find(build: &mut Build) {
} }
let compiler = cfg.get_compiler(); let compiler = cfg.get_compiler();
build.verbose(&format!("CXX_{} = {:?}", host, compiler.path())); build.verbose(&format!("CXX_{} = {:?}", host, compiler.path()));
build.cxx.insert(host.to_string(), compiler); build.cxx.insert(host, compiler);
} }
} }
fn set_compiler(cfg: &mut gcc::Config, fn set_compiler(cfg: &mut gcc::Config,
gnu_compiler: &str, gnu_compiler: &str,
target: &str, target: Interned<String>,
config: Option<&Target>, config: Option<&Target>,
build: &Build) { build: &Build) {
match target { match &*target {
// When compiling for android we may have the NDK configured in the // When compiling for android we may have the NDK configured in the
// config.toml in which case we look there. Otherwise the default // config.toml in which case we look there. Otherwise the default
// compiler already takes into account the triple in question. // compiler already takes into account the triple in question.

View File

@ -21,14 +21,15 @@ use std::process::Command;
use build_helper::output; use build_helper::output;
use Build; use Build;
use config::Config;
// The version number // The version number
pub const CFG_RELEASE_NUM: &str = "1.20.0"; pub const CFG_RELEASE_NUM: &str = "1.21.0";
// An optional number to put after the label, e.g. '.2' -> '-beta.2' // An optional number to put after the label, e.g. '.2' -> '-beta.2'
// Be sure to make this starts with a dot to conform to semver pre-release // Be sure to make this starts with a dot to conform to semver pre-release
// versions (section 9) // versions (section 9)
pub const CFG_PRERELEASE_VERSION: &str = ".3"; pub const CFG_PRERELEASE_VERSION: &str = ".4";
pub struct GitInfo { pub struct GitInfo {
inner: Option<Info>, inner: Option<Info>,
@ -41,9 +42,9 @@ struct Info {
} }
impl GitInfo { impl GitInfo {
pub fn new(dir: &Path) -> GitInfo { pub fn new(config: &Config, dir: &Path) -> GitInfo {
// See if this even begins to look like a git dir // See if this even begins to look like a git dir
if !dir.join(".git").exists() { if config.ignore_git || !dir.join(".git").exists() {
return GitInfo { inner: None } return GitInfo { inner: None }
} }

File diff suppressed because it is too large Load Diff

View File

@ -26,7 +26,7 @@ pub fn clean(build: &Build) {
rm_rf(&build.out.join("tmp")); rm_rf(&build.out.join("tmp"));
rm_rf(&build.out.join("dist")); rm_rf(&build.out.join("dist"));
for host in build.config.host.iter() { for host in &build.hosts {
let entries = match build.out.join(host).read_dir() { let entries = match build.out.join(host).read_dir() {
Ok(iter) => iter, Ok(iter) => iter,
Err(_) => continue, Err(_) => continue,

View File

@ -23,31 +23,121 @@ use std::io::prelude::*;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::str; use std::str;
use std::cmp::min;
use build_helper::{output, mtime, up_to_date}; use build_helper::{output, mtime, up_to_date};
use filetime::FileTime; use filetime::FileTime;
use rustc_serialize::json; use serde_json;
use channel::GitInfo;
use util::{exe, libdir, is_dylib, copy}; use util::{exe, libdir, is_dylib, copy};
use {Build, Compiler, Mode}; use {Build, Compiler, Mode};
use native;
use tool;
/// Build the standard library. use cache::{INTERNER, Interned};
/// use builder::{Step, RunConfig, ShouldRun, Builder};
/// This will build the standard library for a particular stage of the build
/// using the `compiler` targeting the `target` architecture. The artifacts #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// created will also be linked into the sysroot directory. pub struct Std {
pub fn std(build: &Build, target: &str, compiler: &Compiler) { pub target: Interned<String>,
let libdir = build.sysroot_libdir(compiler, target); pub compiler: Compiler,
t!(fs::create_dir_all(&libdir)); }
impl Step for Std {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/libstd").krate("std")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Std {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build the standard library.
///
/// This will build the standard library for a particular stage of the build
/// using the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let compiler = self.compiler;
builder.ensure(StartupObjects { compiler, target });
if build.force_use_stage1(compiler, target) {
let from = builder.compiler(1, build.build);
builder.ensure(Std {
compiler: from,
target,
});
println!("Uplifting stage1 std ({} -> {})", from.host, target);
// Even if we're not building std this stage, the new sysroot must
// still contain the musl startup objects.
if target.contains("musl") && !target.contains("mips") {
let libdir = builder.sysroot_libdir(compiler, target);
copy_musl_third_party_objects(build, target, &libdir);
}
builder.ensure(StdLink {
compiler: from,
target_compiler: compiler,
target,
});
return;
}
let _folder = build.fold_output(|| format!("stage{}-std", compiler.stage)); let _folder = build.fold_output(|| format!("stage{}-std", compiler.stage));
println!("Building stage{} std artifacts ({} -> {})", compiler.stage, println!("Building stage{} std artifacts ({} -> {})", compiler.stage,
compiler.host, target); &compiler.host, target);
if target.contains("musl") && !target.contains("mips") {
let libdir = builder.sysroot_libdir(compiler, target);
copy_musl_third_party_objects(build, target, &libdir);
}
let out_dir = build.cargo_out(compiler, Mode::Libstd, target); let out_dir = build.cargo_out(compiler, Mode::Libstd, target);
build.clear_if_dirty(&out_dir, &build.compiler_path(compiler)); build.clear_if_dirty(&out_dir, &builder.rustc(compiler));
let mut cargo = build.cargo(compiler, Mode::Libstd, target, "build"); let mut cargo = builder.cargo(compiler, Mode::Libstd, target, "build");
std_cargo(build, &compiler, target, &mut cargo);
run_cargo(build,
&mut cargo,
&libstd_stamp(build, compiler, target));
builder.ensure(StdLink {
compiler: builder.compiler(compiler.stage, build.build),
target_compiler: compiler,
target,
});
}
}
/// Copies the crt(1,i,n).o startup objects
///
/// Since musl supports fully static linking, we can cross link for it even
/// with a glibc-targeting toolchain, given we have the appropriate startup
/// files. As those shipped with glibc won't work, copy the ones provided by
/// musl so we have them on linux-gnu hosts.
fn copy_musl_third_party_objects(build: &Build,
target: Interned<String>,
into: &Path) {
for &obj in &["crt1.o", "crti.o", "crtn.o"] {
copy(&build.musl_root(target).unwrap().join("lib").join(obj), &into.join(obj));
}
}
/// Configure cargo to compile the standard library, adding appropriate env vars
/// and such.
pub fn std_cargo(build: &Build,
compiler: &Compiler,
target: Interned<String>,
cargo: &mut Command) {
let mut features = build.std_features(); let mut features = build.std_features();
if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") { if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") {
@ -73,11 +163,12 @@ pub fn std(build: &Build, target: &str, compiler: &Compiler) {
// config.toml equivalent) is used // config.toml equivalent) is used
cargo.env("LLVM_CONFIG", build.llvm_config(target)); cargo.env("LLVM_CONFIG", build.llvm_config(target));
} }
cargo.arg("--features").arg(features) cargo.arg("--features").arg(features)
.arg("--manifest-path") .arg("--manifest-path")
.arg(build.src.join("src/libstd/Cargo.toml")); .arg(build.src.join("src/libstd/Cargo.toml"));
if let Some(target) = build.config.target_config.get(target) { if let Some(target) = build.config.target_config.get(&target) {
if let Some(ref jemalloc) = target.jemalloc { if let Some(ref jemalloc) = target.jemalloc {
cargo.env("JEMALLOC_OVERRIDE", jemalloc); cargo.env("JEMALLOC_OVERRIDE", jemalloc);
} }
@ -87,51 +178,56 @@ pub fn std(build: &Build, target: &str, compiler: &Compiler) {
cargo.env("MUSL_ROOT", p); cargo.env("MUSL_ROOT", p);
} }
} }
run_cargo(build,
&mut cargo,
&libstd_stamp(build, &compiler, target));
} }
/// Link all libstd rlibs/dylibs into the sysroot location. #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// struct StdLink {
/// Links those artifacts generated by `compiler` to a the `stage` compiler's pub compiler: Compiler,
/// sysroot for the specified `host` and `target`. pub target_compiler: Compiler,
/// pub target: Interned<String>,
/// Note that this assumes that `compiler` has already generated the libstd }
/// libraries for `target`, and this method will find them in the relevant
/// output directory. impl Step for StdLink {
pub fn std_link(build: &Build, type Output = ();
compiler: &Compiler,
target_compiler: &Compiler, fn should_run(run: ShouldRun) -> ShouldRun {
target: &str) { run.never()
}
/// Link all libstd rlibs/dylibs into the sysroot location.
///
/// Links those artifacts generated by `compiler` to a the `stage` compiler's
/// sysroot for the specified `host` and `target`.
///
/// Note that this assumes that `compiler` has already generated the libstd
/// libraries for `target`, and this method will find them in the relevant
/// output directory.
fn run(self, builder: &Builder) {
let build = builder.build;
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
println!("Copying stage{} std from stage{} ({} -> {} / {})", println!("Copying stage{} std from stage{} ({} -> {} / {})",
target_compiler.stage, target_compiler.stage,
compiler.stage, compiler.stage,
compiler.host, &compiler.host,
target_compiler.host, target_compiler.host,
target); target);
let libdir = build.sysroot_libdir(target_compiler, target); let libdir = builder.sysroot_libdir(target_compiler, target);
add_to_sysroot(&libdir, &libstd_stamp(build, compiler, target)); add_to_sysroot(&libdir, &libstd_stamp(build, compiler, target));
if target.contains("musl") && !target.contains("mips") {
copy_musl_third_party_objects(build, target, &libdir);
}
if build.config.sanitizers && compiler.stage != 0 && target == "x86_64-apple-darwin" { if build.config.sanitizers && compiler.stage != 0 && target == "x86_64-apple-darwin" {
// The sanitizers are only built in stage1 or above, so the dylibs will // The sanitizers are only built in stage1 or above, so the dylibs will
// be missing in stage0 and causes panic. See the `std()` function above // be missing in stage0 and causes panic. See the `std()` function above
// for reason why the sanitizers are not built in stage0. // for reason why the sanitizers are not built in stage0.
copy_apple_sanitizer_dylibs(&build.native_dir(target), "osx", &libdir); copy_apple_sanitizer_dylibs(&build.native_dir(target), "osx", &libdir);
} }
}
/// Copies the crt(1,i,n).o startup objects builder.ensure(tool::CleanTools {
/// compiler: target_compiler,
/// Only required for musl targets that statically link to libc target,
fn copy_musl_third_party_objects(build: &Build, target: &str, into: &Path) { mode: Mode::Libstd,
for &obj in &["crt1.o", "crti.o", "crtn.o"] { });
copy(&build.musl_root(target).unwrap().join("lib").join(obj), &into.join(obj));
} }
} }
@ -147,35 +243,55 @@ fn copy_apple_sanitizer_dylibs(native_dir: &Path, platform: &str, into: &Path) {
} }
} }
/// Build and prepare startup objects like rsbegin.o and rsend.o #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// pub struct StartupObjects {
/// These are primarily used on Windows right now for linking executables/dlls. pub compiler: Compiler,
/// They don't require any library support as they're just plain old object pub target: Interned<String>,
/// files, so we just use the nightly snapshot compiler to always build them (as }
/// no other compilers are guaranteed to be available).
pub fn build_startup_objects(build: &Build, for_compiler: &Compiler, target: &str) { impl Step for StartupObjects {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/rtstartup")
}
fn make_run(run: RunConfig) {
run.builder.ensure(StartupObjects {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build and prepare startup objects like rsbegin.o and rsend.o
///
/// These are primarily used on Windows right now for linking executables/dlls.
/// They don't require any library support as they're just plain old object
/// files, so we just use the nightly snapshot compiler to always build them (as
/// no other compilers are guaranteed to be available).
fn run(self, builder: &Builder) {
let build = builder.build;
let for_compiler = self.compiler;
let target = self.target;
if !target.contains("pc-windows-gnu") { if !target.contains("pc-windows-gnu") {
return return
} }
let compiler = Compiler::new(0, &build.build);
let compiler_path = build.compiler_path(&compiler);
let src_dir = &build.src.join("src/rtstartup"); let src_dir = &build.src.join("src/rtstartup");
let dst_dir = &build.native_dir(target).join("rtstartup"); let dst_dir = &build.native_dir(target).join("rtstartup");
let sysroot_dir = &build.sysroot_libdir(for_compiler, target); let sysroot_dir = &builder.sysroot_libdir(for_compiler, target);
t!(fs::create_dir_all(dst_dir)); t!(fs::create_dir_all(dst_dir));
t!(fs::create_dir_all(sysroot_dir));
for file in &["rsbegin", "rsend"] { for file in &["rsbegin", "rsend"] {
let src_file = &src_dir.join(file.to_string() + ".rs"); let src_file = &src_dir.join(file.to_string() + ".rs");
let dst_file = &dst_dir.join(file.to_string() + ".o"); let dst_file = &dst_dir.join(file.to_string() + ".o");
if !up_to_date(src_file, dst_file) { if !up_to_date(src_file, dst_file) {
let mut cmd = Command::new(&compiler_path); let mut cmd = Command::new(&build.initial_rustc);
build.run(cmd.env("RUSTC_BOOTSTRAP", "1") build.run(cmd.env("RUSTC_BOOTSTRAP", "1")
.arg("--cfg").arg(format!("stage{}", compiler.stage)) .arg("--cfg").arg("stage0")
.arg("--target").arg(target) .arg("--target").arg(target)
.arg("--emit=obj") .arg("--emit=obj")
.arg("--out-dir").arg(dst_dir) .arg("-o").arg(dst_file)
.arg(src_file)); .arg(src_file));
} }
@ -185,59 +301,207 @@ pub fn build_startup_objects(build: &Build, for_compiler: &Compiler, target: &st
for obj in ["crt2.o", "dllcrt2.o"].iter() { for obj in ["crt2.o", "dllcrt2.o"].iter() {
copy(&compiler_file(build.cc(target), obj), &sysroot_dir.join(obj)); copy(&compiler_file(build.cc(target), obj), &sysroot_dir.join(obj));
} }
}
} }
/// Build libtest. #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// pub struct Test {
/// This will build libtest and supporting libraries for a particular stage of pub compiler: Compiler,
/// the build using the `compiler` targeting the `target` architecture. The pub target: Interned<String>,
/// artifacts created will also be linked into the sysroot directory. }
pub fn test(build: &Build, target: &str, compiler: &Compiler) {
impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/libtest").krate("test")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Test {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build libtest.
///
/// This will build libtest and supporting libraries for a particular stage of
/// the build using the `compiler` targeting the `target` architecture. The
/// artifacts created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let compiler = self.compiler;
builder.ensure(Std { compiler, target });
if build.force_use_stage1(compiler, target) {
builder.ensure(Test {
compiler: builder.compiler(1, build.build),
target,
});
println!("Uplifting stage1 test ({} -> {})", &build.build, target);
builder.ensure(TestLink {
compiler: builder.compiler(1, build.build),
target_compiler: compiler,
target,
});
return;
}
let _folder = build.fold_output(|| format!("stage{}-test", compiler.stage)); let _folder = build.fold_output(|| format!("stage{}-test", compiler.stage));
println!("Building stage{} test artifacts ({} -> {})", compiler.stage, println!("Building stage{} test artifacts ({} -> {})", compiler.stage,
compiler.host, target); &compiler.host, target);
let out_dir = build.cargo_out(compiler, Mode::Libtest, target); let out_dir = build.cargo_out(compiler, Mode::Libtest, target);
build.clear_if_dirty(&out_dir, &libstd_stamp(build, compiler, target)); build.clear_if_dirty(&out_dir, &libstd_stamp(build, compiler, target));
let mut cargo = build.cargo(compiler, Mode::Libtest, target, "build"); let mut cargo = builder.cargo(compiler, Mode::Libtest, target, "build");
test_cargo(build, &compiler, target, &mut cargo);
run_cargo(build,
&mut cargo,
&libtest_stamp(build, compiler, target));
builder.ensure(TestLink {
compiler: builder.compiler(compiler.stage, build.build),
target_compiler: compiler,
target,
});
}
}
/// Same as `std_cargo`, but for libtest
pub fn test_cargo(build: &Build,
_compiler: &Compiler,
_target: Interned<String>,
cargo: &mut Command) {
if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") { if let Some(target) = env::var_os("MACOSX_STD_DEPLOYMENT_TARGET") {
cargo.env("MACOSX_DEPLOYMENT_TARGET", target); cargo.env("MACOSX_DEPLOYMENT_TARGET", target);
} }
cargo.arg("--manifest-path") cargo.arg("--manifest-path")
.arg(build.src.join("src/libtest/Cargo.toml")); .arg(build.src.join("src/libtest/Cargo.toml"));
run_cargo(build,
&mut cargo,
&libtest_stamp(build, compiler, target));
} }
/// Same as `std_link`, only for libtest #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub fn test_link(build: &Build, pub struct TestLink {
compiler: &Compiler, pub compiler: Compiler,
target_compiler: &Compiler, pub target_compiler: Compiler,
target: &str) { pub target: Interned<String>,
}
impl Step for TestLink {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Same as `std_link`, only for libtest
fn run(self, builder: &Builder) {
let build = builder.build;
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
println!("Copying stage{} test from stage{} ({} -> {} / {})", println!("Copying stage{} test from stage{} ({} -> {} / {})",
target_compiler.stage, target_compiler.stage,
compiler.stage, compiler.stage,
compiler.host, &compiler.host,
target_compiler.host, target_compiler.host,
target); target);
add_to_sysroot(&build.sysroot_libdir(target_compiler, target), add_to_sysroot(&builder.sysroot_libdir(target_compiler, target),
&libtest_stamp(build, compiler, target)); &libtest_stamp(build, compiler, target));
builder.ensure(tool::CleanTools {
compiler: target_compiler,
target,
mode: Mode::Libtest,
});
}
} }
/// Build the compiler. #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// pub struct Rustc {
/// This will build the compiler for a particular stage of the build using pub compiler: Compiler,
/// the `compiler` targeting the `target` architecture. The artifacts pub target: Interned<String>,
/// created will also be linked into the sysroot directory. }
pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
impl Step for Rustc {
type Output = ();
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/librustc").krate("rustc-main")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rustc {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build the compiler.
///
/// This will build the compiler for a particular stage of the build using
/// the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
let build = builder.build;
let compiler = self.compiler;
let target = self.target;
builder.ensure(Test { compiler, target });
// Build LLVM for our target. This will implicitly build the host LLVM
// if necessary.
builder.ensure(native::Llvm { target });
if build.force_use_stage1(compiler, target) {
builder.ensure(Rustc {
compiler: builder.compiler(1, build.build),
target,
});
println!("Uplifting stage1 rustc ({} -> {})", &build.build, target);
builder.ensure(RustcLink {
compiler: builder.compiler(1, build.build),
target_compiler: compiler,
target,
});
return;
}
// Ensure that build scripts have a std to link against.
builder.ensure(Std {
compiler: builder.compiler(self.compiler.stage, build.build),
target: build.build,
});
let _folder = build.fold_output(|| format!("stage{}-rustc", compiler.stage)); let _folder = build.fold_output(|| format!("stage{}-rustc", compiler.stage));
println!("Building stage{} compiler artifacts ({} -> {})", println!("Building stage{} compiler artifacts ({} -> {})",
compiler.stage, compiler.host, target); compiler.stage, &compiler.host, target);
let out_dir = build.cargo_out(compiler, Mode::Librustc, target); let out_dir = build.cargo_out(compiler, Mode::Librustc, target);
build.clear_if_dirty(&out_dir, &libtest_stamp(build, compiler, target)); build.clear_if_dirty(&out_dir, &libtest_stamp(build, compiler, target));
let mut cargo = build.cargo(compiler, Mode::Librustc, target, "build"); let mut cargo = builder.cargo(compiler, Mode::Librustc, target, "build");
rustc_cargo(build, &compiler, target, &mut cargo);
run_cargo(build,
&mut cargo,
&librustc_stamp(build, compiler, target));
builder.ensure(RustcLink {
compiler: builder.compiler(compiler.stage, build.build),
target_compiler: compiler,
target,
});
}
}
/// Same as `std_cargo`, but for libtest
pub fn rustc_cargo(build: &Build,
compiler: &Compiler,
target: Interned<String>,
cargo: &mut Command) {
cargo.arg("--features").arg(build.rustc_features()) cargo.arg("--features").arg(build.rustc_features())
.arg("--manifest-path") .arg("--manifest-path")
.arg(build.src.join("src/rustc/Cargo.toml")); .arg(build.src.join("src/rustc/Cargo.toml"));
@ -252,7 +516,8 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
if compiler.stage == 0 { if compiler.stage == 0 {
cargo.env("CFG_LIBDIR_RELATIVE", "lib"); cargo.env("CFG_LIBDIR_RELATIVE", "lib");
} else { } else {
let libdir_relative = build.config.libdir_relative.clone().unwrap_or(PathBuf::from("lib")); let libdir_relative =
build.config.libdir_relative.clone().unwrap_or(PathBuf::from("lib"));
cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative); cargo.env("CFG_LIBDIR_RELATIVE", libdir_relative);
} }
@ -277,7 +542,7 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
cargo.env("LLVM_RUSTLLVM", "1"); cargo.env("LLVM_RUSTLLVM", "1");
} }
cargo.env("LLVM_CONFIG", build.llvm_config(target)); cargo.env("LLVM_CONFIG", build.llvm_config(target));
let target_config = build.config.target_config.get(target); let target_config = build.config.target_config.get(&target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
cargo.env("CFG_LLVM_ROOT", s); cargo.env("CFG_LLVM_ROOT", s);
} }
@ -298,41 +563,59 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
if let Some(ref s) = build.config.rustc_default_ar { if let Some(ref s) = build.config.rustc_default_ar {
cargo.env("CFG_DEFAULT_AR", s); cargo.env("CFG_DEFAULT_AR", s);
} }
run_cargo(build,
&mut cargo,
&librustc_stamp(build, compiler, target));
} }
/// Same as `std_link`, only for librustc #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
pub fn rustc_link(build: &Build, struct RustcLink {
compiler: &Compiler, pub compiler: Compiler,
target_compiler: &Compiler, pub target_compiler: Compiler,
target: &str) { pub target: Interned<String>,
}
impl Step for RustcLink {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Same as `std_link`, only for librustc
fn run(self, builder: &Builder) {
let build = builder.build;
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
println!("Copying stage{} rustc from stage{} ({} -> {} / {})", println!("Copying stage{} rustc from stage{} ({} -> {} / {})",
target_compiler.stage, target_compiler.stage,
compiler.stage, compiler.stage,
compiler.host, &compiler.host,
target_compiler.host, target_compiler.host,
target); target);
add_to_sysroot(&build.sysroot_libdir(target_compiler, target), add_to_sysroot(&builder.sysroot_libdir(target_compiler, target),
&librustc_stamp(build, compiler, target)); &librustc_stamp(build, compiler, target));
builder.ensure(tool::CleanTools {
compiler: target_compiler,
target,
mode: Mode::Librustc,
});
}
} }
/// Cargo's output path for the standard library in a given stage, compiled /// Cargo's output path for the standard library in a given stage, compiled
/// by a particular compiler for the specified target. /// by a particular compiler for the specified target.
fn libstd_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { pub fn libstd_stamp(build: &Build, compiler: Compiler, target: Interned<String>) -> PathBuf {
build.cargo_out(compiler, Mode::Libstd, target).join(".libstd.stamp") build.cargo_out(compiler, Mode::Libstd, target).join(".libstd.stamp")
} }
/// Cargo's output path for libtest in a given stage, compiled by a particular /// Cargo's output path for libtest in a given stage, compiled by a particular
/// compiler for the specified target. /// compiler for the specified target.
fn libtest_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { pub fn libtest_stamp(build: &Build, compiler: Compiler, target: Interned<String>) -> PathBuf {
build.cargo_out(compiler, Mode::Libtest, target).join(".libtest.stamp") build.cargo_out(compiler, Mode::Libtest, target).join(".libtest.stamp")
} }
/// Cargo's output path for librustc in a given stage, compiled by a particular /// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target. /// compiler for the specified target.
fn librustc_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf { pub fn librustc_stamp(build: &Build, compiler: Compiler, target: Interned<String>) -> PathBuf {
build.cargo_out(compiler, Mode::Librustc, target).join(".librustc.stamp") build.cargo_out(compiler, Mode::Librustc, target).join(".librustc.stamp")
} }
@ -342,36 +625,114 @@ fn compiler_file(compiler: &Path, file: &str) -> PathBuf {
PathBuf::from(out.trim()) PathBuf::from(out.trim())
} }
pub fn create_sysroot(build: &Build, compiler: &Compiler) { #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
let sysroot = build.sysroot(compiler); pub struct Sysroot {
let _ = fs::remove_dir_all(&sysroot); pub compiler: Compiler,
t!(fs::create_dir_all(&sysroot));
} }
/// Prepare a new compiler from the artifacts in `stage` impl Step for Sysroot {
/// type Output = Interned<PathBuf>;
/// This will assemble a compiler in `build/$host/stage$stage`. The compiler
/// must have been previously produced by the `stage - 1` build.build fn should_run(run: ShouldRun) -> ShouldRun {
/// compiler. run.never()
pub fn assemble_rustc(build: &Build, stage: u32, host: &str) {
// nothing to do in stage0
if stage == 0 {
return
} }
println!("Copying stage{} compiler ({})", stage, host); /// Returns the sysroot for the `compiler` specified that *this build system
/// generates*.
///
/// That is, the sysroot for the stage0 compiler is not what the compiler
/// thinks it is by default, but it's the same as the default for stages
/// 1-3.
fn run(self, builder: &Builder) -> Interned<PathBuf> {
let build = builder.build;
let compiler = self.compiler;
let sysroot = if compiler.stage == 0 {
build.out.join(&compiler.host).join("stage0-sysroot")
} else {
build.out.join(&compiler.host).join(format!("stage{}", compiler.stage))
};
let _ = fs::remove_dir_all(&sysroot);
t!(fs::create_dir_all(&sysroot));
INTERNER.intern_path(sysroot)
}
}
// The compiler that we're assembling #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
let target_compiler = Compiler::new(stage, host); pub struct Assemble {
/// The compiler which we will produce in this step. Assemble itself will
/// take care of ensuring that the necessary prerequisites to do so exist,
/// that is, this target can be a stage2 compiler and Assemble will build
/// previous stages for you.
pub target_compiler: Compiler,
}
// The compiler that compiled the compiler we're assembling impl Step for Assemble {
let build_compiler = Compiler::new(stage - 1, &build.build); type Output = Compiler;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/rustc")
}
/// Prepare a new compiler from the artifacts in `stage`
///
/// This will assemble a compiler in `build/$host/stage$stage`. The compiler
/// must have been previously produced by the `stage - 1` build.build
/// compiler.
fn run(self, builder: &Builder) -> Compiler {
let build = builder.build;
let target_compiler = self.target_compiler;
if target_compiler.stage == 0 {
assert_eq!(build.build, target_compiler.host,
"Cannot obtain compiler for non-native build triple at stage 0");
// The stage 0 compiler for the build triple is always pre-built.
return target_compiler;
}
// Get the compiler that we'll use to bootstrap ourselves.
let build_compiler = if target_compiler.host != build.build {
// Build a compiler for the host platform. We cannot use the stage0
// compiler for the host platform for this because it doesn't have
// the libraries we need. FIXME: Perhaps we should download those
// libraries? It would make builds faster...
// FIXME: It may be faster if we build just a stage 1
// compiler and then use that to bootstrap this compiler
// forward.
builder.compiler(target_compiler.stage - 1, build.build)
} else {
// Build the compiler we'll use to build the stage requested. This
// may build more than one compiler (going down to stage 0).
builder.compiler(target_compiler.stage - 1, target_compiler.host)
};
// Build the libraries for this compiler to link to (i.e., the libraries
// it uses at runtime). NOTE: Crates the target compiler compiles don't
// link to these. (FIXME: Is that correct? It seems to be correct most
// of the time but I think we do link to these for stage2/bin compilers
// when not performing a full bootstrap).
if builder.build.config.keep_stage.map_or(false, |s| target_compiler.stage <= s) {
builder.verbose("skipping compilation of compiler due to --keep-stage");
let compiler = build_compiler;
for stage in 0..min(target_compiler.stage, builder.config.keep_stage.unwrap()) {
let target_compiler = builder.compiler(stage, target_compiler.host);
let target = target_compiler.host;
builder.ensure(StdLink { compiler, target_compiler, target });
builder.ensure(TestLink { compiler, target_compiler, target });
builder.ensure(RustcLink { compiler, target_compiler, target });
}
} else {
builder.ensure(Rustc { compiler: build_compiler, target: target_compiler.host });
}
let stage = target_compiler.stage;
let host = target_compiler.host;
println!("Assembling stage{} compiler ({})", stage, host);
// Link in all dylibs to the libdir // Link in all dylibs to the libdir
let sysroot = build.sysroot(&target_compiler); let sysroot = builder.sysroot(target_compiler);
let sysroot_libdir = sysroot.join(libdir(host)); let sysroot_libdir = sysroot.join(libdir(&*host));
t!(fs::create_dir_all(&sysroot_libdir)); t!(fs::create_dir_all(&sysroot_libdir));
let src_libdir = build.sysroot_libdir(&build_compiler, host); let src_libdir = builder.sysroot_libdir(build_compiler, host);
for f in t!(fs::read_dir(&src_libdir)).map(|f| t!(f)) { for f in t!(fs::read_dir(&src_libdir)).map(|f| t!(f)) {
let filename = f.file_name().into_string().unwrap(); let filename = f.file_name().into_string().unwrap();
if is_dylib(&filename) { if is_dylib(&filename) {
@ -379,23 +740,17 @@ pub fn assemble_rustc(build: &Build, stage: u32, host: &str) {
} }
} }
let out_dir = build.cargo_out(&build_compiler, Mode::Librustc, host); let out_dir = build.cargo_out(build_compiler, Mode::Librustc, host);
// Link the compiler binary itself into place // Link the compiler binary itself into place
let rustc = out_dir.join(exe("rustc", host)); let rustc = out_dir.join(exe("rustc", &*host));
let bindir = sysroot.join("bin"); let bindir = sysroot.join("bin");
t!(fs::create_dir_all(&bindir)); t!(fs::create_dir_all(&bindir));
let compiler = build.compiler_path(&target_compiler); let compiler = builder.rustc(target_compiler);
let _ = fs::remove_file(&compiler); let _ = fs::remove_file(&compiler);
copy(&rustc, &compiler); copy(&rustc, &compiler);
// See if rustdoc exists to link it into place target_compiler
let rustdoc = exe("rustdoc", host);
let rustdoc_src = out_dir.join(&rustdoc);
let rustdoc_dst = bindir.join(&rustdoc);
if fs::metadata(&rustdoc_src).is_ok() {
let _ = fs::remove_file(&rustdoc_dst);
copy(&rustdoc_src, &rustdoc_dst);
} }
} }
@ -418,64 +773,6 @@ fn add_to_sysroot(sysroot_dst: &Path, stamp: &Path) {
} }
} }
/// Build a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
pub fn maybe_clean_tools(build: &Build, stage: u32, target: &str, mode: Mode) {
let compiler = Compiler::new(stage, &build.build);
let stamp = match mode {
Mode::Libstd => libstd_stamp(build, &compiler, target),
Mode::Libtest => libtest_stamp(build, &compiler, target),
Mode::Librustc => librustc_stamp(build, &compiler, target),
_ => panic!(),
};
let out_dir = build.cargo_out(&compiler, Mode::Tool, target);
build.clear_if_dirty(&out_dir, &stamp);
}
/// Build a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
pub fn tool(build: &Build, stage: u32, target: &str, tool: &str) {
let _folder = build.fold_output(|| format!("stage{}-{}", stage, tool));
println!("Building stage{} tool {} ({})", stage, tool, target);
let compiler = Compiler::new(stage, &build.build);
let mut cargo = build.cargo(&compiler, Mode::Tool, target, "build");
let dir = build.src.join("src/tools").join(tool);
cargo.arg("--manifest-path").arg(dir.join("Cargo.toml"));
// We don't want to build tools dynamically as they'll be running across
// stages and such and it's just easier if they're not dynamically linked.
cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1");
if let Some(dir) = build.openssl_install_dir(target) {
cargo.env("OPENSSL_STATIC", "1");
cargo.env("OPENSSL_DIR", dir);
cargo.env("LIBZ_SYS_STATIC", "1");
}
cargo.env("CFG_RELEASE_CHANNEL", &build.config.channel);
let info = GitInfo::new(&dir);
if let Some(sha) = info.sha() {
cargo.env("CFG_COMMIT_HASH", sha);
}
if let Some(sha_short) = info.sha_short() {
cargo.env("CFG_SHORT_COMMIT_HASH", sha_short);
}
if let Some(date) = info.commit_date() {
cargo.env("CFG_COMMIT_DATE", date);
}
build.run(&mut cargo);
}
// Avoiding a dependency on winapi to keep compile times down // Avoiding a dependency on winapi to keep compile times down
#[cfg(unix)] #[cfg(unix)]
fn stderr_isatty() -> bool { fn stderr_isatty() -> bool {
@ -535,18 +832,18 @@ fn run_cargo(build: &Build, cargo: &mut Command, stamp: &Path) {
let stdout = BufReader::new(child.stdout.take().unwrap()); let stdout = BufReader::new(child.stdout.take().unwrap());
for line in stdout.lines() { for line in stdout.lines() {
let line = t!(line); let line = t!(line);
let json = if line.starts_with("{") { let json: serde_json::Value = if line.starts_with("{") {
t!(line.parse::<json::Json>()) t!(serde_json::from_str(&line))
} else { } else {
// If this was informational, just print it out and continue // If this was informational, just print it out and continue
println!("{}", line); println!("{}", line);
continue continue
}; };
if json.find("reason").and_then(|j| j.as_string()) != Some("compiler-artifact") { if json["reason"].as_str() != Some("compiler-artifact") {
continue continue
} }
for filename in json["filenames"].as_array().unwrap() { for filename in json["filenames"].as_array().unwrap() {
let filename = filename.as_string().unwrap(); let filename = filename.as_str().unwrap();
// Skip files like executables // Skip files like executables
if !filename.ends_with(".rlib") && if !filename.ends_with(".rlib") &&
!filename.ends_with(".lib") && !filename.ends_with(".lib") &&

View File

@ -19,11 +19,14 @@ use std::fs::{self, File};
use std::io::prelude::*; use std::io::prelude::*;
use std::path::PathBuf; use std::path::PathBuf;
use std::process; use std::process;
use std::cmp;
use num_cpus; use num_cpus;
use rustc_serialize::Decodable; use toml;
use toml::{Parser, Decoder, Value};
use util::{exe, push_exe_path}; use util::{exe, push_exe_path};
use cache::{INTERNER, Interned};
use flags::Flags;
pub use flags::Subcommand;
/// Global configuration for the entire build and/or bootstrap. /// Global configuration for the entire build and/or bootstrap.
/// ///
@ -35,7 +38,7 @@ use util::{exe, push_exe_path};
/// Note that this structure is not decoded directly into, but rather it is /// Note that this structure is not decoded directly into, but rather it is
/// filled out from the decoded forms of the structs below. For documentation /// filled out from the decoded forms of the structs below. For documentation
/// each field, see the corresponding fields in /// each field, see the corresponding fields in
/// `src/bootstrap/config.toml.example`. /// `config.toml.example`.
#[derive(Default)] #[derive(Default)]
pub struct Config { pub struct Config {
pub ccache: Option<String>, pub ccache: Option<String>,
@ -46,13 +49,25 @@ pub struct Config {
pub docs: bool, pub docs: bool,
pub locked_deps: bool, pub locked_deps: bool,
pub vendor: bool, pub vendor: bool,
pub target_config: HashMap<String, Target>, pub target_config: HashMap<Interned<String>, Target>,
pub full_bootstrap: bool, pub full_bootstrap: bool,
pub extended: bool, pub extended: bool,
pub sanitizers: bool, pub sanitizers: bool,
pub profiler: bool, pub profiler: bool,
pub ignore_git: bool,
pub run_host_only: bool,
pub on_fail: Option<String>,
pub stage: Option<u32>,
pub keep_stage: Option<u32>,
pub src: PathBuf,
pub jobs: Option<u32>,
pub cmd: Subcommand,
pub incremental: bool,
// llvm codegen options // llvm codegen options
pub llvm_enabled: bool,
pub llvm_assertions: bool, pub llvm_assertions: bool,
pub llvm_optimize: bool, pub llvm_optimize: bool,
pub llvm_release_debuginfo: bool, pub llvm_release_debuginfo: bool,
@ -62,7 +77,6 @@ pub struct Config {
pub llvm_targets: Option<String>, pub llvm_targets: Option<String>,
pub llvm_experimental_targets: Option<String>, pub llvm_experimental_targets: Option<String>,
pub llvm_link_jobs: Option<u32>, pub llvm_link_jobs: Option<u32>,
pub llvm_clean_rebuild: bool,
// rust codegen options // rust codegen options
pub rust_optimize: bool, pub rust_optimize: bool,
@ -78,9 +92,9 @@ pub struct Config {
pub rust_debuginfo_tests: bool, pub rust_debuginfo_tests: bool,
pub rust_dist_src: bool, pub rust_dist_src: bool,
pub build: String, pub build: Interned<String>,
pub host: Vec<String>, pub hosts: Vec<Interned<String>>,
pub target: Vec<String>, pub targets: Vec<Interned<String>>,
pub local_rebuild: bool, pub local_rebuild: bool,
// dist misc // dist misc
@ -129,6 +143,7 @@ pub struct Target {
pub cc: Option<PathBuf>, pub cc: Option<PathBuf>,
pub cxx: Option<PathBuf>, pub cxx: Option<PathBuf>,
pub ndk: Option<PathBuf>, pub ndk: Option<PathBuf>,
pub crt_static: Option<bool>,
pub musl_root: Option<PathBuf>, pub musl_root: Option<PathBuf>,
pub qemu_rootfs: Option<PathBuf>, pub qemu_rootfs: Option<PathBuf>,
} }
@ -138,7 +153,8 @@ pub struct Target {
/// This structure uses `Decodable` to automatically decode a TOML configuration /// This structure uses `Decodable` to automatically decode a TOML configuration
/// file into this format, and then this is traversed and written into the above /// file into this format, and then this is traversed and written into the above
/// `Config` structure. /// `Config` structure.
#[derive(RustcDecodable, Default)] #[derive(Deserialize, Default)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct TomlConfig { struct TomlConfig {
build: Option<Build>, build: Option<Build>,
install: Option<Install>, install: Option<Install>,
@ -149,10 +165,13 @@ struct TomlConfig {
} }
/// TOML representation of various global build decisions. /// TOML representation of various global build decisions.
#[derive(RustcDecodable, Default, Clone)] #[derive(Deserialize, Default, Clone)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Build { struct Build {
build: Option<String>, build: Option<String>,
#[serde(default)]
host: Vec<String>, host: Vec<String>,
#[serde(default)]
target: Vec<String>, target: Vec<String>,
cargo: Option<String>, cargo: Option<String>,
rustc: Option<String>, rustc: Option<String>,
@ -174,7 +193,8 @@ struct Build {
} }
/// TOML representation of various global install decisions. /// TOML representation of various global install decisions.
#[derive(RustcDecodable, Default, Clone)] #[derive(Deserialize, Default, Clone)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Install { struct Install {
prefix: Option<String>, prefix: Option<String>,
sysconfdir: Option<String>, sysconfdir: Option<String>,
@ -185,8 +205,10 @@ struct Install {
} }
/// TOML representation of how the LLVM build is configured. /// TOML representation of how the LLVM build is configured.
#[derive(RustcDecodable, Default)] #[derive(Deserialize, Default)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Llvm { struct Llvm {
enabled: Option<bool>,
ccache: Option<StringOrBool>, ccache: Option<StringOrBool>,
ninja: Option<bool>, ninja: Option<bool>,
assertions: Option<bool>, assertions: Option<bool>,
@ -197,10 +219,10 @@ struct Llvm {
targets: Option<String>, targets: Option<String>,
experimental_targets: Option<String>, experimental_targets: Option<String>,
link_jobs: Option<u32>, link_jobs: Option<u32>,
clean_rebuild: Option<bool>,
} }
#[derive(RustcDecodable, Default, Clone)] #[derive(Deserialize, Default, Clone)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Dist { struct Dist {
sign_folder: Option<String>, sign_folder: Option<String>,
gpg_password_file: Option<String>, gpg_password_file: Option<String>,
@ -208,7 +230,8 @@ struct Dist {
src_tarball: Option<bool>, src_tarball: Option<bool>,
} }
#[derive(RustcDecodable)] #[derive(Deserialize)]
#[serde(untagged)]
enum StringOrBool { enum StringOrBool {
String(String), String(String),
Bool(bool), Bool(bool),
@ -221,7 +244,8 @@ impl Default for StringOrBool {
} }
/// TOML representation of how the Rust build is configured. /// TOML representation of how the Rust build is configured.
#[derive(RustcDecodable, Default)] #[derive(Deserialize, Default)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct Rust { struct Rust {
optimize: Option<bool>, optimize: Option<bool>,
codegen_units: Option<u32>, codegen_units: Option<u32>,
@ -240,23 +264,29 @@ struct Rust {
optimize_tests: Option<bool>, optimize_tests: Option<bool>,
debuginfo_tests: Option<bool>, debuginfo_tests: Option<bool>,
codegen_tests: Option<bool>, codegen_tests: Option<bool>,
ignore_git: Option<bool>,
} }
/// TOML representation of how each build target is configured. /// TOML representation of how each build target is configured.
#[derive(RustcDecodable, Default)] #[derive(Deserialize, Default)]
#[serde(deny_unknown_fields, rename_all = "kebab-case")]
struct TomlTarget { struct TomlTarget {
llvm_config: Option<String>, llvm_config: Option<String>,
jemalloc: Option<String>, jemalloc: Option<String>,
cc: Option<String>, cc: Option<String>,
cxx: Option<String>, cxx: Option<String>,
android_ndk: Option<String>, android_ndk: Option<String>,
crt_static: Option<bool>,
musl_root: Option<String>, musl_root: Option<String>,
qemu_rootfs: Option<String>, qemu_rootfs: Option<String>,
} }
impl Config { impl Config {
pub fn parse(build: &str, file: Option<PathBuf>) -> Config { pub fn parse(args: &[String]) -> Config {
let flags = Flags::parse(&args);
let file = flags.config.clone();
let mut config = Config::default(); let mut config = Config::default();
config.llvm_enabled = true;
config.llvm_optimize = true; config.llvm_optimize = true;
config.use_jemalloc = true; config.use_jemalloc = true;
config.backtrace = true; config.backtrace = true;
@ -266,52 +296,69 @@ impl Config {
config.docs = true; config.docs = true;
config.rust_rpath = true; config.rust_rpath = true;
config.rust_codegen_units = 1; config.rust_codegen_units = 1;
config.build = build.to_string();
config.channel = "dev".to_string(); config.channel = "dev".to_string();
config.codegen_tests = true; config.codegen_tests = true;
config.ignore_git = false;
config.rust_dist_src = true; config.rust_dist_src = true;
config.on_fail = flags.on_fail;
config.stage = flags.stage;
config.src = flags.src;
config.jobs = flags.jobs;
config.cmd = flags.cmd;
config.incremental = flags.incremental;
config.keep_stage = flags.keep_stage;
// If --target was specified but --host wasn't specified, don't run any host-only tests.
config.run_host_only = flags.host.is_empty() && !flags.target.is_empty();
let toml = file.map(|file| { let toml = file.map(|file| {
let mut f = t!(File::open(&file)); let mut f = t!(File::open(&file));
let mut toml = String::new(); let mut contents = String::new();
t!(f.read_to_string(&mut toml)); t!(f.read_to_string(&mut contents));
let mut p = Parser::new(&toml); match toml::from_str(&contents) {
let table = match p.parse() { Ok(table) => table,
Some(table) => table, Err(err) => {
None => { println!("failed to parse TOML configuration '{}': {}",
println!("failed to parse TOML configuration '{}':", file.to_str().unwrap()); file.display(), err);
for err in p.errors.iter() {
let (loline, locol) = p.to_linecol(err.lo);
let (hiline, hicol) = p.to_linecol(err.hi);
println!("{}:{}-{}:{}: {}", loline, locol, hiline,
hicol, err.desc);
}
process::exit(2);
}
};
let mut d = Decoder::new(Value::Table(table));
match Decodable::decode(&mut d) {
Ok(cfg) => cfg,
Err(e) => {
println!("failed to decode TOML: {}", e);
process::exit(2); process::exit(2);
} }
} }
}).unwrap_or_else(|| TomlConfig::default()); }).unwrap_or_else(|| TomlConfig::default());
let build = toml.build.clone().unwrap_or(Build::default()); let build = toml.build.clone().unwrap_or(Build::default());
set(&mut config.build, build.build.clone()); set(&mut config.build, build.build.clone().map(|x| INTERNER.intern_string(x)));
config.host.push(config.build.clone()); set(&mut config.build, flags.build);
if config.build.is_empty() {
// set by bootstrap.py
config.build = INTERNER.intern_str(&env::var("BUILD").unwrap());
}
config.hosts.push(config.build.clone());
for host in build.host.iter() { for host in build.host.iter() {
if !config.host.contains(host) { let host = INTERNER.intern_str(host);
config.host.push(host.clone()); if !config.hosts.contains(&host) {
config.hosts.push(host);
} }
} }
for target in config.host.iter().chain(&build.target) { for target in config.hosts.iter().cloned()
if !config.target.contains(target) { .chain(build.target.iter().map(|s| INTERNER.intern_str(s)))
config.target.push(target.clone()); {
if !config.targets.contains(&target) {
config.targets.push(target);
} }
} }
config.hosts = if !flags.host.is_empty() {
flags.host
} else {
config.hosts
};
config.targets = if !flags.target.is_empty() {
flags.target
} else {
config.targets
};
config.nodejs = build.nodejs.map(PathBuf::from); config.nodejs = build.nodejs.map(PathBuf::from);
config.gdb = build.gdb.map(PathBuf::from); config.gdb = build.gdb.map(PathBuf::from);
config.python = build.python.map(PathBuf::from); config.python = build.python.map(PathBuf::from);
@ -327,6 +374,7 @@ impl Config {
set(&mut config.sanitizers, build.sanitizers); set(&mut config.sanitizers, build.sanitizers);
set(&mut config.profiler, build.profiler); set(&mut config.profiler, build.profiler);
set(&mut config.openssl_static, build.openssl_static); set(&mut config.openssl_static, build.openssl_static);
config.verbose = cmp::max(config.verbose, flags.verbose);
if let Some(ref install) = toml.install { if let Some(ref install) = toml.install {
config.prefix = install.prefix.clone().map(PathBuf::from); config.prefix = install.prefix.clone().map(PathBuf::from);
@ -348,12 +396,12 @@ impl Config {
Some(StringOrBool::Bool(false)) | None => {} Some(StringOrBool::Bool(false)) | None => {}
} }
set(&mut config.ninja, llvm.ninja); set(&mut config.ninja, llvm.ninja);
set(&mut config.llvm_enabled, llvm.enabled);
set(&mut config.llvm_assertions, llvm.assertions); set(&mut config.llvm_assertions, llvm.assertions);
set(&mut config.llvm_optimize, llvm.optimize); set(&mut config.llvm_optimize, llvm.optimize);
set(&mut config.llvm_release_debuginfo, llvm.release_debuginfo); set(&mut config.llvm_release_debuginfo, llvm.release_debuginfo);
set(&mut config.llvm_version_check, llvm.version_check); set(&mut config.llvm_version_check, llvm.version_check);
set(&mut config.llvm_static_stdcpp, llvm.static_libstdcpp); set(&mut config.llvm_static_stdcpp, llvm.static_libstdcpp);
set(&mut config.llvm_clean_rebuild, llvm.clean_rebuild);
config.llvm_targets = llvm.targets.clone(); config.llvm_targets = llvm.targets.clone();
config.llvm_experimental_targets = llvm.experimental_targets.clone(); config.llvm_experimental_targets = llvm.experimental_targets.clone();
config.llvm_link_jobs = llvm.link_jobs; config.llvm_link_jobs = llvm.link_jobs;
@ -373,6 +421,7 @@ impl Config {
set(&mut config.use_jemalloc, rust.use_jemalloc); set(&mut config.use_jemalloc, rust.use_jemalloc);
set(&mut config.backtrace, rust.backtrace); set(&mut config.backtrace, rust.backtrace);
set(&mut config.channel, rust.channel.clone()); set(&mut config.channel, rust.channel.clone());
set(&mut config.ignore_git, rust.ignore_git);
config.rustc_default_linker = rust.default_linker.clone(); config.rustc_default_linker = rust.default_linker.clone();
config.rustc_default_ar = rust.default_ar.clone(); config.rustc_default_ar = rust.default_ar.clone();
config.musl_root = rust.musl_root.clone().map(PathBuf::from); config.musl_root = rust.musl_root.clone().map(PathBuf::from);
@ -399,10 +448,11 @@ impl Config {
} }
target.cxx = cfg.cxx.clone().map(PathBuf::from); target.cxx = cfg.cxx.clone().map(PathBuf::from);
target.cc = cfg.cc.clone().map(PathBuf::from); target.cc = cfg.cc.clone().map(PathBuf::from);
target.crt_static = cfg.crt_static.clone();
target.musl_root = cfg.musl_root.clone().map(PathBuf::from); target.musl_root = cfg.musl_root.clone().map(PathBuf::from);
target.qemu_rootfs = cfg.qemu_rootfs.clone().map(PathBuf::from); target.qemu_rootfs = cfg.qemu_rootfs.clone().map(PathBuf::from);
config.target_config.insert(triple.clone(), target); config.target_config.insert(INTERNER.intern_string(triple.clone()), target);
} }
} }
@ -478,7 +528,6 @@ impl Config {
("LLVM_VERSION_CHECK", self.llvm_version_check), ("LLVM_VERSION_CHECK", self.llvm_version_check),
("LLVM_STATIC_STDCPP", self.llvm_static_stdcpp), ("LLVM_STATIC_STDCPP", self.llvm_static_stdcpp),
("LLVM_LINK_SHARED", self.llvm_link_shared), ("LLVM_LINK_SHARED", self.llvm_link_shared),
("LLVM_CLEAN_REBUILD", self.llvm_clean_rebuild),
("OPTIMIZE", self.rust_optimize), ("OPTIMIZE", self.rust_optimize),
("DEBUG_ASSERTIONS", self.rust_debug_assertions), ("DEBUG_ASSERTIONS", self.rust_debug_assertions),
("DEBUGINFO", self.rust_debuginfo), ("DEBUGINFO", self.rust_debuginfo),
@ -504,13 +553,13 @@ impl Config {
} }
match key { match key {
"CFG_BUILD" if value.len() > 0 => self.build = value.to_string(), "CFG_BUILD" if value.len() > 0 => self.build = INTERNER.intern_str(value),
"CFG_HOST" if value.len() > 0 => { "CFG_HOST" if value.len() > 0 => {
self.host.extend(value.split(" ").map(|s| s.to_string())); self.hosts.extend(value.split(" ").map(|s| INTERNER.intern_str(s)));
} }
"CFG_TARGET" if value.len() > 0 => { "CFG_TARGET" if value.len() > 0 => {
self.target.extend(value.split(" ").map(|s| s.to_string())); self.targets.extend(value.split(" ").map(|s| INTERNER.intern_str(s)));
} }
"CFG_EXPERIMENTAL_TARGETS" if value.len() > 0 => { "CFG_EXPERIMENTAL_TARGETS" if value.len() > 0 => {
self.llvm_experimental_targets = Some(value.to_string()); self.llvm_experimental_targets = Some(value.to_string());
@ -519,33 +568,28 @@ impl Config {
self.musl_root = Some(parse_configure_path(value)); self.musl_root = Some(parse_configure_path(value));
} }
"CFG_MUSL_ROOT_X86_64" if value.len() > 0 => { "CFG_MUSL_ROOT_X86_64" if value.len() > 0 => {
let target = "x86_64-unknown-linux-musl".to_string(); let target = INTERNER.intern_str("x86_64-unknown-linux-musl");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.musl_root = Some(parse_configure_path(value)); target.musl_root = Some(parse_configure_path(value));
} }
"CFG_MUSL_ROOT_I686" if value.len() > 0 => { "CFG_MUSL_ROOT_I686" if value.len() > 0 => {
let target = "i686-unknown-linux-musl".to_string(); let target = INTERNER.intern_str("i686-unknown-linux-musl");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.musl_root = Some(parse_configure_path(value)); target.musl_root = Some(parse_configure_path(value));
} }
"CFG_MUSL_ROOT_ARM" if value.len() > 0 => { "CFG_MUSL_ROOT_ARM" if value.len() > 0 => {
let target = "arm-unknown-linux-musleabi".to_string(); let target = INTERNER.intern_str("arm-unknown-linux-musleabi");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.musl_root = Some(parse_configure_path(value)); target.musl_root = Some(parse_configure_path(value));
} }
"CFG_MUSL_ROOT_ARMHF" if value.len() > 0 => { "CFG_MUSL_ROOT_ARMHF" if value.len() > 0 => {
let target = "arm-unknown-linux-musleabihf".to_string(); let target = INTERNER.intern_str("arm-unknown-linux-musleabihf");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.musl_root = Some(parse_configure_path(value)); target.musl_root = Some(parse_configure_path(value));
} }
"CFG_MUSL_ROOT_ARMV7" if value.len() > 0 => { "CFG_MUSL_ROOT_ARMV7" if value.len() > 0 => {
let target = "armv7-unknown-linux-musleabihf".to_string(); let target = INTERNER.intern_str("armv7-unknown-linux-musleabihf");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.musl_root = Some(parse_configure_path(value)); target.musl_root = Some(parse_configure_path(value));
} }
"CFG_DEFAULT_AR" if value.len() > 0 => { "CFG_DEFAULT_AR" if value.len() > 0 => {
@ -593,33 +637,28 @@ impl Config {
target.jemalloc = Some(parse_configure_path(value).join("libjemalloc_pic.a")); target.jemalloc = Some(parse_configure_path(value).join("libjemalloc_pic.a"));
} }
"CFG_ARM_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => { "CFG_ARM_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => {
let target = "arm-linux-androideabi".to_string(); let target = INTERNER.intern_str("arm-linux-androideabi");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_ARMV7_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => { "CFG_ARMV7_LINUX_ANDROIDEABI_NDK" if value.len() > 0 => {
let target = "armv7-linux-androideabi".to_string(); let target = INTERNER.intern_str("armv7-linux-androideabi");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_I686_LINUX_ANDROID_NDK" if value.len() > 0 => { "CFG_I686_LINUX_ANDROID_NDK" if value.len() > 0 => {
let target = "i686-linux-android".to_string(); let target = INTERNER.intern_str("i686-linux-android");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_AARCH64_LINUX_ANDROID_NDK" if value.len() > 0 => { "CFG_AARCH64_LINUX_ANDROID_NDK" if value.len() > 0 => {
let target = "aarch64-linux-android".to_string(); let target = INTERNER.intern_str("aarch64-linux-android");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_X86_64_LINUX_ANDROID_NDK" if value.len() > 0 => { "CFG_X86_64_LINUX_ANDROID_NDK" if value.len() > 0 => {
let target = "x86_64-linux-android".to_string(); let target = INTERNER.intern_str("x86_64-linux-android");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_LOCAL_RUST_ROOT" if value.len() > 0 => { "CFG_LOCAL_RUST_ROOT" if value.len() > 0 => {
@ -643,9 +682,13 @@ impl Config {
.collect(); .collect();
} }
"CFG_QEMU_ARMHF_ROOTFS" if value.len() > 0 => { "CFG_QEMU_ARMHF_ROOTFS" if value.len() > 0 => {
let target = "arm-unknown-linux-gnueabihf".to_string(); let target = INTERNER.intern_str("arm-unknown-linux-gnueabihf");
let target = self.target_config.entry(target) let target = self.target_config.entry(target).or_insert(Target::default());
.or_insert(Target::default()); target.qemu_rootfs = Some(parse_configure_path(value));
}
"CFG_QEMU_AARCH64_ROOTFS" if value.len() > 0 => {
let target = INTERNER.intern_str("aarch64-unknown-linux-gnu");
let target = self.target_config.entry(target).or_insert(Target::default());
target.qemu_rootfs = Some(parse_configure_path(value)); target.qemu_rootfs = Some(parse_configure_path(value));
} }
_ => {} _ => {}

View File

@ -1,333 +0,0 @@
# Sample TOML configuration file for building Rust.
#
# To configure rustbuild, copy this file to the directory from which you will be
# running the build, and name it config.toml.
#
# All options are commented out by default in this file, and they're commented
# out with their default values. The build system by default looks for
# `config.toml` in the current directory of a build for build configuration, but
# a custom configuration file can also be specified with `--config` to the build
# system.
# =============================================================================
# Tweaking how LLVM is compiled
# =============================================================================
[llvm]
# Indicates whether the LLVM build is a Release or Debug build
#optimize = true
# Indicates whether an LLVM Release build should include debug info
#release-debuginfo = false
# Indicates whether the LLVM assertions are enabled or not
#assertions = false
# Indicates whether ccache is used when building LLVM
#ccache = false
# or alternatively ...
#ccache = "/path/to/ccache"
# If an external LLVM root is specified, we automatically check the version by
# default to make sure it's within the range that we're expecting, but setting
# this flag will indicate that this version check should not be done.
#version-check = false
# Link libstdc++ statically into the librustc_llvm instead of relying on a
# dynamic version to be available.
#static-libstdcpp = false
# Tell the LLVM build system to use Ninja instead of the platform default for
# the generated build system. This can sometimes be faster than make, for
# example.
#ninja = false
# LLVM targets to build support for.
# Note: this is NOT related to Rust compilation targets. However, as Rust is
# dependent on LLVM for code generation, turning targets off here WILL lead to
# the resulting rustc being unable to compile for the disabled architectures.
# Also worth pointing out is that, in case support for new targets are added to
# LLVM, enabling them here doesn't mean Rust is automatically gaining said
# support. You'll need to write a target specification at least, and most
# likely, teach rustc about the C ABI of the target. Get in touch with the
# Rust team and file an issue if you need assistance in porting!
#targets = "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX;Hexagon"
# LLVM experimental targets to build support for. These targets are specified in
# the same format as above, but since these targets are experimental, they are
# not built by default and the experimental Rust compilation targets that depend
# on them will not work unless the user opts in to building them. Possible
# experimental LLVM targets include WebAssembly for the
# wasm32-experimental-emscripten Rust target.
#experimental-targets = ""
# Cap the number of parallel linker invocations when compiling LLVM.
# This can be useful when building LLVM with debug info, which significantly
# increases the size of binaries and consequently the memory required by
# each linker process.
# If absent or 0, linker invocations are treated like any other job and
# controlled by rustbuild's -j parameter.
#link-jobs = 0
# Delete LLVM build directory on LLVM rebuild.
# This option defaults to `false` for local development, but CI may want to
# always perform clean full builds (possibly accelerated by (s)ccache).
#clean-rebuild = false
# =============================================================================
# General build configuration options
# =============================================================================
[build]
# Build triple for the original snapshot compiler. This must be a compiler that
# nightlies are already produced for. The current platform must be able to run
# binaries of this build triple and the nightly will be used to bootstrap the
# first compiler.
#build = "x86_64-unknown-linux-gnu" # defaults to your host platform
# In addition to the build triple, other triples to produce full compiler
# toolchains for. Each of these triples will be bootstrapped from the build
# triple and then will continue to bootstrap themselves. This platform must
# currently be able to run all of the triples provided here.
#host = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple
# In addition to all host triples, other triples to produce the standard library
# for. Each host triple will be used to produce a copy of the standard library
# for each target triple.
#target = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple
# Instead of downloading the src/stage0.txt version of Cargo specified, use
# this Cargo binary instead to build all Rust code
#cargo = "/path/to/bin/cargo"
# Instead of downloading the src/stage0.txt version of the compiler
# specified, use this rustc binary instead as the stage0 snapshot compiler.
#rustc = "/path/to/bin/rustc"
# Flag to specify whether any documentation is built. If false, rustdoc and
# friends will still be compiled but they will not be used to generate any
# documentation.
#docs = true
# Indicate whether the compiler should be documented in addition to the standard
# library and facade crates.
#compiler-docs = false
# Indicate whether submodules are managed and updated automatically.
#submodules = true
# The path to (or name of) the GDB executable to use. This is only used for
# executing the debuginfo test suite.
#gdb = "gdb"
# The node.js executable to use. Note that this is only used for the emscripten
# target when running tests, otherwise this can be omitted.
#nodejs = "node"
# Python interpreter to use for various tasks throughout the build, notably
# rustdoc tests, the lldb python interpreter, and some dist bits and pieces.
# Note that Python 2 is currently required.
#python = "python2.7"
# Force Cargo to check that Cargo.lock describes the precise dependency
# set that all the Cargo.toml files create, instead of updating it.
#locked-deps = false
# Indicate whether the vendored sources are used for Rust dependencies or not
#vendor = false
# Typically the build system will build the rust compiler twice. The second
# compiler, however, will simply use its own libraries to link against. If you
# would rather to perform a full bootstrap, compiling the compiler three times,
# then you can set this option to true. You shouldn't ever need to set this
# option to true.
#full-bootstrap = false
# Enable a build of the and extended rust tool set which is not only the
# compiler but also tools such as Cargo. This will also produce "combined
# installers" which are used to install Rust and Cargo together. This is
# disabled by default.
#extended = false
# Verbosity level: 0 == not verbose, 1 == verbose, 2 == very verbose
#verbose = 0
# Build the sanitizer runtimes
#sanitizers = false
# Build the profiler runtime
#profiler = false
# Indicates whether the OpenSSL linked into Cargo will be statically linked or
# not. If static linkage is specified then the build system will download a
# known-good version of OpenSSL, compile it, and link it to Cargo.
#openssl-static = false
# Run the build with low priority, by setting the process group's "nice" value
# to +10 on Unix platforms, and by using a "low priority" job object on Windows.
#low-priority = false
# =============================================================================
# General install configuration options
# =============================================================================
[install]
# Instead of installing to /usr/local, install to this path instead.
#prefix = "/usr/local"
# Where to install system configuration files
# If this is a relative path, it will get installed in `prefix` above
#sysconfdir = "/etc"
# Where to install documentation in `prefix` above
#docdir = "share/doc/rust"
# Where to install binaries in `prefix` above
#bindir = "bin"
# Where to install libraries in `prefix` above
#libdir = "lib"
# Where to install man pages in `prefix` above
#mandir = "share/man"
# =============================================================================
# Options for compiling Rust code itself
# =============================================================================
[rust]
# Whether or not to optimize the compiler and standard library
# Note: the slowness of the non optimized compiler compiling itself usually
# outweighs the time gains in not doing optimizations, therefore a
# full bootstrap takes much more time with optimize set to false.
#optimize = true
# Number of codegen units to use for each compiler invocation. A value of 0
# means "the number of cores on this machine", and 1+ is passed through to the
# compiler.
#codegen-units = 1
# Whether or not debug assertions are enabled for the compiler and standard
# library
#debug-assertions = false
# Whether or not debuginfo is emitted
#debuginfo = false
# Whether or not line number debug information is emitted
#debuginfo-lines = false
# Whether or not to only build debuginfo for the standard library if enabled.
# If enabled, this will not compile the compiler with debuginfo, just the
# standard library.
#debuginfo-only-std = false
# Whether or not jemalloc is built and enabled
#use-jemalloc = true
# Whether or not jemalloc is built with its debug option set
#debug-jemalloc = false
# Whether or not `panic!`s generate backtraces (RUST_BACKTRACE)
#backtrace = true
# The default linker that will be used by the generated compiler. Note that this
# is not the linker used to link said compiler.
#default-linker = "cc"
# The default ar utility that will be used by the generated compiler if LLVM
# cannot be used. Note that this is not used to assemble said compiler.
#default-ar = "ar"
# The "channel" for the Rust build to produce. The stable/beta channels only
# allow using stable features, whereas the nightly and dev channels allow using
# nightly features
#channel = "dev"
# By default the `rustc` executable is built with `-Wl,-rpath` flags on Unix
# platforms to ensure that the compiler is usable by default from the build
# directory (as it links to a number of dynamic libraries). This may not be
# desired in distributions, for example.
#rpath = true
# Flag indicating whether tests are compiled with optimizations (the -O flag) or
# with debuginfo (the -g flag)
#optimize-tests = true
#debuginfo-tests = true
# Flag indicating whether codegen tests will be run or not. If you get an error
# saying that the FileCheck executable is missing, you may want to disable this.
#codegen-tests = true
# =============================================================================
# Options for specific targets
#
# Each of the following options is scoped to the specific target triple in
# question and is used for determining how to compile each target.
# =============================================================================
[target.x86_64-unknown-linux-gnu]
# C compiler to be used to compiler C code and link Rust code. Note that the
# default value is platform specific, and if not specified it may also depend on
# what platform is crossing to what platform.
#cc = "cc"
# C++ compiler to be used to compiler C++ code (e.g. LLVM and our LLVM shims).
# This is only used for host targets.
#cxx = "c++"
# Path to the `llvm-config` binary of the installation of a custom LLVM to link
# against. Note that if this is specifed we don't compile LLVM at all for this
# target.
#llvm-config = "../path/to/llvm/root/bin/llvm-config"
# Path to the custom jemalloc static library to link into the standard library
# by default. This is only used if jemalloc is still enabled above
#jemalloc = "/path/to/jemalloc/libjemalloc_pic.a"
# If this target is for Android, this option will be required to specify where
# the NDK for the target lives. This is used to find the C compiler to link and
# build native code.
#android-ndk = "/path/to/ndk"
# The root location of the MUSL installation directory. The library directory
# will also need to contain libunwind.a for an unwinding implementation. Note
# that this option only makes sense for MUSL targets that produce statically
# linked binaries
#musl-root = "..."
# =============================================================================
# Distribution options
#
# These options are related to distribution, mostly for the Rust project itself.
# You probably won't need to concern yourself with any of these options
# =============================================================================
[dist]
# This is the folder of artifacts that the build system will sign. All files in
# this directory will be signed with the default gpg key using the system `gpg`
# binary. The `asc` and `sha256` files will all be output into the standard dist
# output folder (currently `build/dist`)
#
# This folder should be populated ahead of time before the build system is
# invoked.
#sign-folder = "path/to/folder/to/sign"
# This is a file which contains the password of the default gpg key. This will
# be passed to `gpg` down the road when signing all files in `sign-folder`
# above. This should be stored in plaintext.
#gpg-password-file = "path/to/gpg/password"
# The remote address that all artifacts will eventually be uploaded to. The
# build system generates manifests which will point to these urls, and for the
# manifests to be correct they'll have to have the right URLs encoded.
#
# Note that this address should not contain a trailing slash as file names will
# be appended to it.
#upload-addr = "https://example.com/folder"
# Whether to build a plain source tarball to upload
# We disable that on Windows not to override the one already uploaded on S3
# as the one built on Windows will contain backslashes in paths causing problems
# on linux
#src-tarball = true

View File

@ -28,7 +28,11 @@ use build_helper::output;
use {Build, Compiler, Mode}; use {Build, Compiler, Mode};
use channel; use channel;
use util::{cp_r, libdir, is_dylib, cp_filtered, copy, exe}; use util::{cp_r, libdir, is_dylib, cp_filtered, copy};
use builder::{Builder, RunConfig, ShouldRun, Step};
use compile;
use tool::{self, Tool};
use cache::{INTERNER, Interned};
pub fn pkgname(build: &Build, component: &str) -> String { pub fn pkgname(build: &Build, component: &str) -> String {
if component == "cargo" { if component == "cargo" {
@ -49,21 +53,47 @@ pub fn tmpdir(build: &Build) -> PathBuf {
build.out.join("tmp/dist") build.out.join("tmp/dist")
} }
fn rust_installer(build: &Build) -> Command { fn rust_installer(builder: &Builder) -> Command {
build.tool_cmd(&Compiler::new(0, &build.build), "rust-installer") builder.tool_cmd(Tool::RustInstaller)
} }
/// Builds the `rust-docs` installer component. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct Docs {
/// Slurps up documentation from the `stage`'s `host`. pub stage: u32,
pub fn docs(build: &Build, stage: u32, host: &str) { pub host: Interned<String>,
println!("Dist docs stage{} ({})", stage, host); }
if !build.config.docs {
println!("\tskipping - docs disabled"); impl Step for Docs {
return type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/doc")
} }
fn make_run(run: RunConfig) {
run.builder.ensure(Docs {
stage: run.builder.top_stage,
host: run.target,
});
}
/// Builds the `rust-docs` installer component.
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let host = self.host;
let name = pkgname(build, "rust-docs"); let name = pkgname(build, "rust-docs");
println!("Dist docs ({})", host);
if !build.config.docs {
println!("\tskipping - docs disabled");
return distdir(build).join(format!("{}-{}.tar.gz", name, host));
}
builder.default_doc(None);
let image = tmpdir(build).join(format!("{}-{}-image", name, host)); let image = tmpdir(build).join(format!("{}-{}-image", name, host));
let _ = fs::remove_dir_all(&image); let _ = fs::remove_dir_all(&image);
@ -72,7 +102,7 @@ pub fn docs(build: &Build, stage: u32, host: &str) {
let src = build.out.join(host).join("doc"); let src = build.out.join(host).join("doc");
cp_r(&src, &dst); cp_r(&src, &dst);
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust-Documentation") .arg("--product-name=Rust-Documentation")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -94,6 +124,9 @@ pub fn docs(build: &Build, stage: u32, host: &str) {
t!(fs::create_dir_all(&dst)); t!(fs::create_dir_all(&dst));
cp_r(&src, &dst); cp_r(&src, &dst);
} }
distdir(build).join(format!("{}-{}.tar.gz", name, host))
}
} }
fn find_files(files: &[&str], path: &[PathBuf]) -> Vec<PathBuf> { fn find_files(files: &[&str], path: &[PathBuf]) -> Vec<PathBuf> {
@ -115,7 +148,9 @@ fn find_files(files: &[&str], path: &[PathBuf]) -> Vec<PathBuf> {
found found
} }
fn make_win_dist(rust_root: &Path, plat_root: &Path, target_triple: &str, build: &Build) { fn make_win_dist(
rust_root: &Path, plat_root: &Path, target_triple: Interned<String>, build: &Build
) {
//Ask gcc where it keeps its stuff //Ask gcc where it keeps its stuff
let mut cmd = Command::new(build.cc(target_triple)); let mut cmd = Command::new(build.cc(target_triple));
cmd.arg("-print-search-dirs"); cmd.arg("-print-search-dirs");
@ -222,11 +257,36 @@ fn make_win_dist(rust_root: &Path, plat_root: &Path, target_triple: &str, build:
} }
} }
/// Build the `rust-mingw` installer component. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct Mingw {
/// This contains all the bits and pieces to run the MinGW Windows targets host: Interned<String>,
/// without any extra installed software (e.g. we bundle gcc, libraries, etc). }
pub fn mingw(build: &Build, host: &str) {
impl Step for Mingw {
type Output = Option<PathBuf>;
const DEFAULT: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
fn make_run(run: RunConfig) {
run.builder.ensure(Mingw { host: run.target });
}
/// Build the `rust-mingw` installer component.
///
/// This contains all the bits and pieces to run the MinGW Windows targets
/// without any extra installed software (e.g. we bundle gcc, libraries, etc).
fn run(self, builder: &Builder) -> Option<PathBuf> {
let build = builder.build;
let host = self.host;
if !host.contains("pc-windows-gnu") {
return None;
}
println!("Dist mingw ({})", host); println!("Dist mingw ({})", host);
let name = pkgname(build, "rust-mingw"); let name = pkgname(build, "rust-mingw");
let image = tmpdir(build).join(format!("{}-{}-image", name, host)); let image = tmpdir(build).join(format!("{}-{}-image", name, host));
@ -239,7 +299,7 @@ pub fn mingw(build: &Build, host: &str) {
// (which is what we want). // (which is what we want).
make_win_dist(&tmpdir(build), &image, host, &build); make_win_dist(&tmpdir(build), &image, host, &build);
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust-MinGW") .arg("--product-name=Rust-MinGW")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -252,11 +312,38 @@ pub fn mingw(build: &Build, host: &str) {
.arg("--legacy-manifest-dirs=rustlib,cargo"); .arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd); build.run(&mut cmd);
t!(fs::remove_dir_all(&image)); t!(fs::remove_dir_all(&image));
Some(distdir(build).join(format!("{}-{}.tar.gz", name, host)))
}
} }
/// Creates the `rustc` installer component. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn rustc(build: &Build, stage: u32, host: &str) { pub struct Rustc {
println!("Dist rustc stage{} ({})", stage, host); pub compiler: Compiler,
}
impl Step for Rustc {
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/librustc")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rustc {
compiler: run.builder.compiler(run.builder.top_stage, run.target),
});
}
/// Creates the `rustc` installer component.
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let compiler = self.compiler;
let host = self.compiler.host;
println!("Dist rustc stage{} ({})", compiler.stage, compiler.host);
let name = pkgname(build, "rustc"); let name = pkgname(build, "rustc");
let image = tmpdir(build).join(format!("{}-{}-image", name, host)); let image = tmpdir(build).join(format!("{}-{}-image", name, host));
let _ = fs::remove_dir_all(&image); let _ = fs::remove_dir_all(&image);
@ -264,7 +351,7 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
let _ = fs::remove_dir_all(&overlay); let _ = fs::remove_dir_all(&overlay);
// Prepare the rustc "image", what will actually end up getting installed // Prepare the rustc "image", what will actually end up getting installed
prepare_image(build, stage, host, &image); prepare_image(builder, compiler, &image);
// Prepare the overlay which is part of the tarball but won't actually be // Prepare the overlay which is part of the tarball but won't actually be
// installed // installed
@ -298,7 +385,7 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
} }
// Finally, wrap everything up in a nice tarball! // Finally, wrap everything up in a nice tarball!
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -314,14 +401,20 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
t!(fs::remove_dir_all(&image)); t!(fs::remove_dir_all(&image));
t!(fs::remove_dir_all(&overlay)); t!(fs::remove_dir_all(&overlay));
fn prepare_image(build: &Build, stage: u32, host: &str, image: &Path) { return distdir(build).join(format!("{}-{}.tar.gz", name, host));
let src = build.sysroot(&Compiler::new(stage, host));
let libdir = libdir(host); fn prepare_image(builder: &Builder, compiler: Compiler, image: &Path) {
let host = compiler.host;
let build = builder.build;
let src = builder.sysroot(compiler);
let libdir = libdir(&host);
// Copy rustc/rustdoc binaries // Copy rustc/rustdoc binaries
t!(fs::create_dir_all(image.join("bin"))); t!(fs::create_dir_all(image.join("bin")));
cp_r(&src.join("bin"), &image.join("bin")); cp_r(&src.join("bin"), &image.join("bin"));
install(&builder.rustdoc(compiler.host), &image.join("bin"), 0o755);
// Copy runtime DLLs needed by the compiler // Copy runtime DLLs needed by the compiler
if libdir != "bin" { if libdir != "bin" {
for entry in t!(src.join(libdir).read_dir()).map(|e| t!(e)) { for entry in t!(src.join(libdir).read_dir()).map(|e| t!(e)) {
@ -339,7 +432,10 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
cp_r(&build.src.join("man"), &image.join("share/man/man1")); cp_r(&build.src.join("man"), &image.join("share/man/man1"));
// Debugger scripts // Debugger scripts
debugger_scripts(build, &image, host); builder.ensure(DebuggerScripts {
sysroot: INTERNER.intern_path(image.to_owned()),
host,
});
// Misc license info // Misc license info
let cp = |file: &str| { let cp = |file: &str| {
@ -350,12 +446,34 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
cp("LICENSE-MIT"); cp("LICENSE-MIT");
cp("README.md"); cp("README.md");
} }
}
} }
/// Copies debugger scripts for `host` into the `sysroot` specified. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn debugger_scripts(build: &Build, pub struct DebuggerScripts {
sysroot: &Path, pub sysroot: Interned<PathBuf>,
host: &str) { pub host: Interned<String>,
}
impl Step for DebuggerScripts {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/lldb_batchmode.py")
}
fn make_run(run: RunConfig) {
run.builder.ensure(DebuggerScripts {
sysroot: run.builder.sysroot(run.builder.compiler(run.builder.top_stage, run.host)),
host: run.target,
});
}
/// Copies debugger scripts for `target` into the `sysroot` specified.
fn run(self, builder: &Builder) {
let build = builder.build;
let host = self.host;
let sysroot = self.sysroot;
let dst = sysroot.join("lib/rustlib/etc"); let dst = sysroot.join("lib/rustlib/etc");
t!(fs::create_dir_all(&dst)); t!(fs::create_dir_all(&dst));
let cp_debugger_script = |file: &str| { let cp_debugger_script = |file: &str| {
@ -384,32 +502,65 @@ pub fn debugger_scripts(build: &Build,
cp_debugger_script("lldb_rust_formatters.py"); cp_debugger_script("lldb_rust_formatters.py");
} }
}
} }
/// Creates the `rust-std` installer component as compiled by `compiler` for the #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// target `target`. pub struct Std {
pub fn std(build: &Build, compiler: &Compiler, target: &str) { pub compiler: Compiler,
println!("Dist std stage{} ({} -> {})", compiler.stage, compiler.host, pub target: Interned<String>,
target); }
impl Step for Std {
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/libstd")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Std {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let compiler = self.compiler;
let target = self.target;
let name = pkgname(build, "rust-std");
println!("Dist std stage{} ({} -> {})", compiler.stage, &compiler.host, target);
// The only true set of target libraries came from the build triple, so // The only true set of target libraries came from the build triple, so
// let's reduce redundant work by only producing archives from that host. // let's reduce redundant work by only producing archives from that host.
if compiler.host != build.build { if compiler.host != build.build {
println!("\tskipping, not a build host"); println!("\tskipping, not a build host");
return return distdir(build).join(format!("{}-{}.tar.gz", name, target));
}
// We want to package up as many target libraries as possible
// for the `rust-std` package, so if this is a host target we
// depend on librustc and otherwise we just depend on libtest.
if build.hosts.iter().any(|t| t == target) {
builder.ensure(compile::Rustc { compiler, target });
} else {
builder.ensure(compile::Test { compiler, target });
} }
let name = pkgname(build, "rust-std");
let image = tmpdir(build).join(format!("{}-{}-image", name, target)); let image = tmpdir(build).join(format!("{}-{}-image", name, target));
let _ = fs::remove_dir_all(&image); let _ = fs::remove_dir_all(&image);
let dst = image.join("lib/rustlib").join(target); let dst = image.join("lib/rustlib").join(target);
t!(fs::create_dir_all(&dst)); t!(fs::create_dir_all(&dst));
let mut src = build.sysroot_libdir(compiler, target); let mut src = builder.sysroot_libdir(compiler, target).to_path_buf();
src.pop(); // Remove the trailing /lib folder from the sysroot_libdir src.pop(); // Remove the trailing /lib folder from the sysroot_libdir
cp_r(&src, &dst); cp_r(&src, &dst);
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -422,42 +573,61 @@ pub fn std(build: &Build, compiler: &Compiler, target: &str) {
.arg("--legacy-manifest-dirs=rustlib,cargo"); .arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd); build.run(&mut cmd);
t!(fs::remove_dir_all(&image)); t!(fs::remove_dir_all(&image));
distdir(build).join(format!("{}-{}.tar.gz", name, target))
}
} }
/// The path to the complete rustc-src tarball #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn rust_src_location(build: &Build) -> PathBuf { pub struct Analysis {
let plain_name = format!("rustc-{}-src", build.rust_package_vers()); pub compiler: Compiler,
distdir(build).join(&format!("{}.tar.gz", plain_name)) pub target: Interned<String>,
} }
/// The path to the rust-src component installer impl Step for Analysis {
pub fn rust_src_installer(build: &Build) -> PathBuf { type Output = PathBuf;
let name = pkgname(build, "rust-src"); const DEFAULT: bool = true;
distdir(build).join(&format!("{}.tar.gz", name)) const ONLY_BUILD_TARGETS: bool = true;
}
/// Creates a tarball of save-analysis metadata, if available. fn should_run(run: ShouldRun) -> ShouldRun {
pub fn analysis(build: &Build, compiler: &Compiler, target: &str) { let builder = run.builder;
run.path("analysis").default_condition(builder.build.config.extended)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Analysis {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Creates a tarball of save-analysis metadata, if available.
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let compiler = self.compiler;
let target = self.target;
assert!(build.config.extended); assert!(build.config.extended);
println!("Dist analysis"); println!("Dist analysis");
let name = pkgname(build, "rust-analysis");
if compiler.host != build.build { if &compiler.host != build.build {
println!("\tskipping, not a build host"); println!("\tskipping, not a build host");
return; return distdir(build).join(format!("{}-{}.tar.gz", name, target));
} }
builder.ensure(Std { compiler, target });
// Package save-analysis from stage1 if not doing a full bootstrap, as the // Package save-analysis from stage1 if not doing a full bootstrap, as the
// stage2 artifacts is simply copied from stage1 in that case. // stage2 artifacts is simply copied from stage1 in that case.
let compiler = if build.force_use_stage1(compiler, target) { let compiler = if build.force_use_stage1(compiler, target) {
Compiler::new(1, compiler.host) builder.compiler(1, compiler.host)
} else { } else {
compiler.clone() compiler.clone()
}; };
let name = pkgname(build, "rust-analysis");
let image = tmpdir(build).join(format!("{}-{}-image", name, target)); let image = tmpdir(build).join(format!("{}-{}-image", name, target));
let src = build.stage_out(&compiler, Mode::Libstd).join(target).join("release").join("deps"); let src = build.stage_out(compiler, Mode::Libstd)
.join(target).join("release").join("deps");
let image_src = src.join("save-analysis"); let image_src = src.join("save-analysis");
let dst = image.join("lib/rustlib").join(target).join("analysis"); let dst = image.join("lib/rustlib").join(target).join("analysis");
@ -465,7 +635,7 @@ pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
println!("image_src: {:?}, dst: {:?}", image_src, dst); println!("image_src: {:?}, dst: {:?}", image_src, dst);
cp_r(&image_src, &dst); cp_r(&image_src, &dst);
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -478,6 +648,8 @@ pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
.arg("--legacy-manifest-dirs=rustlib,cargo"); .arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd); build.run(&mut cmd);
t!(fs::remove_dir_all(&image)); t!(fs::remove_dir_all(&image));
distdir(build).join(format!("{}-{}.tar.gz", name, target))
}
} }
fn copy_src_dirs(build: &Build, src_dirs: &[&str], exclude_dirs: &[&str], dst_dir: &Path) { fn copy_src_dirs(build: &Build, src_dirs: &[&str], exclude_dirs: &[&str], dst_dir: &Path) {
@ -520,8 +692,28 @@ fn copy_src_dirs(build: &Build, src_dirs: &[&str], exclude_dirs: &[&str], dst_di
} }
} }
/// Creates the `rust-src` installer component #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn rust_src(build: &Build) { pub struct Src;
impl Step for Src {
/// The output path of the src installer tarball
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_BUILD: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Src);
}
/// Creates the `rust-src` installer component
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
println!("Dist src"); println!("Dist src");
let name = pkgname(build, "rust-src"); let name = pkgname(build, "rust-src");
@ -569,7 +761,7 @@ pub fn rust_src(build: &Build) {
copy_src_dirs(build, &std_src_dirs[..], &std_src_dirs_exclude[..], &dst_src); copy_src_dirs(build, &std_src_dirs[..], &std_src_dirs_exclude[..], &dst_src);
// Create source tarball in rust-installer format // Create source tarball in rust-installer format
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -583,12 +775,35 @@ pub fn rust_src(build: &Build) {
build.run(&mut cmd); build.run(&mut cmd);
t!(fs::remove_dir_all(&image)); t!(fs::remove_dir_all(&image));
distdir(build).join(&format!("{}.tar.gz", name))
}
} }
const CARGO_VENDOR_VERSION: &str = "0.1.4"; const CARGO_VENDOR_VERSION: &str = "0.1.4";
/// Creates the plain source tarball #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn plain_source_tarball(build: &Build) { pub struct PlainSourceTarball;
impl Step for PlainSourceTarball {
/// Produces the location of the tarball generated
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_BUILD: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src").default_condition(builder.config.rust_dist_src)
}
fn make_run(run: RunConfig) {
run.builder.ensure(PlainSourceTarball);
}
/// Creates the plain source tarball
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
println!("Create plain source tarball"); println!("Create plain source tarball");
// Make sure that the root folder of tarball has the correct name // Make sure that the root folder of tarball has the correct name
@ -650,19 +865,23 @@ pub fn plain_source_tarball(build: &Build) {
} }
// Create plain source tarball // Create plain source tarball
let mut tarball = rust_src_location(build); let plain_name = format!("rustc-{}-src", build.rust_package_vers());
let mut tarball = distdir(build).join(&format!("{}.tar.gz", plain_name));
tarball.set_extension(""); // strip .gz tarball.set_extension(""); // strip .gz
tarball.set_extension(""); // strip .tar tarball.set_extension(""); // strip .tar
if let Some(dir) = tarball.parent() { if let Some(dir) = tarball.parent() {
t!(fs::create_dir_all(dir)); t!(fs::create_dir_all(dir));
} }
let mut cmd = rust_installer(build); println!("running installer");
let mut cmd = rust_installer(builder);
cmd.arg("tarball") cmd.arg("tarball")
.arg("--input").arg(&plain_name) .arg("--input").arg(&plain_name)
.arg("--output").arg(&tarball) .arg("--output").arg(&tarball)
.arg("--work-dir=.") .arg("--work-dir=.")
.current_dir(tmpdir(build)); .current_dir(tmpdir(build));
build.run(&mut cmd); build.run(&mut cmd);
distdir(build).join(&format!("{}.tar.gz", plain_name))
}
} }
fn install(src: &Path, dstdir: &Path, perms: u32) { fn install(src: &Path, dstdir: &Path, perms: u32) {
@ -704,15 +923,39 @@ fn write_file(path: &Path, data: &[u8]) {
t!(vf.write_all(data)); t!(vf.write_all(data));
} }
pub fn cargo(build: &Build, stage: u32, target: &str) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
println!("Dist cargo stage{} ({})", stage, target); pub struct Cargo {
let compiler = Compiler::new(stage, &build.build); pub stage: u32,
pub target: Interned<String>,
}
impl Step for Cargo {
type Output = PathBuf;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("cargo")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Cargo {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let stage = self.stage;
let target = self.target;
println!("Dist cargo stage{} ({})", stage, target);
let src = build.src.join("src/tools/cargo"); let src = build.src.join("src/tools/cargo");
let etc = src.join("src/etc"); let etc = src.join("src/etc");
let release_num = build.release_num("cargo"); let release_num = build.release_num("cargo");
let name = pkgname(build, "cargo"); let name = pkgname(build, "cargo");
let version = build.cargo_info.version(build, &release_num); let version = builder.cargo_info.version(build, &release_num);
let tmp = tmpdir(build); let tmp = tmpdir(build);
let image = tmp.join("cargo-image"); let image = tmp.join("cargo-image");
@ -722,8 +965,10 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
// Prepare the image directory // Prepare the image directory
t!(fs::create_dir_all(image.join("share/zsh/site-functions"))); t!(fs::create_dir_all(image.join("share/zsh/site-functions")));
t!(fs::create_dir_all(image.join("etc/bash_completion.d"))); t!(fs::create_dir_all(image.join("etc/bash_completion.d")));
let cargo = build.cargo_out(&compiler, Mode::Tool, target) let cargo = builder.ensure(tool::Cargo {
.join(exe("cargo", target)); compiler: builder.compiler(stage, build.build),
target
});
install(&cargo, &image.join("bin"), 0o755); install(&cargo, &image.join("bin"), 0o755);
for man in t!(etc.join("man").read_dir()) { for man in t!(etc.join("man").read_dir()) {
let man = t!(man); let man = t!(man);
@ -749,7 +994,7 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes()));
// Generate the installer tarball // Generate the installer tarball
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -762,13 +1007,39 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
.arg("--component-name=cargo") .arg("--component-name=cargo")
.arg("--legacy-manifest-dirs=rustlib,cargo"); .arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd); build.run(&mut cmd);
distdir(build).join(format!("{}-{}.tar.gz", name, target))
}
} }
pub fn rls(build: &Build, stage: u32, target: &str) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
assert!(build.config.extended); pub struct Rls {
println!("Dist RLS stage{} ({})", stage, target); pub stage: u32,
let compiler = Compiler::new(stage, &build.build); pub target: Interned<String>,
}
impl Step for Rls {
type Output = PathBuf;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("rls")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rls {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let stage = self.stage;
let target = self.target;
assert!(build.config.extended);
println!("Dist RLS stage{} ({})", stage, target);
let src = build.src.join("src/tools/rls"); let src = build.src.join("src/tools/rls");
let release_num = build.release_num("rls"); let release_num = build.release_num("rls");
let name = pkgname(build, "rls"); let name = pkgname(build, "rls");
@ -780,8 +1051,10 @@ pub fn rls(build: &Build, stage: u32, target: &str) {
t!(fs::create_dir_all(&image)); t!(fs::create_dir_all(&image));
// Prepare the image directory // Prepare the image directory
let rls = build.cargo_out(&compiler, Mode::Tool, target) let rls = builder.ensure(tool::Rls {
.join(exe("rls", target)); compiler: builder.compiler(stage, build.build),
target
});
install(&rls, &image.join("bin"), 0o755); install(&rls, &image.join("bin"), 0o755);
let doc = image.join("share/doc/rls"); let doc = image.join("share/doc/rls");
install(&src.join("README.md"), &doc, 0o644); install(&src.join("README.md"), &doc, 0o644);
@ -798,7 +1071,7 @@ pub fn rls(build: &Build, stage: u32, target: &str) {
t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes())); t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes()));
// Generate the installer tarball // Generate the installer tarball
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("generate") cmd.arg("generate")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -808,34 +1081,69 @@ pub fn rls(build: &Build, stage: u32, target: &str) {
.arg("--output-dir").arg(&distdir(build)) .arg("--output-dir").arg(&distdir(build))
.arg("--non-installed-overlay").arg(&overlay) .arg("--non-installed-overlay").arg(&overlay)
.arg(format!("--package-name={}-{}", name, target)) .arg(format!("--package-name={}-{}", name, target))
.arg("--component-name=rls")
.arg("--legacy-manifest-dirs=rustlib,cargo"); .arg("--legacy-manifest-dirs=rustlib,cargo");
if build.config.channel == "nightly" {
cmd.arg("--component-name=rls");
} else {
cmd.arg("--component-name=rls-preview");
}
build.run(&mut cmd); build.run(&mut cmd);
distdir(build).join(format!("{}-{}.tar.gz", name, target))
}
} }
/// Creates a combined installer for the specified target in the provided stage. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub fn extended(build: &Build, stage: u32, target: &str) { pub struct Extended {
stage: u32,
host: Interned<String>,
target: Interned<String>,
}
impl Step for Extended {
type Output = ();
const DEFAULT: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("extended").default_condition(builder.config.extended)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Extended {
stage: run.builder.top_stage,
host: run.host,
target: run.target,
});
}
/// Creates a combined installer for the specified target in the provided stage.
fn run(self, builder: &Builder) {
let build = builder.build;
let stage = self.stage;
let target = self.target;
println!("Dist extended stage{} ({})", stage, target); println!("Dist extended stage{} ({})", stage, target);
let dist = distdir(build); let rustc_installer = builder.ensure(Rustc {
let rustc_installer = dist.join(format!("{}-{}.tar.gz", compiler: builder.compiler(stage, target),
pkgname(build, "rustc"), });
target)); let cargo_installer = builder.ensure(Cargo { stage, target });
let cargo_installer = dist.join(format!("{}-{}.tar.gz", let rls_installer = builder.ensure(Rls { stage, target });
pkgname(build, "cargo"), let mingw_installer = builder.ensure(Mingw { host: target });
target)); let analysis_installer = builder.ensure(Analysis {
let analysis_installer = dist.join(format!("{}-{}.tar.gz", compiler: builder.compiler(stage, self.host),
pkgname(build, "rust-analysis"), target
target)); });
let docs_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rust-docs"), let docs_installer = builder.ensure(Docs { stage, host: target, });
target)); let std_installer = builder.ensure(Std {
let mingw_installer = dist.join(format!("{}-{}.tar.gz", compiler: builder.compiler(stage, self.host),
pkgname(build, "rust-mingw"), target,
target)); });
let std_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rust-std"),
target));
let tmp = tmpdir(build); let tmp = tmpdir(build);
let overlay = tmp.join("extended-overlay"); let overlay = tmp.join("extended-overlay");
@ -854,10 +1162,10 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
// upgrades rustc was upgraded before rust-std. To avoid rustc clobbering // upgrades rustc was upgraded before rust-std. To avoid rustc clobbering
// the std files during uninstall. To do this ensure that rustc comes // the std files during uninstall. To do this ensure that rustc comes
// before rust-std in the list below. // before rust-std in the list below.
let mut tarballs = vec![rustc_installer, cargo_installer, let mut tarballs = vec![rustc_installer, cargo_installer, rls_installer,
analysis_installer, docs_installer, std_installer]; analysis_installer, docs_installer, std_installer];
if target.contains("pc-windows-gnu") { if target.contains("pc-windows-gnu") {
tarballs.push(mingw_installer); tarballs.push(mingw_installer.unwrap());
} }
let mut input_tarballs = tarballs[0].as_os_str().to_owned(); let mut input_tarballs = tarballs[0].as_os_str().to_owned();
for tarball in &tarballs[1..] { for tarball in &tarballs[1..] {
@ -865,7 +1173,7 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
input_tarballs.push(tarball); input_tarballs.push(tarball);
} }
let mut cmd = rust_installer(build); let mut cmd = rust_installer(builder);
cmd.arg("combine") cmd.arg("combine")
.arg("--product-name=Rust") .arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib") .arg("--rel-manifest-dir=rustlib")
@ -901,6 +1209,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
t!(fs::create_dir_all(pkg.join("cargo"))); t!(fs::create_dir_all(pkg.join("cargo")));
t!(fs::create_dir_all(pkg.join("rust-docs"))); t!(fs::create_dir_all(pkg.join("rust-docs")));
t!(fs::create_dir_all(pkg.join("rust-std"))); t!(fs::create_dir_all(pkg.join("rust-std")));
t!(fs::create_dir_all(pkg.join("rls")));
t!(fs::create_dir_all(pkg.join("rust-analysis")));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target)), cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target)),
&pkg.join("rustc")); &pkg.join("rustc"));
@ -910,11 +1220,17 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
&pkg.join("rust-docs")); &pkg.join("rust-docs"));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target)), cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target)),
&pkg.join("rust-std")); &pkg.join("rust-std"));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rls"), target)),
&pkg.join("rls"));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-analysis"), target)),
&pkg.join("rust-analysis"));
install(&etc.join("pkg/postinstall"), &pkg.join("rustc"), 0o755); install(&etc.join("pkg/postinstall"), &pkg.join("rustc"), 0o755);
install(&etc.join("pkg/postinstall"), &pkg.join("cargo"), 0o755); install(&etc.join("pkg/postinstall"), &pkg.join("cargo"), 0o755);
install(&etc.join("pkg/postinstall"), &pkg.join("rust-docs"), 0o755); install(&etc.join("pkg/postinstall"), &pkg.join("rust-docs"), 0o755);
install(&etc.join("pkg/postinstall"), &pkg.join("rust-std"), 0o755); install(&etc.join("pkg/postinstall"), &pkg.join("rust-std"), 0o755);
install(&etc.join("pkg/postinstall"), &pkg.join("rls"), 0o755);
install(&etc.join("pkg/postinstall"), &pkg.join("rust-analysis"), 0o755);
let pkgbuild = |component: &str| { let pkgbuild = |component: &str| {
let mut cmd = Command::new("pkgbuild"); let mut cmd = Command::new("pkgbuild");
@ -928,6 +1244,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
pkgbuild("cargo"); pkgbuild("cargo");
pkgbuild("rust-docs"); pkgbuild("rust-docs");
pkgbuild("rust-std"); pkgbuild("rust-std");
pkgbuild("rls");
pkgbuild("rust-analysis");
// create an 'uninstall' package // create an 'uninstall' package
install(&etc.join("pkg/postinstall"), &pkg.join("uninstall"), 0o755); install(&etc.join("pkg/postinstall"), &pkg.join("uninstall"), 0o755);
@ -951,6 +1269,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
let _ = fs::remove_dir_all(&exe); let _ = fs::remove_dir_all(&exe);
t!(fs::create_dir_all(exe.join("rustc"))); t!(fs::create_dir_all(exe.join("rustc")));
t!(fs::create_dir_all(exe.join("cargo"))); t!(fs::create_dir_all(exe.join("cargo")));
t!(fs::create_dir_all(exe.join("rls")));
t!(fs::create_dir_all(exe.join("rust-analysis")));
t!(fs::create_dir_all(exe.join("rust-docs"))); t!(fs::create_dir_all(exe.join("rust-docs")));
t!(fs::create_dir_all(exe.join("rust-std"))); t!(fs::create_dir_all(exe.join("rust-std")));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target)) cp_r(&work.join(&format!("{}-{}", pkgname(build, "rustc"), target))
@ -965,11 +1285,22 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target)) cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-std"), target))
.join(format!("rust-std-{}", target)), .join(format!("rust-std-{}", target)),
&exe.join("rust-std")); &exe.join("rust-std"));
let rls_path = if build.config.channel == "nightly" {
work.join(&format!("{}-{}", pkgname(build, "rls"), target)).join("rls")
} else {
work.join(&format!("{}-{}", pkgname(build, "rls"), target)).join("rls-preview")
};
cp_r(&rls_path, &exe.join("rls"));
cp_r(&work.join(&format!("{}-{}", pkgname(build, "rust-analysis"), target))
.join(format!("rust-analysis-{}", target)),
&exe.join("rust-analysis"));
t!(fs::remove_file(exe.join("rustc/manifest.in"))); t!(fs::remove_file(exe.join("rustc/manifest.in")));
t!(fs::remove_file(exe.join("cargo/manifest.in"))); t!(fs::remove_file(exe.join("cargo/manifest.in")));
t!(fs::remove_file(exe.join("rust-docs/manifest.in"))); t!(fs::remove_file(exe.join("rust-docs/manifest.in")));
t!(fs::remove_file(exe.join("rust-std/manifest.in"))); t!(fs::remove_file(exe.join("rust-std/manifest.in")));
t!(fs::remove_file(exe.join("rls/manifest.in")));
t!(fs::remove_file(exe.join("rust-analysis/manifest.in")));
if target.contains("windows-gnu") { if target.contains("windows-gnu") {
t!(fs::create_dir_all(exe.join("rust-mingw"))); t!(fs::create_dir_all(exe.join("rust-mingw")));
@ -1043,6 +1374,26 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
.arg("-dr").arg("Std") .arg("-dr").arg("Std")
.arg("-var").arg("var.StdDir") .arg("-var").arg("var.StdDir")
.arg("-out").arg(exe.join("StdGroup.wxs"))); .arg("-out").arg(exe.join("StdGroup.wxs")));
build.run(Command::new(&heat)
.current_dir(&exe)
.arg("dir")
.arg("rls")
.args(&heat_flags)
.arg("-cg").arg("RlsGroup")
.arg("-dr").arg("Rls")
.arg("-var").arg("var.RlsDir")
.arg("-out").arg(exe.join("RlsGroup.wxs"))
.arg("-t").arg(etc.join("msi/remove-duplicates.xsl")));
build.run(Command::new(&heat)
.current_dir(&exe)
.arg("dir")
.arg("rust-analysis")
.args(&heat_flags)
.arg("-cg").arg("AnalysisGroup")
.arg("-dr").arg("Analysis")
.arg("-var").arg("var.AnalysisDir")
.arg("-out").arg(exe.join("AnalysisGroup.wxs"))
.arg("-t").arg(etc.join("msi/remove-duplicates.xsl")));
if target.contains("windows-gnu") { if target.contains("windows-gnu") {
build.run(Command::new(&heat) build.run(Command::new(&heat)
.current_dir(&exe) .current_dir(&exe)
@ -1066,6 +1417,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
.arg("-dDocsDir=rust-docs") .arg("-dDocsDir=rust-docs")
.arg("-dCargoDir=cargo") .arg("-dCargoDir=cargo")
.arg("-dStdDir=rust-std") .arg("-dStdDir=rust-std")
.arg("-dRlsDir=rls")
.arg("-dAnalysisDir=rust-analysis")
.arg("-arch").arg(&arch) .arg("-arch").arg(&arch)
.arg("-out").arg(&output) .arg("-out").arg(&output)
.arg(&input); .arg(&input);
@ -1083,6 +1436,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
candle("DocsGroup.wxs".as_ref()); candle("DocsGroup.wxs".as_ref());
candle("CargoGroup.wxs".as_ref()); candle("CargoGroup.wxs".as_ref());
candle("StdGroup.wxs".as_ref()); candle("StdGroup.wxs".as_ref());
candle("RlsGroup.wxs".as_ref());
candle("AnalysisGroup.wxs".as_ref());
if target.contains("windows-gnu") { if target.contains("windows-gnu") {
candle("GccGroup.wxs".as_ref()); candle("GccGroup.wxs".as_ref());
@ -1105,6 +1460,8 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
.arg("DocsGroup.wixobj") .arg("DocsGroup.wixobj")
.arg("CargoGroup.wixobj") .arg("CargoGroup.wixobj")
.arg("StdGroup.wixobj") .arg("StdGroup.wixobj")
.arg("RlsGroup.wixobj")
.arg("AnalysisGroup.wixobj")
.current_dir(&exe); .current_dir(&exe);
if target.contains("windows-gnu") { if target.contains("windows-gnu") {
@ -1117,9 +1474,10 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
t!(fs::rename(exe.join(&filename), distdir(build).join(&filename))); t!(fs::rename(exe.join(&filename), distdir(build).join(&filename)));
} }
}
} }
fn add_env(build: &Build, cmd: &mut Command, target: &str) { fn add_env(build: &Build, cmd: &mut Command, target: Interned<String>) {
let mut parts = channel::CFG_RELEASE_NUM.split('.'); let mut parts = channel::CFG_RELEASE_NUM.split('.');
cmd.env("CFG_RELEASE_INFO", build.rust_version()) cmd.env("CFG_RELEASE_INFO", build.rust_version())
.env("CFG_RELEASE_NUM", channel::CFG_RELEASE_NUM) .env("CFG_RELEASE_NUM", channel::CFG_RELEASE_NUM)
@ -1149,9 +1507,26 @@ fn add_env(build: &Build, cmd: &mut Command, target: &str) {
} }
} }
pub fn hash_and_sign(build: &Build) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
let compiler = Compiler::new(0, &build.build); pub struct HashSign;
let mut cmd = build.tool_cmd(&compiler, "build-manifest");
impl Step for HashSign {
type Output = ();
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_HOSTS: bool = true;
const ONLY_BUILD: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("hash-and-sign")
}
fn make_run(run: RunConfig) {
run.builder.ensure(HashSign);
}
fn run(self, builder: &Builder) {
let build = builder.build;
let mut cmd = builder.tool_cmd(Tool::BuildManifest);
let sign = build.config.dist_sign_folder.as_ref().unwrap_or_else(|| { let sign = build.config.dist_sign_folder.as_ref().unwrap_or_else(|| {
panic!("\n\nfailed to specify `dist.sign-folder` in `config.toml`\n\n") panic!("\n\nfailed to specify `dist.sign-folder` in `config.toml`\n\n")
}); });
@ -1171,6 +1546,7 @@ pub fn hash_and_sign(build: &Build) {
cmd.arg(today.trim()); cmd.arg(today.trim());
cmd.arg(build.rust_package_vers()); cmd.arg(build.rust_package_vers());
cmd.arg(build.package_vers(&build.release_num("cargo"))); cmd.arg(build.package_vers(&build.release_num("cargo")));
cmd.arg(build.package_vers(&build.release_num("rls")));
cmd.arg(addr); cmd.arg(addr);
t!(fs::create_dir_all(distdir(build))); t!(fs::create_dir_all(distdir(build)));
@ -1179,4 +1555,5 @@ pub fn hash_and_sign(build: &Build) {
t!(child.stdin.take().unwrap().write_all(pass.as_bytes())); t!(child.stdin.take().unwrap().write_all(pass.as_bytes()));
let status = t!(child.wait()); let status = t!(child.wait());
assert!(status.success()); assert!(status.success());
}
} }

View File

@ -20,66 +20,213 @@
use std::fs::{self, File}; use std::fs::{self, File};
use std::io::prelude::*; use std::io::prelude::*;
use std::io; use std::io;
use std::path::Path; use std::path::{PathBuf, Path};
use std::process::Command;
use {Build, Compiler, Mode}; use Mode;
use util::{cp_r, symlink_dir};
use build_helper::up_to_date; use build_helper::up_to_date;
/// Invoke `rustbook` for `target` for the doc book `name`. use util::{cp_r, symlink_dir};
/// use builder::{Builder, Compiler, RunConfig, ShouldRun, Step};
/// This will not actually generate any documentation if the documentation has use tool::Tool;
/// already been generated. use compile;
pub fn rustbook(build: &Build, target: &str, name: &str) { use cache::{INTERNER, Interned};
let src = build.src.join("src/doc");
rustbook_src(build, target, name, &src); macro_rules! book {
($($name:ident, $path:expr, $book_name:expr;)+) => {
$(
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct $name {
target: Interned<String>,
}
impl Step for $name {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path($path).default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure($name {
target: run.target,
});
}
fn run(self, builder: &Builder) {
builder.ensure(Rustbook {
target: self.target,
name: INTERNER.intern_str($book_name),
})
}
}
)+
}
} }
/// Invoke `rustbook` for `target` for the doc book `name` from the `src` path. book!(
/// Nomicon, "src/doc/book", "nomicon";
/// This will not actually generate any documentation if the documentation has Reference, "src/doc/reference", "reference";
/// already been generated. Rustdoc, "src/doc/rustdoc", "rustdoc";
pub fn rustbook_src(build: &Build, target: &str, name: &str, src: &Path) { );
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct Rustbook {
target: Interned<String>,
name: Interned<String>,
}
impl Step for Rustbook {
type Output = ();
// rustbook is never directly called, and only serves as a shim for the nomicon and the
// reference.
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Invoke `rustbook` for `target` for the doc book `name`.
///
/// This will not actually generate any documentation if the documentation has
/// already been generated.
fn run(self, builder: &Builder) {
let src = builder.build.src.join("src/doc");
builder.ensure(RustbookSrc {
target: self.target,
name: self.name,
src: INTERNER.intern_path(src),
});
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct UnstableBook {
target: Interned<String>,
}
impl Step for UnstableBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/doc/unstable-book").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(UnstableBook {
target: run.target,
});
}
fn run(self, builder: &Builder) {
builder.ensure(UnstableBookGen {
target: self.target,
});
builder.ensure(RustbookSrc {
target: self.target,
name: INTERNER.intern_str("unstable-book"),
src: builder.build.md_doc_out(self.target),
})
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct RustbookSrc {
target: Interned<String>,
name: Interned<String>,
src: Interned<PathBuf>,
}
impl Step for RustbookSrc {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Invoke `rustbook` for `target` for the doc book `name` from the `src` path.
///
/// This will not actually generate any documentation if the documentation has
/// already been generated.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let name = self.name;
let src = self.src;
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let out = out.join(name); let out = out.join(name);
let compiler = Compiler::new(0, &build.build);
let src = src.join(name); let src = src.join(name);
let index = out.join("index.html"); let index = out.join("index.html");
let rustbook = build.tool(&compiler, "rustbook"); let rustbook = builder.tool_exe(Tool::Rustbook);
if up_to_date(&src, &index) && up_to_date(&rustbook, &index) { if up_to_date(&src, &index) && up_to_date(&rustbook, &index) {
return return
} }
println!("Rustbook ({}) - {}", target, name); println!("Rustbook ({}) - {}", target, name);
let _ = fs::remove_dir_all(&out); let _ = fs::remove_dir_all(&out);
build.run(build.tool_cmd(&compiler, "rustbook") build.run(builder.tool_cmd(Tool::Rustbook)
.arg("build") .arg("build")
.arg(&src) .arg(&src)
.arg("-d") .arg("-d")
.arg(out)); .arg(out));
}
} }
/// Build the book and associated stuff. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct TheBook {
/// We need to build: compiler: Compiler,
/// target: Interned<String>,
/// * Book (first edition) name: &'static str,
/// * Book (second edition) }
/// * Index page
/// * Redirect pages impl Step for TheBook {
pub fn book(build: &Build, target: &str, name: &str) { type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/doc/book").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(TheBook {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
name: "book",
});
}
/// Build the book and associated stuff.
///
/// We need to build:
///
/// * Book (first edition)
/// * Book (second edition)
/// * Index page
/// * Redirect pages
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let name = self.name;
// build book first edition // build book first edition
rustbook(build, target, &format!("{}/first-edition", name)); builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(format!("{}/first-edition", name)),
});
// build book second edition // build book second edition
rustbook(build, target, &format!("{}/second-edition", name)); builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(format!("{}/second-edition", name)),
});
// build the index page // build the index page
let index = format!("{}/index.md", name); let index = format!("{}/index.md", name);
println!("Documenting book index ({})", target); println!("Documenting book index ({})", target);
invoke_rustdoc(build, target, &index); invoke_rustdoc(builder, self.compiler, target, &index);
// build the redirect pages // build the redirect pages
println!("Documenting book redirect pages ({})", target); println!("Documenting book redirect pages ({})", target);
@ -88,19 +235,62 @@ pub fn book(build: &Build, target: &str, name: &str) {
let path = file.path(); let path = file.path();
let path = path.to_str().unwrap(); let path = path.to_str().unwrap();
invoke_rustdoc(build, target, path); invoke_rustdoc(builder, self.compiler, target, path);
}
} }
} }
fn invoke_rustdoc(build: &Build, target: &str, markdown: &str) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct CargoBook {
target: Interned<String>,
}
impl Step for CargoBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/doc/cargo").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(CargoBook {
target: run.target,
});
}
/// Create a placeholder for the cargo documentation so that doc.rust-lang.org/cargo will
/// redirect to doc.crates.io. We want to publish doc.rust-lang.org/cargo in the paper
/// version of the book, but we don't want to rush the process of switching cargo's docs
/// over to mdbook and deploying them. When the cargo book is ready, this implementation
/// should build the mdbook instead of this redirect page.
fn run(self, builder: &Builder) {
let build = builder.build;
let out = build.doc_out(self.target);
let cargo_dir = out.join("cargo");
t!(fs::create_dir_all(&cargo_dir));
let index = cargo_dir.join("index.html");
let redirect_html = r#"
<html>
<head>
<meta http-equiv="refresh" content="0; URL='http://doc.crates.io'" />
</head>
</html>"#;
println!("Creating cargo book redirect page");
t!(t!(File::create(&index)).write_all(redirect_html.as_bytes()));
}
}
fn invoke_rustdoc(builder: &Builder, compiler: Compiler, target: Interned<String>, markdown: &str) {
let build = builder.build;
let out = build.doc_out(target); let out = build.doc_out(target);
let compiler = Compiler::new(0, &build.build);
let path = build.src.join("src/doc").join(markdown); let path = build.src.join("src/doc").join(markdown);
let rustdoc = build.rustdoc(&compiler);
let favicon = build.src.join("src/doc/favicon.inc"); let favicon = build.src.join("src/doc/favicon.inc");
let footer = build.src.join("src/doc/footer.inc"); let footer = build.src.join("src/doc/footer.inc");
@ -116,9 +306,7 @@ fn invoke_rustdoc(build: &Build, target: &str, markdown: &str) {
t!(t!(File::create(&version_info)).write_all(info.as_bytes())); t!(t!(File::create(&version_info)).write_all(info.as_bytes()));
} }
let mut cmd = Command::new(&rustdoc); let mut cmd = builder.rustdoc_cmd(compiler.host);
build.add_rustc_lib_path(&compiler, &mut cmd);
let out = out.join("book"); let out = out.join("book");
@ -137,21 +325,44 @@ fn invoke_rustdoc(build: &Build, target: &str, markdown: &str) {
build.run(&mut cmd); build.run(&mut cmd);
} }
/// Generates all standalone documentation as compiled by the rustdoc in `stage` #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// for the `target` into `out`. pub struct Standalone {
/// compiler: Compiler,
/// This will list all of `src/doc` looking for markdown files and appropriately target: Interned<String>,
/// perform transformations like substituting `VERSION`, `SHORT_HASH`, and }
/// `STAMP` alongw ith providing the various header/footer HTML we've cutomized.
/// impl Step for Standalone {
/// In the end, this is just a glorified wrapper around rustdoc! type Output = ();
pub fn standalone(build: &Build, target: &str) { const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/doc").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Standalone {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
});
}
/// Generates all standalone documentation as compiled by the rustdoc in `stage`
/// for the `target` into `out`.
///
/// This will list all of `src/doc` looking for markdown files and appropriately
/// perform transformations like substituting `VERSION`, `SHORT_HASH`, and
/// `STAMP` along with providing the various header/footer HTML we've customized.
///
/// In the end, this is just a glorified wrapper around rustdoc!
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let compiler = self.compiler;
println!("Documenting standalone ({})", target); println!("Documenting standalone ({})", target);
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = Compiler::new(0, &build.build);
let favicon = build.src.join("src/doc/favicon.inc"); let favicon = build.src.join("src/doc/favicon.inc");
let footer = build.src.join("src/doc/footer.inc"); let footer = build.src.join("src/doc/footer.inc");
let full_toc = build.src.join("src/doc/full-toc.inc"); let full_toc = build.src.join("src/doc/full-toc.inc");
@ -178,7 +389,7 @@ pub fn standalone(build: &Build, target: &str) {
} }
let html = out.join(filename).with_extension("html"); let html = out.join(filename).with_extension("html");
let rustdoc = build.rustdoc(&compiler); let rustdoc = builder.rustdoc(compiler.host);
if up_to_date(&path, &html) && if up_to_date(&path, &html) &&
up_to_date(&footer, &html) && up_to_date(&footer, &html) &&
up_to_date(&favicon, &html) && up_to_date(&favicon, &html) &&
@ -188,8 +399,7 @@ pub fn standalone(build: &Build, target: &str) {
continue continue
} }
let mut cmd = Command::new(&rustdoc); let mut cmd = builder.rustdoc_cmd(compiler.host);
build.add_rustc_lib_path(&compiler, &mut cmd);
cmd.arg("--html-after-content").arg(&footer) cmd.arg("--html-after-content").arg(&footer)
.arg("--html-before-content").arg(&version_info) .arg("--html-before-content").arg(&version_info)
.arg("--html-in-header").arg(&favicon) .arg("--html-in-header").arg(&favicon)
@ -207,25 +417,53 @@ pub fn standalone(build: &Build, target: &str) {
} }
build.run(&mut cmd); build.run(&mut cmd);
} }
}
} }
/// Compile all standard library documentation. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct Std {
/// This will generate all documentation for the standard library and its stage: u32,
/// dependencies. This is largely just a wrapper around `cargo doc`. target: Interned<String>,
pub fn std(build: &Build, stage: u32, target: &str) { }
impl Step for Std {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.krate("std").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Std {
stage: run.builder.top_stage,
target: run.target
});
}
/// Compile all standard library documentation.
///
/// This will generate all documentation for the standard library and its
/// dependencies. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
let build = builder.build;
let stage = self.stage;
let target = self.target;
println!("Documenting stage{} std ({})", stage, target); println!("Documenting stage{} std ({})", stage, target);
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = Compiler::new(stage, &build.build); let compiler = builder.compiler(stage, build.build);
let compiler = if build.force_use_stage1(&compiler, target) { let rustdoc = builder.rustdoc(compiler.host);
Compiler::new(1, compiler.host) let compiler = if build.force_use_stage1(compiler, target) {
builder.compiler(1, compiler.host)
} else { } else {
compiler compiler
}; };
let out_dir = build.stage_out(&compiler, Mode::Libstd)
builder.ensure(compile::Std { compiler, target });
let out_dir = build.stage_out(compiler, Mode::Libstd)
.join(target).join("doc"); .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
// Here what we're doing is creating a *symlink* (directory junction on // Here what we're doing is creating a *symlink* (directory junction on
// Windows) to the final output location. This is not done as an // Windows) to the final output location. This is not done as an
@ -244,10 +482,8 @@ pub fn std(build: &Build, stage: u32, target: &str) {
build.clear_if_dirty(&my_out, &rustdoc); build.clear_if_dirty(&my_out, &rustdoc);
t!(symlink_dir_force(&my_out, &out_dir)); t!(symlink_dir_force(&my_out, &out_dir));
let mut cargo = build.cargo(&compiler, Mode::Libstd, target, "doc"); let mut cargo = builder.cargo(compiler, Mode::Libstd, target, "doc");
cargo.arg("--manifest-path") compile::std_cargo(build, &compiler, target, &mut cargo);
.arg(build.src.join("src/libstd/Cargo.toml"))
.arg("--features").arg(build.std_features());
// We don't want to build docs for internal std dependencies unless // We don't want to build docs for internal std dependencies unless
// in compiler-docs mode. When not in that mode, we whitelist the crates // in compiler-docs mode. When not in that mode, we whitelist the crates
@ -266,65 +502,125 @@ pub fn std(build: &Build, stage: u32, target: &str) {
build.run(&mut cargo); build.run(&mut cargo);
cp_r(&my_out, &out); cp_r(&my_out, &out);
}
} }
/// Compile all libtest documentation. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct Test {
/// This will generate all documentation for libtest and its dependencies. This stage: u32,
/// is largely just a wrapper around `cargo doc`. target: Interned<String>,
pub fn test(build: &Build, stage: u32, target: &str) { }
impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.krate("test").default_condition(builder.config.compiler_docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Test {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Compile all libtest documentation.
///
/// This will generate all documentation for libtest and its dependencies. This
/// is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
let build = builder.build;
let stage = self.stage;
let target = self.target;
println!("Documenting stage{} test ({})", stage, target); println!("Documenting stage{} test ({})", stage, target);
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = Compiler::new(stage, &build.build); let compiler = builder.compiler(stage, build.build);
let compiler = if build.force_use_stage1(&compiler, target) { let rustdoc = builder.rustdoc(compiler.host);
Compiler::new(1, compiler.host) let compiler = if build.force_use_stage1(compiler, target) {
builder.compiler(1, compiler.host)
} else { } else {
compiler compiler
}; };
let out_dir = build.stage_out(&compiler, Mode::Libtest)
// Build libstd docs so that we generate relative links
builder.ensure(Std { stage, target });
builder.ensure(compile::Test { compiler, target });
let out_dir = build.stage_out(compiler, Mode::Libtest)
.join(target).join("doc"); .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
// See docs in std above for why we symlink // See docs in std above for why we symlink
let my_out = build.crate_doc_out(target); let my_out = build.crate_doc_out(target);
build.clear_if_dirty(&my_out, &rustdoc); build.clear_if_dirty(&my_out, &rustdoc);
t!(symlink_dir_force(&my_out, &out_dir)); t!(symlink_dir_force(&my_out, &out_dir));
let mut cargo = build.cargo(&compiler, Mode::Libtest, target, "doc"); let mut cargo = builder.cargo(compiler, Mode::Libtest, target, "doc");
cargo.arg("--manifest-path") compile::test_cargo(build, &compiler, target, &mut cargo);
.arg(build.src.join("src/libtest/Cargo.toml"));
build.run(&mut cargo); build.run(&mut cargo);
cp_r(&my_out, &out); cp_r(&my_out, &out);
}
} }
/// Generate all compiler documentation. #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// pub struct Rustc {
/// This will generate all documentation for the compiler libraries and their stage: u32,
/// dependencies. This is largely just a wrapper around `cargo doc`. target: Interned<String>,
pub fn rustc(build: &Build, stage: u32, target: &str) { }
impl Step for Rustc {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.krate("rustc-main").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rustc {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Generate all compiler documentation.
///
/// This will generate all documentation for the compiler libraries and their
/// dependencies. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
let build = builder.build;
let stage = self.stage;
let target = self.target;
println!("Documenting stage{} compiler ({})", stage, target); println!("Documenting stage{} compiler ({})", stage, target);
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = Compiler::new(stage, &build.build); let compiler = builder.compiler(stage, build.build);
let compiler = if build.force_use_stage1(&compiler, target) { let rustdoc = builder.rustdoc(compiler.host);
Compiler::new(1, compiler.host) let compiler = if build.force_use_stage1(compiler, target) {
builder.compiler(1, compiler.host)
} else { } else {
compiler compiler
}; };
let out_dir = build.stage_out(&compiler, Mode::Librustc)
// Build libstd docs so that we generate relative links
builder.ensure(Std { stage, target });
builder.ensure(compile::Rustc { compiler, target });
let out_dir = build.stage_out(compiler, Mode::Librustc)
.join(target).join("doc"); .join(target).join("doc");
let rustdoc = build.rustdoc(&compiler);
// See docs in std above for why we symlink // See docs in std above for why we symlink
let my_out = build.crate_doc_out(target); let my_out = build.crate_doc_out(target);
build.clear_if_dirty(&my_out, &rustdoc); build.clear_if_dirty(&my_out, &rustdoc);
t!(symlink_dir_force(&my_out, &out_dir)); t!(symlink_dir_force(&my_out, &out_dir));
let mut cargo = build.cargo(&compiler, Mode::Librustc, target, "doc"); let mut cargo = builder.cargo(compiler, Mode::Librustc, target, "doc");
cargo.arg("--manifest-path") compile::rustc_cargo(build, &compiler, target, &mut cargo);
.arg(build.src.join("src/rustc/Cargo.toml"))
.arg("--features").arg(build.rustc_features());
if build.config.compiler_docs { if build.config.compiler_docs {
// src/rustc/Cargo.toml contains bin crates called rustc and rustdoc // src/rustc/Cargo.toml contains bin crates called rustc and rustdoc
@ -343,16 +639,45 @@ pub fn rustc(build: &Build, stage: u32, target: &str) {
build.run(&mut cargo); build.run(&mut cargo);
cp_r(&my_out, &out); cp_r(&my_out, &out);
}
} }
/// Generates the HTML rendered error-index by running the #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
/// `error_index_generator` tool. pub struct ErrorIndex {
pub fn error_index(build: &Build, target: &str) { target: Interned<String>,
}
impl Step for ErrorIndex {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/tools/error_index_generator").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(ErrorIndex {
target: run.target,
});
}
/// Generates the HTML rendered error-index by running the
/// `error_index_generator` tool.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
builder.ensure(compile::Rustc {
compiler: builder.compiler(0, build.build),
target,
});
println!("Documenting error index ({})", target); println!("Documenting error index ({})", target);
let out = build.doc_out(target); let out = build.doc_out(target);
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
let compiler = Compiler::new(0, &build.build); let mut index = builder.tool_cmd(Tool::ErrorIndex);
let mut index = build.tool_cmd(&compiler, "error_index_generator");
index.arg("html"); index.arg("html");
index.arg(out.join("error-index.html")); index.arg(out.join("error-index.html"));
@ -360,19 +685,49 @@ pub fn error_index(build: &Build, target: &str) {
index.env("CFG_BUILD", &build.build); index.env("CFG_BUILD", &build.build);
build.run(&mut index); build.run(&mut index);
}
} }
pub fn unstable_book_gen(build: &Build, target: &str) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct UnstableBookGen {
target: Interned<String>,
}
impl Step for UnstableBookGen {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/tools/unstable-book-gen").default_condition(builder.build.config.docs)
}
fn make_run(run: RunConfig) {
run.builder.ensure(UnstableBookGen {
target: run.target,
});
}
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
builder.ensure(compile::Std {
compiler: builder.compiler(builder.top_stage, build.build),
target,
});
println!("Generating unstable book md files ({})", target); println!("Generating unstable book md files ({})", target);
let out = build.md_doc_out(target).join("unstable-book"); let out = build.md_doc_out(target).join("unstable-book");
t!(fs::create_dir_all(&out)); t!(fs::create_dir_all(&out));
t!(fs::remove_dir_all(&out)); t!(fs::remove_dir_all(&out));
let compiler = Compiler::new(0, &build.build); let mut cmd = builder.tool_cmd(Tool::UnstableBookGen);
let mut cmd = build.tool_cmd(&compiler, "unstable-book-gen");
cmd.arg(build.src.join("src")); cmd.arg(build.src.join("src"));
cmd.arg(out); cmd.arg(out);
build.run(&mut cmd); build.run(&mut cmd);
}
} }
fn symlink_dir_force(src: &Path, dst: &Path) -> io::Result<()> { fn symlink_dir_force(src: &Path, dst: &Path) -> io::Result<()> {

View File

@ -23,7 +23,9 @@ use getopts::Options;
use Build; use Build;
use config::Config; use config::Config;
use metadata; use metadata;
use step; use builder::Builder;
use cache::{Interned, INTERNER};
/// Deserialized version of all flags for this compile. /// Deserialized version of all flags for this compile.
pub struct Flags { pub struct Flags {
@ -31,9 +33,10 @@ pub struct Flags {
pub on_fail: Option<String>, pub on_fail: Option<String>,
pub stage: Option<u32>, pub stage: Option<u32>,
pub keep_stage: Option<u32>, pub keep_stage: Option<u32>,
pub build: String, pub build: Option<Interned<String>>,
pub host: Vec<String>,
pub target: Vec<String>, pub host: Vec<Interned<String>>,
pub target: Vec<Interned<String>>,
pub config: Option<PathBuf>, pub config: Option<PathBuf>,
pub src: PathBuf, pub src: PathBuf,
pub jobs: Option<u32>, pub jobs: Option<u32>,
@ -66,6 +69,14 @@ pub enum Subcommand {
}, },
} }
impl Default for Subcommand {
fn default() -> Subcommand {
Subcommand::Build {
paths: vec![PathBuf::from("nowhere")],
}
}
}
impl Flags { impl Flags {
pub fn parse(args: &[String]) -> Flags { pub fn parse(args: &[String]) -> Flags {
let mut extra_help = String::new(); let mut extra_help = String::new();
@ -241,15 +252,12 @@ Arguments:
// All subcommands can have an optional "Available paths" section // All subcommands can have an optional "Available paths" section
if matches.opt_present("verbose") { if matches.opt_present("verbose") {
let flags = Flags::parse(&["build".to_string()]); let config = Config::parse(&["build".to_string()]);
let mut config = Config::parse(&flags.build, cfg_file.clone()); let mut build = Build::new(config);
config.build = flags.build.clone();
let mut build = Build::new(flags, config);
metadata::build(&mut build); metadata::build(&mut build);
let maybe_rules_help = step::build_rules(&build).get_help(subcommand);
if maybe_rules_help.is_some() { let maybe_rules_help = Builder::get_help(&build, subcommand.as_str());
extra_help.push_str(maybe_rules_help.unwrap().as_str()); extra_help.push_str(maybe_rules_help.unwrap_or_default().as_str());
}
} else { } else {
extra_help.push_str(format!("Run `./x.py {} -h -v` to see a list of available paths.", extra_help.push_str(format!("Run `./x.py {} -h -v` to see a list of available paths.",
subcommand).as_str()); subcommand).as_str());
@ -266,14 +274,14 @@ Arguments:
} }
"test" => { "test" => {
Subcommand::Test { Subcommand::Test {
paths: paths, paths,
test_args: matches.opt_strs("test-args"), test_args: matches.opt_strs("test-args"),
fail_fast: !matches.opt_present("no-fail-fast"), fail_fast: !matches.opt_present("no-fail-fast"),
} }
} }
"bench" => { "bench" => {
Subcommand::Bench { Subcommand::Bench {
paths: paths, paths,
test_args: matches.opt_strs("test-args"), test_args: matches.opt_strs("test-args"),
} }
} }
@ -289,12 +297,12 @@ Arguments:
} }
"dist" => { "dist" => {
Subcommand::Dist { Subcommand::Dist {
paths: paths, paths,
} }
} }
"install" => { "install" => {
Subcommand::Install { Subcommand::Install {
paths: paths, paths,
} }
} }
_ => { _ => {
@ -316,18 +324,18 @@ Arguments:
Flags { Flags {
verbose: matches.opt_count("verbose"), verbose: matches.opt_count("verbose"),
stage: stage, stage,
on_fail: matches.opt_str("on-fail"), on_fail: matches.opt_str("on-fail"),
keep_stage: matches.opt_str("keep-stage").map(|j| j.parse().unwrap()), keep_stage: matches.opt_str("keep-stage").map(|j| j.parse().unwrap()),
build: matches.opt_str("build").unwrap_or_else(|| { build: matches.opt_str("build").map(|s| INTERNER.intern_string(s)),
env::var("BUILD").unwrap() host: split(matches.opt_strs("host"))
}), .into_iter().map(|x| INTERNER.intern_string(x)).collect::<Vec<_>>(),
host: split(matches.opt_strs("host")), target: split(matches.opt_strs("target"))
target: split(matches.opt_strs("target")), .into_iter().map(|x| INTERNER.intern_string(x)).collect::<Vec<_>>(),
config: cfg_file, config: cfg_file,
src: src, src,
jobs: matches.opt_str("jobs").map(|j| j.parse().unwrap()), jobs: matches.opt_str("jobs").map(|j| j.parse().unwrap()),
cmd: cmd, cmd,
incremental: matches.opt_present("incremental"), incremental: matches.opt_present("incremental"),
} }
} }

View File

@ -18,28 +18,50 @@ use std::fs;
use std::path::{Path, PathBuf, Component}; use std::path::{Path, PathBuf, Component};
use std::process::Command; use std::process::Command;
use Build; use dist::{self, pkgname, sanitize_sh, tmpdir};
use dist::{pkgname, sanitize_sh, tmpdir};
pub struct Installer<'a> { use builder::{Builder, RunConfig, ShouldRun, Step};
build: &'a Build, use cache::Interned;
prefix: PathBuf,
sysconfdir: PathBuf, pub fn install_docs(builder: &Builder, stage: u32, host: Interned<String>) {
docdir: PathBuf, install_sh(builder, "docs", "rust-docs", stage, Some(host));
bindir: PathBuf,
libdir: PathBuf,
mandir: PathBuf,
empty_dir: PathBuf,
} }
impl<'a> Drop for Installer<'a> { pub fn install_std(builder: &Builder, stage: u32) {
fn drop(&mut self) { for target in &builder.build.targets {
t!(fs::remove_dir_all(&self.empty_dir)); install_sh(builder, "std", "rust-std", stage, Some(*target));
} }
} }
impl<'a> Installer<'a> { pub fn install_cargo(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn new(build: &'a Build) -> Installer<'a> { install_sh(builder, "cargo", "cargo", stage, Some(host));
}
pub fn install_rls(builder: &Builder, stage: u32, host: Interned<String>) {
install_sh(builder, "rls", "rls", stage, Some(host));
}
pub fn install_analysis(builder: &Builder, stage: u32, host: Interned<String>) {
install_sh(builder, "analysis", "rust-analysis", stage, Some(host));
}
pub fn install_src(builder: &Builder, stage: u32) {
install_sh(builder, "src", "rust-src", stage, None);
}
pub fn install_rustc(builder: &Builder, stage: u32, host: Interned<String>) {
install_sh(builder, "rustc", "rustc", stage, Some(host));
}
fn install_sh(
builder: &Builder,
package: &str,
name: &str,
stage: u32,
host: Option<Interned<String>>
) {
let build = builder.build;
println!("Install {} stage{} ({:?})", package, stage, host);
let prefix_default = PathBuf::from("/usr/local"); let prefix_default = PathBuf::from("/usr/local");
let sysconfdir_default = PathBuf::from("/etc"); let sysconfdir_default = PathBuf::from("/etc");
let docdir_default = PathBuf::from("share/doc/rust"); let docdir_default = PathBuf::from("share/doc/rust");
@ -71,68 +93,24 @@ impl<'a> Installer<'a> {
let empty_dir = build.out.join("tmp/empty_dir"); let empty_dir = build.out.join("tmp/empty_dir");
t!(fs::create_dir_all(&empty_dir)); t!(fs::create_dir_all(&empty_dir));
Installer {
build,
prefix,
sysconfdir,
docdir,
bindir,
libdir,
mandir,
empty_dir,
}
}
pub fn install_docs(&self, stage: u32, host: &str) {
self.install_sh("docs", "rust-docs", stage, Some(host));
}
pub fn install_std(&self, stage: u32) {
for target in self.build.config.target.iter() {
self.install_sh("std", "rust-std", stage, Some(target));
}
}
pub fn install_cargo(&self, stage: u32, host: &str) {
self.install_sh("cargo", "cargo", stage, Some(host));
}
pub fn install_rls(&self, stage: u32, host: &str) {
self.install_sh("rls", "rls", stage, Some(host));
}
pub fn install_analysis(&self, stage: u32, host: &str) {
self.install_sh("analysis", "rust-analysis", stage, Some(host));
}
pub fn install_src(&self, stage: u32) {
self.install_sh("src", "rust-src", stage, None);
}
pub fn install_rustc(&self, stage: u32, host: &str) {
self.install_sh("rustc", "rustc", stage, Some(host));
}
fn install_sh(&self, package: &str, name: &str, stage: u32, host: Option<&str>) {
println!("Install {} stage{} ({:?})", package, stage, host);
let package_name = if let Some(host) = host { let package_name = if let Some(host) = host {
format!("{}-{}", pkgname(self.build, name), host) format!("{}-{}", pkgname(build, name), host)
} else { } else {
pkgname(self.build, name) pkgname(build, name)
}; };
let mut cmd = Command::new("sh"); let mut cmd = Command::new("sh");
cmd.current_dir(&self.empty_dir) cmd.current_dir(&empty_dir)
.arg(sanitize_sh(&tmpdir(self.build).join(&package_name).join("install.sh"))) .arg(sanitize_sh(&tmpdir(build).join(&package_name).join("install.sh")))
.arg(format!("--prefix={}", sanitize_sh(&self.prefix))) .arg(format!("--prefix={}", sanitize_sh(&prefix)))
.arg(format!("--sysconfdir={}", sanitize_sh(&self.sysconfdir))) .arg(format!("--sysconfdir={}", sanitize_sh(&sysconfdir)))
.arg(format!("--docdir={}", sanitize_sh(&self.docdir))) .arg(format!("--docdir={}", sanitize_sh(&docdir)))
.arg(format!("--bindir={}", sanitize_sh(&self.bindir))) .arg(format!("--bindir={}", sanitize_sh(&bindir)))
.arg(format!("--libdir={}", sanitize_sh(&self.libdir))) .arg(format!("--libdir={}", sanitize_sh(&libdir)))
.arg(format!("--mandir={}", sanitize_sh(&self.mandir))) .arg(format!("--mandir={}", sanitize_sh(&mandir)))
.arg("--disable-ldconfig"); .arg("--disable-ldconfig");
self.build.run(&mut cmd); build.run(&mut cmd);
} t!(fs::remove_dir_all(&empty_dir));
} }
fn add_destdir(path: &Path, destdir: &Option<PathBuf>) -> PathBuf { fn add_destdir(path: &Path, destdir: &Option<PathBuf>) -> PathBuf {
@ -148,3 +126,84 @@ fn add_destdir(path: &Path, destdir: &Option<PathBuf>) -> PathBuf {
} }
ret ret
} }
macro_rules! install {
(($sel:ident, $builder:ident, $_config:ident),
$($name:ident,
$path:expr,
$default_cond:expr,
only_hosts: $only_hosts:expr,
$run_item:block $(, $c:ident)*;)+) => {
$(
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct $name {
pub stage: u32,
pub target: Interned<String>,
pub host: Interned<String>,
}
impl Step for $name {
type Output = ();
const DEFAULT: bool = true;
const ONLY_BUILD_TARGETS: bool = true;
const ONLY_HOSTS: bool = $only_hosts;
$(const $c: bool = true;)*
fn should_run(run: ShouldRun) -> ShouldRun {
let $_config = &run.builder.config;
run.path($path).default_condition($default_cond)
}
fn make_run(run: RunConfig) {
run.builder.ensure($name {
stage: run.builder.top_stage,
target: run.target,
host: run.host,
});
}
fn run($sel, $builder: &Builder) {
$run_item
}
})+
}
}
install!((self, builder, _config),
Docs, "src/doc", _config.docs, only_hosts: false, {
builder.ensure(dist::Docs { stage: self.stage, host: self.target });
install_docs(builder, self.stage, self.target);
};
Std, "src/libstd", true, only_hosts: true, {
builder.ensure(dist::Std {
compiler: builder.compiler(self.stage, self.host),
target: self.target
});
install_std(builder, self.stage);
};
Cargo, "cargo", _config.extended, only_hosts: true, {
builder.ensure(dist::Cargo { stage: self.stage, target: self.target });
install_cargo(builder, self.stage, self.target);
};
Rls, "rls", _config.extended, only_hosts: true, {
builder.ensure(dist::Rls { stage: self.stage, target: self.target });
install_rls(builder, self.stage, self.target);
};
Analysis, "analysis", _config.extended, only_hosts: false, {
builder.ensure(dist::Analysis {
compiler: builder.compiler(self.stage, self.host),
target: self.target
});
install_analysis(builder, self.stage, self.target);
};
Src, "src", _config.extended, only_hosts: true, {
builder.ensure(dist::Src);
install_src(builder, self.stage);
}, ONLY_BUILD;
Rustc, "src/librustc", true, only_hosts: true, {
builder.ensure(dist::Rustc {
compiler: builder.compiler(self.stage, self.target),
});
install_rustc(builder, self.stage, self.target);
};
);

View File

@ -23,38 +23,87 @@
//! //!
//! ## Architecture //! ## Architecture
//! //!
//! Although this build system defers most of the complicated logic to Cargo //! The build system defers most of the complicated logic managing invocations
//! itself, it still needs to maintain a list of targets and dependencies which //! of rustc and rustdoc to Cargo itself. However, moving through various stages
//! it can itself perform. Rustbuild is made up of a list of rules with //! and copying artifacts is still necessary for it to do. Each time rustbuild
//! dependencies amongst them (created in the `step` module) and then knows how //! is invoked, it will iterate through the list of predefined steps and execute
//! to execute each in sequence. Each time rustbuild is invoked, it will simply //! each serially in turn if it matches the paths passed or is a default rule.
//! iterate through this list of steps and execute each serially in turn. For //! For each step rustbuild relies on the step internally being incremental and
//! each step rustbuild relies on the step internally being incremental and
//! parallel. Note, though, that the `-j` parameter to rustbuild gets forwarded //! parallel. Note, though, that the `-j` parameter to rustbuild gets forwarded
//! to appropriate test harnesses and such. //! to appropriate test harnesses and such.
//! //!
//! Most of the "meaty" steps that matter are backed by Cargo, which does indeed //! Most of the "meaty" steps that matter are backed by Cargo, which does indeed
//! have its own parallelism and incremental management. Later steps, like //! have its own parallelism and incremental management. Later steps, like
//! tests, aren't incremental and simply run the entire suite currently. //! tests, aren't incremental and simply run the entire suite currently.
//! However, compiletest itself tries to avoid running tests when the artifacts
//! that are involved (mainly the compiler) haven't changed.
//! //!
//! When you execute `x.py build`, the steps which are executed are: //! When you execute `x.py build`, the steps which are executed are:
//! //!
//! * First, the python script is run. This will automatically download the //! * First, the python script is run. This will automatically download the
//! stage0 rustc and cargo according to `src/stage0.txt`, or using the cached //! stage0 rustc and cargo according to `src/stage0.txt`, or use the cached
//! versions if they're available. These are then used to compile rustbuild //! versions if they're available. These are then used to compile rustbuild
//! itself (using Cargo). Finally, control is then transferred to rustbuild. //! itself (using Cargo). Finally, control is then transferred to rustbuild.
//! //!
//! * Rustbuild takes over, performs sanity checks, probes the environment, //! * Rustbuild takes over, performs sanity checks, probes the environment,
//! reads configuration, builds up a list of steps, and then starts executing //! reads configuration, and starts executing steps as it reads the command
//! them. //! line arguments (paths) or going through the default rules.
//! //!
//! * The stage0 libstd is compiled //! The build output will be something like the following:
//! * The stage0 libtest is compiled //!
//! * The stage0 librustc is compiled //! Building stage0 std artifacts
//! * The stage1 compiler is assembled //! Copying stage0 std
//! * The stage1 libstd, libtest, librustc are compiled //! Building stage0 test artifacts
//! * The stage2 compiler is assembled //! Copying stage0 test
//! * The stage2 libstd, libtest, librustc are compiled //! Building stage0 compiler artifacts
//! Copying stage0 rustc
//! Assembling stage1 compiler
//! Building stage1 std artifacts
//! Copying stage1 std
//! Building stage1 test artifacts
//! Copying stage1 test
//! Building stage1 compiler artifacts
//! Copying stage1 rustc
//! Assembling stage2 compiler
//! Uplifting stage1 std
//! Uplifting stage1 test
//! Uplifting stage1 rustc
//!
//! Let's disect that a little:
//!
//! ## Building stage0 {std,test,compiler} artifacts
//!
//! These steps use the provided (downloaded, usually) compiler to compile the
//! local Rust source into libraries we can use.
//!
//! ## Copying stage0 {std,test,rustc}
//!
//! This copies the build output from Cargo into
//! `build/$HOST/stage0-sysroot/lib/rustlib/$ARCH/lib`. FIXME: This step's
//! documentation should be expanded -- the information already here may be
//! incorrect.
//!
//! ## Assembling stage1 compiler
//!
//! This copies the libraries we built in "building stage0 ... artifacts" into
//! the stage1 compiler's lib directory. These are the host libraries that the
//! compiler itself uses to run. These aren't actually used by artifacts the new
//! compiler generates. This step also copies the rustc and rustdoc binaries we
//! generated into build/$HOST/stage/bin.
//!
//! The stage1/bin/rustc is a fully functional compiler, but it doesn't yet have
//! any libraries to link built binaries or libraries to. The next 3 steps will
//! provide those libraries for it; they are mostly equivalent to constructing
//! the stage1/bin compiler so we don't go through them individually.
//!
//! ## Uplifting stage1 {std,test,rustc}
//!
//! This step copies the libraries from the stage1 compiler sysroot into the
//! stage2 compiler. This is done to avoid rebuilding the compiler; libraries
//! we'd build in this step should be identical (in function, if not necessarily
//! identical on disk) so there's no need to recompile the compiler again. Note
//! that if you want to, you can enable the full-bootstrap option to change this
//! behavior.
//! //!
//! Each step is driven by a separate Cargo project and rustbuild orchestrates //! Each step is driven by a separate Cargo project and rustbuild orchestrates
//! copying files between steps and otherwise preparing for Cargo to run. //! copying files between steps and otherwise preparing for Cargo to run.
@ -65,33 +114,38 @@
//! also check out the `src/bootstrap/README.md` file for more information. //! also check out the `src/bootstrap/README.md` file for more information.
#![deny(warnings)] #![deny(warnings)]
#![allow(stable_features)]
#![feature(associated_consts)]
#[macro_use] #[macro_use]
extern crate build_helper; extern crate build_helper;
#[macro_use]
extern crate serde_derive;
#[macro_use]
extern crate lazy_static;
extern crate serde_json;
extern crate cmake; extern crate cmake;
extern crate filetime; extern crate filetime;
extern crate gcc; extern crate gcc;
extern crate getopts; extern crate getopts;
extern crate num_cpus; extern crate num_cpus;
extern crate rustc_serialize;
extern crate toml; extern crate toml;
#[cfg(unix)] #[cfg(unix)]
extern crate libc; extern crate libc;
use std::cell::Cell; use std::cell::Cell;
use std::cmp; use std::collections::{HashSet, HashMap};
use std::collections::HashMap;
use std::env; use std::env;
use std::ffi::OsString;
use std::fs::{self, File}; use std::fs::{self, File};
use std::io::Read; use std::io::Read;
use std::path::{PathBuf, Path}; use std::path::{PathBuf, Path};
use std::process::Command; use std::process::Command;
use std::slice;
use build_helper::{run_silent, run_suppressed, try_run_silent, try_run_suppressed, output, mtime}; use build_helper::{run_silent, run_suppressed, try_run_silent, try_run_suppressed, output, mtime};
use util::{exe, libdir, add_lib_path, OutputFolder, CiEnv}; use util::{exe, libdir, OutputFolder, CiEnv};
mod cc; mod cc;
mod channel; mod channel;
@ -106,8 +160,10 @@ mod flags;
mod install; mod install;
mod native; mod native;
mod sanity; mod sanity;
mod step;
pub mod util; pub mod util;
mod builder;
mod cache;
mod tool;
#[cfg(windows)] #[cfg(windows)]
mod job; mod job;
@ -130,7 +186,8 @@ mod job {
} }
pub use config::Config; pub use config::Config;
pub use flags::{Flags, Subcommand}; use flags::Subcommand;
use cache::{Interned, INTERNER};
/// A structure representing a Rust compiler. /// A structure representing a Rust compiler.
/// ///
@ -138,9 +195,9 @@ pub use flags::{Flags, Subcommand};
/// corresponds to the platform the compiler runs on. This structure is used as /// corresponds to the platform the compiler runs on. This structure is used as
/// a parameter to many methods below. /// a parameter to many methods below.
#[derive(Eq, PartialEq, Clone, Copy, Hash, Debug)] #[derive(Eq, PartialEq, Clone, Copy, Hash, Debug)]
pub struct Compiler<'a> { pub struct Compiler {
stage: u32, stage: u32,
host: &'a str, host: Interned<String>,
} }
/// Global configuration for the build system. /// Global configuration for the build system.
@ -157,9 +214,6 @@ pub struct Build {
// User-specified configuration via config.toml // User-specified configuration via config.toml
config: Config, config: Config,
// User-specified configuration via CLI flags
flags: Flags,
// Derived properties from the above two configurations // Derived properties from the above two configurations
src: PathBuf, src: PathBuf,
out: PathBuf, out: PathBuf,
@ -171,9 +225,9 @@ pub struct Build {
verbosity: usize, verbosity: usize,
// Targets for which to build. // Targets for which to build.
build: String, build: Interned<String>,
hosts: Vec<String>, hosts: Vec<Interned<String>>,
targets: Vec<String>, targets: Vec<Interned<String>>,
// Stage 0 (downloaded) compiler and cargo or their local rust equivalents. // Stage 0 (downloaded) compiler and cargo or their local rust equivalents.
initial_rustc: PathBuf, initial_rustc: PathBuf,
@ -185,10 +239,10 @@ pub struct Build {
// Runtime state filled in later on // Runtime state filled in later on
// target -> (cc, ar) // target -> (cc, ar)
cc: HashMap<String, (gcc::Tool, Option<PathBuf>)>, cc: HashMap<Interned<String>, (gcc::Tool, Option<PathBuf>)>,
// host -> (cc, ar) // host -> (cc, ar)
cxx: HashMap<String, gcc::Tool>, cxx: HashMap<Interned<String>, gcc::Tool>,
crates: HashMap<String, Crate>, crates: HashMap<Interned<String>, Crate>,
is_sudo: bool, is_sudo: bool,
ci_env: CiEnv, ci_env: CiEnv,
delayed_failures: Cell<usize>, delayed_failures: Cell<usize>,
@ -196,9 +250,9 @@ pub struct Build {
#[derive(Debug)] #[derive(Debug)]
struct Crate { struct Crate {
name: String, name: Interned<String>,
version: String, version: String,
deps: Vec<String>, deps: Vec<Interned<String>>,
path: PathBuf, path: PathBuf,
doc_step: String, doc_step: String,
build_step: String, build_step: String,
@ -210,7 +264,7 @@ struct Crate {
/// ///
/// These entries currently correspond to the various output directories of the /// These entries currently correspond to the various output directories of the
/// build system, with each mod generating output in a different directory. /// build system, with each mod generating output in a different directory.
#[derive(Clone, Copy, PartialEq, Eq)] #[derive(Debug, Hash, Clone, Copy, PartialEq, Eq)]
pub enum Mode { pub enum Mode {
/// Build the standard library, placing output in the "stageN-std" directory. /// Build the standard library, placing output in the "stageN-std" directory.
Libstd, Libstd,
@ -230,9 +284,9 @@ impl Build {
/// line and the filesystem `config`. /// line and the filesystem `config`.
/// ///
/// By default all build output will be placed in the current directory. /// By default all build output will be placed in the current directory.
pub fn new(flags: Flags, config: Config) -> Build { pub fn new(config: Config) -> Build {
let cwd = t!(env::current_dir()); let cwd = t!(env::current_dir());
let src = flags.src.clone(); let src = config.src.clone();
let out = cwd.join("build"); let out = cwd.join("build");
let is_sudo = match env::var_os("SUDO_USER") { let is_sudo = match env::var_os("SUDO_USER") {
@ -244,64 +298,42 @@ impl Build {
} }
None => false, None => false,
}; };
let rust_info = channel::GitInfo::new(&src); let rust_info = channel::GitInfo::new(&config, &src);
let cargo_info = channel::GitInfo::new(&src.join("src/tools/cargo")); let cargo_info = channel::GitInfo::new(&config, &src.join("src/tools/cargo"));
let rls_info = channel::GitInfo::new(&src.join("src/tools/rls")); let rls_info = channel::GitInfo::new(&config, &src.join("src/tools/rls"));
let hosts = if !flags.host.is_empty() {
for host in flags.host.iter() {
if !config.host.contains(host) {
panic!("specified host `{}` is not in configuration", host);
}
}
flags.host.clone()
} else {
config.host.clone()
};
let targets = if !flags.target.is_empty() {
for target in flags.target.iter() {
if !config.target.contains(target) {
panic!("specified target `{}` is not in configuration", target);
}
}
flags.target.clone()
} else {
config.target.clone()
};
Build { Build {
initial_rustc: config.initial_rustc.clone(), initial_rustc: config.initial_rustc.clone(),
initial_cargo: config.initial_cargo.clone(), initial_cargo: config.initial_cargo.clone(),
local_rebuild: config.local_rebuild, local_rebuild: config.local_rebuild,
fail_fast: flags.cmd.fail_fast(), fail_fast: config.cmd.fail_fast(),
verbosity: cmp::max(flags.verbose, config.verbose), verbosity: config.verbose,
build: config.host[0].clone(), build: config.build,
hosts: hosts, hosts: config.hosts.clone(),
targets: targets, targets: config.targets.clone(),
flags: flags, config,
config: config, src,
src: src, out,
out: out,
rust_info: rust_info, rust_info,
cargo_info: cargo_info, cargo_info,
rls_info: rls_info, rls_info,
cc: HashMap::new(), cc: HashMap::new(),
cxx: HashMap::new(), cxx: HashMap::new(),
crates: HashMap::new(), crates: HashMap::new(),
lldb_version: None, lldb_version: None,
lldb_python_dir: None, lldb_python_dir: None,
is_sudo: is_sudo, is_sudo,
ci_env: CiEnv::current(), ci_env: CiEnv::current(),
delayed_failures: Cell::new(0), delayed_failures: Cell::new(0),
} }
} }
fn build_slice(&self) -> &[String] { pub fn build_triple(&self) -> &[Interned<String>] {
unsafe { unsafe {
std::slice::from_raw_parts(&self.build, 1) slice::from_raw_parts(&self.build, 1)
} }
} }
@ -311,7 +343,7 @@ impl Build {
job::setup(self); job::setup(self);
} }
if let Subcommand::Clean = self.flags.cmd { if let Subcommand::Clean = self.config.cmd {
return clean::clean(self); return clean::clean(self);
} }
@ -333,7 +365,7 @@ impl Build {
self.verbose("learning about cargo"); self.verbose("learning about cargo");
metadata::build(self); metadata::build(self);
step::run(self); builder::Builder::run(&self);
} }
/// Clear out `dir` if `input` is newer. /// Clear out `dir` if `input` is newer.
@ -351,242 +383,6 @@ impl Build {
t!(File::create(stamp)); t!(File::create(stamp));
} }
/// Prepares an invocation of `cargo` to be run.
///
/// This will create a `Command` that represents a pending execution of
/// Cargo. This cargo will be configured to use `compiler` as the actual
/// rustc compiler, its output will be scoped by `mode`'s output directory,
/// it will pass the `--target` flag for the specified `target`, and will be
/// executing the Cargo command `cmd`.
fn cargo(&self,
compiler: &Compiler,
mode: Mode,
target: &str,
cmd: &str) -> Command {
let mut cargo = Command::new(&self.initial_cargo);
let out_dir = self.stage_out(compiler, mode);
cargo.env("CARGO_TARGET_DIR", out_dir)
.arg(cmd)
.arg("-j").arg(self.jobs().to_string())
.arg("--target").arg(target);
// FIXME: Temporary fix for https://github.com/rust-lang/cargo/issues/3005
// Force cargo to output binaries with disambiguating hashes in the name
cargo.env("__CARGO_DEFAULT_LIB_METADATA", &self.config.channel);
let stage;
if compiler.stage == 0 && self.local_rebuild {
// Assume the local-rebuild rustc already has stage1 features.
stage = 1;
} else {
stage = compiler.stage;
}
// Customize the compiler we're running. Specify the compiler to cargo
// as our shim and then pass it some various options used to configure
// how the actual compiler itself is called.
//
// These variables are primarily all read by
// src/bootstrap/bin/{rustc.rs,rustdoc.rs}
cargo.env("RUSTBUILD_NATIVE_DIR", self.native_dir(target))
.env("RUSTC", self.out.join("bootstrap/debug/rustc"))
.env("RUSTC_REAL", self.compiler_path(compiler))
.env("RUSTC_STAGE", stage.to_string())
.env("RUSTC_CODEGEN_UNITS",
self.config.rust_codegen_units.to_string())
.env("RUSTC_DEBUG_ASSERTIONS",
self.config.rust_debug_assertions.to_string())
.env("RUSTC_SYSROOT", self.sysroot(compiler))
.env("RUSTC_LIBDIR", self.rustc_libdir(compiler))
.env("RUSTC_RPATH", self.config.rust_rpath.to_string())
.env("RUSTDOC", self.out.join("bootstrap/debug/rustdoc"))
.env("RUSTDOC_REAL", self.rustdoc(compiler))
.env("RUSTC_FLAGS", self.rustc_flags(target).join(" "));
if mode != Mode::Tool {
// Tools don't get debuginfo right now, e.g. cargo and rls don't
// get compiled with debuginfo.
cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string())
.env("RUSTC_DEBUGINFO_LINES", self.config.rust_debuginfo_lines.to_string())
.env("RUSTC_FORCE_UNSTABLE", "1");
// Currently the compiler depends on crates from crates.io, and
// then other crates can depend on the compiler (e.g. proc-macro
// crates). Let's say, for example that rustc itself depends on the
// bitflags crate. If an external crate then depends on the
// bitflags crate as well, we need to make sure they don't
// conflict, even if they pick the same verison of bitflags. We'll
// want to make sure that e.g. a plugin and rustc each get their
// own copy of bitflags.
// Cargo ensures that this works in general through the -C metadata
// flag. This flag will frob the symbols in the binary to make sure
// they're different, even though the source code is the exact
// same. To solve this problem for the compiler we extend Cargo's
// already-passed -C metadata flag with our own. Our rustc.rs
// wrapper around the actual rustc will detect -C metadata being
// passed and frob it with this extra string we're passing in.
cargo.env("RUSTC_METADATA_SUFFIX", "rustc");
}
// Enable usage of unstable features
cargo.env("RUSTC_BOOTSTRAP", "1");
self.add_rust_test_threads(&mut cargo);
// Almost all of the crates that we compile as part of the bootstrap may
// have a build script, including the standard library. To compile a
// build script, however, it itself needs a standard library! This
// introduces a bit of a pickle when we're compiling the standard
// library itself.
//
// To work around this we actually end up using the snapshot compiler
// (stage0) for compiling build scripts of the standard library itself.
// The stage0 compiler is guaranteed to have a libstd available for use.
//
// For other crates, however, we know that we've already got a standard
// library up and running, so we can use the normal compiler to compile
// build scripts in that situation.
if mode == Mode::Libstd {
cargo.env("RUSTC_SNAPSHOT", &self.initial_rustc)
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_snapshot_libdir());
} else {
cargo.env("RUSTC_SNAPSHOT", self.compiler_path(compiler))
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler));
}
// Ignore incremental modes except for stage0, since we're
// not guaranteeing correctness across builds if the compiler
// is changing under your feet.`
if self.flags.incremental && compiler.stage == 0 {
let incr_dir = self.incremental_dir(compiler);
cargo.env("RUSTC_INCREMENTAL", incr_dir);
}
if let Some(ref on_fail) = self.flags.on_fail {
cargo.env("RUSTC_ON_FAIL", on_fail);
}
cargo.env("RUSTC_VERBOSE", format!("{}", self.verbosity));
// Specify some various options for build scripts used throughout
// the build.
//
// FIXME: the guard against msvc shouldn't need to be here
if !target.contains("msvc") {
cargo.env(format!("CC_{}", target), self.cc(target))
.env(format!("AR_{}", target), self.ar(target).unwrap()) // only msvc is None
.env(format!("CFLAGS_{}", target), self.cflags(target).join(" "));
if let Ok(cxx) = self.cxx(target) {
cargo.env(format!("CXX_{}", target), cxx);
}
}
if mode == Mode::Libstd &&
self.config.extended &&
compiler.is_final_stage(self) {
cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string());
}
// When being built Cargo will at some point call `nmake.exe` on Windows
// MSVC. Unfortunately `nmake` will read these two environment variables
// below and try to intepret them. We're likely being run, however, from
// MSYS `make` which uses the same variables.
//
// As a result, to prevent confusion and errors, we remove these
// variables from our environment to prevent passing MSYS make flags to
// nmake, causing it to blow up.
if cfg!(target_env = "msvc") {
cargo.env_remove("MAKE");
cargo.env_remove("MAKEFLAGS");
}
// Environment variables *required* throughout the build
//
// FIXME: should update code to not require this env var
cargo.env("CFG_COMPILER_HOST_TRIPLE", target);
if self.is_verbose() {
cargo.arg("-v");
}
// FIXME: cargo bench does not accept `--release`
if self.config.rust_optimize && cmd != "bench" {
cargo.arg("--release");
}
if self.config.locked_deps {
cargo.arg("--locked");
}
if self.config.vendor || self.is_sudo {
cargo.arg("--frozen");
}
self.ci_env.force_coloring_in_ci(&mut cargo);
cargo
}
/// Get a path to the compiler specified.
fn compiler_path(&self, compiler: &Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.initial_rustc.clone()
} else {
self.sysroot(compiler).join("bin").join(exe("rustc", compiler.host))
}
}
/// Get the specified tool built by the specified compiler
fn tool(&self, compiler: &Compiler, tool: &str) -> PathBuf {
self.cargo_out(compiler, Mode::Tool, compiler.host)
.join(exe(tool, compiler.host))
}
/// Get the `rustdoc` executable next to the specified compiler
fn rustdoc(&self, compiler: &Compiler) -> PathBuf {
let mut rustdoc = self.compiler_path(compiler);
rustdoc.pop();
rustdoc.push(exe("rustdoc", compiler.host));
rustdoc
}
/// Get a `Command` which is ready to run `tool` in `stage` built for
/// `host`.
fn tool_cmd(&self, compiler: &Compiler, tool: &str) -> Command {
let mut cmd = Command::new(self.tool(&compiler, tool));
self.prepare_tool_cmd(compiler, &mut cmd);
cmd
}
/// Prepares the `cmd` provided to be able to run the `compiler` provided.
///
/// Notably this munges the dynamic library lookup path to point to the
/// right location to run `compiler`.
fn prepare_tool_cmd(&self, compiler: &Compiler, cmd: &mut Command) {
let host = compiler.host;
let mut paths = vec![
self.sysroot_libdir(compiler, compiler.host),
self.cargo_out(compiler, Mode::Tool, host).join("deps"),
];
// On MSVC a tool may invoke a C compiler (e.g. compiletest in run-make
// mode) and that C compiler may need some extra PATH modification. Do
// so here.
if compiler.host.contains("msvc") {
let curpaths = env::var_os("PATH").unwrap_or(OsString::new());
let curpaths = env::split_paths(&curpaths).collect::<Vec<_>>();
for &(ref k, ref v) in self.cc[compiler.host].0.env() {
if k != "PATH" {
continue
}
for path in env::split_paths(v) {
if !curpaths.contains(&path) {
paths.push(path);
}
}
}
}
add_lib_path(paths, cmd);
}
/// Get the space-separated set of activated features for the standard /// Get the space-separated set of activated features for the standard
/// library. /// library.
fn std_features(&self) -> String { fn std_features(&self) -> String {
@ -613,6 +409,9 @@ impl Build {
if self.config.use_jemalloc { if self.config.use_jemalloc {
features.push_str(" jemalloc"); features.push_str(" jemalloc");
} }
if self.config.llvm_enabled {
features.push_str(" llvm");
}
features features
} }
@ -622,94 +421,67 @@ impl Build {
if self.config.rust_optimize {"release"} else {"debug"} if self.config.rust_optimize {"release"} else {"debug"}
} }
/// Returns the sysroot for the `compiler` specified that *this build system
/// generates*.
///
/// That is, the sysroot for the stage0 compiler is not what the compiler
/// thinks it is by default, but it's the same as the default for stages
/// 1-3.
fn sysroot(&self, compiler: &Compiler) -> PathBuf {
if compiler.stage == 0 {
self.out.join(compiler.host).join("stage0-sysroot")
} else {
self.out.join(compiler.host).join(format!("stage{}", compiler.stage))
}
}
/// Get the directory for incremental by-products when using the /// Get the directory for incremental by-products when using the
/// given compiler. /// given compiler.
fn incremental_dir(&self, compiler: &Compiler) -> PathBuf { fn incremental_dir(&self, compiler: Compiler) -> PathBuf {
self.out.join(compiler.host).join(format!("stage{}-incremental", compiler.stage)) self.out.join(&*compiler.host).join(format!("stage{}-incremental", compiler.stage))
}
/// Returns the libdir where the standard library and other artifacts are
/// found for a compiler's sysroot.
fn sysroot_libdir(&self, compiler: &Compiler, target: &str) -> PathBuf {
if compiler.stage >= 2 {
if let Some(ref libdir_relative) = self.config.libdir_relative {
return self.sysroot(compiler).join(libdir_relative)
.join("rustlib").join(target).join("lib")
}
}
self.sysroot(compiler).join("lib").join("rustlib")
.join(target).join("lib")
} }
/// Returns the root directory for all output generated in a particular /// Returns the root directory for all output generated in a particular
/// stage when running with a particular host compiler. /// stage when running with a particular host compiler.
/// ///
/// The mode indicates what the root directory is for. /// The mode indicates what the root directory is for.
fn stage_out(&self, compiler: &Compiler, mode: Mode) -> PathBuf { fn stage_out(&self, compiler: Compiler, mode: Mode) -> PathBuf {
let suffix = match mode { let suffix = match mode {
Mode::Libstd => "-std", Mode::Libstd => "-std",
Mode::Libtest => "-test", Mode::Libtest => "-test",
Mode::Tool => "-tools", Mode::Tool => "-tools",
Mode::Librustc => "-rustc", Mode::Librustc => "-rustc",
}; };
self.out.join(compiler.host) self.out.join(&*compiler.host)
.join(format!("stage{}{}", compiler.stage, suffix)) .join(format!("stage{}{}", compiler.stage, suffix))
} }
/// Returns the root output directory for all Cargo output in a given stage, /// Returns the root output directory for all Cargo output in a given stage,
/// running a particular compiler, wehther or not we're building the /// running a particular compiler, whether or not we're building the
/// standard library, and targeting the specified architecture. /// standard library, and targeting the specified architecture.
fn cargo_out(&self, fn cargo_out(&self,
compiler: &Compiler, compiler: Compiler,
mode: Mode, mode: Mode,
target: &str) -> PathBuf { target: Interned<String>) -> PathBuf {
self.stage_out(compiler, mode).join(target).join(self.cargo_dir()) self.stage_out(compiler, mode).join(&*target).join(self.cargo_dir())
} }
/// Root output directory for LLVM compiled for `target` /// Root output directory for LLVM compiled for `target`
/// ///
/// Note that if LLVM is configured externally then the directory returned /// Note that if LLVM is configured externally then the directory returned
/// will likely be empty. /// will likely be empty.
fn llvm_out(&self, target: &str) -> PathBuf { fn llvm_out(&self, target: Interned<String>) -> PathBuf {
self.out.join(target).join("llvm") self.out.join(&*target).join("llvm")
} }
/// Output directory for all documentation for a target /// Output directory for all documentation for a target
fn doc_out(&self, target: &str) -> PathBuf { fn doc_out(&self, target: Interned<String>) -> PathBuf {
self.out.join(target).join("doc") self.out.join(&*target).join("doc")
} }
/// Output directory for some generated md crate documentation for a target (temporary) /// Output directory for some generated md crate documentation for a target (temporary)
fn md_doc_out(&self, target: &str) -> PathBuf { fn md_doc_out(&self, target: Interned<String>) -> Interned<PathBuf> {
self.out.join(target).join("md-doc") INTERNER.intern_path(self.out.join(&*target).join("md-doc"))
} }
/// Output directory for all crate documentation for a target (temporary) /// Output directory for all crate documentation for a target (temporary)
/// ///
/// The artifacts here are then copied into `doc_out` above. /// The artifacts here are then copied into `doc_out` above.
fn crate_doc_out(&self, target: &str) -> PathBuf { fn crate_doc_out(&self, target: Interned<String>) -> PathBuf {
self.out.join(target).join("crate-docs") self.out.join(&*target).join("crate-docs")
} }
/// Returns true if no custom `llvm-config` is set for the specified target. /// Returns true if no custom `llvm-config` is set for the specified target.
/// ///
/// If no custom `llvm-config` was specified then Rust's llvm will be used. /// If no custom `llvm-config` was specified then Rust's llvm will be used.
fn is_rust_llvm(&self, target: &str) -> bool { fn is_rust_llvm(&self, target: Interned<String>) -> bool {
match self.config.target_config.get(target) { match self.config.target_config.get(&target) {
Some(ref c) => c.llvm_config.is_none(), Some(ref c) => c.llvm_config.is_none(),
None => true None => true
} }
@ -719,25 +491,25 @@ impl Build {
/// ///
/// If a custom `llvm-config` was specified for target then that's returned /// If a custom `llvm-config` was specified for target then that's returned
/// instead. /// instead.
fn llvm_config(&self, target: &str) -> PathBuf { fn llvm_config(&self, target: Interned<String>) -> PathBuf {
let target_config = self.config.target_config.get(target); let target_config = self.config.target_config.get(&target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
s.clone() s.clone()
} else { } else {
self.llvm_out(&self.config.build).join("bin") self.llvm_out(self.config.build).join("bin")
.join(exe("llvm-config", target)) .join(exe("llvm-config", &*target))
} }
} }
/// Returns the path to `FileCheck` binary for the specified target /// Returns the path to `FileCheck` binary for the specified target
fn llvm_filecheck(&self, target: &str) -> PathBuf { fn llvm_filecheck(&self, target: Interned<String>) -> PathBuf {
let target_config = self.config.target_config.get(target); let target_config = self.config.target_config.get(&target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) { if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
let llvm_bindir = output(Command::new(s).arg("--bindir")); let llvm_bindir = output(Command::new(s).arg("--bindir"));
Path::new(llvm_bindir.trim()).join(exe("FileCheck", target)) Path::new(llvm_bindir.trim()).join(exe("FileCheck", &*target))
} else { } else {
let base = self.llvm_out(&self.config.build).join("build"); let base = self.llvm_out(self.config.build).join("build");
let exe = exe("FileCheck", target); let exe = exe("FileCheck", &*target);
if !self.config.ninja && self.config.build.contains("msvc") { if !self.config.ninja && self.config.build.contains("msvc") {
base.join("Release/bin").join(exe) base.join("Release/bin").join(exe)
} else { } else {
@ -747,29 +519,16 @@ impl Build {
} }
/// Directory for libraries built from C/C++ code and shared between stages. /// Directory for libraries built from C/C++ code and shared between stages.
fn native_dir(&self, target: &str) -> PathBuf { fn native_dir(&self, target: Interned<String>) -> PathBuf {
self.out.join(target).join("native") self.out.join(&*target).join("native")
} }
/// Root output directory for rust_test_helpers library compiled for /// Root output directory for rust_test_helpers library compiled for
/// `target` /// `target`
fn test_helpers_out(&self, target: &str) -> PathBuf { fn test_helpers_out(&self, target: Interned<String>) -> PathBuf {
self.native_dir(target).join("rust-test-helpers") self.native_dir(target).join("rust-test-helpers")
} }
/// Adds the compiler's directory of dynamic libraries to `cmd`'s dynamic
/// library lookup path.
fn add_rustc_lib_path(&self, compiler: &Compiler, cmd: &mut Command) {
// Windows doesn't need dylib path munging because the dlls for the
// compiler live next to the compiler and the system will find them
// automatically.
if cfg!(windows) {
return
}
add_lib_path(vec![self.rustc_libdir(compiler)], cmd);
}
/// Adds the `RUST_TEST_THREADS` env var if necessary /// Adds the `RUST_TEST_THREADS` env var if necessary
fn add_rust_test_threads(&self, cmd: &mut Command) { fn add_rust_test_threads(&self, cmd: &mut Command) {
if env::var_os("RUST_TEST_THREADS").is_none() { if env::var_os("RUST_TEST_THREADS").is_none() {
@ -777,19 +536,6 @@ impl Build {
} }
} }
/// Returns the compiler's libdir where it stores the dynamic libraries that
/// it itself links against.
///
/// For example this returns `<sysroot>/lib` on Unix and `<sysroot>/bin` on
/// Windows.
fn rustc_libdir(&self, compiler: &Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.rustc_snapshot_libdir()
} else {
self.sysroot(compiler).join(libdir(compiler.host))
}
}
/// Returns the libdir of the snapshot compiler. /// Returns the libdir of the snapshot compiler.
fn rustc_snapshot_libdir(&self) -> PathBuf { fn rustc_snapshot_libdir(&self) -> PathBuf {
self.initial_rustc.parent().unwrap().parent().unwrap() self.initial_rustc.parent().unwrap().parent().unwrap()
@ -842,20 +588,20 @@ impl Build {
/// Returns the number of parallel jobs that have been configured for this /// Returns the number of parallel jobs that have been configured for this
/// build. /// build.
fn jobs(&self) -> u32 { fn jobs(&self) -> u32 {
self.flags.jobs.unwrap_or_else(|| num_cpus::get() as u32) self.config.jobs.unwrap_or_else(|| num_cpus::get() as u32)
} }
/// Returns the path to the C compiler for the target specified. /// Returns the path to the C compiler for the target specified.
fn cc(&self, target: &str) -> &Path { fn cc(&self, target: Interned<String>) -> &Path {
self.cc[target].0.path() self.cc[&target].0.path()
} }
/// Returns a list of flags to pass to the C compiler for the target /// Returns a list of flags to pass to the C compiler for the target
/// specified. /// specified.
fn cflags(&self, target: &str) -> Vec<String> { fn cflags(&self, target: Interned<String>) -> Vec<String> {
// Filter out -O and /O (the optimization flags) that we picked up from // Filter out -O and /O (the optimization flags) that we picked up from
// gcc-rs because the build scripts will determine that for themselves. // gcc-rs because the build scripts will determine that for themselves.
let mut base = self.cc[target].0.args().iter() let mut base = self.cc[&target].0.args().iter()
.map(|s| s.to_string_lossy().into_owned()) .map(|s| s.to_string_lossy().into_owned())
.filter(|s| !s.starts_with("-O") && !s.starts_with("/O")) .filter(|s| !s.starts_with("-O") && !s.starts_with("/O"))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
@ -871,20 +617,20 @@ impl Build {
// Work around an apparently bad MinGW / GCC optimization, // Work around an apparently bad MinGW / GCC optimization,
// See: http://lists.llvm.org/pipermail/cfe-dev/2016-December/051980.html // See: http://lists.llvm.org/pipermail/cfe-dev/2016-December/051980.html
// See: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78936 // See: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78936
if target == "i686-pc-windows-gnu" { if &*target == "i686-pc-windows-gnu" {
base.push("-fno-omit-frame-pointer".into()); base.push("-fno-omit-frame-pointer".into());
} }
base base
} }
/// Returns the path to the `ar` archive utility for the target specified. /// Returns the path to the `ar` archive utility for the target specified.
fn ar(&self, target: &str) -> Option<&Path> { fn ar(&self, target: Interned<String>) -> Option<&Path> {
self.cc[target].1.as_ref().map(|p| &**p) self.cc[&target].1.as_ref().map(|p| &**p)
} }
/// Returns the path to the C++ compiler for the target specified. /// Returns the path to the C++ compiler for the target specified.
fn cxx(&self, target: &str) -> Result<&Path, String> { fn cxx(&self, target: Interned<String>) -> Result<&Path, String> {
match self.cxx.get(target) { match self.cxx.get(&target) {
Some(p) => Ok(p.path()), Some(p) => Ok(p.path()),
None => Err(format!( None => Err(format!(
"target `{}` is not configured as a host, only as a target", "target `{}` is not configured as a host, only as a target",
@ -893,7 +639,7 @@ impl Build {
} }
/// Returns flags to pass to the compiler to generate code for `target`. /// Returns flags to pass to the compiler to generate code for `target`.
fn rustc_flags(&self, target: &str) -> Vec<String> { fn rustc_flags(&self, target: Interned<String>) -> Vec<String> {
// New flags should be added here with great caution! // New flags should be added here with great caution!
// //
// It's quite unfortunate to **require** flags to generate code for a // It's quite unfortunate to **require** flags to generate code for a
@ -909,9 +655,19 @@ impl Build {
base base
} }
/// Returns if this target should statically link the C runtime, if specified
fn crt_static(&self, target: Interned<String>) -> Option<bool> {
if target.contains("pc-windows-msvc") {
Some(true)
} else {
self.config.target_config.get(&target)
.and_then(|t| t.crt_static)
}
}
/// Returns the "musl root" for this `target`, if defined /// Returns the "musl root" for this `target`, if defined
fn musl_root(&self, target: &str) -> Option<&Path> { fn musl_root(&self, target: Interned<String>) -> Option<&Path> {
self.config.target_config.get(target) self.config.target_config.get(&target)
.and_then(|t| t.musl_root.as_ref()) .and_then(|t| t.musl_root.as_ref())
.or(self.config.musl_root.as_ref()) .or(self.config.musl_root.as_ref())
.map(|p| &**p) .map(|p| &**p)
@ -919,8 +675,9 @@ impl Build {
/// Returns whether the target will be tested using the `remote-test-client` /// Returns whether the target will be tested using the `remote-test-client`
/// and `remote-test-server` binaries. /// and `remote-test-server` binaries.
fn remote_tested(&self, target: &str) -> bool { fn remote_tested(&self, target: Interned<String>) -> bool {
self.qemu_rootfs(target).is_some() || target.contains("android") self.qemu_rootfs(target).is_some() || target.contains("android") ||
env::var_os("TEST_DEVICE_ADDR").is_some()
} }
/// Returns the root of the "rootfs" image that this target will be using, /// Returns the root of the "rootfs" image that this target will be using,
@ -928,8 +685,8 @@ impl Build {
/// ///
/// If `Some` is returned then that means that tests for this target are /// If `Some` is returned then that means that tests for this target are
/// emulated with QEMU and binaries will need to be shipped to the emulator. /// emulated with QEMU and binaries will need to be shipped to the emulator.
fn qemu_rootfs(&self, target: &str) -> Option<&Path> { fn qemu_rootfs(&self, target: Interned<String>) -> Option<&Path> {
self.config.target_config.get(target) self.config.target_config.get(&target)
.and_then(|t| t.qemu_rootfs.as_ref()) .and_then(|t| t.qemu_rootfs.as_ref())
.map(|p| &**p) .map(|p| &**p)
} }
@ -957,20 +714,20 @@ impl Build {
/// ///
/// When all of these conditions are met the build will lift artifacts from /// When all of these conditions are met the build will lift artifacts from
/// the previous stage forward. /// the previous stage forward.
fn force_use_stage1(&self, compiler: &Compiler, target: &str) -> bool { fn force_use_stage1(&self, compiler: Compiler, target: Interned<String>) -> bool {
!self.config.full_bootstrap && !self.config.full_bootstrap &&
compiler.stage >= 2 && compiler.stage >= 2 &&
self.config.host.iter().any(|h| h == target) self.hosts.iter().any(|h| *h == target)
} }
/// Returns the directory that OpenSSL artifacts are compiled into if /// Returns the directory that OpenSSL artifacts are compiled into if
/// configured to do so. /// configured to do so.
fn openssl_dir(&self, target: &str) -> Option<PathBuf> { fn openssl_dir(&self, target: Interned<String>) -> Option<PathBuf> {
// OpenSSL not used on Windows // OpenSSL not used on Windows
if target.contains("windows") { if target.contains("windows") {
None None
} else if self.config.openssl_static { } else if self.config.openssl_static {
Some(self.out.join(target).join("openssl")) Some(self.out.join(&*target).join("openssl"))
} else { } else {
None None
} }
@ -978,7 +735,7 @@ impl Build {
/// Returns the directory that OpenSSL artifacts are installed into if /// Returns the directory that OpenSSL artifacts are installed into if
/// configured as such. /// configured as such.
fn openssl_install_dir(&self, target: &str) -> Option<PathBuf> { fn openssl_install_dir(&self, target: Interned<String>) -> Option<PathBuf> {
self.openssl_dir(target).map(|p| p.join("install")) self.openssl_dir(target).map(|p| p.join("install"))
} }
@ -1077,16 +834,38 @@ impl Build {
None None
} }
} }
/// Get a list of crates from a root crate.
///
/// Returns Vec<(crate, path to crate, is_root_crate)>
fn crates(&self, root: &str) -> Vec<(Interned<String>, &Path)> {
let interned = INTERNER.intern_string(root.to_owned());
let mut ret = Vec::new();
let mut list = vec![interned];
let mut visited = HashSet::new();
while let Some(krate) = list.pop() {
let krate = &self.crates[&krate];
// If we can't strip prefix, then out-of-tree path
let path = krate.path.strip_prefix(&self.src).unwrap_or(&krate.path);
ret.push((krate.name, path));
for dep in &krate.deps {
if visited.insert(dep) && dep != "build_helper" {
list.push(*dep);
}
}
}
ret
}
} }
impl<'a> Compiler<'a> { impl<'a> Compiler {
/// Creates a new complier for the specified stage/host pub fn with_stage(mut self, stage: u32) -> Compiler {
fn new(stage: u32, host: &'a str) -> Compiler<'a> { self.stage = stage;
Compiler { stage: stage, host: host } self
} }
/// Returns whether this is a snapshot compiler for `build`'s configuration /// Returns whether this is a snapshot compiler for `build`'s configuration
fn is_snapshot(&self, build: &Build) -> bool { pub fn is_snapshot(&self, build: &Build) -> bool {
self.stage == 0 && self.host == build.build self.stage == 0 && self.host == build.build
} }
@ -1094,7 +873,7 @@ impl<'a> Compiler<'a> {
/// current build session. /// current build session.
/// This takes into account whether we're performing a full bootstrap or /// This takes into account whether we're performing a full bootstrap or
/// not; don't directly compare the stage with `2`! /// not; don't directly compare the stage with `2`!
fn is_final_stage(&self, build: &Build) -> bool { pub fn is_final_stage(&self, build: &Build) -> bool {
let final_stage = if build.config.full_bootstrap { 2 } else { 1 }; let final_stage = if build.config.full_bootstrap { 2 } else { 1 };
self.stage >= final_stage self.stage >= final_stage
} }

View File

@ -13,17 +13,18 @@ use std::process::Command;
use std::path::PathBuf; use std::path::PathBuf;
use build_helper::output; use build_helper::output;
use rustc_serialize::json; use serde_json;
use {Build, Crate}; use {Build, Crate};
use cache::INTERNER;
#[derive(RustcDecodable)] #[derive(Deserialize)]
struct Output { struct Output {
packages: Vec<Package>, packages: Vec<Package>,
resolve: Resolve, resolve: Resolve,
} }
#[derive(RustcDecodable)] #[derive(Deserialize)]
struct Package { struct Package {
id: String, id: String,
name: String, name: String,
@ -32,12 +33,12 @@ struct Package {
manifest_path: String, manifest_path: String,
} }
#[derive(RustcDecodable)] #[derive(Deserialize)]
struct Resolve { struct Resolve {
nodes: Vec<ResolveNode>, nodes: Vec<ResolveNode>,
} }
#[derive(RustcDecodable)] #[derive(Deserialize)]
struct ResolveNode { struct ResolveNode {
id: String, id: String,
dependencies: Vec<String>, dependencies: Vec<String>,
@ -61,22 +62,23 @@ fn build_krate(build: &mut Build, krate: &str) {
.arg("--format-version").arg("1") .arg("--format-version").arg("1")
.arg("--manifest-path").arg(build.src.join(krate).join("Cargo.toml")); .arg("--manifest-path").arg(build.src.join(krate).join("Cargo.toml"));
let output = output(&mut cargo); let output = output(&mut cargo);
let output: Output = json::decode(&output).unwrap(); let output: Output = serde_json::from_str(&output).unwrap();
let mut id2name = HashMap::new(); let mut id2name = HashMap::new();
for package in output.packages { for package in output.packages {
if package.source.is_none() { if package.source.is_none() {
id2name.insert(package.id, package.name.clone()); let name = INTERNER.intern_string(package.name);
id2name.insert(package.id, name);
let mut path = PathBuf::from(package.manifest_path); let mut path = PathBuf::from(package.manifest_path);
path.pop(); path.pop();
build.crates.insert(package.name.clone(), Crate { build.crates.insert(name, Crate {
build_step: format!("build-crate-{}", package.name), build_step: format!("build-crate-{}", name),
doc_step: format!("doc-crate-{}", package.name), doc_step: format!("doc-crate-{}", name),
test_step: format!("test-crate-{}", package.name), test_step: format!("test-crate-{}", name),
bench_step: format!("bench-crate-{}", package.name), bench_step: format!("bench-crate-{}", name),
name: package.name, name,
version: package.version, version: package.version,
deps: Vec::new(), deps: Vec::new(),
path: path, path,
}); });
} }
} }
@ -93,7 +95,7 @@ fn build_krate(build: &mut Build, krate: &str) {
Some(dep) => dep, Some(dep) => dep,
None => continue, None => continue,
}; };
krate.deps.push(dep.clone()); krate.deps.push(*dep);
} }
} }
} }

View File

@ -56,6 +56,7 @@ check-aux:
$(Q)$(BOOTSTRAP) test \ $(Q)$(BOOTSTRAP) test \
src/tools/cargotest \ src/tools/cargotest \
src/tools/cargo \ src/tools/cargo \
src/tools/rls \
src/test/pretty \ src/test/pretty \
src/test/run-pass/pretty \ src/test/run-pass/pretty \
src/test/run-fail/pretty \ src/test/run-fail/pretty \
@ -63,6 +64,8 @@ check-aux:
src/test/run-pass-fulldeps/pretty \ src/test/run-pass-fulldeps/pretty \
src/test/run-fail-fulldeps/pretty \ src/test/run-fail-fulldeps/pretty \
$(BOOTSTRAP_ARGS) $(BOOTSTRAP_ARGS)
check-bootstrap:
$(Q)$(CFG_PYTHON) $(CFG_SRC_DIR)src/bootstrap/bootstrap_test.py
dist: dist:
$(Q)$(BOOTSTRAP) dist $(BOOTSTRAP_ARGS) $(Q)$(BOOTSTRAP) dist $(BOOTSTRAP_ARGS)
distcheck: distcheck:

View File

@ -11,7 +11,7 @@
//! Compilation of native dependencies like LLVM. //! Compilation of native dependencies like LLVM.
//! //!
//! Native projects like LLVM unfortunately aren't suited just yet for //! Native projects like LLVM unfortunately aren't suited just yet for
//! compilation in build scripts that Cargo has. This is because thie //! compilation in build scripts that Cargo has. This is because the
//! compilation takes a *very* long time but also because we don't want to //! compilation takes a *very* long time but also because we don't want to
//! compile LLVM 3 times as part of a normal bootstrap (we want it cached). //! compile LLVM 3 times as part of a normal bootstrap (we want it cached).
//! //!
@ -32,12 +32,39 @@ use gcc;
use Build; use Build;
use util; use util;
use build_helper::up_to_date; use build_helper::up_to_date;
use builder::{Builder, RunConfig, ShouldRun, Step};
use cache::Interned;
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Llvm {
pub target: Interned<String>,
}
impl Step for Llvm {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/llvm")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Llvm { target: run.target })
}
/// Compile LLVM for `target`.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
// If we're not compiling for LLVM bail out here.
if !build.config.llvm_enabled {
return;
}
/// Compile LLVM for `target`.
pub fn llvm(build: &Build, target: &str) {
// If we're using a custom LLVM bail out here, but we can only use a // If we're using a custom LLVM bail out here, but we can only use a
// custom LLVM for the build triple. // custom LLVM for the build triple.
if let Some(config) = build.config.target_config.get(target) { if let Some(config) = build.config.target_config.get(&target) {
if let Some(ref s) = config.llvm_config { if let Some(ref s) = config.llvm_config {
return check_llvm_version(build, s); return check_llvm_version(build, s);
} }
@ -59,9 +86,6 @@ pub fn llvm(build: &Build, target: &str) {
return return
} }
} }
if build.config.llvm_clean_rebuild {
drop(fs::remove_dir_all(&out_dir));
}
let _folder = build.fold_output(|| "llvm"); let _folder = build.fold_output(|| "llvm");
println!("Building LLVM for {}", target); println!("Building LLVM for {}", target);
@ -93,7 +117,7 @@ pub fn llvm(build: &Build, target: &str) {
let assertions = if build.config.llvm_assertions {"ON"} else {"OFF"}; let assertions = if build.config.llvm_assertions {"ON"} else {"OFF"};
cfg.target(target) cfg.target(&target)
.host(&build.build) .host(&build.build)
.out_dir(&out_dir) .out_dir(&out_dir)
.profile(profile) .profile(profile)
@ -111,6 +135,15 @@ pub fn llvm(build: &Build, target: &str) {
.define("LLVM_TARGET_ARCH", target.split('-').next().unwrap()) .define("LLVM_TARGET_ARCH", target.split('-').next().unwrap())
.define("LLVM_DEFAULT_TARGET_TRIPLE", target); .define("LLVM_DEFAULT_TARGET_TRIPLE", target);
// This setting makes the LLVM tools link to the dynamic LLVM library,
// which saves both memory during parallel links and overall disk space
// for the tools. We don't distribute any of those tools, so this is
// just a local concern. However, it doesn't work well everywhere.
if target.contains("linux-gnu") || target.contains("apple-darwin") {
cfg.define("LLVM_LINK_LLVM_DYLIB", "ON");
}
if target.contains("msvc") { if target.contains("msvc") {
cfg.define("LLVM_USE_CRT_DEBUG", "MT"); cfg.define("LLVM_USE_CRT_DEBUG", "MT");
cfg.define("LLVM_USE_CRT_RELEASE", "MT"); cfg.define("LLVM_USE_CRT_RELEASE", "MT");
@ -130,12 +163,21 @@ pub fn llvm(build: &Build, target: &str) {
// http://llvm.org/docs/HowToCrossCompileLLVM.html // http://llvm.org/docs/HowToCrossCompileLLVM.html
if target != build.build { if target != build.build {
builder.ensure(Llvm { target: build.build });
// FIXME: if the llvm root for the build triple is overridden then we // FIXME: if the llvm root for the build triple is overridden then we
// should use llvm-tblgen from there, also should verify that it // should use llvm-tblgen from there, also should verify that it
// actually exists most of the time in normal installs of LLVM. // actually exists most of the time in normal installs of LLVM.
let host = build.llvm_out(&build.build).join("bin/llvm-tblgen"); let host = build.llvm_out(build.build).join("bin/llvm-tblgen");
cfg.define("CMAKE_CROSSCOMPILING", "True") cfg.define("CMAKE_CROSSCOMPILING", "True")
.define("LLVM_TABLEGEN", &host); .define("LLVM_TABLEGEN", &host);
if target.contains("netbsd") {
cfg.define("CMAKE_SYSTEM_NAME", "NetBSD");
} else if target.contains("freebsd") {
cfg.define("CMAKE_SYSTEM_NAME", "FreeBSD");
}
cfg.define("LLVM_NATIVE_BUILD", build.llvm_out(build.build).join("build"));
} }
let sanitize_cc = |cc: &Path| { let sanitize_cc = |cc: &Path| {
@ -200,6 +242,7 @@ pub fn llvm(build: &Build, target: &str) {
cfg.build(); cfg.build();
t!(t!(File::create(&done_stamp)).write_all(rebuild_trigger_contents.as_bytes())); t!(t!(File::create(&done_stamp)).write_all(rebuild_trigger_contents.as_bytes()));
}
} }
fn check_llvm_version(build: &Build, llvm_config: &Path) { fn check_llvm_version(build: &Build, llvm_config: &Path) {
@ -216,9 +259,27 @@ fn check_llvm_version(build: &Build, llvm_config: &Path) {
panic!("\n\nbad LLVM version: {}, need >=3.5\n\n", version) panic!("\n\nbad LLVM version: {}, need >=3.5\n\n", version)
} }
/// Compiles the `rust_test_helpers.c` library which we used in various #[derive(Debug, Copy, Clone, PartialEq, Eq, Hash)]
/// `run-pass` test suites for ABI testing. pub struct TestHelpers {
pub fn test_helpers(build: &Build, target: &str) { pub target: Interned<String>,
}
impl Step for TestHelpers {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/rt/rust_test_helpers.c")
}
fn make_run(run: RunConfig) {
run.builder.ensure(TestHelpers { target: run.target })
}
/// Compiles the `rust_test_helpers.c` library which we used in various
/// `run-pass` test suites for ABI testing.
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let dst = build.test_helpers_out(target); let dst = build.test_helpers_out(target);
let src = build.src.join("src/rt/rust_test_helpers.c"); let src = build.src.join("src/rt/rust_test_helpers.c");
if up_to_date(&src, &dst.join("librust_test_helpers.a")) { if up_to_date(&src, &dst.join("librust_test_helpers.a")) {
@ -242,18 +303,34 @@ pub fn test_helpers(build: &Build, target: &str) {
cfg.cargo_metadata(false) cfg.cargo_metadata(false)
.out_dir(&dst) .out_dir(&dst)
.target(target) .target(&target)
.host(&build.build) .host(&build.build)
.opt_level(0) .opt_level(0)
.debug(false) .debug(false)
.file(build.src.join("src/rt/rust_test_helpers.c")) .file(build.src.join("src/rt/rust_test_helpers.c"))
.compile("librust_test_helpers.a"); .compile("librust_test_helpers.a");
}
} }
const OPENSSL_VERS: &'static str = "1.0.2k"; const OPENSSL_VERS: &'static str = "1.0.2k";
const OPENSSL_SHA256: &'static str = const OPENSSL_SHA256: &'static str =
"6b3977c61f2aedf0f96367dcfb5c6e578cf37e7b8d913b4ecb6643c3cb88d8c0"; "6b3977c61f2aedf0f96367dcfb5c6e578cf37e7b8d913b4ecb6643c3cb88d8c0";
pub fn openssl(build: &Build, target: &str) { #[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Openssl {
pub target: Interned<String>,
}
impl Step for Openssl {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
fn run(self, builder: &Builder) {
let build = builder.build;
let target = self.target;
let out = match build.openssl_dir(target) { let out = match build.openssl_dir(target) {
Some(dir) => dir, Some(dir) => dir,
None => return, None => return,
@ -272,7 +349,7 @@ pub fn openssl(build: &Build, target: &str) {
if !tarball.exists() { if !tarball.exists() {
let tmp = tarball.with_extension("tmp"); let tmp = tarball.with_extension("tmp");
// originally from https://www.openssl.org/source/... // originally from https://www.openssl.org/source/...
let url = format!("https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/{}", let url = format!("https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/{}",
name); name);
let mut ok = false; let mut ok = false;
for _ in 0..3 { for _ in 0..3 {
@ -317,7 +394,7 @@ pub fn openssl(build: &Build, target: &str) {
configure.arg("no-ssl2"); configure.arg("no-ssl2");
configure.arg("no-ssl3"); configure.arg("no-ssl3");
let os = match target { let os = match &*target {
"aarch64-linux-android" => "linux-aarch64", "aarch64-linux-android" => "linux-aarch64",
"aarch64-unknown-linux-gnu" => "linux-aarch64", "aarch64-unknown-linux-gnu" => "linux-aarch64",
"arm-linux-androideabi" => "android", "arm-linux-androideabi" => "android",
@ -373,4 +450,5 @@ pub fn openssl(build: &Build, target: &str) {
let mut f = t!(File::create(&stamp)); let mut f = t!(File::create(&stamp));
t!(f.write_all(OPENSSL_VERS.as_bytes())); t!(f.write_all(OPENSSL_VERS.as_bytes()));
}
} }

View File

@ -85,7 +85,7 @@ pub fn check(build: &mut Build) {
} }
// We need cmake, but only if we're actually building LLVM or sanitizers. // We need cmake, but only if we're actually building LLVM or sanitizers.
let building_llvm = build.config.host.iter() let building_llvm = build.hosts.iter()
.filter_map(|host| build.config.target_config.get(host)) .filter_map(|host| build.config.target_config.get(host))
.any(|config| config.llvm_config.is_none()); .any(|config| config.llvm_config.is_none());
if building_llvm || build.config.sanitizers { if building_llvm || build.config.sanitizers {
@ -93,11 +93,28 @@ pub fn check(build: &mut Build) {
} }
// Ninja is currently only used for LLVM itself. // Ninja is currently only used for LLVM itself.
if building_llvm {
if build.config.ninja {
// Some Linux distros rename `ninja` to `ninja-build`. // Some Linux distros rename `ninja` to `ninja-build`.
// CMake can work with either binary name. // CMake can work with either binary name.
if building_llvm && build.config.ninja && cmd_finder.maybe_have("ninja-build").is_none() { if cmd_finder.maybe_have("ninja-build").is_none() {
cmd_finder.must_have("ninja"); cmd_finder.must_have("ninja");
} }
}
// If ninja isn't enabled but we're building for MSVC then we try
// doubly hard to enable it. It was realized in #43767 that the msbuild
// CMake generator for MSVC doesn't respect configuration options like
// disabling LLVM assertions, which can often be quite important!
//
// In these cases we automatically enable Ninja if we find it in the
// environment.
if !build.config.ninja && build.config.build.contains("msvc") {
if cmd_finder.maybe_have("ninja").is_some() {
build.config.ninja = true;
}
}
}
build.config.python = build.config.python.take().map(|p| cmd_finder.must_have(p)) build.config.python = build.config.python.take().map(|p| cmd_finder.must_have(p))
.or_else(|| env::var_os("BOOTSTRAP_PYTHON").map(PathBuf::from)) // set by bootstrap.py .or_else(|| env::var_os("BOOTSTRAP_PYTHON").map(PathBuf::from)) // set by bootstrap.py
@ -114,7 +131,7 @@ pub fn check(build: &mut Build) {
// We're gonna build some custom C code here and there, host triples // We're gonna build some custom C code here and there, host triples
// also build some C++ shims for LLVM so we need a C++ compiler. // also build some C++ shims for LLVM so we need a C++ compiler.
for target in &build.config.target { for target in &build.targets {
// On emscripten we don't actually need the C compiler to just // On emscripten we don't actually need the C compiler to just
// build the target artifacts, only for testing. For the sake // build the target artifacts, only for testing. For the sake
// of easier bot configuration, just skip detection. // of easier bot configuration, just skip detection.
@ -122,14 +139,14 @@ pub fn check(build: &mut Build) {
continue; continue;
} }
cmd_finder.must_have(build.cc(target)); cmd_finder.must_have(build.cc(*target));
if let Some(ar) = build.ar(target) { if let Some(ar) = build.ar(*target) {
cmd_finder.must_have(ar); cmd_finder.must_have(ar);
} }
} }
for host in build.config.host.iter() { for host in &build.hosts {
cmd_finder.must_have(build.cxx(host).unwrap()); cmd_finder.must_have(build.cxx(*host).unwrap());
// The msvc hosts don't use jemalloc, turn it off globally to // The msvc hosts don't use jemalloc, turn it off globally to
// avoid packaging the dummy liballoc_jemalloc on that platform. // avoid packaging the dummy liballoc_jemalloc on that platform.
@ -139,21 +156,28 @@ pub fn check(build: &mut Build) {
} }
// Externally configured LLVM requires FileCheck to exist // Externally configured LLVM requires FileCheck to exist
let filecheck = build.llvm_filecheck(&build.build); let filecheck = build.llvm_filecheck(build.build);
if !filecheck.starts_with(&build.out) && !filecheck.exists() && build.config.codegen_tests { if !filecheck.starts_with(&build.out) && !filecheck.exists() && build.config.codegen_tests {
panic!("FileCheck executable {:?} does not exist", filecheck); panic!("FileCheck executable {:?} does not exist", filecheck);
} }
for target in &build.config.target { for target in &build.targets {
// Can't compile for iOS unless we're on macOS // Can't compile for iOS unless we're on macOS
if target.contains("apple-ios") && if target.contains("apple-ios") &&
!build.build.contains("apple-darwin") { !build.build.contains("apple-darwin") {
panic!("the iOS target is only supported on macOS"); panic!("the iOS target is only supported on macOS");
} }
// Make sure musl-root is valid if specified // Make sure musl-root is valid
if target.contains("musl") && !target.contains("mips") { if target.contains("musl") && !target.contains("mips") {
match build.musl_root(target) { // If this is a native target (host is also musl) and no musl-root is given,
// fall back to the system toolchain in /usr before giving up
if build.musl_root(*target).is_none() && build.config.build == *target {
let target = build.config.target_config.entry(target.clone())
.or_insert(Default::default());
target.musl_root = Some("/usr".into());
}
match build.musl_root(*target) {
Some(root) => { Some(root) => {
if fs::metadata(root.join("lib/libc.a")).is_err() { if fs::metadata(root.join("lib/libc.a")).is_err() {
panic!("couldn't find libc.a in musl dir: {}", panic!("couldn't find libc.a in musl dir: {}",

File diff suppressed because it is too large Load Diff

425
src/bootstrap/tool.rs Normal file
View File

@ -0,0 +1,425 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::fs;
use std::env;
use std::path::PathBuf;
use std::process::Command;
use Mode;
use Compiler;
use builder::{Step, RunConfig, ShouldRun, Builder};
use util::{copy, exe, add_lib_path};
use compile::{self, libtest_stamp, libstd_stamp, librustc_stamp};
use native;
use channel::GitInfo;
use cache::Interned;
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct CleanTools {
pub compiler: Compiler,
pub target: Interned<String>,
pub mode: Mode,
}
impl Step for CleanTools {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Build a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
fn run(self, builder: &Builder) {
let build = builder.build;
let compiler = self.compiler;
let target = self.target;
let mode = self.mode;
let stamp = match mode {
Mode::Libstd => libstd_stamp(build, compiler, target),
Mode::Libtest => libtest_stamp(build, compiler, target),
Mode::Librustc => librustc_stamp(build, compiler, target),
_ => panic!(),
};
let out_dir = build.cargo_out(compiler, Mode::Tool, target);
build.clear_if_dirty(&out_dir, &stamp);
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct ToolBuild {
compiler: Compiler,
target: Interned<String>,
tool: &'static str,
mode: Mode,
}
impl Step for ToolBuild {
type Output = PathBuf;
fn should_run(run: ShouldRun) -> ShouldRun {
run.never()
}
/// Build a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let compiler = self.compiler;
let target = self.target;
let tool = self.tool;
match self.mode {
Mode::Libstd => builder.ensure(compile::Std { compiler, target }),
Mode::Libtest => builder.ensure(compile::Test { compiler, target }),
Mode::Librustc => builder.ensure(compile::Rustc { compiler, target }),
Mode::Tool => panic!("unexpected Mode::Tool for tool build")
}
let _folder = build.fold_output(|| format!("stage{}-{}", compiler.stage, tool));
println!("Building stage{} tool {} ({})", compiler.stage, tool, target);
let mut cargo = prepare_tool_cargo(builder, compiler, target, tool);
build.run(&mut cargo);
build.cargo_out(compiler, Mode::Tool, target).join(exe(tool, &compiler.host))
}
}
fn prepare_tool_cargo(
builder: &Builder,
compiler: Compiler,
target: Interned<String>,
tool: &'static str,
) -> Command {
let build = builder.build;
let mut cargo = builder.cargo(compiler, Mode::Tool, target, "build");
let dir = build.src.join("src/tools").join(tool);
cargo.arg("--manifest-path").arg(dir.join("Cargo.toml"));
// We don't want to build tools dynamically as they'll be running across
// stages and such and it's just easier if they're not dynamically linked.
cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1");
if let Some(dir) = build.openssl_install_dir(target) {
cargo.env("OPENSSL_STATIC", "1");
cargo.env("OPENSSL_DIR", dir);
cargo.env("LIBZ_SYS_STATIC", "1");
}
cargo.env("CFG_RELEASE_CHANNEL", &build.config.channel);
let info = GitInfo::new(&build.config, &dir);
if let Some(sha) = info.sha() {
cargo.env("CFG_COMMIT_HASH", sha);
}
if let Some(sha_short) = info.sha_short() {
cargo.env("CFG_SHORT_COMMIT_HASH", sha_short);
}
if let Some(date) = info.commit_date() {
cargo.env("CFG_COMMIT_DATE", date);
}
cargo
}
macro_rules! tool {
($($name:ident, $path:expr, $tool_name:expr, $mode:expr;)+) => {
#[derive(Copy, Clone)]
pub enum Tool {
$(
$name,
)+
}
impl<'a> Builder<'a> {
pub fn tool_exe(&self, tool: Tool) -> PathBuf {
match tool {
$(Tool::$name =>
self.ensure($name {
compiler: self.compiler(0, self.build.build),
target: self.build.build,
}),
)+
}
}
}
$(
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct $name {
pub compiler: Compiler,
pub target: Interned<String>,
}
impl Step for $name {
type Output = PathBuf;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path($path)
}
fn make_run(run: RunConfig) {
run.builder.ensure($name {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
tool: $tool_name,
mode: $mode,
})
}
}
)+
}
}
tool!(
Rustbook, "src/tools/rustbook", "rustbook", Mode::Librustc;
ErrorIndex, "src/tools/error_index_generator", "error_index_generator", Mode::Librustc;
UnstableBookGen, "src/tools/unstable-book-gen", "unstable-book-gen", Mode::Libstd;
Tidy, "src/tools/tidy", "tidy", Mode::Libstd;
Linkchecker, "src/tools/linkchecker", "linkchecker", Mode::Libstd;
CargoTest, "src/tools/cargotest", "cargotest", Mode::Libstd;
Compiletest, "src/tools/compiletest", "compiletest", Mode::Libtest;
BuildManifest, "src/tools/build-manifest", "build-manifest", Mode::Libstd;
RemoteTestClient, "src/tools/remote-test-client", "remote-test-client", Mode::Libstd;
RustInstaller, "src/tools/rust-installer", "rust-installer", Mode::Libstd;
);
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct RemoteTestServer {
pub compiler: Compiler,
pub target: Interned<String>,
}
impl Step for RemoteTestServer {
type Output = PathBuf;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/tools/remote-test-server")
}
fn make_run(run: RunConfig) {
run.builder.ensure(RemoteTestServer {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
tool: "remote-test-server",
mode: Mode::Libstd,
})
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Rustdoc {
pub host: Interned<String>,
}
impl Step for Rustdoc {
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/tools/rustdoc")
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rustdoc {
host: run.host,
});
}
fn run(self, builder: &Builder) -> PathBuf {
let build = builder.build;
let target_compiler = builder.compiler(builder.top_stage, self.host);
let target = target_compiler.host;
let build_compiler = if target_compiler.stage == 0 {
builder.compiler(0, builder.build.build)
} else if target_compiler.stage >= 2 {
// Past stage 2, we consider the compiler to be ABI-compatible and hence capable of
// building rustdoc itself.
builder.compiler(target_compiler.stage, builder.build.build)
} else {
// Similar to `compile::Assemble`, build with the previous stage's compiler. Otherwise
// we'd have stageN/bin/rustc and stageN/bin/rustdoc be effectively different stage
// compilers, which isn't what we want.
builder.compiler(target_compiler.stage - 1, builder.build.build)
};
builder.ensure(compile::Rustc { compiler: build_compiler, target });
let _folder = build.fold_output(|| format!("stage{}-rustdoc", target_compiler.stage));
println!("Building rustdoc for stage{} ({})", target_compiler.stage, target_compiler.host);
let mut cargo = prepare_tool_cargo(builder, build_compiler, target, "rustdoc");
build.run(&mut cargo);
// Cargo adds a number of paths to the dylib search path on windows, which results in
// the wrong rustdoc being executed. To avoid the conflicting rustdocs, we name the "tool"
// rustdoc a different name.
let tool_rustdoc = build.cargo_out(build_compiler, Mode::Tool, target)
.join(exe("rustdoc-tool-binary", &target_compiler.host));
// don't create a stage0-sysroot/bin directory.
if target_compiler.stage > 0 {
let sysroot = builder.sysroot(target_compiler);
let bindir = sysroot.join("bin");
t!(fs::create_dir_all(&bindir));
let bin_rustdoc = bindir.join(exe("rustdoc", &*target_compiler.host));
let _ = fs::remove_file(&bin_rustdoc);
copy(&tool_rustdoc, &bin_rustdoc);
bin_rustdoc
} else {
tool_rustdoc
}
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Cargo {
pub compiler: Compiler,
pub target: Interned<String>,
}
impl Step for Cargo {
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/tools/cargo").default_condition(builder.build.config.extended)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Cargo {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
builder.ensure(native::Openssl {
target: self.target,
});
// Cargo depends on procedural macros, which requires a full host
// compiler to be available, so we need to depend on that.
builder.ensure(compile::Rustc {
compiler: self.compiler,
target: builder.build.build,
});
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
tool: "cargo",
mode: Mode::Librustc,
})
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct Rls {
pub compiler: Compiler,
pub target: Interned<String>,
}
impl Step for Rls {
type Output = PathBuf;
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
let builder = run.builder;
run.path("src/tools/rls").default_condition(builder.build.config.extended)
}
fn make_run(run: RunConfig) {
run.builder.ensure(Rls {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.build.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
builder.ensure(native::Openssl {
target: self.target,
});
// RLS depends on procedural macros, which requires a full host
// compiler to be available, so we need to depend on that.
builder.ensure(compile::Rustc {
compiler: self.compiler,
target: builder.build.build,
});
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
tool: "rls",
mode: Mode::Librustc,
})
}
}
impl<'a> Builder<'a> {
/// Get a `Command` which is ready to run `tool` in `stage` built for
/// `host`.
pub fn tool_cmd(&self, tool: Tool) -> Command {
let mut cmd = Command::new(self.tool_exe(tool));
let compiler = self.compiler(0, self.build.build);
self.prepare_tool_cmd(compiler, &mut cmd);
cmd
}
/// Prepares the `cmd` provided to be able to run the `compiler` provided.
///
/// Notably this munges the dynamic library lookup path to point to the
/// right location to run `compiler`.
fn prepare_tool_cmd(&self, compiler: Compiler, cmd: &mut Command) {
let host = &compiler.host;
let mut paths: Vec<PathBuf> = vec![
PathBuf::from(&self.sysroot_libdir(compiler, compiler.host)),
self.cargo_out(compiler, Mode::Tool, *host).join("deps"),
];
// On MSVC a tool may invoke a C compiler (e.g. compiletest in run-make
// mode) and that C compiler may need some extra PATH modification. Do
// so here.
if compiler.host.contains("msvc") {
let curpaths = env::var_os("PATH").unwrap_or_default();
let curpaths = env::split_paths(&curpaths).collect::<Vec<_>>();
for &(ref k, ref v) in self.cc[&compiler.host].0.env() {
if k != "PATH" {
continue
}
for path in env::split_paths(v) {
if !curpaths.contains(&path) {
paths.push(path);
}
}
}
}
add_lib_path(paths, cmd);
}
}

View File

@ -13,7 +13,6 @@
extern crate filetime; extern crate filetime;
use std::fs::File; use std::fs::File;
use std::io;
use std::path::{Path, PathBuf}; use std::path::{Path, PathBuf};
use std::process::{Command, Stdio}; use std::process::{Command, Stdio};
use std::{fs, env}; use std::{fs, env};
@ -211,7 +210,7 @@ pub fn native_lib_boilerplate(src_name: &str,
let out_dir = env::var_os("RUSTBUILD_NATIVE_DIR").unwrap_or(env::var_os("OUT_DIR").unwrap()); let out_dir = env::var_os("RUSTBUILD_NATIVE_DIR").unwrap_or(env::var_os("OUT_DIR").unwrap());
let out_dir = PathBuf::from(out_dir).join(out_name); let out_dir = PathBuf::from(out_dir).join(out_name);
t!(create_dir_racy(&out_dir)); t!(fs::create_dir_all(&out_dir));
if link_name.contains('=') { if link_name.contains('=') {
println!("cargo:rustc-link-lib={}", link_name); println!("cargo:rustc-link-lib={}", link_name);
} else { } else {
@ -260,21 +259,3 @@ fn fail(s: &str) -> ! {
println!("\n\n{}\n\n", s); println!("\n\n{}\n\n", s);
std::process::exit(1); std::process::exit(1);
} }
fn create_dir_racy(path: &Path) -> io::Result<()> {
match fs::create_dir(path) {
Ok(()) => return Ok(()),
Err(ref e) if e.kind() == io::ErrorKind::AlreadyExists => return Ok(()),
Err(ref e) if e.kind() == io::ErrorKind::NotFound => {}
Err(e) => return Err(e),
}
match path.parent() {
Some(p) => try!(create_dir_racy(p)),
None => return Err(io::Error::new(io::ErrorKind::Other, "failed to create whole tree")),
}
match fs::create_dir(path) {
Ok(()) => Ok(()),
Err(ref e) if e.kind() == io::ErrorKind::AlreadyExists => Ok(()),
Err(e) => Err(e),
}
}

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm 9 download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm 9
@ -38,4 +35,4 @@ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
COPY scripts/android-start-emulator.sh /scripts/ COPY scripts/android-start-emulator.sh /scripts/
ENTRYPOINT ["/usr/bin/dumb-init", "--", "/scripts/android-start-emulator.sh"] ENTRYPOINT ["/scripts/android-start-emulator.sh"]

View File

@ -63,24 +63,19 @@ RUN curl http://cdimage.ubuntu.com/ubuntu-base/releases/16.04/release/ubuntu-bas
# Copy over our init script, which starts up our test server and also a few # Copy over our init script, which starts up our test server and also a few
# other misc tasks. # other misc tasks.
COPY armhf-gnu/rcS rootfs/etc/init.d/rcS COPY scripts/qemu-bare-bones-rcS rootfs/etc/init.d/rcS
RUN chmod +x rootfs/etc/init.d/rcS RUN chmod +x rootfs/etc/init.d/rcS
# Helper to quickly fill the entropy pool in the kernel. # Helper to quickly fill the entropy pool in the kernel.
COPY armhf-gnu/addentropy.c /tmp/ COPY scripts/qemu-bare-bones-addentropy.c /tmp/addentropy.c
RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static
# TODO: What is this?! # TODO: What is this?!
RUN curl -O http://ftp.nl.debian.org/debian/dists/jessie/main/installer-armhf/current/images/device-tree/vexpress-v2p-ca15-tc1.dtb RUN curl -O http://ftp.nl.debian.org/debian/dists/jessie/main/installer-armhf/current/images/device-tree/vexpress-v2p-ca15-tc1.dtb
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--target=arm-unknown-linux-gnueabihf \ --target=arm-unknown-linux-gnueabihf \
--qemu-armhf-rootfs=/tmp/rootfs --qemu-armhf-rootfs=/tmp/rootfs

View File

@ -13,9 +13,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/emscripten.sh /scripts/ COPY scripts/emscripten.sh /scripts/
RUN bash /scripts/emscripten.sh RUN bash /scripts/emscripten.sh
@ -35,5 +32,3 @@ ENV SCRIPT python2.7 ../x.py test --target $TARGETS
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -21,9 +21,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
WORKDIR /tmp WORKDIR /tmp
COPY cross/build-rumprun.sh /tmp/ COPY cross/build-rumprun.sh /tmp/
@ -38,6 +35,9 @@ RUN ./install-mips-musl.sh
COPY cross/install-mipsel-musl.sh /tmp/ COPY cross/install-mipsel-musl.sh /tmp/
RUN ./install-mipsel-musl.sh RUN ./install-mipsel-musl.sh
COPY cross/install-x86_64-redox.sh /tmp/
RUN ./install-x86_64-redox.sh
ENV TARGETS=asmjs-unknown-emscripten ENV TARGETS=asmjs-unknown-emscripten
ENV TARGETS=$TARGETS,wasm32-unknown-emscripten ENV TARGETS=$TARGETS,wasm32-unknown-emscripten
ENV TARGETS=$TARGETS,x86_64-rumprun-netbsd ENV TARGETS=$TARGETS,x86_64-rumprun-netbsd
@ -47,10 +47,12 @@ ENV TARGETS=$TARGETS,arm-unknown-linux-musleabi
ENV TARGETS=$TARGETS,arm-unknown-linux-musleabihf ENV TARGETS=$TARGETS,arm-unknown-linux-musleabihf
ENV TARGETS=$TARGETS,armv7-unknown-linux-musleabihf ENV TARGETS=$TARGETS,armv7-unknown-linux-musleabihf
ENV TARGETS=$TARGETS,sparc64-unknown-linux-gnu ENV TARGETS=$TARGETS,sparc64-unknown-linux-gnu
ENV TARGETS=$TARGETS,x86_64-unknown-redox
ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
CC_mips_unknown_linux_musl=mips-openwrt-linux-gcc \ CC_mips_unknown_linux_musl=mips-openwrt-linux-gcc \
CC_sparc64_unknown_linux_gnu=sparc64-linux-gnu-gcc CC_sparc64_unknown_linux_gnu=sparc64-linux-gnu-gcc \
CC_x86_64_unknown_redox=x86_64-unknown-redox-gcc
# Suppress some warnings in the openwrt toolchains we downloaded # Suppress some warnings in the openwrt toolchains we downloaded
ENV STAGING_DIR=/tmp ENV STAGING_DIR=/tmp
@ -66,5 +68,3 @@ ENV SCRIPT python2.7 ../x.py dist --target $TARGETS
# sccache # sccache
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -15,7 +15,7 @@ mkdir /usr/local/mips-linux-musl
# originally from # originally from
# https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/ # https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/
# OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2 # OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2
URL="https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror" URL="https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror"
FILE="OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2" FILE="OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mips-linux-musl --strip-components=2 curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mips-linux-musl --strip-components=2

View File

@ -15,7 +15,7 @@ mkdir /usr/local/mipsel-linux-musl
# Note that this originally came from: # Note that this originally came from:
# https://downloads.openwrt.org/snapshots/trunk/malta/generic/ # https://downloads.openwrt.org/snapshots/trunk/malta/generic/
# OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2 # OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2
URL="https://s3.amazonaws.com/rust-lang-ci/libc" URL="https://s3-us-west-1.amazonaws.com/rust-lang-ci2/libc"
FILE="OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2" FILE="OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2 curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2

View File

@ -0,0 +1,23 @@
#!/bin/bash
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
# ignore-tidy-linelength
set -ex
apt-get update
apt-get install -y --no-install-recommends software-properties-common apt-transport-https
apt-key adv --batch --yes --keyserver keyserver.ubuntu.com --recv-keys AA12E97F0881517F
add-apt-repository -y 'deb https://static.redox-os.org/toolchain/apt /'
apt-get update
apt-get install -y x86-64-unknown-redox-gcc

View File

@ -0,0 +1,80 @@
FROM ubuntu:16.04
RUN apt-get update -y && apt-get install -y --no-install-recommends \
bc \
bzip2 \
ca-certificates \
cmake \
cpio \
curl \
file \
g++ \
gcc-aarch64-linux-gnu \
git \
libc6-dev \
libc6-dev-arm64-cross \
make \
python2.7 \
qemu-system-aarch64 \
xz-utils
ENV ARCH=arm64 \
CROSS_COMPILE=aarch64-linux-gnu-
WORKDIR /build
# Compile the kernel that we're going to run and be emulating with. This is
# basically just done to be compatible with the QEMU target that we're going
# to be using when running tests. If any other kernel works or if any
# other QEMU target works with some other stock kernel, we can use that too!
#
# The `config` config file was a previously generated config file for
# the kernel. This file was generated by running `make defconfig`
# followed by `make menuconfig` and then enabling the IPv6 protocol page.
COPY disabled/aarch64-gnu/config /build/.config
RUN curl https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.4.42.tar.xz | \
tar xJf - && \
cd /build/linux-4.4.42 && \
cp /build/.config . && \
make -j$(nproc) all && \
cp arch/arm64/boot/Image /tmp && \
cd /build && \
rm -rf linux-4.4.42
# Compile an instance of busybox as this provides a lightweight system and init
# binary which we will boot into. Only trick here is configuring busybox to
# build static binaries.
RUN curl https://www.busybox.net/downloads/busybox-1.21.1.tar.bz2 | tar xjf - && \
cd busybox-1.21.1 && \
make defconfig && \
sed -i 's/.*CONFIG_STATIC.*/CONFIG_STATIC=y/' .config && \
make -j$(nproc) && \
make install && \
mv _install /tmp/rootfs && \
cd /build && \
rm -rf busybox-1.12.1
# Download the ubuntu rootfs, which we'll use as a chroot for all our tests.
WORKDIR /tmp
RUN mkdir rootfs/ubuntu
RUN curl http://cdimage.ubuntu.com/ubuntu-base/releases/16.04/release/ubuntu-base-16.04-core-arm64.tar.gz | \
tar xzf - -C rootfs/ubuntu && \
cd rootfs && mkdir proc sys dev etc etc/init.d
# Copy over our init script, which starts up our test server and also a few
# other misc tasks.
COPY scripts/qemu-bare-bones-rcS rootfs/etc/init.d/rcS
RUN chmod +x rootfs/etc/init.d/rcS
# Helper to quickly fill the entropy pool in the kernel.
COPY scripts/qemu-bare-bones-addentropy.c /tmp/addentropy.c
RUN aarch64-linux-gnu-gcc addentropy.c -o rootfs/addentropy -static
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
ENV RUST_CONFIGURE_ARGS \
--target=aarch64-unknown-linux-gnu \
--qemu-aarch64-rootfs=/tmp/rootfs
ENV SCRIPT python2.7 ../x.py test --target aarch64-unknown-linux-gnu
ENV NO_CHANGE_USER=1

File diff suppressed because it is too large Load Diff

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm64 21 download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm64 21
@ -28,5 +25,3 @@ ENV SCRIPT python2.7 ../x.py dist --target $HOSTS --host $HOSTS
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
download_ndk android-ndk-r13b-linux-x86_64.zip && \ download_ndk android-ndk-r13b-linux-x86_64.zip && \
@ -46,5 +43,3 @@ ENV SCRIPT \
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
download_ndk android-ndk-r13b-linux-x86_64.zip && \ download_ndk android-ndk-r13b-linux-x86_64.zip && \
@ -46,5 +43,3 @@ ENV SCRIPT \
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip x86_64 21 download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip x86_64 21
@ -28,5 +25,3 @@ ENV SCRIPT python2.7 ../x.py dist --target $HOSTS --host $HOSTS
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -0,0 +1,22 @@
FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh
COPY scripts/crosstool-ng.sh /scripts/
RUN sh /scripts/crosstool-ng.sh
WORKDIR /tmp
COPY cross/install-x86_64-redox.sh /tmp/
RUN ./install-x86_64-redox.sh
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
ENV \
AR_x86_64_unknown_redox=x86_64-unknown-redox-ar \
CC_x86_64_unknown_redox=x86_64-unknown-redox-gcc \
CXX_x86_64_unknown_redox=x86_64-unknown-redox-g++
ENV RUST_CONFIGURE_ARGS --target=x86_64-unknown-redox --enable-extended
ENV SCRIPT python2.7 ../x.py dist --target x86_64-unknown-redox

View File

@ -15,10 +15,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
jq \ jq \
bzip2 bzip2
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# emscripten # emscripten
COPY scripts/emscripten-wasm.sh /scripts/ COPY scripts/emscripten-wasm.sh /scripts/
COPY disabled/wasm32-exp/node.sh /usr/local/bin/node COPY disabled/wasm32-exp/node.sh /usr/local/bin/node
@ -37,6 +33,3 @@ ENV TARGETS=wasm32-experimental-emscripten
ENV RUST_CONFIGURE_ARGS --target=$TARGETS --experimental-targets=WebAssembly ENV RUST_CONFIGURE_ARGS --target=$TARGETS --experimental-targets=WebAssembly
ENV SCRIPT python2.7 ../x.py test --target $TARGETS ENV SCRIPT python2.7 ../x.py test --target $TARGETS
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -13,13 +13,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# emscripten # emscripten
COPY scripts/emscripten.sh /scripts/ COPY scripts/emscripten.sh /scripts/
RUN bash /scripts/emscripten.sh RUN bash /scripts/emscripten.sh
COPY disabled/wasm32/node.sh /usr/local/bin/node
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
@ -27,6 +23,7 @@ RUN sh /scripts/sccache.sh
ENV PATH=$PATH:/emsdk-portable ENV PATH=$PATH:/emsdk-portable
ENV PATH=$PATH:/emsdk-portable/clang/e1.37.13_64bit/ ENV PATH=$PATH:/emsdk-portable/clang/e1.37.13_64bit/
ENV PATH=$PATH:/emsdk-portable/emscripten/1.37.13/ ENV PATH=$PATH:/emsdk-portable/emscripten/1.37.13/
ENV PATH=$PATH:/node-v8.0.0-linux-x64/bin/
ENV EMSCRIPTEN=/emsdk-portable/emscripten/1.37.13/ ENV EMSCRIPTEN=/emsdk-portable/emscripten/1.37.13/
ENV BINARYEN_ROOT=/emsdk-portable/clang/e1.37.13_64bit/binaryen/ ENV BINARYEN_ROOT=/emsdk-portable/clang/e1.37.13_64bit/binaryen/
ENV EM_CONFIG=/emsdk-portable/.emscripten ENV EM_CONFIG=/emsdk-portable/.emscripten
@ -36,5 +33,3 @@ ENV TARGETS=wasm32-unknown-emscripten
ENV RUST_CONFIGURE_ARGS --target=$TARGETS ENV RUST_CONFIGURE_ARGS --target=$TARGETS
ENV SCRIPT python2.7 ../x.py test --target $TARGETS ENV SCRIPT python2.7 ../x.py test --target $TARGETS
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -3,9 +3,6 @@ FROM ubuntu:16.04
COPY scripts/android-base-apt-get.sh /scripts/ COPY scripts/android-base-apt-get.sh /scripts/
RUN sh /scripts/android-base-apt-get.sh RUN sh /scripts/android-base-apt-get.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk # ndk
COPY scripts/android-ndk.sh /scripts/ COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \ RUN . /scripts/android-ndk.sh && \
@ -36,5 +33,3 @@ ENV SCRIPT python2.7 ../x.py dist --target $TARGETS
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -24,11 +24,6 @@ WORKDIR /tmp
COPY dist-fuchsia/shared.sh dist-fuchsia/build-toolchain.sh dist-fuchsia/compiler-rt-dso-handle.patch /tmp/ COPY dist-fuchsia/shared.sh dist-fuchsia/build-toolchain.sh dist-fuchsia/compiler-rt-dso-handle.patch /tmp/
RUN /tmp/build-toolchain.sh RUN /tmp/build-toolchain.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -20,11 +20,6 @@ WORKDIR /build/
COPY dist-i586-gnu-i686-musl/musl-libunwind-patch.patch dist-i586-gnu-i686-musl/build-musl.sh /build/ COPY dist-i586-gnu-i686-musl/musl-libunwind-patch.patch dist-i586-gnu-i686-musl/build-musl.sh /build/
RUN sh /build/build-musl.sh && rm -rf /build RUN sh /build/build-musl.sh && rm -rf /build
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -19,11 +19,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
COPY dist-i686-freebsd/build-toolchain.sh /tmp/ COPY dist-i686-freebsd/build-toolchain.sh /tmp/
RUN /tmp/build-toolchain.sh i686 RUN /tmp/build-toolchain.sh i686
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -13,7 +13,7 @@ set -ex
ARCH=$1 ARCH=$1
BINUTILS=2.25.1 BINUTILS=2.25.1
GCC=5.3.0 GCC=6.4.0
hide_output() { hide_output() {
set +x set +x
@ -86,7 +86,7 @@ rm -rf freebsd
# Finally, download and build gcc to target FreeBSD # Finally, download and build gcc to target FreeBSD
mkdir gcc mkdir gcc
cd gcc cd gcc
curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.bz2 | tar xjf - curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.gz | tar xzf -
cd gcc-$GCC cd gcc-$GCC
./contrib/download_prerequisites ./contrib/download_prerequisites

View File

@ -76,11 +76,6 @@ RUN ./build-cmake.sh
COPY dist-i686-linux/build-headers.sh /tmp/ COPY dist-i686-linux/build-headers.sh /tmp/
RUN ./build-headers.sh RUN ./build-headers.sh
RUN curl -Lo /rustroot/dumb-init \
https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64 && \
chmod +x /rustroot/dumb-init
ENTRYPOINT ["/rustroot/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -13,7 +13,7 @@ set -ex
source shared.sh source shared.sh
VERSION=1.0.2k VERSION=1.0.2k
URL=https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/openssl-$VERSION.tar.gz URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/openssl-$VERSION.tar.gz
curl $URL | tar xzf - curl $URL | tar xzf -

View File

@ -17,14 +17,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV HOSTS=mips-unknown-linux-gnu ENV HOSTS=mips-unknown-linux-gnu
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended

View File

@ -16,14 +16,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV HOSTS=mips64-unknown-linux-gnuabi64 ENV HOSTS=mips64-unknown-linux-gnuabi64
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended

View File

@ -17,14 +17,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV HOSTS=mips64el-unknown-linux-gnuabi64 ENV HOSTS=mips64el-unknown-linux-gnuabi64
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended

View File

@ -16,14 +16,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV HOSTS=mipsel-unknown-linux-gnu ENV HOSTS=mipsel-unknown-linux-gnu
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -3,10 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -19,11 +19,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
COPY dist-x86_64-freebsd/build-toolchain.sh /tmp/ COPY dist-x86_64-freebsd/build-toolchain.sh /tmp/
RUN /tmp/build-toolchain.sh x86_64 RUN /tmp/build-toolchain.sh x86_64
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -13,7 +13,7 @@ set -ex
ARCH=$1 ARCH=$1
BINUTILS=2.25.1 BINUTILS=2.25.1
GCC=5.3.0 GCC=6.4.0
hide_output() { hide_output() {
set +x set +x
@ -86,7 +86,7 @@ rm -rf freebsd
# Finally, download and build gcc to target FreeBSD # Finally, download and build gcc to target FreeBSD
mkdir gcc mkdir gcc
cd gcc cd gcc
curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.bz2 | tar xjf - curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.gz | tar xzf -
cd gcc-$GCC cd gcc-$GCC
./contrib/download_prerequisites ./contrib/download_prerequisites

View File

@ -76,11 +76,6 @@ RUN ./build-cmake.sh
COPY dist-x86_64-linux/build-headers.sh /tmp/ COPY dist-x86_64-linux/build-headers.sh /tmp/
RUN ./build-headers.sh RUN ./build-headers.sh
RUN curl -Lo /rustroot/dumb-init \
https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64 && \
chmod +x /rustroot/dumb-init
ENTRYPOINT ["/rustroot/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -13,7 +13,7 @@ set -ex
source shared.sh source shared.sh
VERSION=1.0.2k VERSION=1.0.2k
URL=https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/openssl-$VERSION.tar.gz URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/openssl-$VERSION.tar.gz
curl $URL | tar xzf - curl $URL | tar xzf -

View File

@ -20,11 +20,6 @@ WORKDIR /build/
COPY dist-x86_64-musl/build-musl.sh /build/ COPY dist-x86_64-musl/build-musl.sh /build/
RUN sh /build/build-musl.sh && rm -rf /build RUN sh /build/build-musl.sh && rm -rf /build
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh

View File

@ -3,11 +3,6 @@ FROM ubuntu:16.04
COPY scripts/cross-apt-packages.sh /scripts/ COPY scripts/cross-apt-packages.sh /scripts/
RUN sh /scripts/cross-apt-packages.sh RUN sh /scripts/cross-apt-packages.sh
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this container) ships with make 4, but something in the # Ubuntu 16.04 (this container) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3 # toolchains we build below chokes on that, so go back to make 3
COPY scripts/make3.sh /scripts/ COPY scripts/make3.sh /scripts/

View File

@ -35,7 +35,7 @@ cd netbsd
mkdir -p /x-tools/x86_64-unknown-netbsd/sysroot mkdir -p /x-tools/x86_64-unknown-netbsd/sysroot
URL=https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror
# Originally from ftp://ftp.netbsd.org/pub/NetBSD/NetBSD-$BSD/source/sets/*.tgz # Originally from ftp://ftp.netbsd.org/pub/NetBSD/NetBSD-$BSD/source/sets/*.tgz
curl $URL/2017-03-17-netbsd-src.tgz | tar xzf - curl $URL/2017-03-17-netbsd-src.tgz | tar xzf -

View File

@ -14,13 +14,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu --disable-optimize-tests ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu --disable-optimize-tests
ENV RUST_CHECK_TARGET check ENV RUST_CHECK_TARGET check

View File

@ -14,13 +14,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu ENV RUST_CONFIGURE_ARGS --build=i686-unknown-linux-gnu
ENV SCRIPT python2.7 ../x.py test ENV SCRIPT python2.7 ../x.py test

View File

@ -57,9 +57,10 @@ mkdir -p $objdir/tmp
args= args=
if [ "$SCCACHE_BUCKET" != "" ]; then if [ "$SCCACHE_BUCKET" != "" ]; then
args="$args --env SCCACHE_BUCKET=$SCCACHE_BUCKET" args="$args --env SCCACHE_BUCKET"
args="$args --env AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID" args="$args --env SCCACHE_REGION"
args="$args --env AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY" args="$args --env AWS_ACCESS_KEY_ID"
args="$args --env AWS_SECRET_ACCESS_KEY"
args="$args --env SCCACHE_ERROR_LOG=/tmp/sccache/sccache.log" args="$args --env SCCACHE_ERROR_LOG=/tmp/sccache/sccache.log"
args="$args --volume $objdir/tmp:/tmp/sccache" args="$args --volume $objdir/tmp:/tmp/sccache"
else else
@ -67,6 +68,13 @@ else
args="$args --env SCCACHE_DIR=/sccache --volume $HOME/.cache/sccache:/sccache" args="$args --env SCCACHE_DIR=/sccache --volume $HOME/.cache/sccache:/sccache"
fi fi
# Run containers as privileged as it should give them access to some more
# syscalls such as ptrace and whatnot. In the upgrade to LLVM 5.0 it was
# discovered that the leak sanitizer apparently needs these syscalls nowadays so
# we'll need `--privileged` for at least the `x86_64-gnu` builder, so this just
# goes ahead and sets it for all builders.
args="$args --privileged"
exec docker \ exec docker \
run \ run \
--volume "$root_dir:/checkout:ro" \ --volume "$root_dir:/checkout:ro" \
@ -75,13 +83,14 @@ exec docker \
--env SRC=/checkout \ --env SRC=/checkout \
$args \ $args \
--env CARGO_HOME=/cargo \ --env CARGO_HOME=/cargo \
--env DEPLOY=$DEPLOY \ --env DEPLOY \
--env DEPLOY_ALT=$DEPLOY_ALT \ --env DEPLOY_ALT \
--env LOCAL_USER_ID=`id -u` \ --env LOCAL_USER_ID=`id -u` \
--env TRAVIS=${TRAVIS-false} \ --env TRAVIS \
--env TRAVIS_BRANCH \ --env TRAVIS_BRANCH \
--volume "$HOME/.cargo:/cargo" \ --volume "$HOME/.cargo:/cargo" \
--volume "$HOME/rustsrc:$HOME/rustsrc" \ --volume "$HOME/rustsrc:$HOME/rustsrc" \
--init \
--rm \ --rm \
rust-ci \ rust-ci \
/checkout/src/ci/run.sh /checkout/src/ci/run.sh

View File

@ -15,7 +15,7 @@ URL=https://dl.google.com/android/repository
download_ndk() { download_ndk() {
mkdir -p /android/ndk mkdir -p /android/ndk
cd /android/ndk cd /android/ndk
curl -O $URL/$1 curl -fO $URL/$1
unzip -q $1 unzip -q $1
rm $1 rm $1
mv android-ndk-* ndk mv android-ndk-* ndk

View File

@ -15,7 +15,7 @@ URL=https://dl.google.com/android/repository
download_sdk() { download_sdk() {
mkdir -p /android/sdk mkdir -p /android/sdk
cd /android/sdk cd /android/sdk
curl -O $URL/$1 curl -fO $URL/$1
unzip -q $1 unzip -q $1
rm -rf $1 rm -rf $1
} }

View File

@ -11,7 +11,7 @@
set -ex set -ex
url="http://crosstool-ng.org/download/crosstool-ng/crosstool-ng-1.22.0.tar.bz2" url="http://crosstool-ng.org/download/crosstool-ng/crosstool-ng-1.22.0.tar.bz2"
curl $url | tar xjf - curl -f $url | tar xjf -
cd crosstool-ng cd crosstool-ng
./configure --prefix=/usr/local ./configure --prefix=/usr/local
make -j$(nproc) make -j$(nproc)

View File

@ -28,14 +28,14 @@ exit 1
} }
# Download last known good emscripten from WebAssembly waterfall # Download last known good emscripten from WebAssembly waterfall
BUILD=$(curl -L https://storage.googleapis.com/wasm-llvm/builds/linux/lkgr.json | \ BUILD=$(curl -fL https://storage.googleapis.com/wasm-llvm/builds/linux/lkgr.json | \
jq '.build | tonumber') jq '.build | tonumber')
curl -L https://storage.googleapis.com/wasm-llvm/builds/linux/$BUILD/wasm-binaries.tbz2 | \ curl -sL https://storage.googleapis.com/wasm-llvm/builds/linux/$BUILD/wasm-binaries.tbz2 | \
hide_output tar xvkj hide_output tar xvkj
# node 8 is required to run wasm # node 8 is required to run wasm
cd / cd /
curl -L https://nodejs.org/dist/v8.0.0/node-v8.0.0-linux-x64.tar.xz | \ curl -sL https://nodejs.org/dist/v8.0.0/node-v8.0.0-linux-x64.tar.xz | \
tar -xJ tar -xJ
# Make emscripten use wasm-ready node and LLVM tools # Make emscripten use wasm-ready node and LLVM tools

View File

@ -28,7 +28,7 @@ exit 1
} }
cd / cd /
curl -L https://s3.amazonaws.com/mozilla-games/emscripten/releases/emsdk-portable.tar.gz | \ curl -fL https://s3.amazonaws.com/mozilla-games/emscripten/releases/emsdk-portable.tar.gz | \
tar -xz tar -xz
cd /emsdk-portable cd /emsdk-portable
@ -49,5 +49,5 @@ chmod a+rxw -R /emsdk-portable
# node 8 is required to run wasm # node 8 is required to run wasm
cd / cd /
curl -L https://nodejs.org/dist/v8.0.0/node-v8.0.0-linux-x64.tar.xz | \ curl -sL https://nodejs.org/dist/v8.0.0/node-v8.0.0-linux-x64.tar.xz | \
tar -xJ tar -xJ

View File

@ -10,7 +10,7 @@
set -ex set -ex
curl https://ftp.gnu.org/gnu/make/make-3.81.tar.gz | tar xzf - curl -f https://ftp.gnu.org/gnu/make/make-3.81.tar.gz | tar xzf -
cd make-3.81 cd make-3.81
./configure --prefix=/usr ./configure --prefix=/usr
make make

View File

@ -8,9 +8,11 @@
# option. This file may not be copied, modified, or distributed # option. This file may not be copied, modified, or distributed
# except according to those terms. # except according to those terms.
# ignore-tidy-linelength
set -ex set -ex
curl -o /usr/local/bin/sccache \ curl -fo /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl
chmod +x /usr/local/bin/sccache chmod +x /usr/local/bin/sccache

View File

@ -14,13 +14,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
xz-utils \ xz-utils \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu
ENV RUST_CHECK_TARGET check-aux ENV RUST_CHECK_TARGET check-aux

View File

@ -13,14 +13,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \ --build=x86_64-unknown-linux-gnu \
--enable-debug \ --enable-debug \

View File

@ -15,14 +15,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu
ENV SCRIPT python2.7 ../x.py test distcheck ENV SCRIPT python2.7 ../x.py test distcheck
ENV DIST_SRC 1 ENV DIST_SRC 1

View File

@ -13,14 +13,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \ --build=x86_64-unknown-linux-gnu \
--enable-full-bootstrap --enable-full-bootstrap

View File

@ -13,14 +13,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu
ENV RUSTFLAGS -Zincremental=/tmp/rust-incr-cache ENV RUSTFLAGS -Zincremental=/tmp/rust-incr-cache
ENV RUST_CHECK_TARGET check ENV RUST_CHECK_TARGET check

View File

@ -16,14 +16,9 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
zlib1g-dev \ zlib1g-dev \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \ --build=x86_64-unknown-linux-gnu \
--llvm-root=/usr/lib/llvm-3.7 --llvm-root=/usr/lib/llvm-3.7

View File

@ -13,13 +13,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu --disable-optimize-tests ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu --disable-optimize-tests
ENV RUST_CHECK_TARGET check ENV RUST_CHECK_TARGET check

View File

@ -13,13 +13,8 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \ gdb \
xz-utils xz-utils
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY scripts/sccache.sh /scripts/ COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh RUN sh /scripts/sccache.sh
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu --enable-sanitizers --enable-profiler ENV RUST_CONFIGURE_ARGS --build=x86_64-unknown-linux-gnu --enable-sanitizers --enable-profiler
ENV SCRIPT python2.7 ../x.py test ENV SCRIPT python2.7 ../x.py test

View File

@ -31,7 +31,6 @@ RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-sccache"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-manage-submodules" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-manage-submodules"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-locked-deps" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-locked-deps"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-cargo-openssl-static" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-cargo-openssl-static"
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --enable-llvm-clean-rebuild"
if [ "$DIST_SRC" = "" ]; then if [ "$DIST_SRC" = "" ]; then
RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-dist-src" RUST_CONFIGURE_ARGS="$RUST_CONFIGURE_ARGS --disable-dist-src"
@ -74,6 +73,12 @@ retry make prepare
travis_fold end make-prepare travis_fold end make-prepare
travis_time_finish travis_time_finish
travis_fold start check-bootstrap
travis_time_start
make check-bootstrap
travis_fold end check-bootstrap
travis_time_finish
if [ "$TRAVIS_OS_NAME" = "osx" ]; then if [ "$TRAVIS_OS_NAME" = "osx" ]; then
ncpus=$(sysctl -n hw.ncpu) ncpus=$(sysctl -n hw.ncpu)
else else

View File

@ -20,8 +20,14 @@ cargo run -- ../../second-edition/src
cd ../.. cd ../..
# tests for the second edition # tests for the first edition
cd first-edition
mdbook test
mdbook build
cd ..
# tests for the second edition
cd second-edition cd second-edition
bash spellcheck.sh list bash spellcheck.sh list
mdbook test mdbook test

View File

@ -53,7 +53,7 @@ Uh oh! Your reference is pointing to an invalid resource. This is called a
dangling pointer or use after free, when the resource is memory. A small dangling pointer or use after free, when the resource is memory. A small
example of such a situation would be: example of such a situation would be:
```rust,compile_fail ```rust,ignore
let r; // Introduce reference: `r`. let r; // Introduce reference: `r`.
{ {
let i = 1; // Introduce scoped value: `i`. let i = 1; // Introduce scoped value: `i`.
@ -70,7 +70,7 @@ as it can see the lifetimes of the various values in the function.
When we have a function that takes arguments by reference the situation becomes When we have a function that takes arguments by reference the situation becomes
more complex. Consider the following example: more complex. Consider the following example:
```rust,compile_fail,E0106 ```rust,ignore
fn skip_prefix(line: &str, prefix: &str) -> &str { fn skip_prefix(line: &str, prefix: &str) -> &str {
// ... // ...
# line # line

Some files were not shown because too many files have changed in this diff Show More