New upstream version 1.34.2+dfsg1

This commit is contained in:
Ximin Luo 2019-05-19 01:32:59 -07:00
parent 0731742a19
commit 9fa0177847
3499 changed files with 104681 additions and 143705 deletions

View File

@ -19,9 +19,16 @@ hop on [#rust-internals][pound-rust-internals].
As a reminder, all contributors are expected to follow our [Code of Conduct][coc].
The [rustc-guide] is your friend! It describes how the compiler works and how
to contribute to it in more detail than this document.
If this is your first time contributing, the [walkthrough] chapter of the guide
can give you a good example of how a typical contribution would go.
[pound-rust-internals]: https://chat.mibbit.com/?server=irc.mozilla.org&channel=%23rust-internals
[internals]: https://internals.rust-lang.org
[coc]: https://www.rust-lang.org/conduct.html
[walkthrough]: https://rust-lang.github.io/rustc-guide/walkthrough.html
## Feature Requests
[feature-requests]: #feature-requests
@ -89,222 +96,14 @@ $ RUST_BACKTRACE=1 rustc ...
```
## The Build System
[the-build-system]: #the-build-system
Rust's build system allows you to bootstrap the compiler, run tests &
benchmarks, generate documentation, install a fresh build of Rust, and more.
It's your best friend when working on Rust, allowing you to compile & test
your contributions before submission.
For info on how to configure and build the compiler, please see [this
chapter][rustcguidebuild] of the rustc-guide. This chapter contains info for
contributions to the compiler and the standard library. It also lists some
really useful commands to the build system (`./x.py`), which could save you a
lot of time.
The build system lives in [the `src/bootstrap` directory][bootstrap] in the
project root. Our build system is itself written in Rust and is based on Cargo
to actually build all the compiler's crates. If you have questions on the build
system internals, try asking in [`#rust-internals`][pound-rust-internals].
[bootstrap]: https://github.com/rust-lang/rust/tree/master/src/bootstrap/
### Configuration
[configuration]: #configuration
Before you can start building the compiler you need to configure the build for
your system. In most cases, that will just mean using the defaults provided
for Rust.
To change configuration, you must copy the file `config.toml.example`
to `config.toml` in the directory from which you will be running the build, and
change the settings provided.
There are large number of options provided in this config file that will alter the
configuration used in the build process. Some options to note:
#### `[llvm]`:
- `assertions = true` = This enables LLVM assertions, which makes LLVM misuse cause an assertion failure instead of weird misbehavior. This also slows down the compiler's runtime by ~20%.
- `ccache = true` - Use ccache when building llvm
#### `[build]`:
- `compiler-docs = true` - Build compiler documentation
#### `[rust]`:
- `debuginfo = true` - Build a compiler with debuginfo. Makes building rustc slower, but then you can use a debugger to debug `rustc`.
- `debuginfo-lines = true` - An alternative to `debuginfo = true` that doesn't let you use a debugger, but doesn't make building rustc slower and still gives you line numbers in backtraces.
- `debuginfo-tools = true` - Build the extended tools with debuginfo.
- `debug-assertions = true` - Makes the log output of `debug!` work.
- `optimize = false` - Disable optimizations to speed up compilation of stage1 rust, but makes the stage1 compiler x100 slower.
For more options, the `config.toml` file contains commented out defaults, with
descriptions of what each option will do.
Note: Previously the `./configure` script was used to configure this
project. It can still be used, but it's recommended to use a `config.toml`
file. If you still have a `config.mk` file in your directory - from
`./configure` - you may need to delete it for `config.toml` to work.
### Building
[building]: #building
A default configuration requires around 3.5 GB of disk space, whereas building a debug configuration may require more than 30 GB.
Dependencies
- [build dependencies](README.md#building-from-source)
- `gdb` 6.2.0 minimum, 7.1 or later recommended for test builds
The build system uses the `x.py` script to control the build process. This script
is used to build, test, and document various parts of the compiler. You can
execute it as:
```sh
python x.py build
```
On some systems you can also use the shorter version:
```sh
./x.py build
```
To learn more about the driver and top-level targets, you can execute:
```sh
python x.py --help
```
The general format for the driver script is:
```sh
python x.py <command> [<directory>]
```
Some example commands are `build`, `test`, and `doc`. These will build, test,
and document the specified directory. The second argument, `<directory>`, is
optional and defaults to working over the entire compiler. If specified,
however, only that specific directory will be built. For example:
```sh
# build the entire compiler
python x.py build
# build all documentation
python x.py doc
# run all test suites
python x.py test
# build only the standard library
python x.py build src/libstd
# test only one particular test suite
python x.py test src/test/rustdoc
# build only the stage0 libcore library
python x.py build src/libcore --stage 0
```
You can explore the build system through the various `--help` pages for each
subcommand. For example to learn more about a command you can run:
```
python x.py build --help
```
To learn about all possible rules you can execute, run:
```
python x.py build --help --verbose
```
Note: Previously `./configure` and `make` were used to build this project.
They are still available, but `x.py` is the recommended build system.
### Useful commands
[useful-commands]: #useful-commands
Some common invocations of `x.py` are:
- `x.py build --help` - show the help message and explain the subcommand
- `x.py build src/libtest --stage 1` - build up to (and including) the first
stage. For most cases we don't need to build the stage2 compiler, so we can
save time by not building it. The stage1 compiler is a fully functioning
compiler and (probably) will be enough to determine if your change works as
expected.
- `x.py build src/rustc --stage 1` - This will build just rustc, without libstd.
This is the fastest way to recompile after you changed only rustc source code.
Note however that the resulting rustc binary won't have a stdlib to link
against by default. You can build libstd once with `x.py build src/libstd`,
but it is only guaranteed to work if recompiled, so if there are any issues
recompile it.
- `x.py test` - build the full compiler & run all tests (takes a while). This
is what gets run by the continuous integration system against your pull
request. You should run this before submitting to make sure your tests pass
& everything builds in the correct manner.
- `x.py test src/libstd --stage 1` - test the standard library without
recompiling stage 2.
- `x.py test src/test/run-pass --test-args TESTNAME` - Run a matching set of
tests.
- `TESTNAME` should be a substring of the tests to match against e.g. it could
be the fully qualified test name, or just a part of it.
`TESTNAME=collections::hash::map::test_map::test_capacity_not_less_than_len`
or `TESTNAME=test_capacity_not_less_than_len`.
- `x.py test src/test/run-pass --stage 1 --test-args <substring-of-test-name>` -
Run a single rpass test with the stage1 compiler (this will be quicker than
running the command above as we only build the stage1 compiler, not the entire
thing). You can also leave off the directory argument to run all stage1 test
types.
- `x.py test src/libcore --stage 1` - Run stage1 tests in `libcore`.
- `x.py test src/tools/tidy` - Check that the source code is in compliance with
Rust's style guidelines. There is no official document describing Rust's full
guidelines as of yet, but basic rules like 4 spaces for indentation and no
more than 99 characters in a single line should be kept in mind when writing
code.
### Using your local build
[using-local-build]: #using-local-build
If you use Rustup to manage your rust install, it has a feature called ["custom
toolchains"][toolchain-link] that you can use to access your newly-built compiler
without having to install it to your system or user PATH. If you've run `python
x.py build`, then you can add your custom rustc to a new toolchain like this:
[toolchain-link]: https://github.com/rust-lang-nursery/rustup.rs#working-with-custom-toolchains-and-local-builds
```
rustup toolchain link <name> build/<host-triple>/stage2
```
Where `<host-triple>` is the build triple for the host (the triple of your
computer, by default), and `<name>` is the name for your custom toolchain. (If you
added `--stage 1` to your build command, the compiler will be in the `stage1`
folder instead.) You'll only need to do this once - it will automatically point
to the latest build you've done.
Once this is set up, you can use your custom toolchain just like any other. For
example, if you've named your toolchain `local`, running `cargo +local build` will
compile a project with your custom rustc, setting `rustup override set local` will
override the toolchain for your current directory, and `cargo +local doc` will use
your custom rustc and rustdoc to generate docs. (If you do this with a `--stage 1`
build, you'll need to build rustdoc specially, since it's not normally built in
stage 1. `python x.py build --stage 1 src/libstd src/tools/rustdoc` will build
rustdoc and libstd, which will allow rustdoc to be run with that toolchain.)
### Out-of-tree builds
[out-of-tree-builds]: #out-of-tree-builds
Rust's `x.py` script fully supports out-of-tree builds - it looks for
the Rust source code from the directory `x.py` was found in, but it
reads the `config.toml` configuration file from the directory it's
run in, and places all build artifacts within a subdirectory named `build`.
This means that if you want to do an out-of-tree build, you can just do it:
```
$ cd my/build/dir
$ cp ~/my-config.toml config.toml # Or fill in config.toml otherwise
$ path/to/rust/x.py build
...
$ # This will use the Rust source code in `path/to/rust`, but build
$ # artifacts will now be in ./build
```
It's absolutely fine to have multiple build directories with different
`config.toml` configurations using the same code.
[rustcguidebuild]: https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html
## Pull Requests
[pull-requests]: #pull-requests
@ -320,26 +119,13 @@ bring those changes into the source repository.
Please make pull requests against the `master` branch.
Compiling all of `./x.py test` can take a while. When testing your pull request,
consider using one of the more specialized `./x.py` targets to cut down on the
amount of time you have to wait. You need to have built the compiler at least
once before running these will work, but thats only one full build rather than
one each time.
$ python x.py test --stage 1
is one such example, which builds just `rustc`, and then runs the tests. If
youre adding something to the standard library, try
$ python x.py test src/libstd --stage 1
Please make sure your pull request is in compliance with Rust's style
guidelines by running
$ python x.py test src/tools/tidy
Make this check before every pull request (and every new commit in a pull
request) ; you can add [git hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks)
request); you can add [git hooks](https://git-scm.com/book/en/v2/Customizing-Git-Git-Hooks)
before every push to make sure you never forget to make this check.
All pull requests are reviewed by another person. We have a bot,
@ -375,10 +161,10 @@ it can be found [here][rctd].
Currently building Rust will also build the following external projects:
* [clippy](https://github.com/rust-lang-nursery/rust-clippy)
* [miri](https://github.com/solson/miri)
* [rustfmt](https://github.com/rust-lang-nursery/rustfmt)
* [rls](https://github.com/rust-lang-nursery/rls/)
* [clippy](https://github.com/rust-lang/rust-clippy)
* [miri](https://github.com/rust-lang/miri)
* [rustfmt](https://github.com/rust-lang/rustfmt)
* [rls](https://github.com/rust-lang/rls/)
We allow breakage of these tools in the nightly channel. Maintainers of these
projects will be notified of the breakages and should fix them as soon as
@ -405,9 +191,9 @@ before the PR is merged.
Rust's build system builds a number of tools that make use of the
internals of the compiler. This includes
[Clippy](https://github.com/rust-lang-nursery/rust-clippy),
[RLS](https://github.com/rust-lang-nursery/rls) and
[rustfmt](https://github.com/rust-lang-nursery/rustfmt). If these tools
[Clippy](https://github.com/rust-lang/rust-clippy),
[RLS](https://github.com/rust-lang/rls) and
[rustfmt](https://github.com/rust-lang/rustfmt). If these tools
break because of your changes, you may run into a sort of "chicken and egg"
problem. These tools rely on the latest compiler to be built so you can't update
them to reflect your changes to the compiler until those changes are merged into
@ -467,10 +253,10 @@ to complete a few more steps which are outlined with their rationale below.
*(This error may change in the future to include more information.)*
```
error: failed to resolve patches for `https://github.com/rust-lang-nursery/rustfmt`
error: failed to resolve patches for `https://github.com/rust-lang/rustfmt`
Caused by:
patch for `rustfmt-nightly` in `https://github.com/rust-lang-nursery/rustfmt` did not resolve to any crates
patch for `rustfmt-nightly` in `https://github.com/rust-lang/rustfmt` did not resolve to any crates
failed to run: ~/rust/build/x86_64-unknown-linux-gnu/stage0/bin/cargo build --manifest-path ~/rust/src/bootstrap/Cargo.toml
```
@ -532,6 +318,12 @@ to check small fixes. For example, `rustdoc src/doc/reference.md` will render
reference to `doc/reference.html`. The CSS might be messed up, but you can
verify that the HTML is right.
Additionally, contributions to the [rustc-guide] are always welcome. Contributions
can be made directly at [the
rust-lang/rustc-guide](https://github.com/rust-lang/rustc-guide) repo. The issue
tracker in that repo is also a great way to find things that need doing. There
are issues for beginners and advanced compiler devs alike!
## Issue Triage
[issue-triage]: #issue-triage
@ -627,7 +419,7 @@ For people new to Rust, and just starting to contribute, or even for
more seasoned developers, some useful places to look for information
are:
* The [rustc guide] contains information about how various parts of the compiler work
* The [rustc guide] contains information about how various parts of the compiler work and how to contribute to the compiler
* [Rust Forge][rustforge] contains additional documentation, including write-ups of how to achieve common tasks
* The [Rust Internals forum][rif], a place to ask questions and
discuss Rust's internals

View File

@ -23,7 +23,7 @@ The Rust Project includes packages written by third parties.
The following third party packages are included, and carry
their own copyright notices and license terms:
* LLVM. Code for this package is found in src/llvm.
* LLVM. Code for this package is found in src/llvm-project.
Copyright (c) 2003-2013 University of Illinois at
Urbana-Champaign. All rights reserved.
@ -73,8 +73,8 @@ their own copyright notices and license terms:
OTHER DEALINGS WITH THE SOFTWARE.
* Additional libraries included in LLVM carry separate
BSD-compatible licenses. See src/llvm/LICENSE.txt for
details.
BSD-compatible licenses. See src/llvm-project/llvm/LICENSE.TXT
for details.
* compiler-rt, in src/compiler-rt is dual licensed under
LLVM's license and MIT:

985
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -13,9 +13,13 @@ Read ["Installation"] from [The Book].
["Installation"]: https://doc.rust-lang.org/book/ch01-01-installation.html
[The Book]: https://doc.rust-lang.org/book/index.html
## Building from Source
## Installing from Source
[building-from-source]: #building-from-source
_Note: If you wish to contribute to the compiler, you should read
[this chapter](https://rust-lang.github.io/rustc-guide/how-to-build-and-run.html)
of the rustc-guide instead._
### Building on *nix
1. Make sure you have installed the dependencies:

View File

@ -1,3 +1,168 @@
Version 1.34.2 (2019-05-14)
===========================
* [Destabilize the `Error::type_id` function due to a security
vulnerability][60785]
[60785]: https://github.com/rust-lang/rust/pull/60785
Version 1.34.1 (2019-04-25)
===========================
* [Fix false positives for the `redundant_closure` Clippy lint][clippy/3821]
* [Fix false positives for the `missing_const_for_fn` Clippy lint][clippy/3844]
* [Fix Clippy panic when checking some macros][clippy/3805]
[clippy/3821]: https://github.com/rust-lang/rust-clippy/pull/3821
[clippy/3844]: https://github.com/rust-lang/rust-clippy/pull/3844
[clippy/3805]: https://github.com/rust-lang/rust-clippy/pull/3805
Version 1.34.0 (2019-04-11)
==========================
Language
--------
- [You can now use `#[deprecated = "reason"]`][58166] as a shorthand for
`#[deprecated(note = "reason")]`. This was previously allowed by mistake
but had no effect.
- [You can now accept token streams in `#[attr()]`,`#[attr[]]`, and
`#[attr{}]` procedural macros.][57367]
- [You can now write `extern crate self as foo;`][57407] to import your
crate's root into the extern prelude.
Compiler
--------
- [You can now target `riscv64imac-unknown-none-elf` and
`riscv64gc-unknown-none-elf`.][58406]
- [You can now enable linker plugin LTO optimisations with
`-C linker-plugin-lto`.][58057] This allows rustc to compile your Rust code
into LLVM bitcode allowing LLVM to perform LTO optimisations across C/C++ FFI
boundaries.
- [You can now target `powerpc64-unknown-freebsd`.][57809]
Libraries
---------
- [The trait bounds have been removed on some of `HashMap<K, V, S>`'s and
`HashSet<T, S>`'s basic methods.][58370] Most notably you no longer require
the `Hash` trait to create an iterator.
- [The `Ord` trait bounds have been removed on some of `BinaryHeap<T>`'s basic
methods.][58421] Most notably you no longer require the `Ord` trait to create
an iterator.
- [The methods `overflowing_neg` and `wrapping_neg` are now `const` functions
for all numeric types.][58044]
- [Indexing a `str` is now generic over all types that
implement `SliceIndex<str>`.][57604]
- [`str::trim`, `str::trim_matches`, `str::trim_{start, end}`, and
`str::trim_{start, end}_matches` are now `#[must_use]`][57106] and will
produce a warning if their returning type is unused.
- [The methods `checked_pow`, `saturating_pow`, `wrapping_pow`, and
`overflowing_pow` are now available for all numeric types.][57873] These are
equivalvent to methods such as `wrapping_add` for the `pow` operation.
Stabilized APIs
---------------
#### std & core
* [`Any::type_id`]
* [`Error::type_id`]
* [`atomic::AtomicI16`]
* [`atomic::AtomicI32`]
* [`atomic::AtomicI64`]
* [`atomic::AtomicI8`]
* [`atomic::AtomicU16`]
* [`atomic::AtomicU32`]
* [`atomic::AtomicU64`]
* [`atomic::AtomicU8`]
* [`convert::Infallible`]
* [`convert::TryFrom`]
* [`convert::TryInto`]
* [`iter::from_fn`]
* [`iter::successors`]
* [`num::NonZeroI128`]
* [`num::NonZeroI16`]
* [`num::NonZeroI32`]
* [`num::NonZeroI64`]
* [`num::NonZeroI8`]
* [`num::NonZeroIsize`]
* [`slice::sort_by_cached_key`]
* [`str::escape_debug`]
* [`str::escape_default`]
* [`str::escape_unicode`]
* [`str::split_ascii_whitespace`]
#### std
* [`Instant::checked_add`]
* [`Instant::checked_sub`]
* [`SystemTime::checked_add`]
* [`SystemTime::checked_sub`]
Cargo
-----
- [You can now use alternative registries to crates.io.][cargo/6654]
Misc
----
- [You can now use the `?` operator in your documentation tests without manually
adding `fn main() -> Result<(), _> {}`.][56470]
Compatibility Notes
-------------------
- [`Command::before_exec` is now deprecated in favor of the
unsafe method `Command::pre_exec`.][58059]
- [Use of `ATOMIC_{BOOL, ISIZE, USIZE}_INIT` is now deprecated.][57425] As you
can now use `const` functions in `static` variables.
[58370]: https://github.com/rust-lang/rust/pull/58370/
[58406]: https://github.com/rust-lang/rust/pull/58406/
[58421]: https://github.com/rust-lang/rust/pull/58421/
[58166]: https://github.com/rust-lang/rust/pull/58166/
[58044]: https://github.com/rust-lang/rust/pull/58044/
[58057]: https://github.com/rust-lang/rust/pull/58057/
[58059]: https://github.com/rust-lang/rust/pull/58059/
[57809]: https://github.com/rust-lang/rust/pull/57809/
[57873]: https://github.com/rust-lang/rust/pull/57873/
[57604]: https://github.com/rust-lang/rust/pull/57604/
[57367]: https://github.com/rust-lang/rust/pull/57367/
[57407]: https://github.com/rust-lang/rust/pull/57407/
[57425]: https://github.com/rust-lang/rust/pull/57425/
[57106]: https://github.com/rust-lang/rust/pull/57106/
[56470]: https://github.com/rust-lang/rust/pull/56470/
[cargo/6654]: https://github.com/rust-lang/cargo/pull/6654/
[`Any::type_id`]: https://doc.rust-lang.org/std/any/trait.Any.html#tymethod.type_id
[`Error::type_id`]: https://doc.rust-lang.org/std/error/trait.Error.html#method.type_id
[`atomic::AtomicI16`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicI16.html
[`atomic::AtomicI32`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicI32.html
[`atomic::AtomicI64`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicI64.html
[`atomic::AtomicI8`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicI8.html
[`atomic::AtomicU16`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicU16.html
[`atomic::AtomicU32`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicU32.html
[`atomic::AtomicU64`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicU64.html
[`atomic::AtomicU8`]: https://doc.rust-lang.org/std/sync/atomic/struct.AtomicU8.html
[`convert::Infallible`]: https://doc.rust-lang.org/std/convert/enum.Infallible.html
[`convert::TryFrom`]: https://doc.rust-lang.org/std/convert/trait.TryFrom.html
[`convert::TryInto`]: https://doc.rust-lang.org/std/convert/trait.TryInto.html
[`iter::from_fn`]: https://doc.rust-lang.org/std/iter/fn.from_fn.html
[`iter::successors`]: https://doc.rust-lang.org/std/iter/fn.successors.html
[`num::NonZeroI128`]: https://doc.rust-lang.org/std/num/struct.NonZeroI128.html
[`num::NonZeroI16`]: https://doc.rust-lang.org/std/num/struct.NonZeroI16.html
[`num::NonZeroI32`]: https://doc.rust-lang.org/std/num/struct.NonZeroI32.html
[`num::NonZeroI64`]: https://doc.rust-lang.org/std/num/struct.NonZeroI64.html
[`num::NonZeroI8`]: https://doc.rust-lang.org/std/num/struct.NonZeroI8.html
[`num::NonZeroIsize`]: https://doc.rust-lang.org/std/num/struct.NonZeroIsize.html
[`slice::sort_by_cached_key`]: https://doc.rust-lang.org/std/primitive.slice.html#method.sort_by_cached_key
[`str::escape_debug`]: https://doc.rust-lang.org/std/primitive.str.html#method.escape_debug
[`str::escape_default`]: https://doc.rust-lang.org/std/primitive.str.html#method.escape_default
[`str::escape_unicode`]: https://doc.rust-lang.org/std/primitive.str.html#method.escape_unicode
[`str::split_ascii_whitespace`]: https://doc.rust-lang.org/std/primitive.str.html#method.split_ascii_whitespace
[`Instant::checked_add`]: https://doc.rust-lang.org/std/time/struct.Instant.html#method.checked_add
[`Instant::checked_sub`]: https://doc.rust-lang.org/std/time/struct.Instant.html#method.checked_sub
[`SystemTime::checked_add`]: https://doc.rust-lang.org/std/time/struct.SystemTime.html#method.checked_add
[`SystemTime::checked_sub`]: https://doc.rust-lang.org/std/time/struct.SystemTime.html#method.checked_sub
Version 1.33.0 (2019-02-28)
==========================
@ -99,6 +264,8 @@ Stabilized APIs
Cargo
-----
- [You can now publish crates that require a feature flag to compile with
`cargo publish --features` or `cargo publish --all-features`.][cargo/6453]
- [Cargo should now rebuild a crate if a file was modified during the initial
build.][cargo/6484]
@ -110,8 +277,11 @@ Compatibility Notes
methods instead.
- The `Error::cause` method has been deprecated in favor of `Error::source` which supports
downcasting.
- [Libtest no longer creates a new thread for each test when
`--test-threads=1`. It also runs the tests in deterministic order][56243]
[55982]: https://github.com/rust-lang/rust/pull/55982/
[56243]: https://github.com/rust-lang/rust/pull/56243
[56303]: https://github.com/rust-lang/rust/pull/56303/
[56351]: https://github.com/rust-lang/rust/pull/56351/
[56362]: https://github.com/rust-lang/rust/pull/56362
@ -132,6 +302,7 @@ Compatibility Notes
[57535]: https://github.com/rust-lang/rust/pull/57535/
[57566]: https://github.com/rust-lang/rust/pull/57566/
[57615]: https://github.com/rust-lang/rust/pull/57615/
[cargo/6453]: https://github.com/rust-lang/cargo/pull/6453/
[cargo/6484]: https://github.com/rust-lang/cargo/pull/6484/
[`unix::FileExt::read_exact_at`]: https://doc.rust-lang.org/std/os/unix/fs/trait.FileExt.html#method.read_exact_at
[`unix::FileExt::write_all_at`]: https://doc.rust-lang.org/std/os/unix/fs/trait.FileExt.html#method.write_all_at

View File

@ -90,12 +90,21 @@
# with clang-cl, so this is special in that it only compiles LLVM with clang-cl
#clang-cl = '/path/to/clang-cl.exe'
# Pass extra compiler and linker flags to the LLVM CMake build.
#cflags = "-fextra-flag"
#cxxflags = "-fextra-flag"
#ldflags = "-Wl,extra-flag"
# Use libc++ when building LLVM instead of libstdc++. This is the default on
# platforms already use libc++ as the default C++ library, but this option
# allows you to use libc++ even on platforms when it's not. You need to ensure
# that your host compiler ships with libc++.
#use-libcxx = true
# The value specified here will be passed as `-DLLVM_USE_LINKER` to CMake.
#use-linker = "lld"
# =============================================================================
# General build configuration options
# =============================================================================
@ -312,8 +321,8 @@
# Whether to always use incremental compilation when building rustc
#incremental = false
# Build rustc with experimental parallelization
#experimental-parallel-queries = false
# Build a multi-threaded rustc
#parallel-compiler = false
# The default linker that will be hard-coded into the generated compiler for
# targets that don't specify linker explicitly in their target specifications.

View File

@ -1 +1 @@
2aa4c46cfdd726e97360c2734835aa3515e8c858
6c2484dc3c532c052f159264e970278d8b77cdc9

View File

@ -8,7 +8,6 @@ For more information on how various parts of the compiler work, see the [rustc g
There is also useful content in the following READMEs, which are gradually being moved over to the guide:
- https://github.com/rust-lang/rust/tree/master/src/librustc/ty/query
- https://github.com/rust-lang/rust/tree/master/src/librustc/dep_graph
- https://github.com/rust-lang/rust/blob/master/src/librustc/infer/region_constraints
- https://github.com/rust-lang/rust/tree/master/src/librustc/infer/higher_ranked
- https://github.com/rust-lang/rust/tree/master/src/librustc/infer/lexical_region_resolve

View File

@ -7,8 +7,6 @@
#![deny(warnings)]
extern crate bootstrap;
use std::env;
use bootstrap::{Config, Build};

View File

@ -17,8 +17,6 @@
#![deny(warnings)]
extern crate bootstrap;
use std::env;
use std::ffi::OsString;
use std::io;
@ -284,8 +282,8 @@ fn main() {
}
}
if env::var_os("RUSTC_PARALLEL_QUERIES").is_some() {
cmd.arg("--cfg").arg("parallel_queries");
if env::var_os("RUSTC_PARALLEL_COMPILER").is_some() {
cmd.arg("--cfg").arg("parallel_compiler");
}
if env::var_os("RUSTC_DENY_WARNINGS").is_some() && env::var_os("RUSTC_EXTERNAL_TOOL").is_none()

View File

@ -4,8 +4,6 @@
#![deny(warnings)]
extern crate bootstrap;
use std::env;
use std::process::Command;
use std::path::PathBuf;
@ -16,6 +14,7 @@ fn main() {
let libdir = env::var_os("RUSTDOC_LIBDIR").expect("RUSTDOC_LIBDIR was not set");
let stage = env::var("RUSTC_STAGE").expect("RUSTC_STAGE was not set");
let sysroot = env::var_os("RUSTC_SYSROOT").expect("RUSTC_SYSROOT was not set");
let mut has_unstable = false;
use std::str::FromStr;
@ -54,9 +53,33 @@ fn main() {
// it up so we can make rustdoc print this into the docs
if let Some(version) = env::var_os("RUSTDOC_CRATE_VERSION") {
// This "unstable-options" can be removed when `--crate-version` is stabilized
cmd.arg("-Z")
.arg("unstable-options")
.arg("--crate-version").arg(version);
if !has_unstable {
cmd.arg("-Z")
.arg("unstable-options");
}
cmd.arg("--crate-version").arg(version);
has_unstable = true;
}
// Needed to be able to run all rustdoc tests.
if let Some(_) = env::var_os("RUSTDOC_GENERATE_REDIRECT_PAGES") {
// This "unstable-options" can be removed when `--generate-redirect-pages` is stabilized
if !has_unstable {
cmd.arg("-Z")
.arg("unstable-options");
}
cmd.arg("--generate-redirect-pages");
has_unstable = true;
}
// Needed to be able to run all rustdoc tests.
if let Some(ref x) = env::var_os("RUSTDOC_RESOURCE_SUFFIX") {
// This "unstable-options" can be removed when `--resource-suffix` is stabilized
if !has_unstable {
cmd.arg("-Z")
.arg("unstable-options");
}
cmd.arg("--resource-suffix").arg(x);
}
if verbose > 1 {

View File

@ -1,5 +1,3 @@
extern crate cc;
use std::env;
use std::process::{self, Command};

View File

@ -230,6 +230,9 @@ def default_build_triple():
err = "unknown OS type: {}".format(ostype)
sys.exit(err)
if cputype == 'powerpc' and ostype == 'unknown-freebsd':
cputype = subprocess.check_output(
['uname', '-p']).strip().decode(default_encoding)
cputype_mapper = {
'BePC': 'i686',
'aarch64': 'aarch64',
@ -698,21 +701,13 @@ class RustBuild(object):
filtered_submodules = []
submodules_names = []
for module in submodules:
if module.endswith("llvm"):
if self.get_toml('llvm-config'):
if module.endswith("llvm-project"):
if self.get_toml('llvm-config') and self.get_toml('lld') != 'true':
continue
if module.endswith("llvm-emscripten"):
backends = self.get_toml('codegen-backends')
if backends is None or not 'emscripten' in backends:
continue
if module.endswith("lld"):
config = self.get_toml('lld')
if config is None or config == 'false':
continue
if module.endswith("lldb") or module.endswith("clang"):
config = self.get_toml('lldb')
if config is None or config == 'false':
continue
check = self.check_submodule(module, slow_submodules)
filtered_submodules.append((module, check))
submodules_names.append(module)

View File

@ -21,7 +21,7 @@ use crate::install;
use crate::native;
use crate::test;
use crate::tool;
use crate::util::{add_lib_path, exe, libdir};
use crate::util::{self, add_lib_path, exe, libdir};
use crate::{Build, DocTests, Mode, GitRepo};
pub use crate::Compiler;
@ -60,23 +60,23 @@ pub trait Step: 'static + Clone + Debug + PartialEq + Eq + Hash {
/// Run this rule for all hosts without cross compiling.
const ONLY_HOSTS: bool = false;
/// Primary function to execute this rule. Can call `builder.ensure(...)`
/// Primary function to execute this rule. Can call `builder.ensure()`
/// with other steps to run those.
fn run(self, builder: &Builder) -> Self::Output;
fn run(self, builder: &Builder<'_>) -> Self::Output;
/// When bootstrap is passed a set of paths, this controls whether this rule
/// will execute. However, it does not get called in a "default" context
/// when we are not passed any paths; in that case, make_run is called
/// when we are not passed any paths; in that case, `make_run` is called
/// directly.
fn should_run(run: ShouldRun) -> ShouldRun;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_>;
/// Build up a "root" rule, either as a default rule or from a path passed
/// Builds up a "root" rule, either as a default rule or from a path passed
/// to us.
///
/// When path is `None`, we are executing in a context where no paths were
/// passed. When `./x.py build` is run, for example, this rule could get
/// called if it is in the correct list below with a path of `None`.
fn make_run(_run: RunConfig) {
fn make_run(_run: RunConfig<'_>) {
// It is reasonable to not have an implementation of make_run for rules
// who do not want to get called from the root context. This means that
// they are likely dependencies (e.g., sysroot creation) or similar, and
@ -95,8 +95,8 @@ pub struct RunConfig<'a> {
struct StepDescription {
default: bool,
only_hosts: bool,
should_run: fn(ShouldRun) -> ShouldRun,
make_run: fn(RunConfig),
should_run: fn(ShouldRun<'_>) -> ShouldRun<'_>,
make_run: fn(RunConfig<'_>),
name: &'static str,
}
@ -124,7 +124,7 @@ impl PathSet {
}
}
fn path(&self, builder: &Builder) -> PathBuf {
fn path(&self, builder: &Builder<'_>) -> PathBuf {
match self {
PathSet::Set(set) => set
.iter()
@ -147,7 +147,7 @@ impl StepDescription {
}
}
fn maybe_run(&self, builder: &Builder, pathset: &PathSet) {
fn maybe_run(&self, builder: &Builder<'_>, pathset: &PathSet) {
if builder.config.exclude.iter().any(|e| pathset.has(e)) {
eprintln!("Skipping {:?} because it is excluded", pathset);
return;
@ -183,7 +183,7 @@ impl StepDescription {
}
}
fn run(v: &[StepDescription], builder: &Builder, paths: &[PathBuf]) {
fn run(v: &[StepDescription], builder: &Builder<'_>, paths: &[PathBuf]) {
let should_runs = v
.iter()
.map(|desc| (desc.should_run)(ShouldRun::new(builder)))
@ -245,7 +245,7 @@ pub struct ShouldRun<'a> {
}
impl<'a> ShouldRun<'a> {
fn new(builder: &'a Builder) -> ShouldRun<'a> {
fn new(builder: &'a Builder<'_>) -> ShouldRun<'a> {
ShouldRun {
builder,
paths: BTreeSet::new(),
@ -326,7 +326,7 @@ pub enum Kind {
impl<'a> Builder<'a> {
fn get_step_descriptions(kind: Kind) -> Vec<StepDescription> {
macro_rules! describe {
($($rule:ty),+ $(,)*) => {{
($($rule:ty),+ $(,)?) => {{
vec![$(StepDescription::from::<$rule>()),+]
}};
}
@ -378,14 +378,11 @@ impl<'a> Builder<'a> {
test::Debuginfo,
test::UiFullDeps,
test::RunPassFullDeps,
test::RunFailFullDeps,
test::Rustdoc,
test::Pretty,
test::RunPassPretty,
test::RunFailPretty,
test::RunPassValgrindPretty,
test::RunPassFullDepsPretty,
test::RunFailFullDepsPretty,
test::Crate,
test::CrateLibrustc,
test::CrateRustdoc,
@ -403,11 +400,13 @@ impl<'a> Builder<'a> {
test::TheBook,
test::UnstableBook,
test::RustcBook,
test::EmbeddedBook,
test::Rustfmt,
test::Miri,
test::Clippy,
test::CompiletestTest,
test::RustdocJS,
test::RustdocJSStd,
test::RustdocJSNotStd,
test::RustdocTheme,
// Run bootstrap close to the end as it's unlikely to fail
test::Bootstrap,
@ -433,6 +432,7 @@ impl<'a> Builder<'a> {
doc::RustByExample,
doc::RustcBook,
doc::CargoBook,
doc::EmbeddedBook,
doc::EditionGuide,
),
Kind::Dist => describe!(
@ -512,7 +512,7 @@ impl<'a> Builder<'a> {
Some(help)
}
pub fn new(build: &Build) -> Builder {
pub fn new(build: &Build) -> Builder<'_> {
let (kind, paths) = match build.config.cmd {
Subcommand::Build { ref paths } => (Kind::Build, &paths[..]),
Subcommand::Check { ref paths } => (Kind::Check, &paths[..]),
@ -592,11 +592,11 @@ impl<'a> Builder<'a> {
impl Step for Libdir {
type Output = Interned<PathBuf>;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
fn run(self, builder: &Builder) -> Interned<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Interned<PathBuf> {
let compiler = self.compiler;
let config = &builder.build.config;
let lib = if compiler.stage >= 1 && config.libdir_relative().is_some() {
@ -649,7 +649,7 @@ impl<'a> Builder<'a> {
add_lib_path(vec![self.rustc_libdir(compiler)], cmd);
}
/// Get a path to the compiler specified.
/// Gets a path to the compiler specified.
pub fn rustc(&self, compiler: Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.initial_rustc.clone()
@ -660,6 +660,15 @@ impl<'a> Builder<'a> {
}
}
/// Gets the paths to all of the compiler's codegen backends.
fn codegen_backends(&self, compiler: Compiler) -> impl Iterator<Item = PathBuf> {
fs::read_dir(self.sysroot_codegen_backends(compiler))
.into_iter()
.flatten()
.filter_map(Result::ok)
.map(|entry| entry.path())
}
pub fn rustdoc(&self, host: Interned<String>) -> PathBuf {
self.ensure(tool::Rustdoc { host })
}
@ -669,10 +678,9 @@ impl<'a> Builder<'a> {
let compiler = self.compiler(self.top_stage, host);
cmd.env("RUSTC_STAGE", compiler.stage.to_string())
.env("RUSTC_SYSROOT", self.sysroot(compiler))
.env(
"RUSTDOC_LIBDIR",
self.sysroot_libdir(compiler, self.config.build),
)
// Note that this is *not* the sysroot_libdir because rustdoc must be linked
// equivalently to rustc.
.env("RUSTDOC_LIBDIR", self.rustc_libdir(compiler))
.env("CFG_RELEASE_CHANNEL", &self.config.channel)
.env("RUSTDOC_REAL", self.rustdoc(host))
.env("RUSTDOC_CRATE_VERSION", self.rust_version())
@ -750,6 +758,9 @@ impl<'a> Builder<'a> {
match mode {
Mode::Std => {
self.clear_if_dirty(&my_out, &self.rustc(compiler));
for backend in self.codegen_backends(compiler) {
self.clear_if_dirty(&my_out, &backend);
}
},
Mode::Test => {
self.clear_if_dirty(&my_out, &libstd_stamp);
@ -782,6 +793,13 @@ impl<'a> Builder<'a> {
.env("CARGO_TARGET_DIR", out_dir)
.arg(cmd);
// See comment in librustc_llvm/build.rs for why this is necessary, largely llvm-config
// needs to not accidentally link to libLLVM in stage0/lib.
cargo.env("REAL_LIBRARY_PATH_VAR", &util::dylib_path_var());
if let Some(e) = env::var_os(util::dylib_path_var()) {
cargo.env("REAL_LIBRARY_PATH", e);
}
if cmd != "install" {
cargo.arg("--target")
.arg(target);
@ -856,7 +874,7 @@ impl<'a> Builder<'a> {
} else {
&maybe_sysroot
};
let libdir = sysroot.join(libdir(&compiler.host));
let libdir = self.rustc_libdir(compiler);
// Customize the compiler we're running. Specify the compiler to cargo
// as our shim and then pass it some various options used to configure
@ -898,7 +916,7 @@ impl<'a> Builder<'a> {
cargo.env("RUSTC_ERROR_FORMAT", error_format);
}
if cmd != "build" && cmd != "check" && cmd != "rustc" && want_rustdoc {
cargo.env("RUSTDOC_LIBDIR", self.sysroot_libdir(compiler, self.config.build));
cargo.env("RUSTDOC_LIBDIR", self.rustc_libdir(compiler));
}
if mode.is_tool() {
@ -982,6 +1000,9 @@ impl<'a> Builder<'a> {
if self.config.incremental {
cargo.env("CARGO_INCREMENTAL", "1");
} else {
// Don't rely on any default setting for incr. comp. in Cargo
cargo.env("CARGO_INCREMENTAL", "0");
}
if let Some(ref on_fail) = self.config.on_fail {
@ -998,8 +1019,7 @@ impl<'a> Builder<'a> {
cargo.env("RUSTC_VERBOSE", self.verbosity.to_string());
// in std, we want to avoid denying warnings for stage 0 as that makes cfg's painful.
if self.config.deny_warnings && !(mode == Mode::Std && stage == 0) {
if self.config.deny_warnings {
cargo.env("RUSTC_DENY_WARNINGS", "1");
}
@ -1032,29 +1052,24 @@ impl<'a> Builder<'a> {
}
};
let cc = ccacheify(&self.cc(target));
cargo.env(format!("CC_{}", target), &cc).env("CC", &cc);
cargo.env(format!("CC_{}", target), &cc);
let cflags = self.cflags(target, GitRepo::Rustc).join(" ");
cargo
.env(format!("CFLAGS_{}", target), cflags.clone())
.env("CFLAGS", cflags.clone());
.env(format!("CFLAGS_{}", target), cflags.clone());
if let Some(ar) = self.ar(target) {
let ranlib = format!("{} s", ar.display());
cargo
.env(format!("AR_{}", target), ar)
.env("AR", ar)
.env(format!("RANLIB_{}", target), ranlib.clone())
.env("RANLIB", ranlib);
.env(format!("RANLIB_{}", target), ranlib);
}
if let Ok(cxx) = self.cxx(target) {
let cxx = ccacheify(&cxx);
cargo
.env(format!("CXX_{}", target), &cxx)
.env("CXX", &cxx)
.env(format!("CXXFLAGS_{}", target), cflags.clone())
.env("CXXFLAGS", cflags);
.env(format!("CXXFLAGS_{}", target), cflags);
}
}

View File

@ -68,20 +68,20 @@ unsafe impl<T> Send for Interned<T> {}
unsafe impl<T> Sync for Interned<T> {}
impl fmt::Display for Interned<String> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let s: &str = &*self;
f.write_str(s)
}
}
impl fmt::Debug for Interned<String> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let s: &str = &*self;
f.write_fmt(format_args!("{:?}", s))
}
}
impl fmt::Debug for Interned<PathBuf> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let s: &Path = &*self;
f.write_fmt(format_args!("{:?}", s))
}
@ -227,10 +227,10 @@ lazy_static! {
pub static ref INTERNER: Interner = Interner::default();
}
/// This is essentially a HashMap which allows storing any type in its input and
/// This is essentially a `HashMap` which allows storing any type in its input and
/// any type in its output. It is a write-once cache; values are never evicted,
/// which means that references to the value can safely be returned from the
/// get() method.
/// `get()` method.
#[derive(Debug)]
pub struct Cache(
RefCell<HashMap<

View File

@ -27,7 +27,6 @@ use std::path::{Path, PathBuf};
use std::process::Command;
use build_helper::output;
use cc;
use crate::{Build, GitRepo};
use crate::config::Target;
@ -157,7 +156,7 @@ fn set_compiler(cfg: &mut cc::Build,
None => return,
};
match output[i + 3..].chars().next().unwrap() {
'0' ... '6' => {}
'0' ..= '6' => {}
_ => return,
}
let alternative = format!("e{}", gnu_compiler);

View File

@ -14,7 +14,7 @@ use crate::Build;
use crate::config::Config;
// The version number
pub const CFG_RELEASE_NUM: &str = "1.33.0";
pub const CFG_RELEASE_NUM: &str = "1.34.2";
pub struct GitInfo {
inner: Option<Info>,

View File

@ -17,17 +17,17 @@ impl Step for Std {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("std")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let compiler = builder.compiler(0, builder.config.build);
@ -56,22 +56,22 @@ impl Step for Rustc {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc-main")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc {
target: run.target,
});
}
/// Build the compiler.
/// Builds the compiler.
///
/// This will build the compiler for a particular stage of the build using
/// the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
@ -103,11 +103,11 @@ impl Step for CodegenBackend {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc_codegen_llvm")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let backend = run.builder.config.rust_codegen_backends.get(0);
let backend = backend.cloned().unwrap_or_else(|| {
INTERNER.intern_str("llvm")
@ -118,7 +118,7 @@ impl Step for CodegenBackend {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
let backend = self.backend;
@ -148,17 +148,17 @@ impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("test")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
@ -189,17 +189,17 @@ impl Step for Rustdoc {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/rustdoc")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(0, builder.config.build);
let target = self.target;
@ -229,25 +229,37 @@ impl Step for Rustdoc {
/// Cargo's output path for the standard library in a given stage, compiled
/// by a particular compiler for the specified target.
pub fn libstd_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn libstd_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Std, target).join(".libstd-check.stamp")
}
/// Cargo's output path for libtest in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn libtest_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn libtest_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Test, target).join(".libtest-check.stamp")
}
/// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn librustc_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn librustc_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Rustc, target).join(".librustc-check.stamp")
}
/// Cargo's output path for librustc_codegen_llvm in a given stage, compiled by a particular
/// compiler for the specified target and backend.
fn codegen_backend_stamp(builder: &Builder,
fn codegen_backend_stamp(builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
backend: Interned<String>) -> PathBuf {
@ -257,7 +269,11 @@ fn codegen_backend_stamp(builder: &Builder,
/// Cargo's output path for rustdoc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn rustdoc_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn rustdoc_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::ToolRustc, target)
.join(".rustdoc-check.stamp")
}

View File

@ -3,7 +3,7 @@
//! Responsible for cleaning out a build directory of all old and stale
//! artifacts to prepare for a fresh build. Currently doesn't remove the
//! `build/cache` directory (download cache) or the `build/$target/llvm`
//! directory unless the --all flag is present.
//! directory unless the `--all` flag is present.
use std::fs;
use std::io::{self, ErrorKind};

View File

@ -37,23 +37,23 @@ impl Step for Std {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("std")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build the standard library.
/// Builds the standard library.
///
/// This will build the standard library for a particular stage of the build
/// using the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let compiler = self.compiler;
@ -111,7 +111,7 @@ impl Step for Std {
}
/// Copies third pary objects needed by various targets.
fn copy_third_party_objects(builder: &Builder, compiler: &Compiler, target: Interned<String>) {
fn copy_third_party_objects(builder: &Builder<'_>, compiler: &Compiler, target: Interned<String>) {
let libdir = builder.sysroot_libdir(*compiler, target);
// Copies the crt(1,i,n).o startup objects
@ -145,7 +145,7 @@ fn copy_third_party_objects(builder: &Builder, compiler: &Compiler, target: Inte
/// Configure cargo to compile the standard library, adding appropriate env vars
/// and such.
pub fn std_cargo(builder: &Builder,
pub fn std_cargo(builder: &Builder<'_>,
compiler: &Compiler,
target: Interned<String>,
cargo: &mut Command) {
@ -201,7 +201,7 @@ struct StdLink {
impl Step for StdLink {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -213,7 +213,7 @@ impl Step for StdLink {
/// Note that this assumes that `compiler` has already generated the libstd
/// libraries for `target`, and this method will find them in the relevant
/// output directory.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
@ -237,7 +237,12 @@ impl Step for StdLink {
}
}
fn copy_apple_sanitizer_dylibs(builder: &Builder, native_dir: &Path, platform: &str, into: &Path) {
fn copy_apple_sanitizer_dylibs(
builder: &Builder<'_>,
native_dir: &Path,
platform: &str,
into: &Path,
) {
for &sanitizer in &["asan", "tsan"] {
let filename = format!("lib__rustc__clang_rt.{}_{}_dynamic.dylib", sanitizer, platform);
let mut src_path = native_dir.join(sanitizer);
@ -258,24 +263,24 @@ pub struct StartupObjects {
impl Step for StartupObjects {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/rtstartup")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(StartupObjects {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build and prepare startup objects like rsbegin.o and rsend.o
/// Builds and prepare startup objects like rsbegin.o and rsend.o
///
/// These are primarily used on Windows right now for linking executables/dlls.
/// They don't require any library support as they're just plain old object
/// files, so we just use the nightly snapshot compiler to always build them (as
/// no other compilers are guaranteed to be available).
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let for_compiler = self.compiler;
let target = self.target;
if !target.contains("pc-windows-gnu") {
@ -323,23 +328,23 @@ impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("test")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build libtest.
/// Builds libtest.
///
/// This will build libtest and supporting libraries for a particular stage of
/// the build using the `compiler` targeting the `target` architecture. The
/// artifacts created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let compiler = self.compiler;
@ -390,7 +395,7 @@ impl Step for Test {
}
/// Same as `std_cargo`, but for libtest
pub fn test_cargo(builder: &Builder,
pub fn test_cargo(builder: &Builder<'_>,
_compiler: &Compiler,
_target: Interned<String>,
cargo: &mut Command) {
@ -411,12 +416,12 @@ pub struct TestLink {
impl Step for TestLink {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
/// Same as `std_link`, only for libtest
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
@ -444,23 +449,23 @@ impl Step for Rustc {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc-main")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
target: run.target,
});
}
/// Build the compiler.
/// Builds the compiler.
///
/// This will build the compiler for a particular stage of the build using
/// the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
@ -516,14 +521,14 @@ impl Step for Rustc {
}
}
pub fn rustc_cargo(builder: &Builder, cargo: &mut Command) {
pub fn rustc_cargo(builder: &Builder<'_>, cargo: &mut Command) {
cargo.arg("--features").arg(builder.rustc_features())
.arg("--manifest-path")
.arg(builder.src.join("src/rustc/Cargo.toml"));
rustc_cargo_env(builder, cargo);
}
pub fn rustc_cargo_env(builder: &Builder, cargo: &mut Command) {
pub fn rustc_cargo_env(builder: &Builder<'_>, cargo: &mut Command) {
// Set some configuration variables picked up by build scripts and
// the compiler alike
cargo.env("CFG_RELEASE", builder.rust_release())
@ -554,8 +559,8 @@ pub fn rustc_cargo_env(builder: &Builder, cargo: &mut Command) {
if let Some(ref s) = builder.config.rustc_default_linker {
cargo.env("CFG_DEFAULT_LINKER", s);
}
if builder.config.rustc_parallel_queries {
cargo.env("RUSTC_PARALLEL_QUERIES", "1");
if builder.config.rustc_parallel {
cargo.env("RUSTC_PARALLEL_COMPILER", "1");
}
if builder.config.rust_verify_llvm_ir {
cargo.env("RUSTC_VERIFY_LLVM_IR", "1");
@ -572,12 +577,12 @@ struct RustcLink {
impl Step for RustcLink {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
/// Same as `std_link`, only for librustc
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target_compiler = self.target_compiler;
let target = self.target;
@ -605,11 +610,11 @@ impl Step for CodegenBackend {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.all_krates("rustc_codegen_llvm")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let backend = run.builder.config.rust_codegen_backends.get(0);
let backend = backend.cloned().unwrap_or_else(|| {
INTERNER.intern_str("llvm")
@ -621,7 +626,7 @@ impl Step for CodegenBackend {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
let backend = self.backend;
@ -684,7 +689,7 @@ impl Step for CodegenBackend {
}
}
pub fn build_codegen_backend(builder: &Builder,
pub fn build_codegen_backend(builder: &Builder<'_>,
cargo: &mut Command,
compiler: &Compiler,
target: Interned<String>,
@ -712,6 +717,7 @@ pub fn build_codegen_backend(builder: &Builder,
if builder.is_rust_llvm(target) && backend != "emscripten" {
cargo.env("LLVM_RUSTLLVM", "1");
}
cargo.env("LLVM_CONFIG", &llvm_config);
if backend != "emscripten" {
let target_config = builder.config.target_config.get(&target);
@ -752,7 +758,7 @@ pub fn build_codegen_backend(builder: &Builder,
/// This will take the codegen artifacts produced by `compiler` and link them
/// into an appropriate location for `target_compiler` to be a functional
/// compiler.
fn copy_codegen_backends_to_sysroot(builder: &Builder,
fn copy_codegen_backends_to_sysroot(builder: &Builder<'_>,
compiler: Compiler,
target_compiler: Compiler) {
let target = target_compiler.host;
@ -790,7 +796,7 @@ fn copy_codegen_backends_to_sysroot(builder: &Builder,
}
}
fn copy_lld_to_sysroot(builder: &Builder,
fn copy_lld_to_sysroot(builder: &Builder<'_>,
target_compiler: Compiler,
lld_install_root: &Path) {
let target = target_compiler.host;
@ -810,25 +816,37 @@ fn copy_lld_to_sysroot(builder: &Builder,
/// Cargo's output path for the standard library in a given stage, compiled
/// by a particular compiler for the specified target.
pub fn libstd_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn libstd_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Std, target).join(".libstd.stamp")
}
/// Cargo's output path for libtest in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn libtest_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn libtest_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Test, target).join(".libtest.stamp")
}
/// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target.
pub fn librustc_stamp(builder: &Builder, compiler: Compiler, target: Interned<String>) -> PathBuf {
pub fn librustc_stamp(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
) -> PathBuf {
builder.cargo_out(compiler, Mode::Rustc, target).join(".librustc.stamp")
}
/// Cargo's output path for librustc_codegen_llvm in a given stage, compiled by a particular
/// compiler for the specified target and backend.
fn codegen_backend_stamp(builder: &Builder,
fn codegen_backend_stamp(builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
backend: Interned<String>) -> PathBuf {
@ -836,10 +854,12 @@ fn codegen_backend_stamp(builder: &Builder,
.join(format!(".librustc_codegen_llvm-{}.stamp", backend))
}
pub fn compiler_file(builder: &Builder,
compiler: &Path,
target: Interned<String>,
file: &str) -> PathBuf {
pub fn compiler_file(
builder: &Builder<'_>,
compiler: &Path,
target: Interned<String>,
file: &str,
) -> PathBuf {
let mut cmd = Command::new(compiler);
cmd.args(builder.cflags(target, GitRepo::Rustc));
cmd.arg(format!("-print-file-name={}", file));
@ -855,7 +875,7 @@ pub struct Sysroot {
impl Step for Sysroot {
type Output = Interned<PathBuf>;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -865,7 +885,7 @@ impl Step for Sysroot {
/// That is, the sysroot for the stage0 compiler is not what the compiler
/// thinks it is by default, but it's the same as the default for stages
/// 1-3.
fn run(self, builder: &Builder) -> Interned<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Interned<PathBuf> {
let compiler = self.compiler;
let sysroot = if compiler.stage == 0 {
builder.out.join(&compiler.host).join("stage0-sysroot")
@ -890,7 +910,7 @@ pub struct Assemble {
impl Step for Assemble {
type Output = Compiler;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -899,7 +919,7 @@ impl Step for Assemble {
/// This will assemble a compiler in `build/$host/stage$stage`. The compiler
/// must have been previously produced by the `stage - 1` builder.build
/// compiler.
fn run(self, builder: &Builder) -> Compiler {
fn run(self, builder: &Builder<'_>) -> Compiler {
let target_compiler = self.target_compiler;
if target_compiler.stage == 0 {
@ -995,14 +1015,14 @@ impl Step for Assemble {
///
/// For a particular stage this will link the file listed in `stamp` into the
/// `sysroot_dst` provided.
pub fn add_to_sysroot(builder: &Builder, sysroot_dst: &Path, stamp: &Path) {
pub fn add_to_sysroot(builder: &Builder<'_>, sysroot_dst: &Path, stamp: &Path) {
t!(fs::create_dir_all(&sysroot_dst));
for path in builder.read_stamp_file(stamp) {
builder.copy(&path, &sysroot_dst.join(path.file_name().unwrap()));
}
}
pub fn run_cargo(builder: &Builder,
pub fn run_cargo(builder: &Builder<'_>,
cargo: &mut Command,
stamp: &Path,
is_check: bool)
@ -1149,9 +1169,9 @@ pub fn run_cargo(builder: &Builder,
}
pub fn stream_cargo(
builder: &Builder,
builder: &Builder<'_>,
cargo: &mut Command,
cb: &mut dyn FnMut(CargoMessage),
cb: &mut dyn FnMut(CargoMessage<'_>),
) -> bool {
if builder.config.dry_run {
return true;
@ -1173,7 +1193,7 @@ pub fn stream_cargo(
let stdout = BufReader::new(child.stdout.take().unwrap());
for line in stdout.lines() {
let line = t!(line);
match serde_json::from_str::<CargoMessage>(&line) {
match serde_json::from_str::<CargoMessage<'_>>(&line) {
Ok(msg) => cb(msg),
// If this was informational, just print it out and continue
Err(_) => println!("{}", line)

View File

@ -77,11 +77,15 @@ pub struct Config {
pub llvm_experimental_targets: String,
pub llvm_link_jobs: Option<u32>,
pub llvm_version_suffix: Option<String>,
pub llvm_use_linker: Option<String>,
pub lld_enabled: bool,
pub lldb_enabled: bool,
pub llvm_tools_enabled: bool,
pub llvm_cflags: Option<String>,
pub llvm_cxxflags: Option<String>,
pub llvm_ldflags: Option<String>,
pub llvm_use_libcxx: bool,
// rust codegen options
@ -94,7 +98,7 @@ pub struct Config {
pub rust_debuginfo_only_std: bool,
pub rust_debuginfo_tools: bool,
pub rust_rpath: bool,
pub rustc_parallel_queries: bool,
pub rustc_parallel: bool,
pub rustc_default_linker: Option<String>,
pub rust_optimize_tests: bool,
pub rust_debuginfo_tests: bool,
@ -254,7 +258,11 @@ struct Llvm {
link_shared: Option<bool>,
version_suffix: Option<String>,
clang_cl: Option<String>,
cflags: Option<String>,
cxxflags: Option<String>,
ldflags: Option<String>,
use_libcxx: Option<bool>,
use_linker: Option<String>,
}
#[derive(Deserialize, Default, Clone)]
@ -292,7 +300,7 @@ struct Rust {
debuginfo_lines: Option<bool>,
debuginfo_only_std: Option<bool>,
debuginfo_tools: Option<bool>,
experimental_parallel_queries: Option<bool>,
parallel_compiler: Option<bool>,
backtrace: Option<bool>,
default_linker: Option<String>,
channel: Option<String>,
@ -516,7 +524,12 @@ impl Config {
config.llvm_link_jobs = llvm.link_jobs;
config.llvm_version_suffix = llvm.version_suffix.clone();
config.llvm_clang_cl = llvm.clang_cl.clone();
config.llvm_cflags = llvm.cflags.clone();
config.llvm_cxxflags = llvm.cxxflags.clone();
config.llvm_ldflags = llvm.ldflags.clone();
set(&mut config.llvm_use_libcxx, llvm.use_libcxx);
config.llvm_use_linker = llvm.use_linker.clone();
}
if let Some(ref rust) = toml.rust {
@ -547,7 +560,7 @@ impl Config {
set(&mut config.lld_enabled, rust.lld);
set(&mut config.lldb_enabled, rust.lldb);
set(&mut config.llvm_tools_enabled, rust.llvm_tools);
config.rustc_parallel_queries = rust.experimental_parallel_queries.unwrap_or(false);
config.rustc_parallel = rust.parallel_compiler.unwrap_or(false);
config.rustc_default_linker = rust.default_linker.clone();
config.musl_root = rust.musl_root.clone().map(PathBuf::from);
config.save_toolstates = rust.save_toolstates.clone().map(PathBuf::from);

View File

@ -35,7 +35,7 @@ o("debug", "rust.debug", "enables debugging environment; does not affect optimiz
o("docs", "build.docs", "build standard library documentation")
o("compiler-docs", "build.compiler-docs", "build compiler documentation")
o("optimize-tests", "rust.optimize-tests", "build tests with optimizations")
o("experimental-parallel-queries", "rust.experimental-parallel-queries", "build rustc with experimental parallelization")
o("parallel-compiler", "rust.parallel-compiler", "build a multi-threaded rustc")
o("test-miri", "rust.test-miri", "run miri's test suite")
o("debuginfo-tests", "rust.debuginfo-tests", "build tests with debugger metadata")
o("verbose-tests", "rust.verbose-tests", "enable verbose output when running tests")
@ -64,6 +64,10 @@ o("lldb", "rust.lldb", "build lldb")
o("missing-tools", "dist.missing-tools", "allow failures when building tools")
o("use-libcxx", "llvm.use_libcxx", "build LLVM with libc++")
o("cflags", "llvm.cflags", "build LLVM with these extra compiler flags")
o("cxxflags", "llvm.cxxflags", "build LLVM with these extra compiler flags")
o("ldflags", "llvm.ldflags", "build LLVM with these extra linker flags")
# Optimization and debugging options. These may be overridden by the release
# channel, etc.
o("optimize", "rust.optimize", "build optimized rust code")

View File

@ -23,9 +23,9 @@ use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use crate::compile;
use crate::tool::{self, Tool};
use crate::cache::{INTERNER, Interned};
use time;
use time::{self, Timespec};
pub fn pkgname(builder: &Builder, component: &str) -> String {
pub fn pkgname(builder: &Builder<'_>, component: &str) -> String {
if component == "cargo" {
format!("{}-{}", component, builder.cargo_package_vers())
} else if component == "rls" {
@ -46,15 +46,15 @@ pub fn pkgname(builder: &Builder, component: &str) -> String {
}
}
fn distdir(builder: &Builder) -> PathBuf {
fn distdir(builder: &Builder<'_>) -> PathBuf {
builder.out.join("dist")
}
pub fn tmpdir(builder: &Builder) -> PathBuf {
pub fn tmpdir(builder: &Builder<'_>) -> PathBuf {
builder.out.join("tmp/dist")
}
fn rust_installer(builder: &Builder) -> Command {
fn rust_installer(builder: &Builder<'_>) -> Command {
builder.tool_cmd(Tool::RustInstaller)
}
@ -76,11 +76,11 @@ impl Step for Docs {
type Output = PathBuf;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/doc")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Docs {
stage: run.builder.top_stage,
host: run.target,
@ -88,7 +88,7 @@ impl Step for Docs {
}
/// Builds the `rust-docs` installer component.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let host = self.host;
let name = pkgname(builder, "rust-docs");
@ -138,11 +138,11 @@ impl Step for RustcDocs {
type Output = PathBuf;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/librustc")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(RustcDocs {
stage: run.builder.top_stage,
host: run.target,
@ -150,7 +150,7 @@ impl Step for RustcDocs {
}
/// Builds the `rustc-docs` installer component.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let host = self.host;
let name = pkgname(builder, "rustc-docs");
@ -210,7 +210,7 @@ fn find_files(files: &[&str], path: &[PathBuf]) -> Vec<PathBuf> {
}
fn make_win_dist(
rust_root: &Path, plat_root: &Path, target_triple: Interned<String>, builder: &Builder
rust_root: &Path, plat_root: &Path, target_triple: Interned<String>, builder: &Builder<'_>
) {
//Ask gcc where it keeps its stuff
let mut cmd = Command::new(builder.cc(target_triple));
@ -334,19 +334,19 @@ impl Step for Mingw {
type Output = Option<PathBuf>;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Mingw { host: run.target });
}
/// Build the `rust-mingw` installer component.
/// Builds the `rust-mingw` installer component.
///
/// This contains all the bits and pieces to run the MinGW Windows targets
/// without any extra installed software (e.g., we bundle gcc, libraries, etc).
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let host = self.host;
if !host.contains("pc-windows-gnu") {
@ -392,18 +392,18 @@ impl Step for Rustc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/librustc")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc {
compiler: run.builder.compiler(run.builder.top_stage, run.target),
});
}
/// Creates the `rustc` installer component.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let compiler = self.compiler;
let host = self.compiler.host;
@ -470,7 +470,7 @@ impl Step for Rustc {
return distdir(builder).join(format!("{}-{}.tar.gz", name, host));
fn prepare_image(builder: &Builder, compiler: Compiler, image: &Path) {
fn prepare_image(builder: &Builder<'_>, compiler: Compiler, image: &Path) {
let host = compiler.host;
let src = builder.sysroot(compiler);
let libdir = libdir(&host);
@ -528,7 +528,19 @@ impl Step for Rustc {
t!(fs::create_dir_all(image.join("share/man/man1")));
let man_src = builder.src.join("src/doc/man");
let man_dst = image.join("share/man/man1");
let month_year = t!(time::strftime("%B %Y", &time::now()));
// Reproducible builds: If SOURCE_DATE_EPOCH is set, use that as the time.
let time = env::var("SOURCE_DATE_EPOCH")
.map(|timestamp| {
let epoch = timestamp.parse().map_err(|err| {
format!("could not parse SOURCE_DATE_EPOCH: {}", err)
}).unwrap();
time::at(Timespec::new(epoch, 0))
})
.unwrap_or_else(|_| time::now());
let month_year = t!(time::strftime("%B %Y", &time));
// don't use our `bootstrap::util::{copy, cp_r}`, because those try
// to hardlink, and we don't want to edit the source templates
for file_entry in builder.read_dir(&man_src) {
@ -568,11 +580,11 @@ pub struct DebuggerScripts {
impl Step for DebuggerScripts {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/lldb_batchmode.py")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(DebuggerScripts {
sysroot: run.builder.sysroot(run.builder.compiler(run.builder.top_stage, run.host)),
host: run.target,
@ -580,7 +592,7 @@ impl Step for DebuggerScripts {
}
/// Copies debugger scripts for `target` into the `sysroot` specified.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let host = self.host;
let sysroot = self.sysroot;
let dst = sysroot.join("lib/rustlib/etc");
@ -602,6 +614,8 @@ impl Step for DebuggerScripts {
// gdb debugger scripts
builder.install(&builder.src.join("src/etc/rust-gdb"), &sysroot.join("bin"),
0o755);
builder.install(&builder.src.join("src/etc/rust-gdbgui"), &sysroot.join("bin"),
0o755);
cp_debugger_script("gdb_load_rust_pretty_printers.py");
cp_debugger_script("gdb_rust_pretty_printing.py");
@ -625,18 +639,18 @@ impl Step for Std {
type Output = PathBuf;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/libstd")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let compiler = self.compiler;
let target = self.target;
@ -714,12 +728,12 @@ impl Step for Analysis {
type Output = PathBuf;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("analysis").default_condition(builder.config.extended)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Analysis {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
@ -727,7 +741,7 @@ impl Step for Analysis {
}
/// Creates a tarball of save-analysis metadata, if available.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let compiler = self.compiler;
let target = self.target;
assert!(builder.config.extended);
@ -777,7 +791,7 @@ impl Step for Analysis {
}
}
fn copy_src_dirs(builder: &Builder, src_dirs: &[&str], exclude_dirs: &[&str], dst_dir: &Path) {
fn copy_src_dirs(builder: &Builder<'_>, src_dirs: &[&str], exclude_dirs: &[&str], dst_dir: &Path) {
fn filter_fn(exclude_dirs: &[&str], dir: &str, path: &Path) -> bool {
let spath = match path.to_str() {
Some(path) => path,
@ -786,7 +800,24 @@ fn copy_src_dirs(builder: &Builder, src_dirs: &[&str], exclude_dirs: &[&str], ds
if spath.ends_with("~") || spath.ends_with(".pyc") {
return false
}
if (spath.contains("llvm/test") || spath.contains("llvm\\test")) &&
const LLVM_PROJECTS: &[&str] = &[
"llvm-project/clang", "llvm-project\\clang",
"llvm-project/lld", "llvm-project\\lld",
"llvm-project/lldb", "llvm-project\\lldb",
"llvm-project/llvm", "llvm-project\\llvm",
];
if spath.contains("llvm-project") && !spath.ends_with("llvm-project")
&& !LLVM_PROJECTS.iter().any(|path| spath.contains(path))
{
return false;
}
const LLVM_TEST: &[&str] = &[
"llvm-project/llvm/test", "llvm-project\\llvm\\test",
"llvm-emscripten/test", "llvm-emscripten\\test",
];
if LLVM_TEST.iter().any(|path| spath.contains(path)) &&
(spath.ends_with(".ll") ||
spath.ends_with(".td") ||
spath.ends_with(".s")) {
@ -830,16 +861,16 @@ impl Step for Src {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Src);
}
/// Creates the `rust-src` installer component
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.info("Dist src");
let name = pkgname(builder, "rust-src");
@ -910,17 +941,17 @@ impl Step for PlainSourceTarball {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src").default_condition(builder.config.rust_dist_src)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(PlainSourceTarball);
}
/// Creates the plain source tarball
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.info("Create plain source tarball");
// Make sure that the root folder of tarball has the correct name
@ -1038,18 +1069,18 @@ impl Step for Cargo {
type Output = PathBuf;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("cargo")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargo {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let stage = self.stage;
let target = self.target;
@ -1124,18 +1155,18 @@ impl Step for Rls {
type Output = Option<PathBuf>;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("rls")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rls {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let stage = self.stage;
let target = self.target;
assert!(builder.config.extended);
@ -1203,18 +1234,18 @@ impl Step for Clippy {
type Output = Option<PathBuf>;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("clippy")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Clippy {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let stage = self.stage;
let target = self.target;
assert!(builder.config.extended);
@ -1287,18 +1318,18 @@ impl Step for Miri {
type Output = Option<PathBuf>;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("miri")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Miri {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let stage = self.stage;
let target = self.target;
assert!(builder.config.extended);
@ -1371,18 +1402,18 @@ impl Step for Rustfmt {
type Output = Option<PathBuf>;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("rustfmt")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustfmt {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let stage = self.stage;
let target = self.target;
@ -1454,12 +1485,12 @@ impl Step for Extended {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("extended").default_condition(builder.config.extended)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Extended {
stage: run.builder.top_stage,
host: run.builder.config.build,
@ -1468,7 +1499,7 @@ impl Step for Extended {
}
/// Creates a combined installer for the specified target in the provided stage.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
@ -1918,7 +1949,7 @@ impl Step for Extended {
}
}
fn add_env(builder: &Builder, cmd: &mut Command, target: Interned<String>) {
fn add_env(builder: &Builder<'_>, cmd: &mut Command, target: Interned<String>) {
let mut parts = channel::CFG_RELEASE_NUM.split('.');
cmd.env("CFG_RELEASE_INFO", builder.rust_version())
.env("CFG_RELEASE_NUM", channel::CFG_RELEASE_NUM)
@ -1954,15 +1985,15 @@ impl Step for HashSign {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("hash-and-sign")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(HashSign);
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let mut cmd = builder.tool_cmd(Tool::BuildManifest);
if builder.config.dry_run {
return;
@ -2006,7 +2037,7 @@ impl Step for HashSign {
// LLVM tools are linked dynamically.
// Note: This function does no yet support Windows but we also don't support
// linking LLVM tools dynamically on Windows yet.
pub fn maybe_install_llvm_dylib(builder: &Builder,
pub fn maybe_install_llvm_dylib(builder: &Builder<'_>,
target: Interned<String>,
sysroot: &Path) {
let src_libdir = builder
@ -2048,18 +2079,18 @@ impl Step for LlvmTools {
type Output = Option<PathBuf>;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("llvm-tools")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(LlvmTools {
stage: run.builder.top_stage,
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let stage = self.stage;
let target = self.target;
assert!(builder.config.extended);
@ -2074,7 +2105,7 @@ impl Step for LlvmTools {
}
builder.info(&format!("Dist LlvmTools stage{} ({})", stage, target));
let src = builder.src.join("src/llvm");
let src = builder.src.join("src/llvm-project/llvm");
let name = pkgname(builder, "llvm-tools");
let tmp = tmpdir(builder);
@ -2132,17 +2163,17 @@ impl Step for Lldb {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/tools/lldb")
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/llvm-project/lldb").path("src/tools/lldb")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Lldb {
target: run.target,
});
}
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let target = self.target;
if builder.config.dry_run {
@ -2158,7 +2189,7 @@ impl Step for Lldb {
}
builder.info(&format!("Dist Lldb ({})", target));
let src = builder.src.join("src/tools/lldb");
let src = builder.src.join("src/llvm-project/lldb");
let name = pkgname(builder, "lldb");
let tmp = tmpdir(builder);

View File

@ -23,7 +23,7 @@ use crate::cache::{INTERNER, Interned};
use crate::config::Config;
macro_rules! book {
($($name:ident, $path:expr, $book_name:expr;)+) => {
($($name:ident, $path:expr, $book_name:expr, $book_ver:expr;)+) => {
$(
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct $name {
@ -34,21 +34,22 @@ macro_rules! book {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path($path).default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure($name {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(Rustbook {
target: self.target,
name: INTERNER.intern_str($book_name),
version: $book_ver,
})
}
}
@ -56,19 +57,29 @@ macro_rules! book {
}
}
// NOTE: When adding a book here, make sure to ALSO build the book by
// adding a build step in `src/bootstrap/builder.rs`!
book!(
Nomicon, "src/doc/nomicon", "nomicon";
Reference, "src/doc/reference", "reference";
EditionGuide, "src/doc/edition-guide", "edition-guide";
RustdocBook, "src/doc/rustdoc", "rustdoc";
RustcBook, "src/doc/rustc", "rustc";
RustByExample, "src/doc/rust-by-example", "rust-by-example";
EditionGuide, "src/doc/edition-guide", "edition-guide", RustbookVersion::MdBook1;
EmbeddedBook, "src/doc/embedded-book", "embedded-book", RustbookVersion::MdBook2;
Nomicon, "src/doc/nomicon", "nomicon", RustbookVersion::MdBook1;
Reference, "src/doc/reference", "reference", RustbookVersion::MdBook1;
RustByExample, "src/doc/rust-by-example", "rust-by-example", RustbookVersion::MdBook1;
RustcBook, "src/doc/rustc", "rustc", RustbookVersion::MdBook1;
RustdocBook, "src/doc/rustdoc", "rustdoc", RustbookVersion::MdBook1;
);
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
enum RustbookVersion {
MdBook1,
MdBook2,
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
struct Rustbook {
target: Interned<String>,
name: Interned<String>,
version: RustbookVersion,
}
impl Step for Rustbook {
@ -76,7 +87,7 @@ impl Step for Rustbook {
// rustbook is never directly called, and only serves as a shim for the nomicon and the
// reference.
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -84,12 +95,13 @@ impl Step for Rustbook {
///
/// This will not actually generate any documentation if the documentation has
/// already been generated.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let src = builder.src.join("src/doc");
builder.ensure(RustbookSrc {
target: self.target,
name: self.name,
src: INTERNER.intern_path(src),
version: self.version,
});
}
}
@ -103,18 +115,18 @@ impl Step for UnstableBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/doc/unstable-book").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(UnstableBook {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(UnstableBookGen {
target: self.target,
});
@ -122,6 +134,7 @@ impl Step for UnstableBook {
target: self.target,
name: INTERNER.intern_str("unstable-book"),
src: builder.md_doc_out(self.target),
version: RustbookVersion::MdBook1,
})
}
}
@ -136,19 +149,19 @@ impl Step for CargoBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/cargo/src/doc/book").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(CargoBook {
target: run.target,
name: INTERNER.intern_str("cargo"),
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let name = self.name;
let src = builder.src.join("src/tools/cargo/src/doc");
@ -175,12 +188,13 @@ struct RustbookSrc {
target: Interned<String>,
name: Interned<String>,
src: Interned<PathBuf>,
version: RustbookVersion,
}
impl Step for RustbookSrc {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -188,7 +202,7 @@ impl Step for RustbookSrc {
///
/// This will not actually generate any documentation if the documentation has
/// already been generated.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let name = self.name;
let src = self.src;
@ -205,11 +219,19 @@ impl Step for RustbookSrc {
}
builder.info(&format!("Rustbook ({}) - {}", target, name));
let _ = fs::remove_dir_all(&out);
let vers = match self.version {
RustbookVersion::MdBook1 => "1",
RustbookVersion::MdBook2 => "2",
};
builder.run(rustbook_cmd
.arg("build")
.arg(&src)
.arg("-d")
.arg(out));
.arg(out)
.arg("-m")
.arg(vers));
}
}
@ -224,12 +246,12 @@ impl Step for TheBook {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/doc/book").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(TheBook {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
@ -237,7 +259,7 @@ impl Step for TheBook {
});
}
/// Build the book and associated stuff.
/// Builds the book and associated stuff.
///
/// We need to build:
///
@ -246,7 +268,7 @@ impl Step for TheBook {
/// * Version info and CSS
/// * Index page
/// * Redirect pages
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
let name = self.name;
@ -255,6 +277,7 @@ impl Step for TheBook {
builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(name.to_string()),
version: RustbookVersion::MdBook1,
});
// building older edition redirects
@ -263,18 +286,21 @@ impl Step for TheBook {
builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(source_name),
version: RustbookVersion::MdBook1,
});
let source_name = format!("{}/second-edition", name);
builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(source_name),
version: RustbookVersion::MdBook1,
});
let source_name = format!("{}/2018-edition", name);
builder.ensure(Rustbook {
target,
name: INTERNER.intern_string(source_name),
version: RustbookVersion::MdBook1,
});
// build the version info page and CSS
@ -295,7 +321,12 @@ impl Step for TheBook {
}
}
fn invoke_rustdoc(builder: &Builder, compiler: Compiler, target: Interned<String>, markdown: &str) {
fn invoke_rustdoc(
builder: &Builder<'_>,
compiler: Compiler,
target: Interned<String>,
markdown: &str,
) {
let out = builder.doc_out(target);
let path = builder.src.join("src/doc").join(markdown);
@ -312,12 +343,9 @@ fn invoke_rustdoc(builder: &Builder, compiler: Compiler, target: Interned<String
.arg("--html-before-content").arg(&version_info)
.arg("--html-in-header").arg(&favicon)
.arg("--markdown-no-toc")
.arg("--markdown-playground-url")
.arg("https://play.rust-lang.org/")
.arg("-o").arg(&out)
.arg(&path)
.arg("--markdown-css")
.arg("../rust.css");
.arg("--markdown-playground-url").arg("https://play.rust-lang.org/")
.arg("-o").arg(&out).arg(&path)
.arg("--markdown-css").arg("../rust.css");
builder.run(&mut cmd);
}
@ -332,12 +360,12 @@ impl Step for Standalone {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/doc").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Standalone {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
@ -352,7 +380,7 @@ impl Step for Standalone {
/// `STAMP` along with providing the various header/footer HTML we've customized.
///
/// In the end, this is just a glorified wrapper around rustdoc!
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
let compiler = self.compiler;
builder.info(&format!("Documenting standalone ({})", target));
@ -400,8 +428,7 @@ impl Step for Standalone {
.arg("--html-in-header").arg(&favicon)
.arg("--markdown-no-toc")
.arg("--index-page").arg(&builder.src.join("src/doc/index.md"))
.arg("--markdown-playground-url")
.arg("https://play.rust-lang.org/")
.arg("--markdown-playground-url").arg("https://play.rust-lang.org/")
.arg("-o").arg(&out)
.arg(&path);
@ -426,12 +453,12 @@ impl Step for Std {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.all_krates("std").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Std {
stage: run.builder.top_stage,
target: run.target
@ -442,7 +469,7 @@ impl Step for Std {
///
/// This will generate all documentation for the standard library and its
/// dependencies. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} std ({})", stage, target));
@ -491,6 +518,8 @@ impl Step for Std {
cargo.arg("--")
.arg("--markdown-css").arg("rust.css")
.arg("--markdown-no-toc")
.arg("--generate-redirect-pages")
.arg("--resource-suffix").arg(crate::channel::CFG_RELEASE_NUM)
.arg("--index-page").arg(&builder.src.join("src/doc/index.md"));
builder.run(&mut cargo);
@ -512,12 +541,12 @@ impl Step for Test {
type Output = ();
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.krate("test").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Test {
stage: run.builder.top_stage,
target: run.target,
@ -528,7 +557,7 @@ impl Step for Test {
///
/// This will generate all documentation for libtest and its dependencies. This
/// is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} test ({})", stage, target));
@ -555,7 +584,10 @@ impl Step for Test {
let mut cargo = builder.cargo(compiler, Mode::Test, target, "doc");
compile::test_cargo(builder, &compiler, target, &mut cargo);
cargo.arg("--no-deps").arg("-p").arg("test");
cargo.arg("--no-deps")
.arg("-p").arg("test")
.env("RUSTDOC_RESOURCE_SUFFIX", crate::channel::CFG_RELEASE_NUM)
.env("RUSTDOC_GENERATE_REDIRECT_PAGES", "1");
builder.run(&mut cargo);
builder.cp_r(&my_out, &out);
@ -573,19 +605,19 @@ impl Step for WhitelistedRustc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.krate("rustc-main").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(WhitelistedRustc {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Generate whitelisted compiler crate documentation.
/// Generates whitelisted compiler crate documentation.
///
/// This will generate all documentation for crates that are whitelisted
/// to be included in the standard documentation. This documentation is
@ -594,7 +626,7 @@ impl Step for WhitelistedRustc {
/// documentation. We don't build other compiler documentation
/// here as we want to be able to keep it separate from the standard
/// documentation. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} whitelisted compiler ({})", stage, target));
@ -624,9 +656,10 @@ impl Step for WhitelistedRustc {
// We don't want to build docs for internal compiler dependencies in this
// step (there is another step for that). Therefore, we whitelist the crates
// for which docs must be built.
cargo.arg("--no-deps");
for krate in &["proc_macro"] {
cargo.arg("-p").arg(krate);
cargo.arg("-p").arg(krate)
.env("RUSTDOC_RESOURCE_SUFFIX", crate::channel::CFG_RELEASE_NUM)
.env("RUSTDOC_GENERATE_REDIRECT_PAGES", "1");
}
builder.run(&mut cargo);
@ -645,25 +678,25 @@ impl Step for Rustc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.krate("rustc-main").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustc {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Generate compiler documentation.
/// Generates compiler documentation.
///
/// This will generate all documentation for compiler and dependencies.
/// Compiler documentation is distributed separately, so we make sure
/// we do not merge it with the other documentation from std, test and
/// proc_macros. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} compiler ({})", stage, target));
@ -721,7 +754,7 @@ impl Step for Rustc {
}
fn find_compiler_crates(
builder: &Builder,
builder: &Builder<'_>,
name: &Interned<String>,
crates: &mut HashSet<Interned<String>>
) {
@ -747,24 +780,24 @@ impl Step for Rustdoc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.krate("rustdoc-tool")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc {
stage: run.builder.top_stage,
target: run.target,
});
}
/// Generate compiler documentation.
/// Generates compiler documentation.
///
/// This will generate all documentation for compiler and dependencies.
/// Compiler documentation is distributed separately, so we make sure
/// we do not merge it with the other documentation from std, test and
/// proc_macros. This is largely just a wrapper around `cargo doc`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let target = self.target;
builder.info(&format!("Documenting stage{} rustdoc ({})", stage, target));
@ -830,12 +863,12 @@ impl Step for ErrorIndex {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/error_index_generator").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(ErrorIndex {
target: run.target,
});
@ -843,7 +876,7 @@ impl Step for ErrorIndex {
/// Generates the HTML rendered error-index by running the
/// `error_index_generator` tool.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
builder.info(&format!("Documenting error index ({})", target));
@ -852,6 +885,7 @@ impl Step for ErrorIndex {
let mut index = builder.tool_cmd(Tool::ErrorIndex);
index.arg("html");
index.arg(out.join("error-index.html"));
index.arg(crate::channel::CFG_RELEASE_NUM);
// FIXME: shouldn't have to pass this env var
index.env("CFG_BUILD", &builder.config.build)
@ -871,18 +905,18 @@ impl Step for UnstableBookGen {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/unstable-book-gen").default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(UnstableBookGen {
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let target = self.target;
builder.ensure(compile::Std {

View File

@ -14,45 +14,45 @@ use crate::builder::{Builder, RunConfig, ShouldRun, Step};
use crate::cache::Interned;
use crate::config::Config;
pub fn install_docs(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_docs(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "docs", "rust-docs", stage, Some(host));
}
pub fn install_std(builder: &Builder, stage: u32, target: Interned<String>) {
pub fn install_std(builder: &Builder<'_>, stage: u32, target: Interned<String>) {
install_sh(builder, "std", "rust-std", stage, Some(target));
}
pub fn install_cargo(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_cargo(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "cargo", "cargo", stage, Some(host));
}
pub fn install_rls(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_rls(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "rls", "rls", stage, Some(host));
}
pub fn install_clippy(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_clippy(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "clippy", "clippy", stage, Some(host));
}
pub fn install_miri(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_miri(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "miri", "miri", stage, Some(host));
}
pub fn install_rustfmt(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_rustfmt(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "rustfmt", "rustfmt", stage, Some(host));
}
pub fn install_analysis(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_analysis(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "analysis", "rust-analysis", stage, Some(host));
}
pub fn install_src(builder: &Builder, stage: u32) {
pub fn install_src(builder: &Builder<'_>, stage: u32) {
install_sh(builder, "src", "rust-src", stage, None);
}
pub fn install_rustc(builder: &Builder, stage: u32, host: Interned<String>) {
pub fn install_rustc(builder: &Builder<'_>, stage: u32, host: Interned<String>) {
install_sh(builder, "rustc", "rustc", stage, Some(host));
}
fn install_sh(
builder: &Builder,
builder: &Builder<'_>,
package: &str,
name: &str,
stage: u32,
@ -155,7 +155,7 @@ macro_rules! install {
}
#[allow(dead_code)]
fn should_install(builder: &Builder) -> bool {
fn should_install(builder: &Builder<'_>) -> bool {
builder.config.tools.as_ref().map_or(false, |t| t.contains($path))
}
}
@ -166,12 +166,12 @@ macro_rules! install {
const ONLY_HOSTS: bool = $only_hosts;
$(const $c: bool = true;)*
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let $_config = &run.builder.config;
run.path($path).default_condition($default_cond)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure($name {
stage: run.builder.top_stage,
target: run.target,
@ -179,7 +179,7 @@ macro_rules! install {
});
}
fn run($sel, $builder: &Builder) {
fn run($sel, $builder: &Builder<'_>) {
$run_item
}
})+
@ -262,20 +262,20 @@ impl Step for Src {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let config = &run.builder.config;
let cond = config.extended &&
config.tools.as_ref().map_or(true, |t| t.contains("src"));
run.path("src").default_condition(cond)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Src {
stage: run.builder.top_stage,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(dist::Src);
install_src(builder, self.stage);
}

View File

@ -69,7 +69,7 @@
//! ## Copying stage0 {std,test,rustc}
//!
//! This copies the build output from Cargo into
//! `build/$HOST/stage0-sysroot/lib/rustlib/$ARCH/lib`. FIXME: This step's
//! `build/$HOST/stage0-sysroot/lib/rustlib/$ARCH/lib`. FIXME: this step's
//! documentation should be expanded -- the information already here may be
//! incorrect.
//!
@ -103,7 +103,7 @@
//! More documentation can be found in each respective module below, and you can
//! also check out the `src/bootstrap/README.md` file for more information.
#![deny(bare_trait_objects)]
#![deny(rust_2018_idioms)]
#![deny(warnings)]
#![feature(core_intrinsics)]
#![feature(drain_filter)]
@ -114,28 +114,16 @@ extern crate build_helper;
extern crate serde_derive;
#[macro_use]
extern crate lazy_static;
extern crate serde_json;
extern crate cmake;
extern crate filetime;
extern crate cc;
extern crate getopts;
extern crate num_cpus;
extern crate toml;
extern crate time;
extern crate petgraph;
#[cfg(test)]
#[macro_use]
extern crate pretty_assertions;
#[cfg(unix)]
extern crate libc;
use std::cell::{RefCell, Cell};
use std::collections::{HashSet, HashMap};
use std::env;
use std::fs::{self, OpenOptions, File};
use std::io::{self, Seek, SeekFrom, Write, Read};
use std::io::{Seek, SeekFrom, Write, Read};
use std::path::{PathBuf, Path};
use std::process::{self, Command};
use std::slice;
@ -176,8 +164,6 @@ mod job;
#[cfg(all(unix, not(target_os = "haiku")))]
mod job {
use libc;
pub unsafe fn setup(build: &mut crate::Build) {
if build.config.low_priority {
libc::setpriority(libc::PRIO_PGRP as _, 0, 10);
@ -255,6 +241,8 @@ pub struct Build {
clippy_info: channel::GitInfo,
miri_info: channel::GitInfo,
rustfmt_info: channel::GitInfo,
in_tree_llvm_info: channel::GitInfo,
emscripten_llvm_info: channel::GitInfo,
local_rebuild: bool,
fail_fast: bool,
doc_tests: DocTests,
@ -377,6 +365,8 @@ impl Build {
let clippy_info = channel::GitInfo::new(&config, &src.join("src/tools/clippy"));
let miri_info = channel::GitInfo::new(&config, &src.join("src/tools/miri"));
let rustfmt_info = channel::GitInfo::new(&config, &src.join("src/tools/rustfmt"));
let in_tree_llvm_info = channel::GitInfo::new(&config, &src.join("src/llvm-project"));
let emscripten_llvm_info = channel::GitInfo::new(&config, &src.join("src/llvm-emscripten"));
let mut build = Build {
initial_rustc: config.initial_rustc.clone(),
@ -400,6 +390,8 @@ impl Build {
clippy_info,
miri_info,
rustfmt_info,
in_tree_llvm_info,
emscripten_llvm_info,
cc: HashMap::new(),
cxx: HashMap::new(),
ar: HashMap::new(),
@ -504,7 +496,7 @@ impl Build {
cleared
}
/// Get the space-separated set of activated features for the standard
/// Gets the space-separated set of activated features for the standard
/// library.
fn std_features(&self) -> String {
let mut features = "panic-unwind".to_string();
@ -521,7 +513,7 @@ impl Build {
features
}
/// Get the space-separated set of activated features for the compiler.
/// Gets the space-separated set of activated features for the compiler.
fn rustc_features(&self) -> String {
let mut features = String::new();
if self.config.jemalloc {
@ -609,7 +601,7 @@ impl Build {
self.out.join(&*target).join("crate-docs")
}
/// Returns true if no custom `llvm-config` is set for the specified target.
/// Returns `true` if no custom `llvm-config` is set for the specified target.
///
/// If no custom `llvm-config` was specified then Rust's llvm will be used.
fn is_rust_llvm(&self, target: Interned<String>) -> bool {
@ -831,6 +823,7 @@ impl Build {
!target.contains("msvc") &&
!target.contains("emscripten") &&
!target.contains("wasm32") &&
!target.contains("nvptx") &&
!target.contains("fuchsia") {
Some(self.cc(target))
} else {
@ -856,13 +849,13 @@ impl Build {
.map(|p| &**p)
}
/// Returns true if this is a no-std `target`, if defined
/// Returns `true` if this is a no-std `target`, if defined
fn no_std(&self, target: Interned<String>) -> Option<bool> {
self.config.target_config.get(&target)
.map(|t| t.no_std)
}
/// Returns whether the target will be tested using the `remote-test-client`
/// Returns `true` if the target will be tested using the `remote-test-client`
/// and `remote-test-server` binaries.
fn remote_tested(&self, target: Interned<String>) -> bool {
self.qemu_rootfs(target).is_some() || target.contains("android") ||
@ -1058,7 +1051,7 @@ impl Build {
self.rust_info.version(self, channel::CFG_RELEASE_NUM)
}
/// Return the full commit hash
/// Returns the full commit hash.
fn rust_sha(&self) -> Option<&str> {
self.rust_info.sha()
}
@ -1078,7 +1071,7 @@ impl Build {
panic!("failed to find version in {}'s Cargo.toml", package)
}
/// Returns whether unstable features should be enabled for the compiler
/// Returns `true` if unstable features should be enabled for the compiler
/// we're building.
fn unstable_features(&self) -> bool {
match &self.config.channel[..] {
@ -1263,9 +1256,15 @@ impl Build {
if !src.exists() {
panic!("Error: File \"{}\" not found!", src.display());
}
let mut s = t!(fs::File::open(&src));
let mut d = t!(fs::File::create(&dst));
io::copy(&mut s, &mut d).expect("failed to copy");
let metadata = t!(src.symlink_metadata());
if let Err(e) = fs::copy(&src, &dst) {
panic!("failed to copy `{}` to `{}`: {}", src.display(),
dst.display(), e)
}
t!(fs::set_permissions(&dst, metadata.permissions()));
let atime = FileTime::from_last_access_time(&metadata);
let mtime = FileTime::from_last_modification_time(&metadata);
t!(filetime::set_file_times(&dst, atime, mtime));
}
chmod(&dst, perms);
}
@ -1320,7 +1319,7 @@ impl<'a> Compiler {
self
}
/// Returns whether this is a snapshot compiler for `build`'s configuration
/// Returns `true` if this is a snapshot compiler for `build`'s configuration
pub fn is_snapshot(&self, build: &Build) -> bool {
self.stage == 0 && self.host == build.build
}

View File

@ -53,7 +53,6 @@ check-aux:
src/test/run-fail/pretty \
src/test/run-pass-valgrind/pretty \
src/test/run-pass-fulldeps/pretty \
src/test/run-fail-fulldeps/pretty \
$(AUX_ARGS) \
$(BOOTSTRAP_ARGS)
check-bootstrap:

View File

@ -18,6 +18,7 @@ use build_helper::output;
use cmake;
use cc;
use crate::channel;
use crate::util::{self, exe};
use build_helper::up_to_date;
use crate::builder::{Builder, RunConfig, ShouldRun, Step};
@ -35,11 +36,14 @@ impl Step for Llvm {
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/llvm").path("src/llvm-emscripten")
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/llvm-project")
.path("src/llvm-project/llvm")
.path("src/llvm")
.path("src/llvm-emscripten")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let emscripten = run.path.ends_with("llvm-emscripten");
run.builder.ensure(Llvm {
target: run.target,
@ -48,7 +52,7 @@ impl Step for Llvm {
}
/// Compile LLVM for `target`.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let target = self.target;
let emscripten = self.emscripten;
@ -97,7 +101,7 @@ impl Step for Llvm {
t!(fs::create_dir_all(&out_dir));
// http://llvm.org/docs/CMake.html
let root = if self.emscripten { "src/llvm-emscripten" } else { "src/llvm" };
let root = if self.emscripten { "src/llvm-emscripten" } else { "src/llvm-project/llvm" };
let mut cfg = cmake::Config::new(builder.src.join(root));
let profile = match (builder.config.llvm_optimize, builder.config.llvm_release_debuginfo) {
@ -189,10 +193,10 @@ impl Step for Llvm {
}
if want_lldb {
cfg.define("LLVM_EXTERNAL_CLANG_SOURCE_DIR", builder.src.join("src/tools/clang"));
cfg.define("LLVM_EXTERNAL_LLDB_SOURCE_DIR", builder.src.join("src/tools/lldb"));
cfg.define("LLVM_ENABLE_PROJECTS", "clang;lldb");
// For the time being, disable code signing.
cfg.define("LLDB_CODESIGN_IDENTITY", "");
cfg.define("LLDB_NO_DEBUGSERVER", "ON");
} else {
// LLDB requires libxml2; but otherwise we want it to be disabled.
// See https://github.com/rust-lang/rust/pull/50104
@ -228,7 +232,30 @@ impl Step for Llvm {
}
if let Some(ref suffix) = builder.config.llvm_version_suffix {
cfg.define("LLVM_VERSION_SUFFIX", suffix);
// Allow version-suffix="" to not define a version suffix at all.
if !suffix.is_empty() {
cfg.define("LLVM_VERSION_SUFFIX", suffix);
}
} else {
let mut default_suffix = format!(
"-rust-{}-{}",
channel::CFG_RELEASE_NUM,
builder.config.channel,
);
let llvm_info = if self.emscripten {
&builder.emscripten_llvm_info
} else {
&builder.in_tree_llvm_info
};
if let Some(sha) = llvm_info.sha_short() {
default_suffix.push_str("-");
default_suffix.push_str(sha);
}
cfg.define("LLVM_VERSION_SUFFIX", default_suffix);
}
if let Some(ref linker) = builder.config.llvm_use_linker {
cfg.define("LLVM_USE_LINKER", linker);
}
if let Some(ref python) = builder.config.python {
@ -254,7 +281,7 @@ impl Step for Llvm {
}
}
fn check_llvm_version(builder: &Builder, llvm_config: &Path) {
fn check_llvm_version(builder: &Builder<'_>, llvm_config: &Path) {
if !builder.config.llvm_version_check {
return
}
@ -275,7 +302,7 @@ fn check_llvm_version(builder: &Builder, llvm_config: &Path) {
panic!("\n\nbad LLVM version: {}, need >=6.0\n\n", version)
}
fn configure_cmake(builder: &Builder,
fn configure_cmake(builder: &Builder<'_>,
target: Interned<String>,
cfg: &mut cmake::Config) {
if builder.config.ninja {
@ -358,7 +385,11 @@ fn configure_cmake(builder: &Builder,
}
cfg.build_arg("-j").build_arg(builder.jobs().to_string());
cfg.define("CMAKE_C_FLAGS", builder.cflags(target, GitRepo::Llvm).join(" "));
let mut cflags = builder.cflags(target, GitRepo::Llvm).join(" ");
if let Some(ref s) = builder.config.llvm_cxxflags {
cflags.push_str(&format!(" {}", s));
}
cfg.define("CMAKE_C_FLAGS", cflags);
let mut cxxflags = builder.cflags(target, GitRepo::Llvm).join(" ");
if builder.config.llvm_static_stdcpp &&
!target.contains("windows") &&
@ -366,6 +397,9 @@ fn configure_cmake(builder: &Builder,
{
cxxflags.push_str(" -static-libstdc++");
}
if let Some(ref s) = builder.config.llvm_cxxflags {
cxxflags.push_str(&format!(" {}", s));
}
cfg.define("CMAKE_CXX_FLAGS", cxxflags);
if let Some(ar) = builder.ar(target) {
if ar.is_absolute() {
@ -383,6 +417,12 @@ fn configure_cmake(builder: &Builder,
}
}
if let Some(ref s) = builder.config.llvm_ldflags {
cfg.define("CMAKE_SHARED_LINKER_FLAGS", s);
cfg.define("CMAKE_MODULE_LINKER_FLAGS", s);
cfg.define("CMAKE_EXE_LINKER_FLAGS", s);
}
if env::var_os("SCCACHE_ERROR_LOG").is_some() {
cfg.env("RUST_LOG", "sccache=warn");
}
@ -397,16 +437,16 @@ impl Step for Lld {
type Output = PathBuf;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/tools/lld")
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/llvm-project/lld").path("src/tools/lld")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Lld { target: run.target });
}
/// Compile LLVM for `target`.
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
if builder.config.dry_run {
return PathBuf::from("lld-out-dir-test-gen");
}
@ -428,7 +468,7 @@ impl Step for Lld {
let _time = util::timeit(&builder);
t!(fs::create_dir_all(&out_dir));
let mut cfg = cmake::Config::new(builder.src.join("src/tools/lld"));
let mut cfg = cmake::Config::new(builder.src.join("src/llvm-project/lld"));
configure_cmake(builder, target, &mut cfg);
// This is an awful, awful hack. Discovered when we migrated to using
@ -469,17 +509,17 @@ pub struct TestHelpers {
impl Step for TestHelpers {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/test/auxiliary/rust_test_helpers.c")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(TestHelpers { target: run.target })
}
/// Compiles the `rust_test_helpers.c` library which we used in various
/// `run-pass` test suites for ABI testing.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
if builder.config.dry_run {
return;
}

View File

@ -156,7 +156,7 @@ pub fn check(build: &mut Build) {
panic!("the iOS target is only supported on macOS");
}
if target.contains("-none-") {
if target.contains("-none-") || target.contains("nvptx") {
if build.no_std(*target).is_none() {
let target = build.config.target_config.entry(target.clone())
.or_default();
@ -165,7 +165,7 @@ pub fn check(build: &mut Build) {
}
if build.no_std(*target) == Some(false) {
panic!("All the *-none-* targets are no-std targets")
panic!("All the *-none-* and nvptx* targets are no-std targets")
}
}

View File

@ -30,9 +30,9 @@ const ADB_TEST_DIR: &str = "/data/tmp/work";
/// The two modes of the test runner; tests or benchmarks.
#[derive(Debug, PartialEq, Eq, Hash, Copy, Clone, PartialOrd, Ord)]
pub enum TestKind {
/// Run `cargo test`
/// Run `cargo test`.
Test,
/// Run `cargo bench`
/// Run `cargo bench`.
Bench,
}
@ -57,7 +57,7 @@ impl TestKind {
}
impl fmt::Display for TestKind {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(match *self {
TestKind::Test => "Testing",
TestKind::Bench => "Benchmarking",
@ -65,7 +65,7 @@ impl fmt::Display for TestKind {
}
}
fn try_run(builder: &Builder, cmd: &mut Command) -> bool {
fn try_run(builder: &Builder<'_>, cmd: &mut Command) -> bool {
if !builder.fail_fast {
if !builder.try_run(cmd) {
let mut failures = builder.delayed_failures.borrow_mut();
@ -78,7 +78,7 @@ fn try_run(builder: &Builder, cmd: &mut Command) -> bool {
true
}
fn try_run_quiet(builder: &Builder, cmd: &mut Command) -> bool {
fn try_run_quiet(builder: &Builder<'_>, cmd: &mut Command) -> bool {
if !builder.fail_fast {
if !builder.try_run_quiet(cmd) {
let mut failures = builder.delayed_failures.borrow_mut();
@ -105,7 +105,7 @@ impl Step for Linkcheck {
///
/// This tool in `src/tools` will verify the validity of all our links in the
/// documentation to ensure we don't have a bunch of dead ones.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let host = self.host;
builder.info(&format!("Linkcheck ({})", host));
@ -121,13 +121,13 @@ impl Step for Linkcheck {
);
}
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/linkchecker")
.default_condition(builder.config.docs)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Linkcheck { host: run.target });
}
}
@ -142,11 +142,11 @@ impl Step for Cargotest {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/cargotest")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargotest {
stage: run.builder.top_stage,
host: run.target,
@ -157,7 +157,7 @@ impl Step for Cargotest {
///
/// This tool in `src/tools` will check out a few Rust projects and run `cargo
/// test` to ensure that we don't regress the test suites there.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(self.stage, self.host);
builder.ensure(compile::Rustc {
compiler,
@ -192,11 +192,11 @@ impl Step for Cargo {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/cargo")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargo {
stage: run.builder.top_stage,
host: run.target,
@ -204,7 +204,7 @@ impl Step for Cargo {
}
/// Runs `cargo test` for `cargo` packaged with Rust.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = builder.compiler(self.stage, self.host);
builder.ensure(tool::Cargo {
@ -247,11 +247,11 @@ impl Step for Rls {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/rls")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rls {
stage: run.builder.top_stage,
host: run.target,
@ -259,7 +259,7 @@ impl Step for Rls {
}
/// Runs `cargo test` for the rls.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let host = self.host;
let compiler = builder.compiler(stage, host);
@ -303,11 +303,11 @@ impl Step for Rustfmt {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/rustfmt")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustfmt {
stage: run.builder.top_stage,
host: run.target,
@ -315,7 +315,7 @@ impl Step for Rustfmt {
}
/// Runs `cargo test` for rustfmt.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let host = self.host;
let compiler = builder.compiler(stage, host);
@ -362,12 +362,12 @@ impl Step for Miri {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let test_miri = run.builder.config.test_miri;
run.path("src/tools/miri").default_condition(test_miri)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Miri {
stage: run.builder.top_stage,
host: run.target,
@ -375,7 +375,7 @@ impl Step for Miri {
}
/// Runs `cargo test` for miri.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let host = self.host;
let compiler = builder.compiler(stage, host);
@ -421,11 +421,11 @@ pub struct CompiletestTest {
impl Step for CompiletestTest {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/compiletest")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(CompiletestTest {
stage: run.builder.top_stage,
host: run.target,
@ -433,7 +433,7 @@ impl Step for CompiletestTest {
}
/// Runs `cargo test` for compiletest.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let host = self.host;
let compiler = builder.compiler(stage, host);
@ -462,11 +462,11 @@ impl Step for Clippy {
const ONLY_HOSTS: bool = true;
const DEFAULT: bool = false;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/clippy")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Clippy {
stage: run.builder.top_stage,
host: run.target,
@ -474,7 +474,7 @@ impl Step for Clippy {
}
/// Runs `cargo test` for clippy.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let stage = self.stage;
let host = self.host;
let compiler = builder.compiler(stage, host);
@ -516,7 +516,7 @@ impl Step for Clippy {
}
}
fn path_for_cargo(builder: &Builder, compiler: Compiler) -> OsString {
fn path_for_cargo(builder: &Builder<'_>, compiler: Compiler) -> OsString {
// Configure PATH to find the right rustc. NB. we have to use PATH
// and not RUSTC because the Cargo test suite has tests that will
// fail if rustc is not spelled `rustc`.
@ -535,17 +535,17 @@ impl Step for RustdocTheme {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/rustdoc-themes")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure(RustdocTheme { compiler });
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let rustdoc = builder.out.join("bootstrap/debug/rustdoc");
let mut cmd = builder.tool_cmd(Tool::RustdocTheme);
cmd.arg(rustdoc.to_str().unwrap())
@ -574,36 +574,79 @@ impl Step for RustdocTheme {
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct RustdocJS {
pub struct RustdocJSStd {
pub host: Interned<String>,
pub target: Interned<String>,
}
impl Step for RustdocJS {
impl Step for RustdocJSStd {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
run.path("src/test/rustdoc-js")
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/test/rustdoc-js-std")
}
fn make_run(run: RunConfig) {
run.builder.ensure(RustdocJS {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(RustdocJSStd {
host: run.host,
target: run.target,
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
if let Some(ref nodejs) = builder.config.nodejs {
let mut command = Command::new(nodejs);
command.args(&["src/tools/rustdoc-js/tester.js", &*self.host]);
command.args(&["src/tools/rustdoc-js-std/tester.js", &*self.host]);
builder.ensure(crate::doc::Std {
target: self.target,
stage: builder.top_stage,
});
builder.run(&mut command);
} else {
builder.info(
"No nodejs found, skipping \"src/test/rustdoc-js-std\" tests"
);
}
}
}
#[derive(Debug, Copy, Clone, Hash, PartialEq, Eq)]
pub struct RustdocJSNotStd {
pub host: Interned<String>,
pub target: Interned<String>,
pub compiler: Compiler,
}
impl Step for RustdocJSNotStd {
type Output = ();
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/test/rustdoc-js")
}
fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure(RustdocJSNotStd {
host: run.host,
target: run.target,
compiler,
});
}
fn run(self, builder: &Builder<'_>) {
if builder.config.nodejs.is_some() {
builder.ensure(Compiletest {
compiler: self.compiler,
target: self.target,
mode: "js-doc-test",
suite: "rustdoc-js",
path: None,
compare_mode: None,
});
} else {
builder.info(
"No nodejs found, skipping \"src/test/rustdoc-js\" tests"
@ -624,11 +667,11 @@ impl Step for RustdocUi {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/test/rustdoc-ui")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure(RustdocUi {
host: run.host,
@ -637,7 +680,7 @@ impl Step for RustdocUi {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(Compiletest {
compiler: self.compiler,
target: self.target,
@ -662,7 +705,7 @@ impl Step for Tidy {
/// This tool in `src/tools` checks up on various bits and pieces of style and
/// otherwise just implements a few lint-like checks that are specific to the
/// compiler itself.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let mut cmd = builder.tool_cmd(Tool::Tidy);
cmd.arg(builder.src.join("src"));
cmd.arg(&builder.initial_cargo);
@ -678,16 +721,16 @@ impl Step for Tidy {
try_run(builder, &mut cmd);
}
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/tidy")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Tidy);
}
}
fn testdir(builder: &Builder, host: Interned<String>) -> PathBuf {
fn testdir(builder: &Builder<'_>, host: Interned<String>) -> PathBuf {
builder.out.join(host).join("test")
}
@ -747,11 +790,11 @@ macro_rules! test_definitions {
const DEFAULT: bool = $default;
const ONLY_HOSTS: bool = $host;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.suite_path($path)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let compiler = run.builder.compiler(run.builder.top_stage, run.host);
run.builder.ensure($name {
@ -760,7 +803,7 @@ macro_rules! test_definitions {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(Compiletest {
compiler: self.compiler,
target: self.target,
@ -848,12 +891,6 @@ host_test!(RunPassFullDeps {
suite: "run-pass-fulldeps"
});
host_test!(RunFailFullDeps {
path: "src/test/run-fail-fulldeps",
mode: "run-fail",
suite: "run-fail-fulldeps"
});
host_test!(Rustdoc {
path: "src/test/rustdoc",
mode: "rustdoc",
@ -888,20 +925,6 @@ test!(RunPassValgrindPretty {
default: false,
host: true
});
test!(RunPassFullDepsPretty {
path: "src/test/run-pass-fulldeps/pretty",
mode: "pretty",
suite: "run-pass-fulldeps",
default: false,
host: true
});
test!(RunFailFullDepsPretty {
path: "src/test/run-fail-fulldeps/pretty",
mode: "pretty",
suite: "run-fail-fulldeps",
default: false,
host: true
});
default_test!(RunMake {
path: "src/test/run-make",
@ -928,7 +951,7 @@ struct Compiletest {
impl Step for Compiletest {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
@ -937,7 +960,7 @@ impl Step for Compiletest {
/// Compiles all tests with `compiler` for `target` with the specified
/// compiletest `mode` and `suite` arguments. For example `mode` can be
/// "run-pass" or `suite` can be something like `debuginfo`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
let mode = self.mode;
@ -1010,12 +1033,13 @@ impl Step for Compiletest {
.arg(builder.sysroot_libdir(compiler, target));
cmd.arg("--rustc-path").arg(builder.rustc(compiler));
let is_rustdoc_ui = suite.ends_with("rustdoc-ui");
let is_rustdoc = suite.ends_with("rustdoc-ui") || suite.ends_with("rustdoc-js");
// Avoid depending on rustdoc when we don't need it.
if mode == "rustdoc"
|| (mode == "run-make" && suite.ends_with("fulldeps"))
|| (mode == "ui" && is_rustdoc_ui)
|| (mode == "ui" && is_rustdoc)
|| mode == "js-doc-test"
{
cmd.arg("--rustdoc-path")
.arg(builder.rustdoc(compiler.host));
@ -1049,12 +1073,12 @@ impl Step for Compiletest {
cmd.arg("--nodejs").arg(nodejs);
}
let mut flags = if is_rustdoc_ui {
let mut flags = if is_rustdoc {
Vec::new()
} else {
vec!["-Crpath".to_string()]
};
if !is_rustdoc_ui {
if !is_rustdoc {
if builder.config.rust_optimize_tests {
flags.push("-O".to_string());
}
@ -1108,9 +1132,7 @@ impl Step for Compiletest {
};
let lldb_exe = if builder.config.lldb_enabled && !target.contains("emscripten") {
// Test against the lldb that was just built.
builder.llvm_out(target)
.join("bin")
.join("lldb")
builder.llvm_out(target).join("bin").join("lldb")
} else {
PathBuf::from("lldb")
};
@ -1127,6 +1149,26 @@ impl Step for Compiletest {
}
}
if let Some(var) = env::var_os("RUSTBUILD_FORCE_CLANG_BASED_TESTS") {
match &var.to_string_lossy().to_lowercase()[..] {
"1" | "yes" | "on" => {
assert!(builder.config.lldb_enabled,
"RUSTBUILD_FORCE_CLANG_BASED_TESTS needs Clang/LLDB to \
be built.");
let clang_exe = builder.llvm_out(target).join("bin").join("clang");
cmd.arg("--run-clang-based-tests-with").arg(clang_exe);
}
"0" | "no" | "off" => {
// Nothing to do.
}
other => {
// Let's make sure typos don't get unnoticed
panic!("Unrecognized option '{}' set in \
RUSTBUILD_FORCE_CLANG_BASED_TESTS", other);
}
}
}
// Get paths from cmd args
let paths = match &builder.config.cmd {
Subcommand::Test { ref paths, .. } => &paths[..],
@ -1286,16 +1328,16 @@ impl Step for DocTest {
type Output = ();
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
/// Run `rustdoc --test` for all documentation in `src/doc`.
/// Runs `rustdoc --test` for all documentation in `src/doc`.
///
/// This will run all tests in our markdown documentation (e.g., the book)
/// located in `src/doc`. The `rustdoc` that's run is the one that sits next to
/// `compiler`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
builder.ensure(compile::Test {
@ -1356,17 +1398,17 @@ macro_rules! test_book {
const DEFAULT: bool = $default;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path($path)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure($name {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(DocTest {
compiler: self.compiler,
path: $path,
@ -1385,6 +1427,7 @@ test_book!(
RustdocBook, "src/doc/rustdoc", "rustdoc", default=true;
RustcBook, "src/doc/rustc", "rustc", default=true;
RustByExample, "src/doc/rust-by-example", "rust-by-example", default=false;
EmbeddedBook, "src/doc/embedded-book", "embedded-book", default=false;
TheBook, "src/doc/book", "book", default=false;
UnstableBook, "src/doc/unstable-book", "unstable-book", default=true;
);
@ -1399,23 +1442,23 @@ impl Step for ErrorIndex {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/error_index_generator")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(ErrorIndex {
compiler: run.builder.compiler(run.builder.top_stage, run.host),
});
}
/// Run the error index generator tool to execute the tests located in the error
/// Runs the error index generator tool to execute the tests located in the error
/// index.
///
/// The `error_index_generator` tool lives in `src/tools` and is used to
/// generate a markdown file from the error indexes of the code base which is
/// then passed to `rustdoc --test`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
builder.ensure(compile::Std {
@ -1441,7 +1484,7 @@ impl Step for ErrorIndex {
}
}
fn markdown_test(builder: &Builder, compiler: Compiler, markdown: &Path) -> bool {
fn markdown_test(builder: &Builder<'_>, compiler: Compiler, markdown: &Path) -> bool {
match fs::read_to_string(markdown) {
Ok(contents) => {
if !contents.contains("```") {
@ -1481,11 +1524,11 @@ impl Step for CrateLibrustc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.krate("rustc-main")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let builder = run.builder;
let compiler = builder.compiler(builder.top_stage, run.host);
@ -1503,7 +1546,7 @@ impl Step for CrateLibrustc {
}
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(Crate {
compiler: self.compiler,
target: self.target,
@ -1525,14 +1568,14 @@ pub struct CrateNotDefault {
impl Step for CrateNotDefault {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/librustc_asan")
.path("src/librustc_lsan")
.path("src/librustc_msan")
.path("src/librustc_tsan")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let builder = run.builder;
let compiler = builder.compiler(builder.top_stage, run.host);
@ -1552,7 +1595,7 @@ impl Step for CrateNotDefault {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
builder.ensure(Crate {
compiler: self.compiler,
target: self.target,
@ -1576,7 +1619,7 @@ impl Step for Crate {
type Output = ();
const DEFAULT: bool = true;
fn should_run(mut run: ShouldRun) -> ShouldRun {
fn should_run(mut run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run = run.krate("test");
for krate in run.builder.in_tree_crates("std") {
@ -1587,7 +1630,7 @@ impl Step for Crate {
run
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let builder = run.builder;
let compiler = builder.compiler(builder.top_stage, run.host);
@ -1615,7 +1658,7 @@ impl Step for Crate {
}
}
/// Run all unit tests plus documentation tests for a given crate defined
/// Runs all unit tests plus documentation tests for a given crate defined
/// by a `Cargo.toml` (single manifest)
///
/// This is what runs tests for crates like the standard library, compiler, etc.
@ -1623,7 +1666,7 @@ impl Step for Crate {
///
/// Currently this runs all tests for a DAG by passing a bunch of `-p foo`
/// arguments, and those arguments are discovered from `cargo metadata`.
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
let mode = self.mode;
@ -1764,11 +1807,11 @@ impl Step for CrateRustdoc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.paths(&["src/librustdoc", "src/tools/rustdoc"])
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
let builder = run.builder;
let test_kind = builder.kind.into();
@ -1779,7 +1822,7 @@ impl Step for CrateRustdoc {
});
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let test_kind = self.test_kind;
let compiler = builder.compiler(builder.top_stage, self.host);
@ -1834,7 +1877,7 @@ fn envify(s: &str) -> String {
/// the standard library and such to the emulator ahead of time. This step
/// represents this and is a dependency of all test suites.
///
/// Most of the time this is a noop. For some steps such as shipping data to
/// Most of the time this is a no-op. For some steps such as shipping data to
/// QEMU we have to build our own tools so we've got conditional dependencies
/// on those programs as well. Note that the remote test client is built for
/// the build target (us) and the server is built for the target.
@ -1847,11 +1890,11 @@ pub struct RemoteCopyLibs {
impl Step for RemoteCopyLibs {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
fn run(self, builder: &Builder) {
fn run(self, builder: &Builder<'_>) {
let compiler = self.compiler;
let target = self.target;
if !builder.remote_tested(target) {
@ -1897,16 +1940,16 @@ pub struct Distcheck;
impl Step for Distcheck {
type Output = ();
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("distcheck")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Distcheck);
}
/// Run "distcheck", a 'make check' from a tarball
fn run(self, builder: &Builder) {
/// Runs "distcheck", a 'make check' from a tarball
fn run(self, builder: &Builder<'_>) {
builder.info("Distcheck");
let dir = builder.out.join("tmp").join("distcheck");
let _ = fs::remove_dir_all(&dir);
@ -1966,8 +2009,8 @@ impl Step for Bootstrap {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
/// Test the build system itself
fn run(self, builder: &Builder) {
/// Tests the build system itself.
fn run(self, builder: &Builder<'_>) {
let mut cmd = Command::new(&builder.initial_cargo);
cmd.arg("test")
.current_dir(builder.src.join("src/bootstrap"))
@ -1991,11 +2034,11 @@ impl Step for Bootstrap {
try_run(builder, &mut cmd);
}
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/bootstrap")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Bootstrap);
}
}

View File

@ -1,6 +1,5 @@
use std::fs;
use std::env;
use std::iter;
use std::path::PathBuf;
use std::process::{Command, exit};
use std::collections::HashSet;
@ -37,15 +36,15 @@ struct ToolBuild {
impl Step for ToolBuild {
type Output = Option<PathBuf>;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.never()
}
/// Build a tool in `src/tools`
/// Builds a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
fn run(self, builder: &Builder) -> Option<PathBuf> {
fn run(self, builder: &Builder<'_>) -> Option<PathBuf> {
let compiler = self.compiler;
let target = self.target;
let tool = self.tool;
@ -193,7 +192,7 @@ impl Step for ToolBuild {
}
pub fn prepare_tool_cargo(
builder: &Builder,
builder: &Builder<'_>,
compiler: Compiler,
mode: Mode,
target: Interned<String>,
@ -316,18 +315,18 @@ macro_rules! tool {
impl Step for $name {
type Output = PathBuf;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path($path)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure($name {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
@ -372,18 +371,18 @@ pub struct RemoteTestServer {
impl Step for RemoteTestServer {
type Output = PathBuf;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/remote-test-server")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(RemoteTestServer {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
builder.ensure(ToolBuild {
compiler: self.compiler,
target: self.target,
@ -407,37 +406,37 @@ impl Step for Rustdoc {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
run.path("src/tools/rustdoc")
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Rustdoc {
host: run.host,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
let target_compiler = builder.compiler(builder.top_stage, self.host);
if target_compiler.stage == 0 {
if !target_compiler.is_snapshot(builder) {
panic!("rustdoc in stage 0 must be snapshot rustdoc");
}
return builder.initial_rustc.with_file_name(exe("rustdoc", &target_compiler.host));
}
let target = target_compiler.host;
let build_compiler = if target_compiler.stage == 0 {
builder.compiler(0, builder.config.build)
} else if target_compiler.stage >= 2 {
// Past stage 2, we consider the compiler to be ABI-compatible and hence capable of
// building rustdoc itself.
builder.compiler(target_compiler.stage, builder.config.build)
} else {
// Similar to `compile::Assemble`, build with the previous stage's compiler. Otherwise
// we'd have stageN/bin/rustc and stageN/bin/rustdoc be effectively different stage
// compilers, which isn't what we want.
builder.compiler(target_compiler.stage - 1, builder.config.build)
};
// Similar to `compile::Assemble`, build with the previous stage's compiler. Otherwise
// we'd have stageN/bin/rustc and stageN/bin/rustdoc be effectively different stage
// compilers, which isn't what we want. Rustdoc should be linked in the same way as the
// rustc compiler it's paired with, so it must be built with the previous stage compiler.
let build_compiler = builder.compiler(target_compiler.stage - 1, builder.config.build);
builder.ensure(compile::Rustc { compiler: build_compiler, target });
builder.ensure(compile::Rustc {
compiler: build_compiler,
target: builder.config.build,
});
// The presence of `target_compiler` ensures that the necessary libraries (codegen backends,
// compiler libraries, ...) are built. Rustdoc does not require the presence of any
// libraries within sysroot_libdir (i.e., rustlib), though doctests may want it (since
// they'll be linked to those libraries). As such, don't explicitly `ensure` any additional
// libraries here. The intuition here is that If we've built a compiler, we should be able
// to build rustdoc.
let mut cargo = prepare_tool_cargo(
builder,
@ -491,19 +490,19 @@ impl Step for Cargo {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path("src/tools/cargo").default_condition(builder.config.extended)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure(Cargo {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
});
}
fn run(self, builder: &Builder) -> PathBuf {
fn run(self, builder: &Builder<'_>) -> PathBuf {
// Cargo depends on procedural macros, which requires a full host
// compiler to be available, so we need to depend on that.
builder.ensure(compile::Rustc {
@ -543,12 +542,12 @@ macro_rules! tool_extended {
const DEFAULT: bool = true;
const ONLY_HOSTS: bool = true;
fn should_run(run: ShouldRun) -> ShouldRun {
fn should_run(run: ShouldRun<'_>) -> ShouldRun<'_> {
let builder = run.builder;
run.path($path).default_condition(builder.config.extended)
}
fn make_run(run: RunConfig) {
fn make_run(run: RunConfig<'_>) {
run.builder.ensure($name {
compiler: run.builder.compiler(run.builder.top_stage, run.builder.config.build),
target: run.target,
@ -557,7 +556,7 @@ macro_rules! tool_extended {
}
#[allow(unused_mut)]
fn run(mut $sel, $builder: &Builder) -> Option<PathBuf> {
fn run(mut $sel, $builder: &Builder<'_>) -> Option<PathBuf> {
$extra_deps
$builder.ensure(ToolBuild {
compiler: $sel.compiler,
@ -622,7 +621,7 @@ tool_extended!((self, builder),
);
impl<'a> Builder<'a> {
/// Get a `Command` which is ready to run `tool` in `stage` built for
/// Gets a `Command` which is ready to run `tool` in `stage` built for
/// `host`.
pub fn tool_cmd(&self, tool: Tool) -> Command {
let mut cmd = Command::new(self.tool_exe(tool));
@ -666,19 +665,33 @@ impl<'a> Builder<'a> {
// Add the llvm/bin directory to PATH since it contains lots of
// useful, platform-independent tools
if tool.uses_llvm_tools() {
if tool.uses_llvm_tools() && !self.config.dry_run {
let mut additional_paths = vec![];
if let Some(llvm_bin_path) = self.llvm_bin_path() {
if host.contains("windows") {
// On Windows, PATH and the dynamic library path are the same,
// so we just add the LLVM bin path to lib_path
lib_paths.push(llvm_bin_path);
} else {
let old_path = env::var_os("PATH").unwrap_or_default();
let new_path = env::join_paths(iter::once(llvm_bin_path)
.chain(env::split_paths(&old_path)))
.expect("Could not add LLVM bin path to PATH");
cmd.env("PATH", new_path);
}
additional_paths.push(llvm_bin_path);
}
// If LLD is available, add that too.
if self.config.lld_enabled {
let lld_install_root = self.ensure(native::Lld {
target: self.config.build,
});
let lld_bin_path = lld_install_root.join("bin");
additional_paths.push(lld_bin_path);
}
if host.contains("windows") {
// On Windows, PATH and the dynamic library path are the same,
// so we just add the LLVM bin path to lib_path
lib_paths.extend(additional_paths);
} else {
let old_path = env::var_os("PATH").unwrap_or_default();
let new_path = env::join_paths(additional_paths.into_iter()
.chain(env::split_paths(&old_path)))
.expect("Could not add LLVM bin path to PATH");
cmd.env("PATH", new_path);
}
}
@ -686,7 +699,7 @@ impl<'a> Builder<'a> {
}
fn llvm_bin_path(&self) -> Option<PathBuf> {
if self.config.llvm_enabled && !self.config.dry_run {
if self.config.llvm_enabled {
let llvm_config = self.ensure(native::Llvm {
target: self.config.build,
emscripten: false,

View File

@ -33,7 +33,7 @@ pub fn exe(name: &str, target: &str) -> String {
}
}
/// Returns whether the file name given looks like a dynamic library.
/// Returns `true` if the file name given looks like a dynamic library.
pub fn is_dylib(name: &str) -> bool {
name.ends_with(".dylib") || name.ends_with(".so") || name.ends_with(".dll")
}
@ -70,7 +70,11 @@ pub fn dylib_path_var() -> &'static str {
/// Parses the `dylib_path_var()` environment variable, returning a list of
/// paths that are members of this lookup path.
pub fn dylib_path() -> Vec<PathBuf> {
env::split_paths(&env::var_os(dylib_path_var()).unwrap_or_default()).collect()
let var = match env::var_os(dylib_path_var()) {
Some(v) => v,
None => return vec![],
};
env::split_paths(&var).collect()
}
/// `push` all components to `buf`. On windows, append `.exe` to the last component.
@ -91,7 +95,7 @@ pub fn push_exe_path(mut buf: PathBuf, components: &[&str]) -> PathBuf {
pub struct TimeIt(bool, Instant);
/// Returns an RAII structure that prints out how long it took to drop.
pub fn timeit(builder: &Builder) -> TimeIt {
pub fn timeit(builder: &Builder<'_>) -> TimeIt {
TimeIt(builder.config.dry_run, Instant::now())
}

View File

@ -2,6 +2,7 @@
name = "build_helper"
version = "0.1.0"
authors = ["The Rust Project Developers"]
edition = "2018"
[lib]
name = "build_helper"

View File

@ -1,3 +1,5 @@
#![deny(rust_2018_idioms)]
use std::fs::File;
use std::path::{Path, PathBuf};
use std::process::{Command, Stdio};
@ -23,6 +25,25 @@ macro_rules! t {
};
}
// Because Cargo adds the compiler's dylib path to our library search path, llvm-config may
// break: the dylib path for the compiler, as of this writing, contains a copy of the LLVM
// shared library, which means that when our freshly built llvm-config goes to load it's
// associated LLVM, it actually loads the compiler's LLVM. In particular when building the first
// compiler (i.e., in stage 0) that's a problem, as the compiler's LLVM is likely different from
// the one we want to use. As such, we restore the environment to what bootstrap saw. This isn't
// perfect -- we might actually want to see something from Cargo's added library paths -- but
// for now it works.
pub fn restore_library_path() {
println!("cargo:rerun-if-env-changed=REAL_LIBRARY_PATH_VAR");
println!("cargo:rerun-if-env-changed=REAL_LIBRARY_PATH");
let key = env::var_os("REAL_LIBRARY_PATH_VAR").expect("REAL_LIBRARY_PATH_VAR");
if let Some(env) = env::var_os("REAL_LIBRARY_PATH") {
env::set_var(&key, &env);
} else {
env::remove_var(&key);
}
}
pub fn run(cmd: &mut Command) {
println!("running: {:?}", cmd);
run_silent(cmd);
@ -142,7 +163,7 @@ pub fn mtime(path: &Path) -> SystemTime {
.unwrap_or(UNIX_EPOCH)
}
/// Returns whether `dst` is up to date given that the file or files in `src`
/// Returns `true` if `dst` is up to date given that the file or files in `src`
/// are used to generate it.
///
/// Uses last-modified time checks to verify this.
@ -169,12 +190,12 @@ pub struct NativeLibBoilerplate {
}
impl NativeLibBoilerplate {
/// On OSX we don't want to ship the exact filename that compiler-rt builds.
/// On macOS we don't want to ship the exact filename that compiler-rt builds.
/// This conflicts with the system and ours is likely a wildly different
/// version, so they can't be substituted.
///
/// As a result, we rename it here but we need to also use
/// `install_name_tool` on OSX to rename the commands listed inside of it to
/// `install_name_tool` on macOS to rename the commands listed inside of it to
/// ensure it's linked against correctly.
pub fn fixup_sanitizer_lib_name(&self, sanitizer_name: &str) {
if env::var("TARGET").unwrap() != "x86_64-apple-darwin" {

View File

@ -131,13 +131,15 @@ $category > $option = $value -- $comment
For targets: `arm-unknown-linux-gnueabi`
- Path and misc options > Prefix directory = /x-tools/${CT\_TARGET}
- Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /tmp/patches
- Target options > Target Architecture = arm
- Target options > Architecture level = armv6 -- (+)
- Target options > Floating point = software (no FPU) -- (\*)
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 3.2.72 -- Precise kernel
- C-library > glibc version = 2.14.1
- C compiler > gcc version = 4.9.3
- C-library > glibc version = 2.16.0
- C compiler > gcc version = 5.2.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
### `arm-linux-gnueabihf.config`
@ -145,6 +147,8 @@ For targets: `arm-unknown-linux-gnueabi`
For targets: `arm-unknown-linux-gnueabihf`
- Path and misc options > Prefix directory = /x-tools/${CT\_TARGET}
- Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /tmp/patches
- Target options > Target Architecture = arm
- Target options > Architecture level = armv6 -- (+)
- Target options > Use specific FPU = vfp -- (+)
@ -152,8 +156,8 @@ For targets: `arm-unknown-linux-gnueabihf`
- Target options > Default instruction set mode = arm -- (+)
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 3.2.72 -- Precise kernel
- C-library > glibc version = 2.14.1
- C compiler > gcc version = 4.9.3
- C-library > glibc version = 2.16.0
- C compiler > gcc version = 5.2.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
### `armv7-linux-gnueabihf.config`
@ -161,6 +165,8 @@ For targets: `arm-unknown-linux-gnueabihf`
For targets: `armv7-unknown-linux-gnueabihf`
- Path and misc options > Prefix directory = /x-tools/${CT\_TARGET}
- Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /tmp/patches
- Target options > Target Architecture = arm
- Target options > Suffix to the arch-part = v7
- Target options > Architecture level = armv7-a -- (+)
@ -169,8 +175,8 @@ For targets: `armv7-unknown-linux-gnueabihf`
- Target options > Default instruction set mode = thumb -- (\*)
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 3.2.72 -- Precise kernel
- C-library > glibc version = 2.14.1
- C compiler > gcc version = 4.9.3
- C-library > glibc version = 2.16.0
- C compiler > gcc version = 5.2.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
(\*) These options have been selected to match the configuration of the arm
@ -204,7 +210,7 @@ For targets: `powerpc-unknown-linux-gnu`
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 2.6.32.68 -- ~RHEL6 kernel
- C-library > glibc version = 2.12.2 -- ~RHEL6 glibc
- C compiler > gcc version = 4.9.3
- C compiler > gcc version = 5.2.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
### `powerpc64-linux-gnu.config`
@ -221,7 +227,7 @@ For targets: `powerpc64-unknown-linux-gnu`
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 2.6.32.68 -- ~RHEL6 kernel
- C-library > glibc version = 2.12.2 -- ~RHEL6 glibc
- C compiler > gcc version = 4.9.3
- C compiler > gcc version = 5.2.0
- C compiler > C++ = ENABLE -- to cross compile LLVM
(+) These CPU options match the configuration of the toolchains in RHEL6.
@ -232,12 +238,12 @@ For targets: `s390x-unknown-linux-gnu`
- Path and misc options > Prefix directory = /x-tools/${CT\_TARGET}
- Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /build/patches
- Path and misc options > Local patch directory = /tmp/patches
- Target options > Target Architecture = s390
- Target options > Bitness = 64-bit
- Operating System > Target OS = linux
- Operating System > Linux kernel version = 2.6.32.68 -- ~RHEL6 kernel
- C-library > glibc version = 2.12.2 -- ~RHEL6 glibc
- C compiler > gcc version = 4.9.3
- C compiler > gcc version = 5.2.0
- C compiler > gcc extra config = --with-arch=z10 -- LLVM's minimum support
- C compiler > C++ = ENABLE -- to cross compile LLVM

View File

@ -23,7 +23,7 @@ RUN dpkg --add-architecture i386 && \
COPY scripts/android-sdk.sh /scripts/
RUN . /scripts/android-sdk.sh && \
download_and_create_avd 4333796 armeabi-v7a 18
download_and_create_avd 4333796 armeabi-v7a 18 5264690
ENV PATH=$PATH:/android/sdk/emulator
ENV PATH=$PATH:/android/sdk/tools

View File

@ -71,7 +71,8 @@ COPY scripts/qemu-bare-bones-addentropy.c /tmp/addentropy.c
RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static
# TODO: What is this?!
RUN curl -O http://ftp.nl.debian.org/debian/dists/jessie/main/installer-armhf/current/images/device-tree/vexpress-v2p-ca15-tc1.dtb
# Source of the file: https://github.com/vfdev-5/qemu-rpi2-vexpress/raw/master/vexpress-v2p-ca15-tc1.dtb
RUN curl -O https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/vexpress-v2p-ca15-tc1.dtb
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh

View File

@ -7,8 +7,8 @@ COPY scripts/crosstool-ng.sh /scripts/
RUN sh /scripts/crosstool-ng.sh
WORKDIR /tmp
COPY cross/install-x86_64-redox.sh /tmp/
RUN ./install-x86_64-redox.sh
COPY dist-various-1/install-x86_64-redox.sh /scripts/
RUN sh /scripts/install-x86_64-redox.sh
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh

View File

@ -16,6 +16,7 @@ RUN sh /scripts/rustbuild-setup.sh
USER rustbuild
WORKDIR /tmp
COPY dist-arm-linux/patches/ /tmp/patches/
COPY dist-arm-linux/arm-linux-gnueabi.config dist-arm-linux/build-toolchains.sh /tmp/
RUN ./build-toolchains.sh

View File

@ -3,6 +3,7 @@
# Crosstool-NG Configuration
#
CT_CONFIGURE_has_make381=y
CT_CONFIGURE_has_xz=y
CT_MODULES=y
#
@ -44,14 +45,16 @@ CT_CONNECT_TIMEOUT=10
# CT_FORCE_EXTRACT is not set
CT_OVERIDE_CONFIG_GUESS_SUB=y
# CT_ONLY_EXTRACT is not set
CT_PATCH_BUNDLED=y
# CT_PATCH_BUNDLED is not set
# CT_PATCH_LOCAL is not set
# CT_PATCH_BUNDLED_LOCAL is not set
CT_PATCH_BUNDLED_LOCAL=y
# CT_PATCH_LOCAL_BUNDLED is not set
# CT_PATCH_BUNDLED_FALLBACK_LOCAL is not set
# CT_PATCH_LOCAL_FALLBACK_BUNDLED is not set
# CT_PATCH_NONE is not set
CT_PATCH_ORDER="bundled"
CT_PATCH_ORDER="bundled,local"
CT_PATCH_USE_LOCAL=y
CT_LOCAL_PATCH_DIR="/tmp/patches"
#
# Build behavior
@ -391,8 +394,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -407,8 +410,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -420,7 +424,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -492,7 +496,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -500,7 +503,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -526,15 +528,13 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -0,0 +1,48 @@
commit bdb24c2851fd5f0ad9b82d7ea1db911d334b02d2
Author: Joseph Myers <joseph@codesourcery.com>
Date: Tue May 20 21:27:13 2014 +0000
Fix ARM build with GCC trunk.
sysdeps/unix/sysv/linux/arm/unwind-resume.c and
sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c have static
variables that are written in C code but only read from toplevel asms.
Current GCC trunk now optimizes away such apparently write-only static
variables, so causing a build failure. This patch marks those
variables with __attribute_used__ to avoid that optimization.
Tested that this fixes the build for ARM.
* sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c
(libgcc_s_resume): Use __attribute_used__.
* sysdeps/unix/sysv/linux/arm/unwind-resume.c (libgcc_s_resume):
Likewise.
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
index 29e2c2b00b04..e848bfeffdcb 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
@@ -22,7 +22,8 @@
#include <pthreadP.h>
static void *libgcc_s_handle;
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);
static _Unwind_Reason_Code (*libgcc_s_forcedunwind)
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
index 285b99b5ed0d..48d00fc83641 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
@@ -20,7 +20,8 @@
#include <stdio.h>
#include <unwind.h>
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);

View File

@ -16,6 +16,7 @@ RUN sh /scripts/rustbuild-setup.sh
USER rustbuild
WORKDIR /tmp
COPY dist-armhf-linux/patches/ /tmp/patches/
COPY dist-armhf-linux/arm-linux-gnueabihf.config dist-armhf-linux/build-toolchains.sh /tmp/
RUN ./build-toolchains.sh

View File

@ -3,6 +3,7 @@
# Crosstool-NG Configuration
#
CT_CONFIGURE_has_make381=y
CT_CONFIGURE_has_xz=y
CT_MODULES=y
#
@ -44,14 +45,16 @@ CT_CONNECT_TIMEOUT=10
# CT_FORCE_EXTRACT is not set
CT_OVERIDE_CONFIG_GUESS_SUB=y
# CT_ONLY_EXTRACT is not set
CT_PATCH_BUNDLED=y
# CT_PATCH_BUNDLED is not set
# CT_PATCH_LOCAL is not set
# CT_PATCH_BUNDLED_LOCAL is not set
CT_PATCH_BUNDLED_LOCAL=y
# CT_PATCH_LOCAL_BUNDLED is not set
# CT_PATCH_BUNDLED_FALLBACK_LOCAL is not set
# CT_PATCH_LOCAL_FALLBACK_BUNDLED is not set
# CT_PATCH_NONE is not set
CT_PATCH_ORDER="bundled"
CT_PATCH_ORDER="bundled,local"
CT_PATCH_USE_LOCAL=y
CT_LOCAL_PATCH_DIR="/tmp/patches"
#
# Build behavior
@ -392,8 +395,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -408,8 +411,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -421,7 +425,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -493,7 +497,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -501,7 +504,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -527,15 +529,13 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -0,0 +1,48 @@
commit bdb24c2851fd5f0ad9b82d7ea1db911d334b02d2
Author: Joseph Myers <joseph@codesourcery.com>
Date: Tue May 20 21:27:13 2014 +0000
Fix ARM build with GCC trunk.
sysdeps/unix/sysv/linux/arm/unwind-resume.c and
sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c have static
variables that are written in C code but only read from toplevel asms.
Current GCC trunk now optimizes away such apparently write-only static
variables, so causing a build failure. This patch marks those
variables with __attribute_used__ to avoid that optimization.
Tested that this fixes the build for ARM.
* sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c
(libgcc_s_resume): Use __attribute_used__.
* sysdeps/unix/sysv/linux/arm/unwind-resume.c (libgcc_s_resume):
Likewise.
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
index 29e2c2b00b04..e848bfeffdcb 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
@@ -22,7 +22,8 @@
#include <pthreadP.h>
static void *libgcc_s_handle;
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);
static _Unwind_Reason_Code (*libgcc_s_forcedunwind)
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
index 285b99b5ed0d..48d00fc83641 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
@@ -20,7 +20,8 @@
#include <stdio.h>
#include <unwind.h>
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);

View File

@ -16,6 +16,7 @@ RUN sh /scripts/rustbuild-setup.sh
USER rustbuild
WORKDIR /tmp
COPY dist-armv7-linux/patches/ /tmp/patches/
COPY dist-armv7-linux/build-toolchains.sh dist-armv7-linux/armv7-linux-gnueabihf.config /tmp/
RUN ./build-toolchains.sh

View File

@ -3,6 +3,7 @@
# Crosstool-NG Configuration
#
CT_CONFIGURE_has_make381=y
CT_CONFIGURE_has_xz=y
CT_MODULES=y
#
@ -44,14 +45,16 @@ CT_CONNECT_TIMEOUT=10
# CT_FORCE_EXTRACT is not set
CT_OVERIDE_CONFIG_GUESS_SUB=y
# CT_ONLY_EXTRACT is not set
CT_PATCH_BUNDLED=y
# CT_PATCH_BUNDLED is not set
# CT_PATCH_LOCAL is not set
# CT_PATCH_BUNDLED_LOCAL is not set
CT_PATCH_BUNDLED_LOCAL=y
# CT_PATCH_LOCAL_BUNDLED is not set
# CT_PATCH_BUNDLED_FALLBACK_LOCAL is not set
# CT_PATCH_LOCAL_FALLBACK_BUNDLED is not set
# CT_PATCH_NONE is not set
CT_PATCH_ORDER="bundled"
CT_PATCH_ORDER="bundled,local"
CT_PATCH_USE_LOCAL=y
CT_LOCAL_PATCH_DIR="/tmp/patches"
#
# Build behavior
@ -155,12 +158,6 @@ CT_ARCH_EXCLUSIVE_WITH_CPU=y
# CT_ARCH_FLOAT_AUTO is not set
# CT_ARCH_FLOAT_SOFTFP is not set
CT_ARCH_FLOAT="hard"
# CT_ARCH_ALPHA_EV4 is not set
# CT_ARCH_ALPHA_EV45 is not set
# CT_ARCH_ALPHA_EV5 is not set
# CT_ARCH_ALPHA_EV56 is not set
# CT_ARCH_ALPHA_EV6 is not set
# CT_ARCH_ALPHA_EV67 is not set
#
# arm other options
@ -311,8 +308,6 @@ CT_LIBC="glibc"
CT_LIBC_VERSION="2.16.0"
CT_LIBC_glibc=y
# CT_LIBC_musl is not set
# CT_LIBC_newlib is not set
# CT_LIBC_none is not set
# CT_LIBC_uClibc is not set
CT_LIBC_avr_libc_AVAILABLE=y
CT_LIBC_glibc_AVAILABLE=y
@ -400,8 +395,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -416,8 +411,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -429,7 +425,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -501,7 +497,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -509,7 +504,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -535,15 +529,13 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -0,0 +1,48 @@
commit bdb24c2851fd5f0ad9b82d7ea1db911d334b02d2
Author: Joseph Myers <joseph@codesourcery.com>
Date: Tue May 20 21:27:13 2014 +0000
Fix ARM build with GCC trunk.
sysdeps/unix/sysv/linux/arm/unwind-resume.c and
sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c have static
variables that are written in C code but only read from toplevel asms.
Current GCC trunk now optimizes away such apparently write-only static
variables, so causing a build failure. This patch marks those
variables with __attribute_used__ to avoid that optimization.
Tested that this fixes the build for ARM.
* sysdeps/unix/sysv/linux/arm/unwind-forcedunwind.c
(libgcc_s_resume): Use __attribute_used__.
* sysdeps/unix/sysv/linux/arm/unwind-resume.c (libgcc_s_resume):
Likewise.
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
index 29e2c2b00b04..e848bfeffdcb 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-forcedunwind.c
@@ -22,7 +22,8 @@
#include <pthreadP.h>
static void *libgcc_s_handle;
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);
static _Unwind_Reason_Code (*libgcc_s_forcedunwind)
diff --git a/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c b/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
index 285b99b5ed0d..48d00fc83641 100644
--- a/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
+++ b/ports/sysdeps/unix/sysv/linux/arm/nptl/unwind-resume.c
@@ -20,7 +20,8 @@
#include <stdio.h>
#include <unwind.h>
-static void (*libgcc_s_resume) (struct _Unwind_Exception *exc);
+static void (*libgcc_s_resume) (struct _Unwind_Exception *exc)
+ __attribute_used__;
static _Unwind_Reason_Code (*libgcc_s_personality)
(_Unwind_State, struct _Unwind_Exception *, struct _Unwind_Context *);

View File

@ -0,0 +1,26 @@
diff --git a/configure b/configure
index b6752d147c6b..6089a3403410 100755
--- a/configure
+++ b/configure
@@ -5079,7 +5079,7 @@ $as_echo_n "checking version of $CC... " >&6; }
ac_prog_version=`$CC -v 2>&1 | sed -n 's/^.*version \([egcygnustpi-]*[0-9.]*\).*$/\1/p'`
case $ac_prog_version in
'') ac_prog_version="v. ?.??, bad"; ac_verc_fail=yes;;
- 3.4* | 4.[0-9]* )
+ 3.4* | [4-9].* )
ac_prog_version="$ac_prog_version, ok"; ac_verc_fail=no;;
*) ac_prog_version="$ac_prog_version, bad"; ac_verc_fail=yes;;
diff --git a/configure.in b/configure.in
index 56849dfc489a..09677eb3d0c1 100644
--- a/configure.in
+++ b/configure.in
@@ -960,7 +960,7 @@ fi
# These programs are version sensitive.
AC_CHECK_TOOL_PREFIX
AC_CHECK_PROG_VER(CC, ${ac_tool_prefix}gcc ${ac_tool_prefix}cc, -v,
- [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | 4.[0-9]* ],
+ [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | [4-9].* ],
critic_missing="$critic_missing gcc")
AC_CHECK_PROG_VER(MAKE, gnumake gmake make, --version,
[GNU Make[^0-9]*\([0-9][0-9.]*\)],

View File

@ -359,8 +359,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -375,8 +375,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -388,7 +389,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -460,7 +461,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -468,7 +468,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -494,15 +493,13 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -0,0 +1,26 @@
diff --git a/configure b/configure
index b6752d147c6b..6089a3403410 100755
--- a/configure
+++ b/configure
@@ -5079,7 +5079,7 @@ $as_echo_n "checking version of $CC... " >&6; }
ac_prog_version=`$CC -v 2>&1 | sed -n 's/^.*version \([egcygnustpi-]*[0-9.]*\).*$/\1/p'`
case $ac_prog_version in
'') ac_prog_version="v. ?.??, bad"; ac_verc_fail=yes;;
- 3.4* | 4.[0-9]* )
+ 3.4* | [4-9].* )
ac_prog_version="$ac_prog_version, ok"; ac_verc_fail=no;;
*) ac_prog_version="$ac_prog_version, bad"; ac_verc_fail=yes;;
diff --git a/configure.in b/configure.in
index 56849dfc489a..09677eb3d0c1 100644
--- a/configure.in
+++ b/configure.in
@@ -960,7 +960,7 @@ fi
# These programs are version sensitive.
AC_CHECK_TOOL_PREFIX
AC_CHECK_PROG_VER(CC, ${ac_tool_prefix}gcc ${ac_tool_prefix}cc, -v,
- [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | 4.[0-9]* ],
+ [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | [4-9].* ],
critic_missing="$critic_missing gcc")
AC_CHECK_PROG_VER(MAKE, gnumake gmake make, --version,
[GNU Make[^0-9]*\([0-9][0-9.]*\)],

View File

@ -359,8 +359,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -375,8 +375,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -388,7 +389,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -460,7 +461,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -468,7 +468,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -494,15 +493,10 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -0,0 +1,26 @@
diff --git a/configure b/configure
index b6752d147c6b..6089a3403410 100755
--- a/configure
+++ b/configure
@@ -5079,7 +5079,7 @@ $as_echo_n "checking version of $CC... " >&6; }
ac_prog_version=`$CC -v 2>&1 | sed -n 's/^.*version \([egcygnustpi-]*[0-9.]*\).*$/\1/p'`
case $ac_prog_version in
'') ac_prog_version="v. ?.??, bad"; ac_verc_fail=yes;;
- 3.4* | 4.[0-9]* )
+ 3.4* | [4-9].* )
ac_prog_version="$ac_prog_version, ok"; ac_verc_fail=no;;
*) ac_prog_version="$ac_prog_version, bad"; ac_verc_fail=yes;;
diff --git a/configure.in b/configure.in
index 56849dfc489a..09677eb3d0c1 100644
--- a/configure.in
+++ b/configure.in
@@ -960,7 +960,7 @@ fi
# These programs are version sensitive.
AC_CHECK_TOOL_PREFIX
AC_CHECK_PROG_VER(CC, ${ac_tool_prefix}gcc ${ac_tool_prefix}cc, -v,
- [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | 4.[0-9]* ],
+ [version \([egcygnustpi-]*[0-9.]*\)], [3.4* | [4-9].* ],
critic_missing="$critic_missing gcc")
AC_CHECK_PROG_VER(MAKE, gnumake gmake make, --version,
[GNU Make[^0-9]*\([0-9][0-9.]*\)],

View File

@ -339,8 +339,8 @@ CT_CC_CORE_PASS_1_NEEDED=y
CT_CC_CORE_PASS_2_NEEDED=y
CT_CC_gcc=y
# CT_CC_GCC_SHOW_LINARO is not set
# CT_CC_GCC_V_5_2_0 is not set
CT_CC_GCC_V_4_9_3=y
CT_CC_GCC_V_5_2_0=y
# CT_CC_GCC_V_4_9_3 is not set
# CT_CC_GCC_V_4_8_5 is not set
# CT_CC_GCC_V_4_7_4 is not set
# CT_CC_GCC_V_4_6_4 is not set
@ -355,8 +355,9 @@ CT_CC_GCC_4_5_or_later=y
CT_CC_GCC_4_6_or_later=y
CT_CC_GCC_4_7_or_later=y
CT_CC_GCC_4_8_or_later=y
CT_CC_GCC_4_9=y
CT_CC_GCC_4_9_or_later=y
CT_CC_GCC_5=y
CT_CC_GCC_5_or_later=y
CT_CC_GCC_HAS_GRAPHITE=y
CT_CC_GCC_USE_GRAPHITE=y
CT_CC_GCC_HAS_LTO=y
@ -368,7 +369,7 @@ CT_CC_GCC_USE_GMP_MPFR=y
CT_CC_GCC_USE_MPC=y
CT_CC_GCC_HAS_LIBQUADMATH=y
CT_CC_GCC_HAS_LIBSANITIZER=y
CT_CC_GCC_VERSION="4.9.3"
CT_CC_GCC_VERSION="5.2.0"
# CT_CC_LANG_FORTRAN is not set
CT_CC_GCC_ENABLE_CXX_FLAGS=""
CT_CC_GCC_CORE_EXTRA_CONFIG_ARRAY=""
@ -440,7 +441,6 @@ CT_GETTEXT_NEEDED=y
CT_GMP_NEEDED=y
CT_MPFR_NEEDED=y
CT_ISL_NEEDED=y
CT_CLOOG_NEEDED=y
CT_MPC_NEEDED=y
CT_COMPLIBS=y
CT_LIBICONV=y
@ -448,7 +448,6 @@ CT_GETTEXT=y
CT_GMP=y
CT_MPFR=y
CT_ISL=y
CT_CLOOG=y
CT_MPC=y
CT_LIBICONV_V_1_14=y
CT_LIBICONV_VERSION="1.14"
@ -474,15 +473,13 @@ CT_MPFR_V_3_1_3=y
# CT_MPFR_V_2_4_0 is not set
CT_MPFR_VERSION="3.1.3"
CT_ISL_V_0_14=y
# CT_ISL_V_0_12_2 is not set
CT_ISL_V_0_14_or_later=y
CT_ISL_V_0_12_or_later=y
CT_ISL_VERSION="0.14"
CT_CLOOG_V_0_18_4=y
# CT_CLOOG_V_0_18_4 is not set
# CT_CLOOG_V_0_18_1 is not set
# CT_CLOOG_V_0_18_0 is not set
CT_CLOOG_VERSION="0.18.4"
CT_CLOOG_0_18_4_or_later=y
CT_CLOOG_0_18_or_later=y
CT_MPC_V_1_0_3=y
# CT_MPC_V_1_0_2 is not set
# CT_MPC_V_1_0_1 is not set

View File

@ -112,6 +112,8 @@ ENV TARGETS=$TARGETS,thumbv7em-none-eabihf
ENV TARGETS=$TARGETS,thumbv8m.main-none-eabi
ENV TARGETS=$TARGETS,riscv32imc-unknown-none-elf
ENV TARGETS=$TARGETS,riscv32imac-unknown-none-elf
ENV TARGETS=$TARGETS,riscv64imac-unknown-none-elf
ENV TARGETS=$TARGETS,riscv64gc-unknown-none-elf
ENV TARGETS=$TARGETS,armebv7r-none-eabi
ENV TARGETS=$TARGETS,armebv7r-none-eabihf
ENV TARGETS=$TARGETS,armv7r-none-eabi

View File

@ -3,11 +3,5 @@
set -ex
apt-get update
apt-get install -y --no-install-recommends software-properties-common apt-transport-https
apt-key adv --batch --yes --keyserver keyserver.ubuntu.com --recv-keys AA12E97F0881517F
add-apt-repository -y 'deb https://static.redox-os.org/toolchain/apt /'
apt-get update
apt-get install -y x86-64-unknown-redox-gcc
curl https://static.redox-os.org/toolchain/x86_64-unknown-redox/relibc-install.tar.gz | \
tar --extract --gzip --directory /usr/local

View File

@ -32,7 +32,7 @@ RUN /tmp/build-solaris-toolchain.sh sparcv9 sparcv9 solaris-sparc
COPY dist-various-2/build-x86_64-fortanix-unknown-sgx-toolchain.sh /tmp/
# We pass the commit id of the port of LLVM's libunwind to the build script.
# Any update to the commit id here, should cause the container image to be re-built from this point on.
RUN /tmp/build-x86_64-fortanix-unknown-sgx-toolchain.sh "bbe23902411be88d7388f381becefadd6e3ef819"
RUN /tmp/build-x86_64-fortanix-unknown-sgx-toolchain.sh "53b586346f2c7870e20b170decdc30729d97c42b"
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
@ -70,6 +70,7 @@ ENV TARGETS=$TARGETS,x86_64-sun-solaris
ENV TARGETS=$TARGETS,x86_64-unknown-linux-gnux32
ENV TARGETS=$TARGETS,x86_64-unknown-cloudabi
ENV TARGETS=$TARGETS,x86_64-fortanix-unknown-sgx
ENV TARGETS=$TARGETS,nvptx64-nvidia-cuda
ENV X86_FORTANIX_SGX_LIBS="/x86_64-fortanix-unknown-sgx/lib/"

View File

@ -32,9 +32,8 @@ ln -s ../lib/llvm-5.0/bin/lld /usr/bin/${target}-ld
ln -s ../../${target} /usr/lib/llvm-5.0/${target}
# Install the C++ runtime libraries from CloudABI Ports.
echo deb https://nuxi.nl/distfiles/cloudabi-ports/debian/ cloudabi cloudabi > \
/etc/apt/sources.list.d/cloudabi.list
curl 'https://pgp.mit.edu/pks/lookup?op=get&search=0x0DA51B8531344B15' | \
apt-key add -
apt-key adv --batch --yes --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 0DA51B8531344B15
add-apt-repository -y 'deb https://nuxi.nl/distfiles/cloudabi-ports/debian/ cloudabi cloudabi'
apt-get update
apt-get install -y $(echo ${target} | sed -e s/_/-/g)-cxx-runtime
apt-get install -y "${target//_/-}-cxx-runtime"

View File

@ -12,8 +12,7 @@ target="x86_64-fortanix-unknown-sgx"
url="https://github.com/fortanix/llvm-project/archive/${1}.tar.gz"
repo_name="llvm-project"
install_prereq()
{
install_prereq() {
apt-get update
apt-get install -y --no-install-recommends \
build-essential \
@ -22,36 +21,32 @@ install_prereq()
git
}
# Clone Fortanix's port of llvm-project to build libunwind that would link with this target.
# The below method to download a single commit from llvm-project is based on fetch_submodule
# from init_repo.sh
fetch_llvm_commit()
{
cached="download-${repo_name}.tar.gz"
curl -f -sSL -o ${cached} ${url}
tar -xvzf ${cached}
mkdir "./${repo_name}" && tar -xf ${cached} -C ${repo_name} --strip-components 1
}
build_unwind()
{
build_unwind() {
set -x
dir_name="${target}_temp"
rm -rf "./${dir_name}"
rm -rf ${dir_name}
mkdir -p ${dir_name}
cd ${dir_name}
pushd ${dir_name}
retry fetch_llvm_commit
# Clone Fortanix's fork of llvm-project which has a port of libunwind
fetch_github_commit_archive "$repo_name" "$url"
cd "${repo_name}/libunwind"
# Build libunwind
mkdir -p build
cd build
cmake -DCMAKE_BUILD_TYPE="RELEASE" -DRUST_SGX=1 -G "Unix Makefiles" -DLLVM_PATH=../../llvm/ ../
cmake -DCMAKE_BUILD_TYPE="RELEASE" -DRUST_SGX=1 -G "Unix Makefiles" \
-DLLVM_ENABLE_WARNINGS=1 -DLIBUNWIND_ENABLE_WERROR=1 -DLIBUNWIND_ENABLE_PEDANTIC=0 \
-DLLVM_PATH=../../llvm/ ../
make unwind_static
install -D "lib/libunwind.a" "/${target}/lib/libunwind.a"
popd
rm -rf ${dir_name}
{ set +x; } 2>/dev/null
}
set -x
hide_output install_prereq
hide_output build_unwind
build_unwind

View File

@ -1,5 +1,5 @@
hide_output() {
set +x
{ set +x; } 2>/dev/null
on_err="
echo ERROR: An error was encountered with the build.
cat /tmp/build.log
@ -14,6 +14,7 @@ exit 1
set -x
}
# Copied from ../../shared.sh
function retry {
echo "Attempting with retry:" "$@"
local n=1
@ -31,3 +32,15 @@ function retry {
}
done
}
# Copied from ../../init_repo.sh
function fetch_github_commit_archive {
local module=$1
local cached="download-${module//\//-}.tar.gz"
retry sh -c "rm -f $cached && \
curl -f -sSL -o $cached $2"
mkdir $module
touch "$module/.git"
tar -C $module --strip-components=1 -xf $cached
rm $cached
}

View File

@ -4,26 +4,14 @@ set -ex
source shared.sh
# Currently these commits are all tip-of-tree as of 2018-12-16, used to pick up
# a fix for rust-lang/rust#56849
LLVM=032b00a5404865765cda7db3039f39d54964d8b0
LLD=3e4aa4e8671523321af51449e0569f455ef3ad43
CLANG=a6b9739069763243020f4ea6fe586bc135fde1f9
LLVM=llvmorg-8.0.0-rc2
mkdir clang
cd clang
mkdir llvm-project
cd llvm-project
curl -L https://github.com/llvm-mirror/llvm/archive/$LLVM.tar.gz | \
curl -L https://github.com/llvm/llvm-project/archive/$LLVM.tar.gz | \
tar xzf - --strip-components=1
mkdir -p tools/clang
curl -L https://github.com/llvm-mirror/clang/archive/$CLANG.tar.gz | \
tar xzf - --strip-components=1 -C tools/clang
mkdir -p tools/lld
curl -L https://github.com/llvm-mirror/lld/archive/$LLD.tar.gz | \
tar zxf - --strip-components=1 -C tools/lld
mkdir clang-build
cd clang-build
@ -39,20 +27,21 @@ cd clang-build
#
# [1]: https://sourceware.org/ml/crossgcc/2008-11/msg00028.html
INC="/rustroot/include"
INC="$INC:/rustroot/lib/gcc/x86_64-unknown-linux-gnu/4.8.5/include-fixed"
INC="$INC:/rustroot/lib/gcc/x86_64-unknown-linux-gnu/5.5.0/include-fixed"
INC="$INC:/usr/include"
hide_output \
cmake .. \
cmake ../llvm \
-DCMAKE_C_COMPILER=/rustroot/bin/gcc \
-DCMAKE_CXX_COMPILER=/rustroot/bin/g++ \
-DCMAKE_BUILD_TYPE=Release \
-DCMAKE_INSTALL_PREFIX=/rustroot \
-DLLVM_TARGETS_TO_BUILD=X86 \
-DLLVM_ENABLE_PROJECTS="clang;lld" \
-DC_INCLUDE_DIRS="$INC"
hide_output make -j10
hide_output make install
cd ../..
rm -rf clang
rm -rf llvm-project

View File

@ -3,9 +3,9 @@ set -ex
source shared.sh
GCC=4.8.5
GCC=5.5.0
curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.bz2 | tar xjf -
curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.xz | xzcat | tar xf -
cd gcc-$GCC
# FIXME(#49246): Remove the `sed` below.

View File

@ -11,7 +11,8 @@ cd perl-5.28.0
# Gotta do some hackery to tell python about our custom OpenSSL build, but other
# than that fairly normal.
CC=gcc \
CFLAGS='-I /rustroot/include' LDFLAGS='-L /rustroot/lib -L /rustroot/lib64' \
CFLAGS='-I /rustroot/include -fgnu89-inline' \
LDFLAGS='-L /rustroot/lib -L /rustroot/lib64' \
hide_output ./configure.gnu
hide_output make -j10
hide_output make install

View File

@ -20,9 +20,9 @@ travis_time_start
if [ -f "$docker_dir/$image/Dockerfile" ]; then
if [ "$CI" != "" ]; then
hash_key=/tmp/.docker-hash-key.txt
find $docker_dir/$image $docker_dir/scripts -type f | \
sort | \
xargs cat >> $hash_key
rm -f "${hash_key}"
echo $image >> $hash_key
find $docker_dir -type f | sort | xargs cat >> $hash_key
docker --version >> $hash_key
cksum=$(sha512sum $hash_key | \
awk '{print $1}')
@ -31,7 +31,7 @@ if [ -f "$docker_dir/$image/Dockerfile" ]; then
echo "Attempting to download $s3url"
rm -f /tmp/rustci_docker_cache
set +e
retry curl -f -L -C - -o /tmp/rustci_docker_cache "$url"
retry curl -y 30 -Y 10 --connect-timeout 30 -f -L -C - -o /tmp/rustci_docker_cache "$url"
loaded_images=$(docker load -i /tmp/rustci_docker_cache | sed 's/.* sha/sha/')
set -e
echo "Downloaded containers:\n$loaded_images"

View File

@ -20,11 +20,19 @@ download_sysimage() {
# The output from sdkmanager is so noisy that it will occupy all of the 4 MB
# log extremely quickly. Thus we must silence all output.
yes | sdkmanager --licenses > /dev/null
yes | sdkmanager platform-tools emulator \
yes | sdkmanager platform-tools \
"platforms;android-$api" \
"system-images;android-$api;default;$abi" > /dev/null
}
download_emulator() {
# Download a pinned version of the emulator since upgrades can cause issues
curl -fo emulator.zip "https://dl.google.com/android/repository/emulator-linux-$1.zip"
rm -rf "${ANDROID_HOME}/emulator"
unzip -q emulator.zip -d "${ANDROID_HOME}"
rm -f emulator.zip
}
create_avd() {
abi=$1
api=$2
@ -40,11 +48,12 @@ download_and_create_avd() {
download_sdk $1
download_sysimage $2 $3
create_avd $2 $3
download_emulator $4
}
# Usage:
#
# setup_android_sdk 4333796 armeabi-v7a 18
# download_and_create_avd 4333796 armeabi-v7a 18 5264690
#
# 4333796 =>
# SDK tool version.
@ -53,3 +62,6 @@ download_and_create_avd() {
# System image ABI
# 18 =>
# Android API Level (18 = Android 4.3 = Jelly Bean MR2)
# 5264690 =>
# Android Emulator version.
# Copy from the "build_id" in the `/android/sdk/emulator/emulator -version` output

View File

@ -1,4 +1,6 @@
#!/bin/bash
# ignore-tidy-linelength
set -eux
arch=$1
@ -55,7 +57,9 @@ for lib in c++ c_nonshared compiler_rt execinfo gcc pthread rt ssp_nonshared; do
files_to_extract=("${files_to_extract[@]}" "./usr/lib/lib${lib}.*")
done
URL=https://download.freebsd.org/ftp/releases/${freebsd_arch}/${freebsd_version}-RELEASE/base.txz
# Originally downloaded from:
# https://download.freebsd.org/ftp/releases/${freebsd_arch}/${freebsd_version}-RELEASE/base.txz
URL=https://s3-us-west-1.amazonaws.com/rust-lang-ci2/rust-ci-mirror/2019-04-04-freebsd-${freebsd_arch}-${freebsd_version}-RELEASE-base.txz
curl "$URL" | tar xJf - -C "$sysroot" --wildcards "${files_to_extract[@]}"
# Fix up absolute symlinks from the system image. This can be removed

View File

@ -13,14 +13,16 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
gdb \
xz-utils
# FIXME: build the `ptx-linker` instead.
RUN curl -sL https://github.com/denzp/rust-ptx-linker/releases/download/v0.9.0-alpha.2/rust-ptx-linker.linux64.tar.gz | \
tar -xzvC /usr/bin
RUN curl -sL https://nodejs.org/dist/v9.2.0/node-v9.2.0-linux-x64.tar.xz | \
tar -xJ
tar -xJ
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
ENV TARGETS=wasm32-unknown-unknown
ENV RUST_CONFIGURE_ARGS \
--set build.nodejs=/node-v9.2.0-linux-x64/bin/node \
--set rust.lld
@ -31,11 +33,18 @@ ENV RUST_CONFIGURE_ARGS \
# other contexts as well
ENV NO_DEBUG_ASSERTIONS=1
ENV SCRIPT python2.7 /checkout/x.py test --target $TARGETS \
ENV WASM_TARGETS=wasm32-unknown-unknown
ENV WASM_SCRIPT python2.7 /checkout/x.py test --target $WASM_TARGETS \
src/test/run-make \
src/test/ui \
src/test/run-pass \
src/test/compile-fail \
src/test/mir-opt \
src/test/codegen-units \
src/libcore \
src/libcore
ENV NVPTX_TARGETS=nvptx64-nvidia-cuda
ENV NVPTX_SCRIPT python2.7 /checkout/x.py test --target $NVPTX_TARGETS \
src/test/run-make
ENV SCRIPT $WASM_SCRIPT && $NVPTX_SCRIPT

View File

@ -1,4 +1,4 @@
FROM ubuntu:16.04
FROM ubuntu:18.10
RUN apt-get update && apt-get install -y --no-install-recommends \
g++ \
@ -7,18 +7,37 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
ca-certificates \
python2.7 \
python2.7-dev \
libxml2-dev \
libncurses-dev \
libedit-dev \
swig \
doxygen \
git \
cmake \
sudo \
gdb \
xz-utils
xz-utils \
lld \
clang
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
ENV RUSTBUILD_FORCE_CLANG_BASED_TESTS 1
ENV RUN_CHECK_WITH_PARALLEL_QUERIES 1
ENV RUST_CONFIGURE_ARGS \
--build=x86_64-unknown-linux-gnu \
--enable-debug \
--enable-optimize
ENV SCRIPT python2.7 ../x.py build
--enable-lld \
--enable-lldb \
--enable-optimize \
--set llvm.use-linker=lld \
--set target.x86_64-unknown-linux-gnu.linker=clang \
--set target.x86_64-unknown-linux-gnu.cc=clang \
--set target.x86_64-unknown-linux-gnu.cxx=clang++
ENV SCRIPT \
python2.7 ../x.py build && \
python2.7 ../x.py test src/test/run-make-fulldeps --test-args clang

View File

@ -23,6 +23,7 @@ python2.7 "$X_PY" test --no-fail-fast \
src/doc/nomicon \
src/doc/reference \
src/doc/rust-by-example \
src/doc/embedded-book \
src/tools/clippy \
src/tools/rls \
src/tools/rustfmt \

View File

@ -34,19 +34,19 @@ if grep -q RUST_RELEASE_CHANNEL=beta src/ci/run.sh; then
git fetch origin --unshallow beta master
fi
function fetch_submodule {
# Duplicated in docker/dist-various-2/shared.sh
function fetch_github_commit_archive {
local module=$1
local cached="download-${module//\//-}.tar.gz"
retry sh -c "rm -f $cached && \
curl -sSL -o $cached $2"
curl -f -sSL -o $cached $2"
mkdir $module
touch "$module/.git"
tar -C $module --strip-components=1 -xf $cached
rm $cached
}
included="src/llvm src/llvm-emscripten src/doc/book src/doc/rust-by-example"
included="$included src/tools/lld src/tools/clang src/tools/lldb"
included="src/llvm-project src/llvm-emscripten src/doc/book src/doc/rust-by-example"
modules="$(git config --file .gitmodules --get-regexp '\.path$' | cut -d' ' -f2)"
modules=($modules)
use_git=""
@ -59,7 +59,7 @@ for i in ${!modules[@]}; do
git rm $module
url=${urls[$i]}
url=${url/\.git/}
fetch_submodule $module "$url/archive/$commit.tar.gz" &
fetch_github_commit_archive $module "$url/archive/$commit.tar.gz" &
continue
else
use_git="$use_git $module"

View File

@ -82,7 +82,7 @@ fi
SCCACHE_IDLE_TIMEOUT=10800 sccache --start-server || true
if [ "$RUN_CHECK_WITH_PARALLEL_QUERIES" != "" ]; then
$SRC/configure --enable-experimental-parallel-queries
$SRC/configure --enable-parallel-compiler
CARGO_INCREMENTAL=0 python2.7 ../x.py check
rm -f config.toml
rm -rf build

View File

@ -5,6 +5,7 @@
# marked as an executable file in git.
# See http://unix.stackexchange.com/questions/82598
# Duplicated in docker/dist-various-2/shared.sh
function retry {
echo "Attempting with retry:" "$@"
local n=1

View File

@ -1,22 +0,0 @@
## What to expect when you file an issue here
Thank you for caring about the quality of the book! Each edition has
different types of issues we can accept, please read on for details.
### 2018 edition
This version of the book is under development, please file issues liberally!
### Second edition
No Starch Press has brought the second edition to print. Bugs containing
factual errors will be documented as errata; bugs for wording changes or
other small corrections should be filed against the 2018 edition.
### First edition
The first edition of the book is frozen, and bugs filed against it will
be closed.
Thank you for reading, you may now delete this text!

View File

@ -1,24 +0,0 @@
## What to expect when you open a pull request here
### 2018 Edition
The 2018 is a "living" edition; it's not scheduled for in-print publication at
this time, and so is able to be updated at any time. We'd love pull requests to
fix issues with this edition, but we're not interested in extremely large
changes without discussing them first. If you'd like to make a big change,
please open an issue first! We'd hate for you to do some hard work that we
ultimately wouldn't accept.
### Second edition
No Starch Press has brought the second edition to print. Pull requests fixing
factual errors will be accepted and documented as errata; pull requests changing
wording or other small corrections should be made against the 2018 edition instead.
### First edition
The first edition is frozen, and no longer accepting changes. Pull requests
made against it will be closed.
Thank you for reading, you may now delete this text!

View File

@ -3,7 +3,7 @@ dist: trusty
language: rust
cache: cargo
rust:
- beta # Change this to stable when Rust 1.31.0 is out
- stable
branches:
only:
- master

View File

@ -1,13 +1,4 @@
#!/bin/bash
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -eu

View File

@ -1,13 +1,4 @@
#!/bin/bash
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -eu
@ -16,18 +7,18 @@ cargo build --release
mkdir -p tmp
rm -rf tmp/*.md
# Get all the markdown files in the src dir,
# Get all the Markdown files in the src dir,
ls src/${1:-""}*.md | \
# except for SUMMARY.md.
# except for `SUMMARY.md`.
grep -v SUMMARY.md | \
# Extract just the filename so we can reuse it easily.
xargs -n 1 basename | \
# Remove all links followed by <!-- ignore -->, then
# Change all remaining links from markdown to italicized inline text.
# Remove all links followed by `<!-- ignore -->``, then
# Change all remaining links from Markdown to italicized inline text.
while IFS= read -r filename; do
< "src/$filename" ./target/release/remove_links \
| ./target/release/link2print \
| ./target/release/remove_markup > "tmp/$filename"
done
# Concat the files into the nostarch dir.
# Concatenate the files into the `nostarch` dir.
./target/release/concat_chapters tmp nostarch

View File

@ -139,12 +139,12 @@ called crate roots because the contents of either of these two files form a
module named `crate` at the root of the crates module tree. So in Listing 7-2,
we have a module tree that looks like Listing 7-3:
```
```text
crate
└── sound
── instrument
└── woodwind
└── voice
└── sound
── instrument
└── woodwind
└── voice
```
Listing 7-3: The module tree for the code in Listing 7-2

View File

@ -1,13 +1,3 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#[macro_use] extern crate lazy_static;
extern crate regex;

View File

@ -1,13 +1,3 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::io;
use std::io::{Read, Write};
@ -22,7 +12,6 @@ fn main() {
}
for line in buffer.lines() {
if line.is_empty() {
is_in_inline_code = false;
}

View File

@ -1,20 +1,11 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// We have some long regex literals, so:
// ignore-tidy-linelength
extern crate rustc_serialize;
extern crate docopt;
use docopt::Docopt;
extern crate rustc_serialize;
extern crate walkdir;
use docopt::Docopt;
use std::{path, fs, io};
use std::io::{BufRead, Write};

View File

@ -1,15 +1,4 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// FIXME: We have some long lines that could be refactored, but it's not a big deal.
// FIXME: we have some long lines that could be refactored, but it's not a big deal.
// ignore-tidy-linelength
extern crate regex;
@ -20,7 +9,6 @@ use std::io::{Read, Write};
use regex::{Regex, Captures};
fn main() {
write_md(parse_links(parse_references(read_md())));
}
@ -38,7 +26,7 @@ fn write_md(output: String) {
fn parse_references(buffer: String) -> (String, HashMap<String, String>) {
let mut ref_map = HashMap::new();
// FIXME: Currently doesn't handle "title" in following line
// FIXME: currently doesn't handle "title" in following line.
let re = Regex::new(r###"(?m)\n?^ {0,3}\[([^]]+)\]:[[:blank:]]*(.*)$"###).unwrap();
let output = re.replace_all(&buffer, |caps: &Captures| {
let key = caps.at(1).unwrap().to_owned().to_uppercase();
@ -52,7 +40,7 @@ fn parse_references(buffer: String) -> (String, HashMap<String, String>) {
}
fn parse_links((buffer, ref_map): (String, HashMap<String, String>)) -> String {
// FIXME: check which punctuation is allowed by spec
// FIXME: check which punctuation is allowed by spec.
let re = Regex::new(r###"(?:(?P<pre>(?:```(?:[^`]|`[^`])*`?\n```\n)|(?:[^[]`[^`\n]+[\n]?[^`\n]*`))|(?:\[(?P<name>[^]]+)\](?:(?:\([[:blank:]]*(?P<val>[^")]*[^ ])(?:[[:blank:]]*"[^"]*")?\))|(?:\[(?P<key>[^]]*)\]))?))"###).expect("could not create regex");
let error_code = Regex::new(r###"^E\d{4}$"###).expect("could not create regex");
let output = re.replace_all(&buffer, |caps: &Captures| {
@ -62,7 +50,7 @@ fn parse_links((buffer, ref_map): (String, HashMap<String, String>)) -> String {
let name = caps.name("name").expect("could not get name").to_owned();
// Really we should ignore text inside code blocks,
// this is a hack to not try to treat `#[derive()]`,
// `[profile]`, `[test]`, or `[E\d\d\d\d]` like a link
// `[profile]`, `[test]`, or `[E\d\d\d\d]` like a link.
if name.starts_with("derive(") ||
name.starts_with("profile") ||
name.starts_with("test") ||
@ -71,19 +59,19 @@ fn parse_links((buffer, ref_map): (String, HashMap<String, String>)) -> String {
}
let val = match caps.name("val") {
// [name](link)
// `[name](link)`
Some(value) => value.to_owned(),
None => {
match caps.name("key") {
Some(key) => {
match key {
// [name][]
// `[name][]`
"" => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
// [name][reference]
// `[name][reference]`
_ => format!("{}", ref_map.get(&key.to_uppercase()).expect(&format!("could not find url for the link text `{}`", key))),
}
}
// [name] as reference
// `[name]` as reference
None => format!("{}", ref_map.get(&name.to_uppercase()).expect(&format!("could not find url for the link text `{}`", name))),
}
}
@ -415,7 +403,4 @@ Some text to show that the reference links can follow later.
.to_string();
assert_eq!(parse(source), target);
}
}

View File

@ -1,19 +1,9 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate regex;
use std::collections::HashSet;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
use std::collections::HashSet;
fn main () {
let mut buffer = String::new();
@ -23,31 +13,31 @@ fn main () {
let mut refs = HashSet::new();
// capture all links and link references
// Capture all links and link references.
let regex = r"\[([^\]]+)\](?:(?:\[([^\]]+)\])|(?:\([^\)]+\)))(?i)<!-- ignore -->";
let link_regex = Regex::new(regex).unwrap();
let first_pass = link_regex.replace_all(&buffer, |caps: &Captures| {
// save the link reference we want to delete
// Save the link reference we want to delete.
if let Some(reference) = caps.at(2) {
refs.insert(reference.to_owned());
}
// put the link title back
// Put the link title back.
caps.at(1).unwrap().to_owned()
});
// search for the references we need to delete
// Search for the references we need to delete.
let ref_regex = Regex::new(r"\n\[([^\]]+)\]:\s.*\n").unwrap();
let out = ref_regex.replace_all(&first_pass, |caps: &Captures| {
let capture = caps.at(1).unwrap().to_owned();
// check if we've marked this reference for deletion...
// Check if we've marked this reference for deletion ...
if refs.contains(capture.as_str()) {
return "".to_string();
}
//... else we put back everything we captured
// ... else we put back everything we captured.
caps.at(0).unwrap().to_owned()
});

View File

@ -1,14 +1,5 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate regex;
use std::io;
use std::io::{Read, Write};
use regex::{Regex, Captures};
@ -31,23 +22,23 @@ fn write_md(output: String) {
fn remove_markup(input: String) -> String {
let filename_regex = Regex::new(r#"\A<span class="filename">(.*)</span>\z"#).unwrap();
// Captions sometimes take up multiple lines
// Captions sometimes take up multiple lines.
let caption_start_regex = Regex::new(r#"\A<span class="caption">(.*)\z"#).unwrap();
let caption_end_regex = Regex::new(r#"(.*)</span>\z"#).unwrap();
let regexen = vec![filename_regex, caption_start_regex, caption_end_regex];
let lines: Vec<_> = input.lines().flat_map(|line| {
// Remove our figure and caption markup
// Remove our figure and caption markup.
if line == "<figure>" ||
line == "<figcaption>" ||
line == "</figcaption>" ||
line == "</figure>"
{
None
// Remove our syntax highlighting and rustdoc markers
// Remove our syntax highlighting and rustdoc markers.
} else if line.starts_with("```") {
Some(String::from("```"))
// Remove the span around filenames and captions
// Remove the span around filenames and captions.
} else {
let result = regexen.iter().fold(line.to_string(), |result, regex| {
regex.replace_all(&result, |caps: &Captures| {

View File

@ -10,7 +10,7 @@ repository.
## Code of Conduct
The Rust project has [a code of conduct](http://rust-lang.org/conduct.html)
The Rust project has [a code of conduct](http://rust-lang.org/policies/code-of-conduct)
that governs all sub-projects, including this one. Please respect it!
## Review
@ -33,37 +33,10 @@ enhance the book in some way!
## Translations
We'd especially love help translating the second edition of the book! See the
[Translations] label to join in efforts that are currently in progress. Open
a new issue to start working on a new language! We're waiting on [mdbook
support] for multiple languages before we merge any in, but feel free to
start! The second edition is frozen and won't see major changes, so if
you start with that, you won't have to redo work :)
We'd love help translating the book! See the [Translations] label to join in
efforts that are currently in progress. Open a new issue to start working on
a new language! We're waiting on [mdbook support] for multiple languages
before we merge any in, but feel free to start!
[Translations]: https://github.com/rust-lang/book/issues?q=is%3Aopen+is%3Aissue+label%3ATranslations
[mdbook support]: https://github.com/azerupi/mdBook/issues/5
## Edition specific details
Each edition of the book may be taking contributions, but only of certain
kinds depending on the edition. Read on to learn the details!
### Contributing to the 2018 Edition
The 2018 is a "living" edition; it's not scheduled for in-print publication
at this time, and so is able to be updated at any time. We'd love pull
requests to fix issues with this edition, but we're not interested in
extremely large changes without discussing them first. If you'd like to make
a big change, please open an issue first! We'd hate for you to do some hard work
that we ultimately wouldn't accept.
## Contributing to the Second Edition
The second edition is completely frozen, and not accepting changes. It's
meant to be in sync with the print version available from No Starch
Press.
## Contributing to the First Edition
The first edition is completely frozen, and not accepting changes. It's
mostly kept around for history's sake.
[mdbook support]: https://github.com/azerupi/mdBook/issues/5

View File

@ -3,6 +3,7 @@ name = "rust-book"
version = "0.0.1"
authors = ["Steve Klabnik <steve@steveklabnik.com>"]
description = "The Rust Book"
edition = "2018"
[[bin]]
name = "concat_chapters"

View File

@ -2,19 +2,16 @@
[![Build Status](https://travis-ci.org/rust-lang/book.svg?branch=master)](https://travis-ci.org/rust-lang/book)
This repository contains the source of all editions of "the Rust Programming
Language".
This repository contains the source of "The Rust Programming Language" book.
The second edition will also be available in dead-tree form by No Starch
Press, available around June 2018. Check [the No Starch Page][nostarch] for
the latest information on the release date and how to order.
[The book is available in dead-tree form from No Starch Press][nostarch]
[nostarch]: https://nostarch.com/rust
You can read all editions of the book for free online! Please see the book as
shipped with the latest [stable], [beta], or [nightly] Rust releases. Be
aware that issues in those versions may have been fixed in this repository
already, as those releases are updated less frequently.
You can also read the book for free online. Please see the book as shipped with
the latest [stable], [beta], or [nightly] Rust releases. Be aware that issues
in those versions may have been fixed in this repository already, as those
releases are updated less frequently.
[stable]: https://doc.rust-lang.org/stable/book/
[beta]: https://doc.rust-lang.org/beta/book/
@ -34,9 +31,7 @@ $ cargo install mdbook --vers [version-num]
## Building
To build the book, first `cd` into the directory of the edition of the
book you'd like to build. For example, the `first-edition` or
`second-edition` directory. Then type:
To build the book, type:
```bash
$ mdbook build
@ -72,57 +67,18 @@ $ mdbook test
We'd love your help! Please see [CONTRIBUTING.md][contrib] to learn about the
kinds of contributions we're looking for.
### 2018 Edition
The "2018" Edition is in the process of being updated with the language changes
that will be available with the 2018 Edition of the Rust language. All new
contributions should be to this edition.
### Second Edition
No Starch Press has brought the second edition to print. Pull requests fixing
factual errors will be accepted and documented as errata; pull requests changing
wording or other small corrections should be made against the 2018 edition instead.
### First Edition
The first edition is frozen, and is not accepting any changes at this time.
[contrib]: https://github.com/rust-lang/book/blob/master/CONTRIBUTING.md
### Translations
We'd especially love help translating the second edition or 2018 edition of the book! See the
[Translations] label to join in efforts that are currently in progress. Open
a new issue to start working on a new language! We're waiting on [mdbook
support] for multiple languages before we merge any in, but feel free to
start! The second edition is frozen and won't see major
changes, so if you start with that, you won't have to redo work :)
We'd love help translating the book! See the [Translations] label to join in
efforts that are currently in progress. Open a new issue to start working on
a new language! We're waiting on [mdbook support] for multiple languages
before we merge any in, but feel free to start!
[Translations]: https://github.com/rust-lang/book/issues?q=is%3Aopen+is%3Aissue+label%3ATranslations
[mdbook support]: https://github.com/azerupi/mdBook/issues/5
## No Starch
As the second edition of the book will be published by No Starch, we first
iterate here, then ship the text off to No Starch. Then they do editing, and we
fold it back in.
As such, theres a directory, *nostarch*, which corresponds to the text in No
Starchs system.
When we've started working with No Starch in a word doc, we will also check
those into the repo in the *nostarch/odt* directory. To extract the text from
the word doc as markdown in order to backport changes to the online book:
1. Open the doc file in LibreOffice
1. Accept all tracked changes
1. Save as Microsoft Word 2007-2013 XML (.docx) in the *tmp* directory
1. Run `./doc-to-md.sh`
1. Inspect changes made to the markdown file in the *nostarch* directory and
copy the changes to the *src* directory as appropriate.
## Graphviz dot
We're using [Graphviz](http://graphviz.org/) for some of the diagrams in the

View File

@ -1,18 +1,10 @@
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
#!/bin/bash
set -e
export PATH=$PATH:/home/travis/.cargo/bin;
# feature check
# Feature check
cd ci/stable-check
cargo run -- ../../src

View File

@ -21,6 +21,8 @@ args
associativity
async
atomics
attr
autocompletion
AveragedCollection
backend
backported
@ -28,7 +30,7 @@ backtrace
backtraces
BACKTRACE
Backtraces
Bazs
Baz's
benchmarking
bioinformatics
bitand
@ -49,6 +51,7 @@ Boolean
Booleans
Bors
BorrowMutError
BoxMeUp
BTreeSet
BuildHasher
Cacher
@ -62,12 +65,15 @@ charset
choo
chXX
chYY
clippy
clippy's
coercions
combinator
ConcreteType
config
Config
const
consts
constant's
copyeditor
couldn
@ -84,13 +90,16 @@ Ctrl
customizable
CustomSmartPointer
CustomSmartPointers
datas
data's
DataStruct
deallocate
deallocated
deallocating
deallocation
debuginfo
decrementing
deduplicate
deduplicating
deps
deref
Deref
@ -134,10 +143,11 @@ Enums
eprintln
Erlang
ErrorKind
Executables
executables
expr
extern
favicon
ferris
FFFD
FFFF
figcaption
@ -198,6 +208,7 @@ IndexMut
indices
init
initializer
initializers
inline
instantiation
internet
@ -207,6 +218,7 @@ invariants
ioerror
iokind
ioresult
IoResult
iostdin
IpAddr
IpAddrKind
@ -219,6 +231,7 @@ JoinHandle
Kay's
kinded
lang
LastWriteTime
latin
liballoc
libc
@ -271,7 +284,9 @@ Mutex
mutexes
Mutexes
MutexGuard
mutext
MyBox
myprogram
namespace
namespaced
namespaces
@ -306,6 +321,7 @@ other's
OutlinePrint
overloadable
overread
PanicPayload
param
parameterize
ParseIntError
@ -349,7 +365,9 @@ refactor
refactoring
refcell
RefCell
refcellt
RefMut
reformats
refutability
reimplement
RemAssign
@ -371,8 +389,10 @@ rUsT
rustc
rustdoc
Rustonomicon
rustfix
rustfmt
rustup
sampleproject
screenshot
searchstring
SecondaryColor
@ -384,8 +404,10 @@ ShlAssign
ShrAssign
shouldn
Simula
siphash
situps
sizeof
SliceIndex
Smalltalk
snuck
someproject
@ -411,6 +433,7 @@ Struct
structs
struct's
Structs
StrWrap
SubAssign
subclasses
subcommand
@ -425,6 +448,7 @@ substring
subteams
subtree
subtyping
summarizable
supertrait
supertraits
TcpListener
@ -463,6 +487,7 @@ unary
Unary
uncomment
Uncomment
uncommenting
unevaluated
Uninstalling
uninstall
@ -488,8 +513,10 @@ versa
Versioning
visualstudio
Vlissides
vscode
vtable
wasn
weakt
WeatherForecast
WebSocket
whitespace

34
src/doc/book/ci/spellcheck.sh Normal file → Executable file
View File

@ -1,17 +1,8 @@
#!/bin/bash
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
aspell --version
# Checks project markdown files for spell errors
# Checks project Markdown files for spelling mistakes.
# Notes:
@ -41,18 +32,25 @@ aspell --version
shopt -s nullglob
dict_filename=./dictionary.txt
dict_filename=./ci/dictionary.txt
markdown_sources=(./src/*.md)
mode="check"
# aspell repeatedly modifies personal dictionary for some purpose,
# so we should use a copy of our dictionary
dict_path="/tmp/$dict_filename"
# aspell repeatedly modifies the personal dictionary for some reason,
# so we should use a copy of our dictionary.
dict_path="/tmp/dictionary.txt"
if [[ "$1" == "list" ]]; then
mode="list"
fi
# Error if running in list (CI) mode and there isn't a dictionary file;
# creating one in CI won't do any good :(
if [[ "$mode" == "list" && ! -f "$dict_filename" ]]; then
echo "No dictionary file found! A dictionary file is required in CI!"
exit 1
fi
if [[ ! -f "$dict_filename" ]]; then
# Pre-check mode: generates dictionary of words aspell consider typos.
# After user validates that this file contains only valid words, we can
@ -63,7 +61,7 @@ if [[ ! -f "$dict_filename" ]]; then
echo "personal_ws-1.1 en 0 utf-8" > "$dict_filename"
cat "${markdown_sources[@]}" | aspell --ignore 3 list | sort -u >> "$dict_filename"
elif [[ "$mode" == "list" ]]; then
# List (default) mode: scan all files, report errors
# List (default) mode: scan all files, report errors.
declare -i retval=0
cp "$dict_filename" "$dict_path"
@ -77,9 +75,9 @@ elif [[ "$mode" == "list" ]]; then
command=$(aspell --ignore 3 --personal="$dict_path" "$mode" < "$fname")
if [[ -n "$command" ]]; then
for error in $command; do
# FIXME: Find more correct way to get line number
# FIXME: find more correct way to get line number
# (ideally from aspell). Now it can make some false positives,
# because it is just a grep
# because it is just a grep.
grep --with-filename --line-number --color=always "$error" "$fname"
done
retval=1
@ -87,7 +85,7 @@ elif [[ "$mode" == "list" ]]; then
done
exit "$retval"
elif [[ "$mode" == "check" ]]; then
# Interactive mode: fix typos
# Interactive mode: fix typos.
cp "$dict_filename" "$dict_path"
if [ ! -f $dict_path ]; then

View File

@ -1,13 +1,3 @@
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::error::Error;
use std::env;
use std::fs;

View File

@ -1,13 +1,4 @@
#!/bin/bash
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -eu

View File

@ -1,13 +1,4 @@
#!/bin/bash
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -eu
@ -16,18 +7,18 @@ cargo build --release
mkdir -p tmp
rm -rf tmp/*.md
# Get all the markdown files in the src dir,
# Get all the Markdown files in the src dir,
ls src/${1:-""}*.md | \
# except for SUMMARY.md.
# except for `SUMMARY.md`.
grep -v SUMMARY.md | \
# Extract just the filename so we can reuse it easily.
xargs -n 1 basename | \
# Remove all links followed by <!-- ignore -->, then
# Change all remaining links from markdown to italicized inline text.
# Remove all links followed by `<!-- ignore -->``, then
# Change all remaining links from Markdown to italicized inline text.
while IFS= read -r filename; do
< "src/$filename" ./target/release/remove_links \
| ./target/release/link2print \
| ./target/release/remove_markup > "tmp/$filename"
done
# Concat the files into the nostarch dir.
# Concatenate the files into the `nostarch` dir.
./target/release/concat_chapters tmp nostarch

View File

@ -0,0 +1,55 @@
This is a new section to appear at the end of Appendix A, after the "Keywords Reserved for Future Use" section.
### Raw Identifiers
*Raw identifiers* let you use keywords where they would not normally be allowed
by prefixing them with `r#`.
For example, `match` is a keyword. If you try to compile this function that
uses `match` as its name:
Filename: src/main.rs
```
fn match(needle: &str, haystack: &str) -> bool {
haystack.contains(needle)
}
```
youll get this error:
```
error: expected identifier, found keyword `match`
--> src/main.rs:4:4
|
4 | fn match(needle: &str, haystack: &str) -> bool {
| ^^^^^ expected identifier, found keyword
```
The error says that you can't use the keyword `match` as the function
identifier. You can use `match` as a function name by using a raw identifier:
Filename: src/main.rs
```
fn r#match(needle: &str, haystack: &str) -> bool {
haystack.contains(needle)
}
fn main() {
assert!(r#match("foo", "foobar"));
}
```
This code will compile without any errors. Note the `r#` prefix on both the
function name in its definition as well as where the function is called in
`main`.
Raw identifiers allow you to use any word you choose as an identifier, even if
that word happens to be a reserved keyword. In addition, raw identifiers allow
you to use libraries written in a different Rust edition than your crate uses.
For example, `try` is not a keyword in the 2015 edition but is in the 2018
edition. If you depend on a library that is written using the 2015 edition and
has a `try` function, to call that function from your 2018 edition code, youll
need to use the raw identifier syntax, `r#try` in this case. See Appendix
E for more information on editions.

View File

@ -0,0 +1,21 @@
Please place this text in a box after the "Integer Types" section ends and before the "Floating-Point Types" section begins on page 38.
##### Integer Overflow
Lets say that you have a variable of type `u8`, which can hold values
between 0 and 255. What happens if you try to change the variable's value to
256? This is called *integer overflow*, and Rust has some interesting rules
around this behavior. When compiling in debug mode, Rust includes checks for
integer overflow that will cause your program to *panic* at runtime if integer
overflow occurs. Panicking is the term Rust uses when a program exits with an
error; well discuss panics more in the "Unrecoverable Errors with `panic!`
section" of Chapter 9 on page XX.
When compiling in release mode with the `--release` flag, Rust does not
include checks for integer overflow that cause panics. Instead, if overflow
occurs, Rust will perform something called *twos complement wrapping*. In
short, values greater than the maximum value the type can hold "wrap around"
to the minimum of the values the type can hold. In the case of a `u8`, 256
becomes 0, 257 becomes 1, etc. Relying on the wrapping behavior of integer
overflow is considered an error. If you want to wrap explicitly, the standard
library has a type named `Wrapping` that provides this behavior.

View File

@ -0,0 +1,25 @@
Please add this text at the end of The Array Type section, just before the Accessing Array Elements subsection starts on page 41.
Writing an array's type is done with square brackets containing the type of
each element in the array followed by a semicolon and the number of elements in
the array, like so:
```rust
let a: [i32; 5] = [1, 2, 3, 4, 5];
```
Here, `i32` is the type of each element. After the semicolon, the number `5`
indicates the element contains five items.
The way an array's type is written looks similar to an alternative syntax for
initializing an array: if you want to create an array that contains the same
value for each element, you can specify the initial value, then a semicolon,
then the length of the array in square brackets as shown here:
```rust
let a = [3; 5];
```
The array named `a` will contain 5 elements that will all be set to the value
`3` initially. This is the same as writing `let a = [3, 3, 3, 3, 3];` but in a
more concise way.

View File

@ -0,0 +1,33 @@
Please insert this new section after the "Repeating Code with loop" section ends and before the "Conditional Loops with while" section starts, on page 53.
#### Returning Values From Loops
One of the uses of a `loop` is to retry an operation you know can fail, such as
checking if a thread completed its job. However, you might need to pass the
result of that operation to the rest of your code. If you add the value you
want to return after the `break` expression you use to stop the loop, it will
be returned out of the loop so you can use the value, as shown here:
```rust
fn main() {
let mut counter = 0;
let result = loop {
counter += 1;
if counter == 10 {
break counter * 2;
}
};
println!("The result is {}", result);
}
```
Before the loop, we declare a variable named `counter` and initialize it to
zero. Then we declare a variable named `result` to hold the value returned from
the loop. On every iteration of the loop, we add one to the counter variable,
and then check if the counter is equal to ten. When it is, we use the `break`
keyword with the value `counter * 2`. After the loop, we place a semicolon to
end the statement assigning the value to `result`. Finally, we print out the
value in `result`, which in this case will be twenty.

View File

@ -0,0 +1,49 @@
Please view this file in monospace to see how the error messages should line up vertically.
Here is the new error message for page 70, I've included the whole message for clarity:
```
error[E0499]: cannot borrow `s` as mutable more than once at a time
--> src/main.rs:5:14
|
4 | let r1 = &mut s;
| ------ first mutable borrow occurs here
5 | let r2 = &mut s;
| ^^^^^^ second mutable borrow occurs here
6 |
7 | println!("{}, {}", r1, r2);
| -- first borrow later used here
```
For page 71:
```
error[E0502]: cannot borrow `s` as mutable because it is also borrowed as immutable
--> src/main.rs:6:14
|
4 | let r1 = &s; // no problem
| -- immutable borrow occurs here
5 | let r2 = &s; // no problem
6 | let r3 = &mut s; // BIG PROBLEM
| ^^^^^^ mutable borrow occurs here
7 |
8 | println!("{}, {}, and {}", r1, r2, r3);
| -- immutable borrow later used here
```
For page 77:
```
error[E0502]: cannot borrow `s` as mutable because it is also borrowed as immutable
--> src/main.rs:18:5
|
16 | let word = first_word(&s);
| -- immutable borrow occurs here
17 |
18 | s.clear(); // error!
| ^^^^^^^^^ mutable borrow occurs here
19 |
20 | println!("the first word is: {}", word);
| ---- immutable borrow later used here
```

View File

@ -0,0 +1,28 @@
Please replace the paragraphs that start with "The stack is fast" and "Data with a size unknown" in the box on page 58 with this paragraph:
---
All data stored on the stack must have a known, fixed size. Data with a size
that is unknown at compile time or a size that might change must be stored on
the heap instead. The heap is less organized: when you put data on the heap,
you ask for some amount of space. The operating system finds an empty spot
somewhere in the heap that is big enough, marks it as being in use, and
returns a *pointer*, which is the address of that location. This process is
called *allocating on the heap*, sometimes abbreviated as just “allocating.”
Pushing values onto the stack is not considered allocating. Because the
pointer is a known, fixed size, you can store the pointer on the stack, but
when you want the actual data, you have to follow the pointer.
---
Then please add this paragraph between the paragraph that starts with "Think of being seated at a restaurant" and the paragraph that starts with "Accessing data in the heap" on page 59:
---
Pushing to the stack is faster than allocating on the heap because it never
has to search for a place to put new data; that place is always at the top
of the stack. Comparatively, allocating space on the heap requires more work,
because the operating system must first find a space big enough to hold the
data and then perform bookkeeping to prepare for the next allocation.
---

View File

@ -0,0 +1,15 @@
Please replace the error message on page 133 with this one:
```
error[E0502]: cannot borrow `v` as mutable because it is also borrowed as immutable
--> src/main.rs:6:5
|
4 | let first = &v[0];
| - immutable borrow occurs here
5 |
6 | v.push(6);
| ^^^^^^^^^ mutable borrow occurs here
7 |
8 | println!("The first element is: {}", first);
| ----- immutable borrow later used here
```

View File

@ -0,0 +1,13 @@
Please replace the code in Listing 8-5 with this code:
```
let v = vec![1, 2, 3, 4, 5];
let third: &i32 = &v[2];
println!("The third element is {}", third);
match v.get(2) {
Some(third) => println!("The third element is {}", third),
None => println!("There is no third element."),
}
```

Some files were not shown because too many files have changed in this diff Show More