New upstream version 1.19.0+dfsg1

This commit is contained in:
Ximin Luo 2017-07-24 11:55:06 +02:00
parent cc61c64bd2
commit 7cac9316f9
4365 changed files with 463496 additions and 60313 deletions

View File

@ -177,7 +177,7 @@ python x.py test src/test/rustdoc
python x.py build src/libcore --stage 0
```
You can explore the build system throught the various `--help` pages for each
You can explore the build system through the various `--help` pages for each
subcommand. For example to learn more about a command you can run:
```

View File

@ -35,7 +35,7 @@ Read ["Installing Rust"] from [The Book].
3. Build and install:
```sh
$ ./x.py build && sudo ./x.py dist --install
$ ./x.py build && sudo ./x.py install
```
> ***Note:*** Install locations can be adjusted by copying the config file
@ -43,7 +43,7 @@ Read ["Installing Rust"] from [The Book].
> adjusting the `prefix` option under `[install]`. Various other options are
> also supported, and are documented in the config file.
When complete, `sudo ./x.py dist --install` will place several programs into
When complete, `sudo ./x.py install` will place several programs into
`/usr/local/bin`: `rustc`, the Rust compiler, and `rustdoc`, the
API-documentation tool. This install does not include [Cargo],
Rust's package manager, which you may also want to build.
@ -96,7 +96,7 @@ build.
4. Navigate to Rust's source code (or clone it), then build it:
```sh
$ ./x.py build && ./x.py dist --install
$ ./x.py build && ./x.py install
```
#### MSVC

View File

@ -1357,44 +1357,33 @@ Version 1.12.0 (2016-09-29)
Highlights
----------
* [`rustc` translates code to LLVM IR via its own "middle" IR (MIR)]
(https://github.com/rust-lang/rust/pull/34096).
* [`rustc` translates code to LLVM IR via its own "middle" IR (MIR)](https://github.com/rust-lang/rust/pull/34096).
This translation pass is far simpler than the previous AST->LLVM pass, and
creates opportunities to perform new optimizations directly on the MIR. It
was previously described [on the Rust blog]
(https://blog.rust-lang.org/2016/04/19/MIR.html).
was previously described [on the Rust blog](https://blog.rust-lang.org/2016/04/19/MIR.html).
* [`rustc` presents a new, more readable error format, along with
machine-readable JSON error output for use by IDEs]
(https://github.com/rust-lang/rust/pull/35401).
machine-readable JSON error output for use by IDEs](https://github.com/rust-lang/rust/pull/35401).
Most common editors supporting Rust have been updated to work with it. It was
previously described [on the Rust blog]
(https://blog.rust-lang.org/2016/08/10/Shape-of-errors-to-come.html).
previously described [on the Rust blog](https://blog.rust-lang.org/2016/08/10/Shape-of-errors-to-come.html).
Compiler
--------
* [`rustc` translates code to LLVM IR via its own "middle" IR (MIR)]
(https://github.com/rust-lang/rust/pull/34096).
* [`rustc` translates code to LLVM IR via its own "middle" IR (MIR)](https://github.com/rust-lang/rust/pull/34096).
This translation pass is far simpler than the previous AST->LLVM pass, and
creates opportunities to perform new optimizations directly on the MIR. It
was previously described [on the Rust blog]
(https://blog.rust-lang.org/2016/04/19/MIR.html).
was previously described [on the Rust blog](https://blog.rust-lang.org/2016/04/19/MIR.html).
* [Print the Rust target name, not the LLVM target name, with
`--print target-list`]
(https://github.com/rust-lang/rust/pull/35489)
`--print target-list`](https://github.com/rust-lang/rust/pull/35489)
* [The computation of `TypeId` is correct in some cases where it was previously
producing inconsistent results]
(https://github.com/rust-lang/rust/pull/35267)
* [The `mips-unknown-linux-gnu` target uses hardware floating point by default]
(https://github.com/rust-lang/rust/pull/34910)
producing inconsistent results](https://github.com/rust-lang/rust/pull/35267)
* [The `mips-unknown-linux-gnu` target uses hardware floating point by default](https://github.com/rust-lang/rust/pull/34910)
* [The `rustc` arguments, `--print target-cpus`, `--print target-features`,
`--print relocation-models`, and `--print code-models` print the available
options to the `-C target-cpu`, `-C target-feature`, `-C relocation-model` and
`-C code-model` code generation arguments]
(https://github.com/rust-lang/rust/pull/34845)
`-C code-model` code generation arguments](https://github.com/rust-lang/rust/pull/34845)
* [`rustc` supports three new MUSL targets on ARM: `arm-unknown-linux-musleabi`,
`arm-unknown-linux-musleabihf`, and `armv7-unknown-linux-musleabihf`]
(https://github.com/rust-lang/rust/pull/35060).
`arm-unknown-linux-musleabihf`, and `armv7-unknown-linux-musleabihf`](https://github.com/rust-lang/rust/pull/35060).
These targets produce statically-linked binaries. There are no binary release
builds yet though.
@ -1402,209 +1391,134 @@ Diagnostics
-----------
* [`rustc` presents a new, more readable error format, along with
machine-readable JSON error output for use by IDEs]
(https://github.com/rust-lang/rust/pull/35401).
machine-readable JSON error output for use by IDEs](https://github.com/rust-lang/rust/pull/35401).
Most common editors supporting Rust have been updated to work with it. It was
previously described [on the Rust blog]
(https://blog.rust-lang.org/2016/08/10/Shape-of-errors-to-come.html).
previously described [on the Rust blog](https://blog.rust-lang.org/2016/08/10/Shape-of-errors-to-come.html).
* [In error descriptions, references are now described in plain English,
instead of as "&-ptr"]
(https://github.com/rust-lang/rust/pull/35611)
instead of as "&-ptr"](https://github.com/rust-lang/rust/pull/35611)
* [In error type descriptions, unknown numeric types are named `{integer}` or
`{float}` instead of `_`]
(https://github.com/rust-lang/rust/pull/35080)
* [`rustc` emits a clearer error when inner attributes follow a doc comment]
(https://github.com/rust-lang/rust/pull/34676)
`{float}` instead of `_`](https://github.com/rust-lang/rust/pull/35080)
* [`rustc` emits a clearer error when inner attributes follow a doc comment](https://github.com/rust-lang/rust/pull/34676)
Language
--------
* [`macro_rules!` invocations can be made within `macro_rules!` invocations]
(https://github.com/rust-lang/rust/pull/34925)
* [`macro_rules!` meta-variables are hygienic]
(https://github.com/rust-lang/rust/pull/35453)
* [`macro_rules!` invocations can be made within `macro_rules!` invocations](https://github.com/rust-lang/rust/pull/34925)
* [`macro_rules!` meta-variables are hygienic](https://github.com/rust-lang/rust/pull/35453)
* [`macro_rules!` `tt` matchers can be reparsed correctly, making them much more
useful]
(https://github.com/rust-lang/rust/pull/34908)
useful](https://github.com/rust-lang/rust/pull/34908)
* [`macro_rules!` `stmt` matchers correctly consume the entire contents when
inside non-braces invocations]
(https://github.com/rust-lang/rust/pull/34886)
inside non-braces invocations](https://github.com/rust-lang/rust/pull/34886)
* [Semicolons are properly required as statement delimeters inside
`macro_rules!` invocations]
(https://github.com/rust-lang/rust/pull/34660)
* [`cfg_attr` works on `path` attributes]
(https://github.com/rust-lang/rust/pull/34546)
`macro_rules!` invocations](https://github.com/rust-lang/rust/pull/34660)
* [`cfg_attr` works on `path` attributes](https://github.com/rust-lang/rust/pull/34546)
Stabilized APIs
---------------
* [`Cell::as_ptr`]
(https://doc.rust-lang.org/std/cell/struct.Cell.html#method.as_ptr)
* [`RefCell::as_ptr`]
(https://doc.rust-lang.org/std/cell/struct.RefCell.html#method.as_ptr)
* [`IpAddr::is_unspecified`]
(https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_unspecified)
* [`IpAddr::is_loopback`]
(https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_loopback)
* [`IpAddr::is_multicast`]
(https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_multicast)
* [`Ipv4Addr::is_unspecified`]
(https://doc.rust-lang.org/std/net/struct.Ipv4Addr.html#method.is_unspecified)
* [`Ipv6Addr::octets`]
(https://doc.rust-lang.org/std/net/struct.Ipv6Addr.html#method.octets)
* [`LinkedList::contains`]
(https://doc.rust-lang.org/std/collections/linked_list/struct.LinkedList.html#method.contains)
* [`VecDeque::contains`]
(https://doc.rust-lang.org/std/collections/vec_deque/struct.VecDeque.html#method.contains)
* [`ExitStatusExt::from_raw`]
(https://doc.rust-lang.org/std/os/unix/process/trait.ExitStatusExt.html#tymethod.from_raw).
* [`Cell::as_ptr`](https://doc.rust-lang.org/std/cell/struct.Cell.html#method.as_ptr)
* [`RefCell::as_ptr`](https://doc.rust-lang.org/std/cell/struct.RefCell.html#method.as_ptr)
* [`IpAddr::is_unspecified`](https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_unspecified)
* [`IpAddr::is_loopback`](https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_loopback)
* [`IpAddr::is_multicast`](https://doc.rust-lang.org/std/net/enum.IpAddr.html#method.is_multicast)
* [`Ipv4Addr::is_unspecified`](https://doc.rust-lang.org/std/net/struct.Ipv4Addr.html#method.is_unspecified)
* [`Ipv6Addr::octets`](https://doc.rust-lang.org/std/net/struct.Ipv6Addr.html#method.octets)
* [`LinkedList::contains`](https://doc.rust-lang.org/std/collections/linked_list/struct.LinkedList.html#method.contains)
* [`VecDeque::contains`](https://doc.rust-lang.org/std/collections/vec_deque/struct.VecDeque.html#method.contains)
* [`ExitStatusExt::from_raw`](https://doc.rust-lang.org/std/os/unix/process/trait.ExitStatusExt.html#tymethod.from_raw).
Both on Unix and Windows.
* [`Receiver::recv_timeout`]
(https://doc.rust-lang.org/std/sync/mpsc/struct.Receiver.html#method.recv_timeout)
* [`RecvTimeoutError`]
(https://doc.rust-lang.org/std/sync/mpsc/enum.RecvTimeoutError.html)
* [`BinaryHeap::peek_mut`]
(https://doc.rust-lang.org/std/collections/binary_heap/struct.BinaryHeap.html#method.peek_mut)
* [`PeekMut`]
(https://doc.rust-lang.org/std/collections/binary_heap/struct.PeekMut.html)
* [`iter::Product`]
(https://doc.rust-lang.org/std/iter/trait.Product.html)
* [`iter::Sum`]
(https://doc.rust-lang.org/std/iter/trait.Sum.html)
* [`OccupiedEntry::remove_entry`]
(https://doc.rust-lang.org/std/collections/btree_map/struct.OccupiedEntry.html#method.remove_entry)
* [`VacantEntry::into_key`]
(https://doc.rust-lang.org/std/collections/btree_map/struct.VacantEntry.html#method.into_key)
* [`Receiver::recv_timeout`](https://doc.rust-lang.org/std/sync/mpsc/struct.Receiver.html#method.recv_timeout)
* [`RecvTimeoutError`](https://doc.rust-lang.org/std/sync/mpsc/enum.RecvTimeoutError.html)
* [`BinaryHeap::peek_mut`](https://doc.rust-lang.org/std/collections/binary_heap/struct.BinaryHeap.html#method.peek_mut)
* [`PeekMut`](https://doc.rust-lang.org/std/collections/binary_heap/struct.PeekMut.html)
* [`iter::Product`](https://doc.rust-lang.org/std/iter/trait.Product.html)
* [`iter::Sum`](https://doc.rust-lang.org/std/iter/trait.Sum.html)
* [`OccupiedEntry::remove_entry`](https://doc.rust-lang.org/std/collections/btree_map/struct.OccupiedEntry.html#method.remove_entry)
* [`VacantEntry::into_key`](https://doc.rust-lang.org/std/collections/btree_map/struct.VacantEntry.html#method.into_key)
Libraries
---------
* [The `format!` macro and friends now allow a single argument to be formatted
in multiple styles]
(https://github.com/rust-lang/rust/pull/33642)
in multiple styles](https://github.com/rust-lang/rust/pull/33642)
* [The lifetime bounds on `[T]::binary_search_by` and
`[T]::binary_search_by_key` have been adjusted to be more flexible]
(https://github.com/rust-lang/rust/pull/34762)
* [`Option` implements `From` for its contained type]
(https://github.com/rust-lang/rust/pull/34828)
* [`Cell`, `RefCell` and `UnsafeCell` implement `From` for their contained type]
(https://github.com/rust-lang/rust/pull/35392)
* [`RwLock` panics if the reader count overflows]
(https://github.com/rust-lang/rust/pull/35378)
* [`vec_deque::Drain`, `hash_map::Drain` and `hash_set::Drain` are covariant]
(https://github.com/rust-lang/rust/pull/35354)
* [`vec::Drain` and `binary_heap::Drain` are covariant]
(https://github.com/rust-lang/rust/pull/34951)
* [`Cow<str>` implements `FromIterator` for `char`, `&str` and `String`]
(https://github.com/rust-lang/rust/pull/35064)
* [Sockets on Linux are correctly closed in subprocesses via `SOCK_CLOEXEC`]
(https://github.com/rust-lang/rust/pull/34946)
`[T]::binary_search_by_key` have been adjusted to be more flexible](https://github.com/rust-lang/rust/pull/34762)
* [`Option` implements `From` for its contained type](https://github.com/rust-lang/rust/pull/34828)
* [`Cell`, `RefCell` and `UnsafeCell` implement `From` for their contained type](https://github.com/rust-lang/rust/pull/35392)
* [`RwLock` panics if the reader count overflows](https://github.com/rust-lang/rust/pull/35378)
* [`vec_deque::Drain`, `hash_map::Drain` and `hash_set::Drain` are covariant](https://github.com/rust-lang/rust/pull/35354)
* [`vec::Drain` and `binary_heap::Drain` are covariant](https://github.com/rust-lang/rust/pull/34951)
* [`Cow<str>` implements `FromIterator` for `char`, `&str` and `String`](https://github.com/rust-lang/rust/pull/35064)
* [Sockets on Linux are correctly closed in subprocesses via `SOCK_CLOEXEC`](https://github.com/rust-lang/rust/pull/34946)
* [`hash_map::Entry`, `hash_map::VacantEntry` and `hash_map::OccupiedEntry`
implement `Debug`]
(https://github.com/rust-lang/rust/pull/34937)
implement `Debug`](https://github.com/rust-lang/rust/pull/34937)
* [`btree_map::Entry`, `btree_map::VacantEntry` and `btree_map::OccupiedEntry`
implement `Debug`]
(https://github.com/rust-lang/rust/pull/34885)
* [`String` implements `AddAssign`]
(https://github.com/rust-lang/rust/pull/34890)
implement `Debug`](https://github.com/rust-lang/rust/pull/34885)
* [`String` implements `AddAssign`](https://github.com/rust-lang/rust/pull/34890)
* [Variadic `extern fn` pointers implement the `Clone`, `PartialEq`, `Eq`,
`PartialOrd`, `Ord`, `Hash`, `fmt::Pointer`, and `fmt::Debug` traits]
(https://github.com/rust-lang/rust/pull/34879)
* [`FileType` implements `Debug`]
(https://github.com/rust-lang/rust/pull/34757)
* [References to `Mutex` and `RwLock` are unwind-safe]
(https://github.com/rust-lang/rust/pull/34756)
`PartialOrd`, `Ord`, `Hash`, `fmt::Pointer`, and `fmt::Debug` traits](https://github.com/rust-lang/rust/pull/34879)
* [`FileType` implements `Debug`](https://github.com/rust-lang/rust/pull/34757)
* [References to `Mutex` and `RwLock` are unwind-safe](https://github.com/rust-lang/rust/pull/34756)
* [`mpsc::sync_channel` `Receiver`s return any available message before
reporting a disconnect]
(https://github.com/rust-lang/rust/pull/34731)
* [Unicode definitions have been updated to 9.0]
(https://github.com/rust-lang/rust/pull/34599)
* [`env` iterators implement `DoubleEndedIterator`]
(https://github.com/rust-lang/rust/pull/33312)
reporting a disconnect](https://github.com/rust-lang/rust/pull/34731)
* [Unicode definitions have been updated to 9.0](https://github.com/rust-lang/rust/pull/34599)
* [`env` iterators implement `DoubleEndedIterator`](https://github.com/rust-lang/rust/pull/33312)
Cargo
-----
* [Support local mirrors of registries]
(https://github.com/rust-lang/cargo/pull/2857)
* [Add support for command aliases]
(https://github.com/rust-lang/cargo/pull/2679)
* [Allow `opt-level="s"` / `opt-level="z"` in profile overrides]
(https://github.com/rust-lang/cargo/pull/3007)
* [Make `cargo doc --open --target` work as expected]
(https://github.com/rust-lang/cargo/pull/2988)
* [Speed up noop registry updates]
(https://github.com/rust-lang/cargo/pull/2974)
* [Update OpenSSL]
(https://github.com/rust-lang/cargo/pull/2971)
* [Fix `--panic=abort` with plugins]
(https://github.com/rust-lang/cargo/pull/2954)
* [Always pass `-C metadata` to the compiler]
(https://github.com/rust-lang/cargo/pull/2946)
* [Fix depending on git repos with workspaces]
(https://github.com/rust-lang/cargo/pull/2938)
* [Add a `--lib` flag to `cargo new`]
(https://github.com/rust-lang/cargo/pull/2921)
* [Add `http.cainfo` for custom certs]
(https://github.com/rust-lang/cargo/pull/2917)
* [Indicate the compilation profile after compiling]
(https://github.com/rust-lang/cargo/pull/2909)
* [Allow enabling features for dependencies with `--features`]
(https://github.com/rust-lang/cargo/pull/2876)
* [Add `--jobs` flag to `cargo package`]
(https://github.com/rust-lang/cargo/pull/2867)
* [Add `--dry-run` to `cargo publish`]
(https://github.com/rust-lang/cargo/pull/2849)
* [Add support for `RUSTDOCFLAGS`]
(https://github.com/rust-lang/cargo/pull/2794)
* [Support local mirrors of registries](https://github.com/rust-lang/cargo/pull/2857)
* [Add support for command aliases](https://github.com/rust-lang/cargo/pull/2679)
* [Allow `opt-level="s"` / `opt-level="z"` in profile overrides](https://github.com/rust-lang/cargo/pull/3007)
* [Make `cargo doc --open --target` work as expected](https://github.com/rust-lang/cargo/pull/2988)
* [Speed up noop registry updates](https://github.com/rust-lang/cargo/pull/2974)
* [Update OpenSSL](https://github.com/rust-lang/cargo/pull/2971)
* [Fix `--panic=abort` with plugins](https://github.com/rust-lang/cargo/pull/2954)
* [Always pass `-C metadata` to the compiler](https://github.com/rust-lang/cargo/pull/2946)
* [Fix depending on git repos with workspaces](https://github.com/rust-lang/cargo/pull/2938)
* [Add a `--lib` flag to `cargo new`](https://github.com/rust-lang/cargo/pull/2921)
* [Add `http.cainfo` for custom certs](https://github.com/rust-lang/cargo/pull/2917)
* [Indicate the compilation profile after compiling](https://github.com/rust-lang/cargo/pull/2909)
* [Allow enabling features for dependencies with `--features`](https://github.com/rust-lang/cargo/pull/2876)
* [Add `--jobs` flag to `cargo package`](https://github.com/rust-lang/cargo/pull/2867)
* [Add `--dry-run` to `cargo publish`](https://github.com/rust-lang/cargo/pull/2849)
* [Add support for `RUSTDOCFLAGS`](https://github.com/rust-lang/cargo/pull/2794)
Performance
-----------
* [`panic::catch_unwind` is more optimized]
(https://github.com/rust-lang/rust/pull/35444)
* [`panic::catch_unwind` no longer accesses thread-local storage on entry]
(https://github.com/rust-lang/rust/pull/34866)
* [`panic::catch_unwind` is more optimized](https://github.com/rust-lang/rust/pull/35444)
* [`panic::catch_unwind` no longer accesses thread-local storage on entry](https://github.com/rust-lang/rust/pull/34866)
Tooling
-------
* [Test binaries now support a `--test-threads` argument to specify the number
of threads used to run tests, and which acts the same as the
`RUST_TEST_THREADS` environment variable]
(https://github.com/rust-lang/rust/pull/35414)
* [The test runner now emits a warning when tests run over 60 seconds]
(https://github.com/rust-lang/rust/pull/35405)
* [rustdoc: Fix methods in search results]
(https://github.com/rust-lang/rust/pull/34752)
* [`rust-lldb` warns about unsupported versions of LLDB]
(https://github.com/rust-lang/rust/pull/34646)
`RUST_TEST_THREADS` environment variable](https://github.com/rust-lang/rust/pull/35414)
* [The test runner now emits a warning when tests run over 60 seconds](https://github.com/rust-lang/rust/pull/35405)
* [rustdoc: Fix methods in search results](https://github.com/rust-lang/rust/pull/34752)
* [`rust-lldb` warns about unsupported versions of LLDB](https://github.com/rust-lang/rust/pull/34646)
* [Rust releases now come with source packages that can be installed by rustup
via `rustup component add rust-src`]
(https://github.com/rust-lang/rust/pull/34366).
via `rustup component add rust-src`](https://github.com/rust-lang/rust/pull/34366).
The resulting source code can be used by tools and IDES, located in the
sysroot under `lib/rustlib/src`.
Misc
----
* [The compiler can now be built against LLVM 3.9]
(https://github.com/rust-lang/rust/pull/35594)
* [The compiler can now be built against LLVM 3.9](https://github.com/rust-lang/rust/pull/35594)
* Many minor improvements to the documentation.
* [The Rust exception handling "personality" routine is now written in Rust]
(https://github.com/rust-lang/rust/pull/34832)
* [The Rust exception handling "personality" routine is now written in Rust](https://github.com/rust-lang/rust/pull/34832)
Compatibility Notes
-------------------
* [When printing Windows `OsStr`s, unpaired surrogate codepoints are escaped
with the lowercase format instead of the uppercase]
(https://github.com/rust-lang/rust/pull/35084)
with the lowercase format instead of the uppercase](https://github.com/rust-lang/rust/pull/35084)
* [When formatting strings, if "precision" is specified, the "fill",
"align" and "width" specifiers are no longer ignored]
(https://github.com/rust-lang/rust/pull/34544)
* [The `Debug` impl for strings no longer escapes all non-ASCII characters]
(https://github.com/rust-lang/rust/pull/34485)
"align" and "width" specifiers are no longer ignored](https://github.com/rust-lang/rust/pull/34544)
* [The `Debug` impl for strings no longer escapes all non-ASCII characters](https://github.com/rust-lang/rust/pull/34485)
Version 1.11.0 (2016-08-18)
@ -1613,142 +1527,92 @@ Version 1.11.0 (2016-08-18)
Language
--------
* [`cfg_attr` works on `path` attributes]
(https://github.com/rust-lang/rust/pull/34546)
* [Support nested `cfg_attr` attributes]
(https://github.com/rust-lang/rust/pull/34216)
* [Allow statement-generating braced macro invocations at the end of blocks]
(https://github.com/rust-lang/rust/pull/34436)
* [Macros can be expanded inside of trait definitions]
(https://github.com/rust-lang/rust/pull/34213)
* [`#[macro_use]` works properly when it is itself expanded from a macro]
(https://github.com/rust-lang/rust/pull/34032)
* [`cfg_attr` works on `path` attributes](https://github.com/rust-lang/rust/pull/34546)
* [Support nested `cfg_attr` attributes](https://github.com/rust-lang/rust/pull/34216)
* [Allow statement-generating braced macro invocations at the end of blocks](https://github.com/rust-lang/rust/pull/34436)
* [Macros can be expanded inside of trait definitions](https://github.com/rust-lang/rust/pull/34213)
* [`#[macro_use]` works properly when it is itself expanded from a macro](https://github.com/rust-lang/rust/pull/34032)
Stabilized APIs
---------------
* [`BinaryHeap::append`]
(https://doc.rust-lang.org/std/collections/binary_heap/struct.BinaryHeap.html#method.append)
* [`BTreeMap::append`]
(https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.append)
* [`BTreeMap::split_off`]
(https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.split_off)
* [`BTreeSet::append`]
(https://doc.rust-lang.org/std/collections/btree_set/struct.BTreeSet.html#method.append)
* [`BTreeSet::split_off`]
(https://doc.rust-lang.org/std/collections/btree_set/struct.BTreeSet.html#method.split_off)
* [`f32::to_degrees`]
(https://doc.rust-lang.org/std/primitive.f32.html#method.to_degrees)
* [`BinaryHeap::append`](https://doc.rust-lang.org/std/collections/binary_heap/struct.BinaryHeap.html#method.append)
* [`BTreeMap::append`](https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.append)
* [`BTreeMap::split_off`](https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.split_off)
* [`BTreeSet::append`](https://doc.rust-lang.org/std/collections/btree_set/struct.BTreeSet.html#method.append)
* [`BTreeSet::split_off`](https://doc.rust-lang.org/std/collections/btree_set/struct.BTreeSet.html#method.split_off)
* [`f32::to_degrees`](https://doc.rust-lang.org/std/primitive.f32.html#method.to_degrees)
(in libcore - previously stabilized in libstd)
* [`f32::to_radians`]
(https://doc.rust-lang.org/std/primitive.f32.html#method.to_radians)
* [`f32::to_radians`](https://doc.rust-lang.org/std/primitive.f32.html#method.to_radians)
(in libcore - previously stabilized in libstd)
* [`f64::to_degrees`]
(https://doc.rust-lang.org/std/primitive.f64.html#method.to_degrees)
* [`f64::to_degrees`](https://doc.rust-lang.org/std/primitive.f64.html#method.to_degrees)
(in libcore - previously stabilized in libstd)
* [`f64::to_radians`]
(https://doc.rust-lang.org/std/primitive.f64.html#method.to_radians)
* [`f64::to_radians`](https://doc.rust-lang.org/std/primitive.f64.html#method.to_radians)
(in libcore - previously stabilized in libstd)
* [`Iterator::sum`]
(https://doc.rust-lang.org/std/iter/trait.Iterator.html#method.sum)
* [`Iterator::product`]
(https://doc.rust-lang.org/std/iter/trait.Iterator.html#method.sum)
* [`Cell::get_mut`]
(https://doc.rust-lang.org/std/cell/struct.Cell.html#method.get_mut)
* [`RefCell::get_mut`]
(https://doc.rust-lang.org/std/cell/struct.RefCell.html#method.get_mut)
* [`Iterator::sum`](https://doc.rust-lang.org/std/iter/trait.Iterator.html#method.sum)
* [`Iterator::product`](https://doc.rust-lang.org/std/iter/trait.Iterator.html#method.sum)
* [`Cell::get_mut`](https://doc.rust-lang.org/std/cell/struct.Cell.html#method.get_mut)
* [`RefCell::get_mut`](https://doc.rust-lang.org/std/cell/struct.RefCell.html#method.get_mut)
Libraries
---------
* [The `thread_local!` macro supports multiple definitions in a single
invocation, and can apply attributes]
(https://github.com/rust-lang/rust/pull/34077)
* [`Cow` implements `Default`]
(https://github.com/rust-lang/rust/pull/34305)
invocation, and can apply attributes](https://github.com/rust-lang/rust/pull/34077)
* [`Cow` implements `Default`](https://github.com/rust-lang/rust/pull/34305)
* [`Wrapping` implements binary, octal, lower-hex and upper-hex
`Display` formatting]
(https://github.com/rust-lang/rust/pull/34190)
* [The range types implement `Hash`]
(https://github.com/rust-lang/rust/pull/34180)
* [`lookup_host` ignores unknown address types]
(https://github.com/rust-lang/rust/pull/34067)
* [`assert_eq!` accepts a custom error message, like `assert!` does]
(https://github.com/rust-lang/rust/pull/33976)
* [The main thread is now called "main" instead of "&lt;main&gt;"]
(https://github.com/rust-lang/rust/pull/33803)
`Display` formatting](https://github.com/rust-lang/rust/pull/34190)
* [The range types implement `Hash`](https://github.com/rust-lang/rust/pull/34180)
* [`lookup_host` ignores unknown address types](https://github.com/rust-lang/rust/pull/34067)
* [`assert_eq!` accepts a custom error message, like `assert!` does](https://github.com/rust-lang/rust/pull/33976)
* [The main thread is now called "main" instead of "&lt;main&gt;"](https://github.com/rust-lang/rust/pull/33803)
Cargo
-----
* [Disallow specifying features of transitive deps]
(https://github.com/rust-lang/cargo/pull/2821)
* [Add color support for Windows consoles]
(https://github.com/rust-lang/cargo/pull/2804)
* [Fix `harness = false` on `[lib]` sections]
(https://github.com/rust-lang/cargo/pull/2795)
* [Don't panic when `links` contains a '.']
(https://github.com/rust-lang/cargo/pull/2787)
* [Build scripts can emit warnings]
(https://github.com/rust-lang/cargo/pull/2630),
* [Disallow specifying features of transitive deps](https://github.com/rust-lang/cargo/pull/2821)
* [Add color support for Windows consoles](https://github.com/rust-lang/cargo/pull/2804)
* [Fix `harness = false` on `[lib]` sections](https://github.com/rust-lang/cargo/pull/2795)
* [Don't panic when `links` contains a '.'](https://github.com/rust-lang/cargo/pull/2787)
* [Build scripts can emit warnings](https://github.com/rust-lang/cargo/pull/2630),
and `-vv` prints warnings for all crates.
* [Ignore file locks on OS X NFS mounts]
(https://github.com/rust-lang/cargo/pull/2720)
* [Don't warn about `package.metadata` keys]
(https://github.com/rust-lang/cargo/pull/2668).
* [Ignore file locks on OS X NFS mounts](https://github.com/rust-lang/cargo/pull/2720)
* [Don't warn about `package.metadata` keys](https://github.com/rust-lang/cargo/pull/2668).
This provides room for expansion by arbitrary tools.
* [Add support for cdylib crate types]
(https://github.com/rust-lang/cargo/pull/2741)
* [Prevent publishing crates when files are dirty]
(https://github.com/rust-lang/cargo/pull/2781)
* [Don't fetch all crates on clean]
(https://github.com/rust-lang/cargo/pull/2704)
* [Propagate --color option to rustc]
(https://github.com/rust-lang/cargo/pull/2779)
* [Fix `cargo doc --open` on Windows]
(https://github.com/rust-lang/cargo/pull/2780)
* [Improve autocompletion]
(https://github.com/rust-lang/cargo/pull/2772)
* [Configure colors of stderr as well as stdout]
(https://github.com/rust-lang/cargo/pull/2739)
* [Add support for cdylib crate types](https://github.com/rust-lang/cargo/pull/2741)
* [Prevent publishing crates when files are dirty](https://github.com/rust-lang/cargo/pull/2781)
* [Don't fetch all crates on clean](https://github.com/rust-lang/cargo/pull/2704)
* [Propagate --color option to rustc](https://github.com/rust-lang/cargo/pull/2779)
* [Fix `cargo doc --open` on Windows](https://github.com/rust-lang/cargo/pull/2780)
* [Improve autocompletion](https://github.com/rust-lang/cargo/pull/2772)
* [Configure colors of stderr as well as stdout](https://github.com/rust-lang/cargo/pull/2739)
Performance
-----------
* [Caching projections speeds up type check dramatically for some
workloads]
(https://github.com/rust-lang/rust/pull/33816)
* [The default `HashMap` hasher is SipHash 1-3 instead of SipHash 2-4]
(https://github.com/rust-lang/rust/pull/33940)
workloads](https://github.com/rust-lang/rust/pull/33816)
* [The default `HashMap` hasher is SipHash 1-3 instead of SipHash 2-4](https://github.com/rust-lang/rust/pull/33940)
This hasher is faster, but is believed to provide sufficient
protection from collision attacks.
* [Comparison of `Ipv4Addr` is 10x faster]
(https://github.com/rust-lang/rust/pull/33891)
* [Comparison of `Ipv4Addr` is 10x faster](https://github.com/rust-lang/rust/pull/33891)
Rustdoc
-------
* [Fix empty implementation section on some module pages]
(https://github.com/rust-lang/rust/pull/34536)
* [Fix inlined renamed reexports in import lists]
(https://github.com/rust-lang/rust/pull/34479)
* [Fix search result layout for enum variants and struct fields]
(https://github.com/rust-lang/rust/pull/34477)
* [Fix issues with source links to external crates]
(https://github.com/rust-lang/rust/pull/34387)
* [Fix redirect pages for renamed reexports]
(https://github.com/rust-lang/rust/pull/34245)
* [Fix empty implementation section on some module pages](https://github.com/rust-lang/rust/pull/34536)
* [Fix inlined renamed reexports in import lists](https://github.com/rust-lang/rust/pull/34479)
* [Fix search result layout for enum variants and struct fields](https://github.com/rust-lang/rust/pull/34477)
* [Fix issues with source links to external crates](https://github.com/rust-lang/rust/pull/34387)
* [Fix redirect pages for renamed reexports](https://github.com/rust-lang/rust/pull/34245)
Tooling
-------
* [rustc is better at finding the MSVC toolchain]
(https://github.com/rust-lang/rust/pull/34492)
* [rustc is better at finding the MSVC toolchain](https://github.com/rust-lang/rust/pull/34492)
* [When emitting debug info, rustc emits frame pointers for closures,
shims and glue, as it does for all other functions]
(https://github.com/rust-lang/rust/pull/33909)
* [rust-lldb warns about unsupported versions of LLDB]
(https://github.com/rust-lang/rust/pull/34646)
shims and glue, as it does for all other functions](https://github.com/rust-lang/rust/pull/33909)
* [rust-lldb warns about unsupported versions of LLDB](https://github.com/rust-lang/rust/pull/34646)
* Many more errors have been given error codes and extended
explanations
* API documentation continues to be improved, with many new examples
@ -1757,30 +1621,22 @@ Misc
----
* [rustc no longer hangs when dependencies recursively re-export
submodules]
(https://github.com/rust-lang/rust/pull/34542)
* [rustc requires LLVM 3.7+]
(https://github.com/rust-lang/rust/pull/34104)
submodules](https://github.com/rust-lang/rust/pull/34542)
* [rustc requires LLVM 3.7+](https://github.com/rust-lang/rust/pull/34104)
* [The 'How Safe and Unsafe Interact' chapter of The Rustonomicon was
rewritten]
(https://github.com/rust-lang/rust/pull/33895)
* [rustc support 16-bit pointer sizes]
(https://github.com/rust-lang/rust/pull/33460).
rewritten](https://github.com/rust-lang/rust/pull/33895)
* [rustc support 16-bit pointer sizes](https://github.com/rust-lang/rust/pull/33460).
No targets use this yet, but it works toward AVR support.
Compatibility Notes
-------------------
* [`const`s and `static`s may not have unsized types]
(https://github.com/rust-lang/rust/pull/34443)
* [`const`s and `static`s may not have unsized types](https://github.com/rust-lang/rust/pull/34443)
* [The new follow-set rules that place restrictions on `macro_rules!`
in order to ensure syntax forward-compatibility have been enabled]
(https://github.com/rust-lang/rust/pull/33982)
This was an [ammendment to RFC 550]
(https://github.com/rust-lang/rfcs/pull/1384),
in order to ensure syntax forward-compatibility have been enabled](https://github.com/rust-lang/rust/pull/33982)
This was an [ammendment to RFC 550](https://github.com/rust-lang/rfcs/pull/1384),
and has been a warning since 1.10.
* [`cfg` attribute process has been refactored to fix various bugs]
(https://github.com/rust-lang/rust/pull/33706).
* [`cfg` attribute process has been refactored to fix various bugs](https://github.com/rust-lang/rust/pull/33706).
This causes breakage in some corner cases.
@ -1791,21 +1647,15 @@ Language
--------
* [Allow `concat_idents!` in type positions as well as in expression
positions]
(https://github.com/rust-lang/rust/pull/33735).
* [`Copy` types are required to have a trivial implementation of `Clone`]
(https://github.com/rust-lang/rust/pull/33420).
positions](https://github.com/rust-lang/rust/pull/33735).
* [`Copy` types are required to have a trivial implementation of `Clone`](https://github.com/rust-lang/rust/pull/33420).
[RFC 1521](https://github.com/rust-lang/rfcs/blob/master/text/1521-copy-clone-semantics.md).
* [Single-variant enums support the `#[repr(..)]` attribute]
(https://github.com/rust-lang/rust/pull/33355).
* [Fix `#[derive(RustcEncodable)]` in the presence of other `encode` methods]
(https://github.com/rust-lang/rust/pull/32908).
* [Single-variant enums support the `#[repr(..)]` attribute](https://github.com/rust-lang/rust/pull/33355).
* [Fix `#[derive(RustcEncodable)]` in the presence of other `encode` methods](https://github.com/rust-lang/rust/pull/32908).
* [`panic!` can be converted to a runtime abort with the
`-C panic=abort` flag]
(https://github.com/rust-lang/rust/pull/32900).
`-C panic=abort` flag](https://github.com/rust-lang/rust/pull/32900).
[RFC 1513](https://github.com/rust-lang/rfcs/blob/master/text/1513-less-unwinding.md).
* [Add a new crate type, 'cdylib']
(https://github.com/rust-lang/rust/pull/33553).
* [Add a new crate type, 'cdylib'](https://github.com/rust-lang/rust/pull/33553).
cdylibs are dynamic libraries suitable for loading by non-Rust hosts.
[RFC 1510](https://github.com/rust-lang/rfcs/blob/master/text/1510-rdylib.md).
Note that Cargo does not yet directly support cdylibs.
@ -1819,242 +1669,146 @@ Stabilized APIs
* `os::windows::fs::OpenOptionsExt::attributes`
* `os::windows::fs::OpenOptionsExt::security_qos_flags`
* `os::unix::fs::OpenOptionsExt::custom_flags`
* [`sync::Weak::new`]
(http://doc.rust-lang.org/alloc/arc/struct.Weak.html#method.new)
* [`sync::Weak::new`](http://doc.rust-lang.org/alloc/arc/struct.Weak.html#method.new)
* `Default for sync::Weak`
* [`panic::set_hook`]
(http://doc.rust-lang.org/std/panic/fn.set_hook.html)
* [`panic::take_hook`]
(http://doc.rust-lang.org/std/panic/fn.take_hook.html)
* [`panic::PanicInfo`]
(http://doc.rust-lang.org/std/panic/struct.PanicInfo.html)
* [`panic::PanicInfo::payload`]
(http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.payload)
* [`panic::PanicInfo::location`]
(http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.location)
* [`panic::Location`]
(http://doc.rust-lang.org/std/panic/struct.Location.html)
* [`panic::Location::file`]
(http://doc.rust-lang.org/std/panic/struct.Location.html#method.file)
* [`panic::Location::line`]
(http://doc.rust-lang.org/std/panic/struct.Location.html#method.line)
* [`ffi::CStr::from_bytes_with_nul`]
(http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul)
* [`ffi::CStr::from_bytes_with_nul_unchecked`]
(http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul_unchecked)
* [`ffi::FromBytesWithNulError`]
(http://doc.rust-lang.org/std/ffi/struct.FromBytesWithNulError.html)
* [`fs::Metadata::modified`]
(http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.modified)
* [`fs::Metadata::accessed`]
(http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.accessed)
* [`fs::Metadata::created`]
(http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.created)
* [`panic::set_hook`](http://doc.rust-lang.org/std/panic/fn.set_hook.html)
* [`panic::take_hook`](http://doc.rust-lang.org/std/panic/fn.take_hook.html)
* [`panic::PanicInfo`](http://doc.rust-lang.org/std/panic/struct.PanicInfo.html)
* [`panic::PanicInfo::payload`](http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.payload)
* [`panic::PanicInfo::location`](http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.location)
* [`panic::Location`](http://doc.rust-lang.org/std/panic/struct.Location.html)
* [`panic::Location::file`](http://doc.rust-lang.org/std/panic/struct.Location.html#method.file)
* [`panic::Location::line`](http://doc.rust-lang.org/std/panic/struct.Location.html#method.line)
* [`ffi::CStr::from_bytes_with_nul`](http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul)
* [`ffi::CStr::from_bytes_with_nul_unchecked`](http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul_unchecked)
* [`ffi::FromBytesWithNulError`](http://doc.rust-lang.org/std/ffi/struct.FromBytesWithNulError.html)
* [`fs::Metadata::modified`](http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.modified)
* [`fs::Metadata::accessed`](http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.accessed)
* [`fs::Metadata::created`](http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.created)
* `sync::atomic::Atomic{Usize,Isize,Bool,Ptr}::compare_exchange`
* `sync::atomic::Atomic{Usize,Isize,Bool,Ptr}::compare_exchange_weak`
* `collections::{btree,hash}_map::{Occupied,Vacant,}Entry::key`
* `os::unix::net::{UnixStream, UnixListener, UnixDatagram, SocketAddr}`
* [`SocketAddr::is_unnamed`]
(http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.is_unnamed)
* [`SocketAddr::as_pathname`]
(http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.as_pathname)
* [`UnixStream::connect`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.connect)
* [`UnixStream::pair`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.pair)
* [`UnixStream::try_clone`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.try_clone)
* [`UnixStream::local_addr`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.local_addr)
* [`UnixStream::peer_addr`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.peer_addr)
* [`UnixStream::set_read_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
* [`UnixStream::set_write_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
* [`UnixStream::read_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
* [`UnixStream::write_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
* [`UnixStream::set_nonblocking`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.set_nonblocking)
* [`UnixStream::take_error`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.take_error)
* [`UnixStream::shutdown`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.shutdown)
* [`SocketAddr::is_unnamed`](http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.is_unnamed)
* [`SocketAddr::as_pathname`](http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.as_pathname)
* [`UnixStream::connect`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.connect)
* [`UnixStream::pair`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.pair)
* [`UnixStream::try_clone`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.try_clone)
* [`UnixStream::local_addr`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.local_addr)
* [`UnixStream::peer_addr`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.peer_addr)
* [`UnixStream::set_read_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
* [`UnixStream::set_write_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
* [`UnixStream::read_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
* [`UnixStream::write_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
* [`UnixStream::set_nonblocking`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.set_nonblocking)
* [`UnixStream::take_error`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.take_error)
* [`UnixStream::shutdown`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.shutdown)
* Read/Write/RawFd impls for `UnixStream`
* [`UnixListener::bind`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.bind)
* [`UnixListener::accept`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.accept)
* [`UnixListener::try_clone`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.try_clone)
* [`UnixListener::local_addr`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.local_addr)
* [`UnixListener::set_nonblocking`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.set_nonblocking)
* [`UnixListener::take_error`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.take_error)
* [`UnixListener::incoming`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.incoming)
* [`UnixListener::bind`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.bind)
* [`UnixListener::accept`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.accept)
* [`UnixListener::try_clone`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.try_clone)
* [`UnixListener::local_addr`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.local_addr)
* [`UnixListener::set_nonblocking`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.set_nonblocking)
* [`UnixListener::take_error`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.take_error)
* [`UnixListener::incoming`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.incoming)
* RawFd impls for `UnixListener`
* [`UnixDatagram::bind`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.bind)
* [`UnixDatagram::unbound`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.unbound)
* [`UnixDatagram::pair`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.pair)
* [`UnixDatagram::connect`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.connect)
* [`UnixDatagram::try_clone`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.try_clone)
* [`UnixDatagram::local_addr`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.local_addr)
* [`UnixDatagram::peer_addr`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.peer_addr)
* [`UnixDatagram::recv_from`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv_from)
* [`UnixDatagram::recv`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv)
* [`UnixDatagram::send_to`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send_to)
* [`UnixDatagram::send`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send)
* [`UnixDatagram::set_read_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_read_timeout)
* [`UnixDatagram::set_write_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_write_timeout)
* [`UnixDatagram::read_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.read_timeout)
* [`UnixDatagram::write_timeout`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.write_timeout)
* [`UnixDatagram::set_nonblocking`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_nonblocking)
* [`UnixDatagram::take_error`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.take_error)
* [`UnixDatagram::shutdown`]
(http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.shutdown)
* [`UnixDatagram::bind`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.bind)
* [`UnixDatagram::unbound`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.unbound)
* [`UnixDatagram::pair`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.pair)
* [`UnixDatagram::connect`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.connect)
* [`UnixDatagram::try_clone`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.try_clone)
* [`UnixDatagram::local_addr`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.local_addr)
* [`UnixDatagram::peer_addr`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.peer_addr)
* [`UnixDatagram::recv_from`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv_from)
* [`UnixDatagram::recv`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv)
* [`UnixDatagram::send_to`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send_to)
* [`UnixDatagram::send`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send)
* [`UnixDatagram::set_read_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_read_timeout)
* [`UnixDatagram::set_write_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_write_timeout)
* [`UnixDatagram::read_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.read_timeout)
* [`UnixDatagram::write_timeout`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.write_timeout)
* [`UnixDatagram::set_nonblocking`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_nonblocking)
* [`UnixDatagram::take_error`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.take_error)
* [`UnixDatagram::shutdown`](http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.shutdown)
* RawFd impls for `UnixDatagram`
* `{BTree,Hash}Map::values_mut`
* [`<[_]>::binary_search_by_key`]
(http://doc.rust-lang.org/beta/std/primitive.slice.html#method.binary_search_by_key)
* [`<[_]>::binary_search_by_key`](http://doc.rust-lang.org/beta/std/primitive.slice.html#method.binary_search_by_key)
Libraries
---------
* [The `abs_sub` method of floats is deprecated]
(https://github.com/rust-lang/rust/pull/33664).
* [The `abs_sub` method of floats is deprecated](https://github.com/rust-lang/rust/pull/33664).
The semantics of this minor method are subtle and probably not what
most people want.
* [Add implementation of Ord for Cell<T> and RefCell<T> where T: Ord]
(https://github.com/rust-lang/rust/pull/33306).
* [Add implementation of Ord for Cell<T> and RefCell<T> where T: Ord](https://github.com/rust-lang/rust/pull/33306).
* [On Linux, if `HashMap`s can't be initialized with `getrandom` they
will fall back to `/dev/urandom` temporarily to avoid blocking
during early boot]
(https://github.com/rust-lang/rust/pull/33086).
* [Implemented negation for wrapping numerals]
(https://github.com/rust-lang/rust/pull/33067).
* [Implement `Clone` for `binary_heap::IntoIter`]
(https://github.com/rust-lang/rust/pull/33050).
* [Implement `Display` and `Hash` for `std::num::Wrapping`]
(https://github.com/rust-lang/rust/pull/33023).
* [Add `Default` implementation for `&CStr`, `CString`]
(https://github.com/rust-lang/rust/pull/32990).
* [Implement `From<Vec<T>>` and `Into<Vec<T>>` for `VecDeque<T>`]
(https://github.com/rust-lang/rust/pull/32866).
during early boot](https://github.com/rust-lang/rust/pull/33086).
* [Implemented negation for wrapping numerals](https://github.com/rust-lang/rust/pull/33067).
* [Implement `Clone` for `binary_heap::IntoIter`](https://github.com/rust-lang/rust/pull/33050).
* [Implement `Display` and `Hash` for `std::num::Wrapping`](https://github.com/rust-lang/rust/pull/33023).
* [Add `Default` implementation for `&CStr`, `CString`](https://github.com/rust-lang/rust/pull/32990).
* [Implement `From<Vec<T>>` and `Into<Vec<T>>` for `VecDeque<T>`](https://github.com/rust-lang/rust/pull/32866).
* [Implement `Default` for `UnsafeCell`, `fmt::Error`, `Condvar`,
`Mutex`, `RwLock`]
(https://github.com/rust-lang/rust/pull/32785).
`Mutex`, `RwLock`](https://github.com/rust-lang/rust/pull/32785).
Cargo
-----
* [Cargo.toml supports the `profile.*.panic` option]
(https://github.com/rust-lang/cargo/pull/2687).
* [Cargo.toml supports the `profile.*.panic` option](https://github.com/rust-lang/cargo/pull/2687).
This controls the runtime behavior of the `panic!` macro
and can be either "unwind" (the default), or "abort".
[RFC 1513](https://github.com/rust-lang/rfcs/blob/master/text/1513-less-unwinding.md).
* [Don't throw away errors with `-p` arguments]
(https://github.com/rust-lang/cargo/pull/2723).
* [Report status to stderr instead of stdout]
(https://github.com/rust-lang/cargo/pull/2693).
* [Don't throw away errors with `-p` arguments](https://github.com/rust-lang/cargo/pull/2723).
* [Report status to stderr instead of stdout](https://github.com/rust-lang/cargo/pull/2693).
* [Build scripts are passed a `CARGO_MANIFEST_LINKS` environment
variable that corresponds to the `links` field of the manifest]
(https://github.com/rust-lang/cargo/pull/2710).
* [Ban keywords from crate names]
(https://github.com/rust-lang/cargo/pull/2707).
* [Canonicalize `CARGO_HOME` on Windows]
(https://github.com/rust-lang/cargo/pull/2604).
* [Retry network requests]
(https://github.com/rust-lang/cargo/pull/2396).
variable that corresponds to the `links` field of the manifest](https://github.com/rust-lang/cargo/pull/2710).
* [Ban keywords from crate names](https://github.com/rust-lang/cargo/pull/2707).
* [Canonicalize `CARGO_HOME` on Windows](https://github.com/rust-lang/cargo/pull/2604).
* [Retry network requests](https://github.com/rust-lang/cargo/pull/2396).
By default they are retried twice, which can be customized with the
`net.retry` value in `.cargo/config`.
* [Don't print extra error info for failing subcommands]
(https://github.com/rust-lang/cargo/pull/2674).
* [Add `--force` flag to `cargo install`]
(https://github.com/rust-lang/cargo/pull/2405).
* [Don't use `flock` on NFS mounts]
(https://github.com/rust-lang/cargo/pull/2623).
* [Prefer building `cargo install` artifacts in temporary directories]
(https://github.com/rust-lang/cargo/pull/2610).
* [Don't print extra error info for failing subcommands](https://github.com/rust-lang/cargo/pull/2674).
* [Add `--force` flag to `cargo install`](https://github.com/rust-lang/cargo/pull/2405).
* [Don't use `flock` on NFS mounts](https://github.com/rust-lang/cargo/pull/2623).
* [Prefer building `cargo install` artifacts in temporary directories](https://github.com/rust-lang/cargo/pull/2610).
Makes it possible to install multiple crates in parallel.
* [Add `cargo test --doc`]
(https://github.com/rust-lang/cargo/pull/2578).
* [Add `cargo --explain`]
(https://github.com/rust-lang/cargo/pull/2551).
* [Don't print warnings when `-q` is passed]
(https://github.com/rust-lang/cargo/pull/2576).
* [Add `cargo doc --lib` and `--bin`]
(https://github.com/rust-lang/cargo/pull/2577).
* [Don't require build script output to be UTF-8]
(https://github.com/rust-lang/cargo/pull/2560).
* [Correctly attempt multiple git usernames]
(https://github.com/rust-lang/cargo/pull/2584).
* [Add `cargo test --doc`](https://github.com/rust-lang/cargo/pull/2578).
* [Add `cargo --explain`](https://github.com/rust-lang/cargo/pull/2551).
* [Don't print warnings when `-q` is passed](https://github.com/rust-lang/cargo/pull/2576).
* [Add `cargo doc --lib` and `--bin`](https://github.com/rust-lang/cargo/pull/2577).
* [Don't require build script output to be UTF-8](https://github.com/rust-lang/cargo/pull/2560).
* [Correctly attempt multiple git usernames](https://github.com/rust-lang/cargo/pull/2584).
Performance
-----------
* [rustc memory usage was reduced by refactoring the context used for
type checking]
(https://github.com/rust-lang/rust/pull/33425).
type checking](https://github.com/rust-lang/rust/pull/33425).
* [Speed up creation of `HashMap`s by caching the random keys used
to initialize the hash state]
(https://github.com/rust-lang/rust/pull/33318).
* [The `find` implementation for `Chain` iterators is 2x faster]
(https://github.com/rust-lang/rust/pull/33289).
* [Trait selection optimizations speed up type checking by 15%]
(https://github.com/rust-lang/rust/pull/33138).
* [Efficient trie lookup for boolean Unicode properties]
(https://github.com/rust-lang/rust/pull/33098).
to initialize the hash state](https://github.com/rust-lang/rust/pull/33318).
* [The `find` implementation for `Chain` iterators is 2x faster](https://github.com/rust-lang/rust/pull/33289).
* [Trait selection optimizations speed up type checking by 15%](https://github.com/rust-lang/rust/pull/33138).
* [Efficient trie lookup for boolean Unicode properties](https://github.com/rust-lang/rust/pull/33098).
10x faster than the previous lookup tables.
* [Special case `#[derive(Copy, Clone)]` to avoid bloat]
(https://github.com/rust-lang/rust/pull/31414).
* [Special case `#[derive(Copy, Clone)]` to avoid bloat](https://github.com/rust-lang/rust/pull/31414).
Usability
---------
* Many incremental improvements to documentation and rustdoc.
* [rustdoc: List blanket trait impls]
(https://github.com/rust-lang/rust/pull/33514).
* [rustdoc: Clean up ABI rendering]
(https://github.com/rust-lang/rust/pull/33151).
* [Indexing with the wrong type produces a more informative error]
(https://github.com/rust-lang/rust/pull/33401).
* [Improve diagnostics for constants being used in irrefutable patterns]
(https://github.com/rust-lang/rust/pull/33406).
* [When many method candidates are in scope limit the suggestions to 10]
(https://github.com/rust-lang/rust/pull/33338).
* [Remove confusing suggestion when calling a `fn` type]
(https://github.com/rust-lang/rust/pull/33325).
* [Do not suggest changing `&mut self` to `&mut mut self`]
(https://github.com/rust-lang/rust/pull/33319).
* [rustdoc: List blanket trait impls](https://github.com/rust-lang/rust/pull/33514).
* [rustdoc: Clean up ABI rendering](https://github.com/rust-lang/rust/pull/33151).
* [Indexing with the wrong type produces a more informative error](https://github.com/rust-lang/rust/pull/33401).
* [Improve diagnostics for constants being used in irrefutable patterns](https://github.com/rust-lang/rust/pull/33406).
* [When many method candidates are in scope limit the suggestions to 10](https://github.com/rust-lang/rust/pull/33338).
* [Remove confusing suggestion when calling a `fn` type](https://github.com/rust-lang/rust/pull/33325).
* [Do not suggest changing `&mut self` to `&mut mut self`](https://github.com/rust-lang/rust/pull/33319).
Misc
----
* [Update i686-linux-android features to match Android ABI]
(https://github.com/rust-lang/rust/pull/33651).
* [Update aarch64-linux-android features to match Android ABI]
(https://github.com/rust-lang/rust/pull/33500).
* [Update i686-linux-android features to match Android ABI](https://github.com/rust-lang/rust/pull/33651).
* [Update aarch64-linux-android features to match Android ABI](https://github.com/rust-lang/rust/pull/33500).
* [`std` no longer prints backtraces on platforms where the running
module must be loaded with `env::current_exe`, which can't be relied
on](https://github.com/rust-lang/rust/pull/33554).
@ -2065,34 +1819,24 @@ Misc
* [The `rust-gdb` and `rust-lldb` scripts are distributed on all
Unix platforms](https://github.com/rust-lang/rust/pull/32835).
* [On Unix the runtime aborts by calling `libc::abort` instead of
generating an illegal instruction]
(https://github.com/rust-lang/rust/pull/31457).
generating an illegal instruction](https://github.com/rust-lang/rust/pull/31457).
* [Rust is now bootstrapped from the previous release of Rust,
instead of a snapshot from an arbitrary commit]
(https://github.com/rust-lang/rust/pull/32942).
instead of a snapshot from an arbitrary commit](https://github.com/rust-lang/rust/pull/32942).
Compatibility Notes
-------------------
* [`AtomicBool` is now bool-sized, not word-sized]
(https://github.com/rust-lang/rust/pull/33579).
* [`AtomicBool` is now bool-sized, not word-sized](https://github.com/rust-lang/rust/pull/33579).
* [`target_env` for Linux ARM targets is just `gnu`, not
`gnueabihf`, `gnueabi`, etc]
(https://github.com/rust-lang/rust/pull/33403).
* [Consistently panic on overflow in `Duration::new`]
(https://github.com/rust-lang/rust/pull/33072).
* [Change `String::truncate` to panic less]
(https://github.com/rust-lang/rust/pull/32977).
* [Add `:block` to the follow set for `:ty` and `:path`]
(https://github.com/rust-lang/rust/pull/32945).
`gnueabihf`, `gnueabi`, etc](https://github.com/rust-lang/rust/pull/33403).
* [Consistently panic on overflow in `Duration::new`](https://github.com/rust-lang/rust/pull/33072).
* [Change `String::truncate` to panic less](https://github.com/rust-lang/rust/pull/32977).
* [Add `:block` to the follow set for `:ty` and `:path`](https://github.com/rust-lang/rust/pull/32945).
Affects how macros are parsed.
* [Fix macro hygiene bug]
(https://github.com/rust-lang/rust/pull/32923).
* [Fix macro hygiene bug](https://github.com/rust-lang/rust/pull/32923).
* [Feature-gated attributes on macro-generated macro invocations are
now rejected]
(https://github.com/rust-lang/rust/pull/32791).
* [Suppress fallback and ambiguity errors during type inference]
(https://github.com/rust-lang/rust/pull/32258).
now rejected](https://github.com/rust-lang/rust/pull/32791).
* [Suppress fallback and ambiguity errors during type inference](https://github.com/rust-lang/rust/pull/32258).
This caused some minor changes to type inference.

4
configure vendored
View File

@ -510,7 +510,6 @@ valopt default-ar "ar" "the default ar"
opt_nosave manage-submodules 1 "let the build manage the git submodules"
opt_nosave clang 0 "prefer clang to gcc for building the runtime"
opt_nosave jemalloc 1 "build liballoc with jemalloc"
opt elf-tls 1 "elf thread local storage on platforms where supported"
opt full-bootstrap 0 "build three compilers instead of two"
opt extended 0 "build an extended rust tool set"
@ -520,6 +519,7 @@ valopt_nosave host "${CFG_BUILD}" "GNUs ./configure syntax LLVM host triples"
valopt_nosave target "${CFG_HOST}" "GNUs ./configure syntax LLVM target triples"
valopt_nosave mandir "${CFG_PREFIX}/share/man" "install man pages in PATH"
valopt_nosave docdir "${CFG_PREFIX}/share/doc/rust" "install documentation in PATH"
valopt_nosave bindir "${CFG_PREFIX}/bin" "install binaries"
# On Windows this determines root of the subtree for target libraries.
# Host runtime libs always go to 'bin'.
@ -711,6 +711,7 @@ envopt LDFLAGS
CFG_PREFIX=${CFG_PREFIX%/}
CFG_MANDIR=${CFG_MANDIR%/}
CFG_DOCDIR=${CFG_DOCDIR%/}
CFG_BINDIR=${CFG_BINDIR%/}
CFG_HOST="$(echo $CFG_HOST | tr ',' ' ')"
CFG_TARGET="$(echo $CFG_TARGET | tr ',' ' ')"
@ -751,6 +752,7 @@ putvar CFG_X86_64_LINUX_ANDROID_NDK
putvar CFG_NACL_CROSS_PATH
putvar CFG_MANDIR
putvar CFG_DOCDIR
putvar CFG_BINDIR
putvar CFG_USING_LIBCPP
msg

View File

@ -1,4 +1,4 @@
.TH RUSTDOC "1" "September 2016" "rustdoc 1.13.0" "User Commands"
.TH RUSTDOC "1" "May 2017" "rustdoc 1.19.0" "User Commands"
.SH NAME
rustdoc \- generate documentation from Rust source code
.SH SYNOPSIS
@ -15,14 +15,13 @@ provides several output formats for the generated documentation.
.TP
\fB\-r\fR, \fB\-\-input\-format\fR \fIFORMAT\fR
html or json (default: inferred)
rust
.TP
\fB\-w\fR, \fB\-\-output\-format\fR \fIFORMAT\fR
html or json (default: html)
html
.TP
\fB\-o\fR, \fB\-\-output\fR \fIOUTPUT\fR
where to place the output (default: \fIdoc/\fR for html,
\fIdoc.json\fR for json)
\fB\-o\fR, \fB\-\-output\fR \fIOUTPUT\fR,
where to place the output (default: \fIdoc/\fR for html)
.TP
\fB\-\-passes\fR \fILIST\fR
space\[hy]separated list of passes to run (default: '')
@ -60,14 +59,25 @@ pass arguments to the test runner
\fB\-\-html\-in\-header\fR \fIFILE\fR
file to add to <head>
.TP
\fB\-\-html\-before\-content\fR \fIFILE\fR
file to add in <body>, before content
\fB\-\-html\-before\-content\fR \fIFILES\fR
files to include inline between <body> and the content of a rendered Markdown
file or generated documentation
.TP
\fB\-\-html\-after\-content\fR \fIFILE\fR
file to add in <body>, after content
\fB\-\-markdown\-before\-content\fR \fIFILES\fR
files to include inline between <body> and the content of a rendered
Markdown file or generated documentation
.TP
\fB\-\-markdown\-css\fR \fIFILE\fR
CSS files to include via <link> in a rendered Markdown file
\fB\-\-html\-after\-content\fR \fIFILES\fR
files to include inline between the content and </body> of a rendered
Markdown file or generated documentation
.TP
\fB\-\-markdown\-after\-content\fR \fIFILES\fR
files to include inline between the content and </body> of a rendered
Markdown file or generated documentation
.TP
\fB\-\-markdown\-css\fR \fIFILES\fR
CSS files to include via <link> in a rendered Markdown file Markdown file or
generated documentation
.TP
\fB\-\-markdown\-playground\-url\fR \fIURL\fR
URL to send code snippets to
@ -75,40 +85,21 @@ URL to send code snippets to
\fB\-\-markdown\-no\-toc\fR
don't include table of contents
.TP
\fB\-h\fR, \fB\-\-help\fR
Print help
\fB\-h\fR, \fB\-\-extend\-css\fR
to redefine some css rules with a given file to generate doc with your own theme
.TP
\fB\-V\fR, \fB\-\-version\fR
Print rustdoc's version
.SH "OUTPUT FORMATS"
The rustdoc tool can generate output in either an HTML or JSON format.
The rustdoc tool can generate output in an HTML format.
If using an HTML format, then the specified output destination will be the root
directory of an HTML structure for all the documentation.
Pages will be placed into this directory, and source files will also
possibly be rendered into it as well.
If using a JSON format, then the specified output destination will have the
rustdoc output serialized as JSON into it.
This output format exists to pre\[hy]compile documentation for crates,
and for usage in non\[hy]rustdoc tools.
The JSON output is the following hash:
{
"schema": VERSION,
"crate": ...,
"plugins": ...,
}
The schema version indicates what the structure of crate/plugins will
look like.
Within a schema version the structure will remain the same.
The \fIcrate\fR field will contain all relevant documentation for the
source being documented, and the \fIplugins\fR field will contain the
output of the plugins run over the crate.
.SH "EXAMPLES"
To generate documentation for the source in the current directory:
@ -117,11 +108,6 @@ To generate documentation for the source in the current directory:
List all available passes that rustdoc has, along with default passes:
$ rustdoc \-\-passes list
To precompile the documentation for a crate, and then use it to render html at
a later date:
$ rustdoc \-w json hello.rs
$ rustdoc doc.json
The generated HTML can be viewed with any standard web browser.
.SH "SEE ALSO"

View File

@ -1,17 +0,0 @@
language: rust
sudo: true
cache: cargo
os:
- linux
- osx
rust:
- nightly
before_install:
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo add-apt-repository ppa:kubuntu-ppa/backports -y; fi
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo apt-get update -qq; fi
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo apt-get install -qq cmake=2.8.12.2-0ubuntu1~ubuntu12.04.1~ppa2; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew update; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew upgrade cmake; fi
script:
- cargo build --verbose
- cargo test --release --verbose

View File

@ -1,40 +0,0 @@
Short version for non-lawyers:
The Rust Project is dual-licensed under Apache 2.0 and MIT
terms.
Longer version:
The Rust Project is copyright 2010, The Rust Project
Developers.
Licensed under the Apache License, Version 2.0
<LICENSE-APACHE or
http://www.apache.org/licenses/LICENSE-2.0> or the MIT
license <LICENSE-MIT or http://opensource.org/licenses/MIT>,
at your option. All files in the project carrying such
notice may not be copied, modified, or distributed except
according to those terms.
* Additional copyright may be retained by contributors other
than Mozilla, the Rust Project Developers, or the parties
enumerated in this file. Such copyright can be determined
on a case-by-case basis by examining the author of each
portion of a file in the revision-control commit records
of the project, or by consulting representative comments
claiming copyright ownership for a file.
For example, the text:
"Copyright (c) 2011 Google Inc."
appears in some files, and these files thereby denote
that their author and copyright-holder is Google Inc.
In all such cases, the absence of explicit licensing text
indicates that the contributor chose to license their work
for distribution under identical terms to those Mozilla
has chosen for the collective work, enumerated at the top
of this file. The only difference is the retention of
copyright itself, held by the contributor.

1349
rls/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,28 +0,0 @@
[package]
name = "rls"
version = "0.1.0"
authors = ["Jonathan Turner <jturner@mozilla.com>"]
[dependencies]
cargo = { git = "https://github.com/rust-lang/cargo" }
derive-new = "0.3"
env_logger = "0.3"
languageserver-types = { git = "https://github.com/gluon-lang/languageserver-types" }
log = "0.3"
racer = { git = "https://github.com/phildawes/racer" }
rls-analysis = { git = "https://github.com/nrc/rls-analysis" }
rls-data = "0.1"
rls-span = { version = "0.1", features = ["serialize-serde"] }
rls-vfs = { git = "https://github.com/nrc/rls-vfs", features = ["racer-impls"] }
rustc-serialize = "0.3"
rustfmt = { git = "https://github.com/rust-lang-nursery/rustfmt" }
serde = "0.9"
serde_json = "0.9"
serde_derive = "0.9"
toml = "0.3"
url = "1.1.0"
url_serde = "0.1.0"
[dependencies.hyper]
version = "0.9"
default-features = false

View File

@ -1,123 +0,0 @@
[![Build Status](https://travis-ci.org/rust-lang-nursery/rls.svg?branch=master)](https://travis-ci.org/rust-lang-nursery/rls) [![Build status](https://ci.appveyor.com/api/projects/status/cxfejvsqnnc1oygs?svg=true)](https://ci.appveyor.com/project/jonathandturner/rls-x6grn)
# Rust Language Server (RLS)
**This project is in the alpha stage of development. It is likely to be buggy in
some situations; proceed with caution.**
The RLS provides a server that runs in the background, providing IDEs,
editors, and other tools with information about Rust programs. It supports
functionality such as 'goto definition', symbol search, reformatting, and code
completion, and enables renaming and refactorings.
The RLS gets its source data from the compiler and from
[Racer](https://github.com/phildawes/racer). Where possible it uses data from
the compiler which is precise and complete. Where its not possible, (for example
for code completion and where building is too slow), it uses Racer.
Since the Rust compiler does not yet support end-to-end incremental compilation,
we can't offer a perfect experience. However, by optimising our use of the
compiler and falling back to Racer, we can offer a pretty good experience for
small to medium sized crates. As the RLS and compiler evolve, we'll offer a
better experience for larger and larger crates.
The RLS is designed to be frontend-independent. We hope it will be widely
adopted by different editors and IDEs. To seed development, we provide a
[reference implementation of an RLS frontend](https://github.com/jonathandturner/rls_vscode)
for [Visual Studio Code](https://code.visualstudio.com/).
## Setup
### Step 1: Install rustup
You can install [rustup](http://rustup.rs/) on many platforms. This will help us quickly install the
rls and its dependencies.
### Step 2: Switch to nightly
Switch to the nightly compiler:
```
rustup default nightly
rustup update nightly
```
### Step 3: Install the RLS
Once you have rustup installed, run the following commands:
```
rustup component add rls
rustup component add rust-analysis
rustup component add rust-src
```
If you've never set up Racer before, you may also need follow the [Racer configuration
steps](https://github.com/phildawes/racer#configuration)
## Running
Though the RLS is built to work with many IDEs and editors, we currently use
VSCode to test the RLS.
To run with VSCode, you'll need a
[recent VSCode version](https://code.visualstudio.com/download) installed.
Next, you'll need to run the VSCode extension (for this step, you'll need a
recent [node](https://nodejs.org/en/) installed:
```
git clone https://github.com/jonathandturner/rls_vscode.git
cd rls_vscode
npm install
code .
```
VSCode will open into the `rls_vscode` project. From here, click the Debug
button on the left-hand side (a bug with a line through it). Next, click the
green triangle at the top. This will launch a new instance of VSCode with the
`rls_vscode` plugin enabled. From there, you can open your Rust projects using
the RLS.
You'll know it's working when you see this in the status bar at the bottom, with
a spinning indicator:
`RLS analysis: working /`
Once you see:
`RLS analysis: done`
Then you have the full set of capabilities available to you. You can goto def,
find all refs, rename, goto type, etc. Completions are also available using the
heuristics that Racer provides. As you type, your code will be checked and
error squiggles will be reported when errors occur. You can hover these
squiggles to see the text of the error.
## Configuration
The RLS can be configured on a per-project basis by adding a file called
`rls.toml` to the project root (i.e., next to Cargo.toml). Entries in this file
will affect how the RLS operates and how it builds your project.
Currently we accept the following options:
* `build_lib` (`bool`, defaults to `false`) checks the project as if you passed
the `--lib` argument to cargo.
* `cfg_test` (`bool`, defaults to `true`) checks the project as if you were
running `cargo test` rather than `cargo build`. I.e., compiles (but does not
run) test code.
* `unstable_features` (`bool`, defaults to `false`) enables unstable features.
Currently, this includes renaming and formatting.
* `sysroot` (`String`, defaults to `""`) if the given string is not empty, use
the given path as the sysroot for all rustc invocations instead of trying to
detect the sysroot automatically
## Contributing
You can look in the [contributing.md](https://github.com/rust-lang-nursery/rls/blob/master/contributing.md)
in this repo to learn more about contributing to this project.

View File

@ -1,41 +0,0 @@
environment:
global:
RUST_TEST_THREADS: 1
PROJECT_NAME: rls
matrix:
# Nightly channel
#- TARGET: i686-pc-windows-gnu
# CHANNEL: nightly
# BITS: 32
- TARGET: i686-pc-windows-msvc
CHANNEL: nightly
BITS: 32
#- TARGET: x86_64-pc-windows-gnu
# CHANNEL: nightly
# BITS: 64
- TARGET: x86_64-pc-windows-msvc
CHANNEL: nightly
BITS: 64
install:
- set PATH=C:\msys64\mingw%BITS%\bin;C:\msys64\usr\bin;%PATH%
- curl -sSf -o rustup-init.exe https://win.rustup.rs
# Install rust, x86_64-pc-windows-msvc host
- rustup-init.exe -y --default-host x86_64-pc-windows-msvc --default-toolchain nightly-x86_64-pc-windows-msvc
# Install the target we're compiling for
- set PATH=%PATH%;C:\Users\appveyor\.cargo\bin
- set PATH=%PATH%;C:\Users\appveyor\.multirust\toolchains\nightly-x86_64-pc-windows-msvc\lib\rustlib\%TARGET%\lib
- if NOT "%TARGET%" == "x86_64-pc-windows-msvc" rustup target add %TARGET%
- rustc -Vv
- cargo -V
build: false
test_script:
- set RUST_TEST_THREADS=1
- cargo test --release --target %TARGET% --verbose
cache:
- target
- C:\Users\appveyor\.cargo\registry

View File

@ -1,300 +0,0 @@
# Contributing
This document provides information for developers who want to contribute to the
RLS or run it in a heavily customised configuration.
The RLS is open source and we'd love you to contribute to the project. Testing,
reporting issues, writing documentation, writing tests, writing code, and
implementing clients are all extremely valuable.
Here is the list of known [issues](https://github.com/rust-lang-nursery/rls/issues).
These are [good issues to start on](https://github.com/rust-lang-nursery/rls/issues?q=is%3Aopen+is%3Aissue+label%3Aeasy).
We're happy to help however we can. The best way to get help is either to
leave a comment on an issue in this repo, or to ping us (nrc or jntrnr) in #rust-tools
on IRC.
We'd love for existing and new tools to use the RLS. If that sounds interesting
please get in touch by filing an issue or on IRC.
## Building
**YOU NEED A VERY RECENT NIGHTLY COMPILER**
Otherwise the RLS will not work very well. You also don't need to build the `rls` to use it. Instead, you can
install via `rustup`, which is the currently preferred method. See the [readme](README.md) for more information.
### Step 1: Install build dependencies
On Linux, you will need [pkg-config](https://www.freedesktop.org/wiki/Software/pkg-config/)
and [zlib](http://zlib.net/):
- On Ubuntu run: `sudo apt-get install pkg-config zlib1g-dev`
- On Fedora run: `sudo dnf install pkgconfig zlib-devel`
### Step 2: Clone and build the RLS
Since the RLS is closely linked to the compiler and is in active development,
you'll need a recent nightly compiler to build it.
```
git clone https://github.com/rust-lang-nursery/rls.git
cd rls
cargo build --release
```
### Step 3: Connect the RLS to your compiler
If you're using recent versions of rustup, you will also need to make sure that
the compiler's dynamic libraries are available for the RLS to load. You can see
where they are using:
```
rustc --print sysroot
```
This will show you where the compiler keeps the dynamic libs. In Windows, this
will be in the `bin` directory under this path. On other platforms, it will be
in the `lib` directory.
Next, you'll make the compiler available to the RLS:
#### Windows
On Windows, make sure this path (plus `bin`) is in your PATH. For example:
```
set PATH=%PATH%;C:\Users\appveyor\.multirust\toolchains\nightly-i686-pc-windows-gnu\bin
```
#### Mac
For Mac, you need to set the DYLD_LIBRARY_PATH. For example:
```
export DYLD_LIBRARY_PATH=$(rustc --print sysroot)/lib
```
#### Linux
For Linux, this path is called LD_LIBRARY_PATH.
```
export LD_LIBRARY_PATH=$(rustc --print sysroot)/lib
```
### Step 4: Set your RLS_ROOT
Next, we'll set the RLS_ROOT environment variable to point to where we built
the RLS:
```
export RLS_ROOT=/Source/rls
```
### Step 5: Download standard library metadata
Finally, we need to get the metadata for the standard library. This lets
us get additional docs and types for all of `std`. The command is currently only
supported on the nightly compilers, though we hope to remove this restriction in
the future.
```
rustup component add rust-analysis
```
If you've never set up Racer before, you may also need follow the [Racer configuration
steps](https://github.com/phildawes/racer#configuration)
## Running and testing
You can run the rls by hand with:
```
cargo run
```
Though more commonly, you'll use an IDE plugin to invoke it for you.
Test using `cargo test`.
Testing is unfortunately minimal. There is support for regression tests, but not
many actual tests exists yet. There is signifcant [work to do](https://github.com/rust-lang-nursery/rls/issues/12)
before we have a comprehensive testing story.
## Standard library support
The way it works is that when the libraries are built, the compiler can emit all
the data that the RLS needs. This can be read by the RLS on startup and used to
provide things like type on hover without having access to the source code for
the libraries.
The compiler gives every definition an id, and the RLS matches up these ids. In
order for the RLS to work, the id of a identifier used in the IDE and the id of
its declaration in a library must match exactly. Since ids are very unstable,
the data used by the RLS for libraries must match exactly with the crate that
your source code links with.
You need a version of the above data which exactly matches the standard
libraries you will use with your project. Rustup takes care of this for you and
is the preferred (and easiest) method for installing this data. If you want to
use the RLS with a Rust compiler/libraries you have built yourself, then you'll
need to take some extra steps.
### Install with rustup
You'll need to be using [rustup](https://www.rustup.rs/) to manage your Rust
compiler toolchains. The RLS does not yet support cross-compilation - your
compiler host and target must be exactly the same.
You must be using nightly (you need to be using nightly for the RLS to work at
the moment in any case). To install a nightly toolchain use `rustup install
nightly`. To switch to using that nightly toolchain by default use `rustup
default nightly`.
Add the RLS data component using `rustup component add rust-analysis`.
Everything should now work! You may need to restart the RLS.
### Build it yourself
When you build Rust, add `-Zsave-analysis-api` to your stage 2 flags, e.g., by
setting the environment variable:
```
export RUSTFLAGS_STAGE2='-Zsave-analysis-api'
```
When the build has finished, you should have a bunch of JSON data in a directory like
`~/rust1/build/x86_64-unknown-linux-gnu/stage1-std/x86_64-unknown-linux-gnu/release/deps/save-analysis`.
You need to copy all those files (should be around 16) into a new directory:
`~/rust1/build/x86_64-unknown-linux-gnu/stage2/lib/rustlib/x86_64-unknown-linux-gnu/analysis`
(assuming you are running the stage 2 compiler you just built. You'll need to
modify the root directory (`~/rust1` here) and the host triple
(`x86_64-unknown-linux-gnu` in both places)).
Finally, to run the RLS you'll need to set things up to use the newly built
compiler, something like:
```
export RUSTC="~/rust1/build/x86_64-unknown-linux-gnu/stage2/bin/rustc"
```
Either before you run the RLS, or before you run the IDE which will start the
RLS.
### Details
Rustup (or you, manually) will install the rls data (which is a bunch of json
files) into `$SYSROOT/lib/rustlib/$TARGET_TRIPLE/analysis`, where `$SYSROOT` is
your Rust sysroot, this can be found using `rustc --print=sysroot`.
`$TARGET_TRIPLE` is the triple which defines the compilation target. Since the
RLS currently does not support cross-compilation, this must match your host
triple. It will look something like `x86_64-unknown-linux-gnu`.
For example, on my system RLS data is installed at:
```
/home/ncameron/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/analysis
```
This data is only for the standard libraries, project-specific data is stored
inside your project's target directory.
## Implementation overview
The goal of the RLS project is to provide an awesome IDE experience *now*. That
means not waiting for incremental compilation support in the compiler. However,
Rust is a somewhat complex language to analyse and providing precise and
complete information about programs requires using the compiler.
The RLS has two data sources - the compiler and Racer. The compiler is always
right, and always precise. But can sometimes be too slow for IDEs. Racer is
nearly always fast, but can't handle some constructs (e.g., macros) or can only
handle them with limited precision (e.g., complex generic types).
The RLS tries to provide data using the compiler. It sets a time budget and
queries both the compiler and Racer. If the compiler completes within the time
budget, we use that data. If not, we use Racer's data.
We link both Racer and the compiler into the RLS, so we don't need to shell out
to either (though see notes on the build process below). We also customise our
use of the compiler (via standard APIs) so that we can read modified files
directly from memory without saving them to disk.
### Building
The RLS tracks changes to files, and keeps the changed file in memory (i.e., the
RLS does not need the IDE to save a file before providing data). These changed
files are tracked by the 'Virtual File System' (which is a bit of a grandiose
name for a pretty simple file cache at the moment, but I expect this area to
grow significantly in the future). The VFS is in a [separate
crate](https://github.com/nrc/rls-vfs).
We want to start building before the user needs information (it would be too
slow to start a build when data is requested). However, we don't want to start a
build on every keystroke (this would be too heavy on user resources). Nor is
there any point starting multiple builds when we would throw away the data from
some of them. We therefore try to queue up and coalesce builds. This is further
documented in [src/build.rs](src/build.rs).
When we do start a build, we may also need to build dependent crates. We
therefore do a full `cargo build`. However, we do not compile the last crate
(the one the user is editing in the IDE). We only run Cargo to get a command
line to build that crate. Furthermore, we cache that command line, so for most
builds (where we don't need to build dependent crates, and where we can be
reasonably sure they haven't changed since a previous build) we don't run Cargo
at all.
The command line we got from Cargo, we chop up and feed to the in-process
compiler. We then collect error messages and analysis data in JSON format
(although this is inefficient and [should
change](https://github.com/rust-lang-nursery/rls/issues/25)).
### Analysis data
From the compiler, we get a serialised dump of its analysis data (from name
resolution and type checking). We combine data from all crates and the standard
libraries and combine this into an index for the whole project. We cross-
reference and store this data in HashMaps and use it to look up data for the
IDE.
Reading, processing, and storing the analysis data is handled by the
[rls-analysis crate](https://github.com/nrc/rls-analysis).
### Communicating with IDEs
The RLS communicates with IDEs via
the [Language Server protocol](https://github.com/Microsoft/language-server-protocol/blob/master/protocol.md).
The LS protocol uses JSON sent over stdin/stdout. The JSON is rather dynamic -
we can't make structs to easily map to many of the protocol objects. The client
sends commands and notifications to the RLS. Commands must get a reply,
notifications do not. Usually the structure of the reply is dictated by the
protocol spec. The RLS can also send notifications to the client. So for a long
running task (such as a build), the RLS will reply quickly to acknowledge the
request, then send a message later with the result of the task.
Associating requests with replies is done using an id which must be handled by
the RLS.
### Extensions to the Language Server Protocol
The RLS uses some custom extensions to the Language Server Protocol. Currently
these are all sent from the RLS to an LSP client and are only used to improve
the user experience by showing progress indicators.
* `rustDocument/diagnosticsBegin`: notification, no arguments. Sent before a
build starts and before any diagnostics from a build are sent.
* `rustDocument/diagnosticsEnd`: notification, no arguments. Sent when a build
is complete (successfully or not, or even skipped) and all post-build analysis
by the RLS is complete.

View File

@ -1,5 +0,0 @@
# jntrnr's test
#curl -v -H "Content-Type: application/json" -X POST -d '{{"pos": {"filepath":"sample_project/src/main.rs","line":22,"col":5}, "span":{"file_name":"sample_project/src/main.rs","line_start":22,"column_start":5,"line_end":22,"column_end":6}}}' 127.0.0.1:9000/goto_def
# nrc's test
curl -v -H "Content-Type: application/json" -X POST -d '{{"pos": {"filepath":"sample_project_2/src/main.rs","line":18,"col":15}, "span":{"file_name":"src/main.rs","line_start":18,"column_start":13,"line_end":18,"column_end":16}}}' 127.0.0.1:9000/goto_def

View File

@ -1,6 +0,0 @@
export PATH="$PWD/target/debug:$PATH"
#export RUSTC="/home/ncameron/rust/x86_64-unknown-linux-gnu/stage2/bin/rustc"
#export SYS_ROOT="/home/ncameron/rust/x86_64-unknown-linux-gnu/stage2"
#export SYS_ROOT="/usr/local"
export RUST_BACKTRACE=1
cargo build && code

View File

@ -1,126 +0,0 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::path::PathBuf;
use ls_types::{DiagnosticSeverity, NumberOrString};
use serde_json;
use span::compiler::DiagnosticSpan;
use span;
use actions::lsp_extensions::{RustDiagnostic, LabelledRange};
use lsp_data::ls_util;
#[derive(Debug, Deserialize)]
struct CompilerMessageCode {
code: String
}
#[derive(Debug, Deserialize)]
struct CompilerMessage {
message: String,
code: Option<CompilerMessageCode>,
level: String,
spans: Vec<DiagnosticSpan>,
children: Vec<CompilerMessage>,
}
#[derive(Debug)]
pub struct FileDiagnostic {
pub file_path: PathBuf,
pub diagnostic: RustDiagnostic,
}
#[derive(Debug)]
pub enum ParseError {
JsonError(serde_json::Error),
NoSpans,
}
impl From<serde_json::Error> for ParseError {
fn from(error: serde_json::Error) -> Self {
ParseError::JsonError(error)
}
}
pub fn parse(message: &str) -> Result<FileDiagnostic, ParseError> {
let message = serde_json::from_str::<CompilerMessage>(message)?;
if message.spans.is_empty() {
return Err(ParseError::NoSpans);
}
let message_text = compose_message(&message);
let primary = message.spans.iter()
.filter(|x| x.is_primary)
.collect::<Vec<&span::compiler::DiagnosticSpan>>()[0].clone();
let primary_span = primary.rls_span().zero_indexed();
let primary_range = ls_util::rls_to_range(primary_span.range);
// build up the secondary spans
let secondary_labels: Vec<LabelledRange> = message.spans.iter()
.filter(|x| !x.is_primary)
.map(|x| {
let secondary_range = ls_util::rls_to_range(x.rls_span().zero_indexed().range);
LabelledRange {
start: secondary_range.start,
end: secondary_range.end,
label: x.label.clone(),
}
}).collect();
let diagnostic = RustDiagnostic {
range: LabelledRange {
start: primary_range.start,
end: primary_range.end,
label: primary.label.clone(),
},
secondaryRanges: secondary_labels,
severity: Some(if message.level == "error" {
DiagnosticSeverity::Error
} else {
DiagnosticSeverity::Warning
}),
code: Some(NumberOrString::String(match message.code {
Some(c) => c.code.clone(),
None => String::new(),
})),
source: Some("rustc".into()),
message: message_text,
};
Ok(FileDiagnostic {
file_path: primary_span.file.clone(),
diagnostic: diagnostic
})
}
/// Builds a more sophisticated error message
fn compose_message(compiler_message: &CompilerMessage) -> String {
let mut message = compiler_message.message.clone();
for sp in &compiler_message.spans {
if !sp.is_primary {
continue;
}
if let Some(ref label) = sp.label {
message.push_str("\n");
message.push_str(label);
}
}
if !compiler_message.children.is_empty() {
message.push_str("\n");
for child in &compiler_message.children {
message.push_str(&format!("\n{}: {}", child.level, child.message));
}
}
message
}

View File

@ -1,62 +0,0 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use url_serde;
use lsp_data::*;
use url::Url;
#[derive(Debug, PartialEq, Deserialize, Serialize)]
pub struct PublishRustDiagnosticsParams {
/// The URI for which diagnostic information is reported.
#[serde(deserialize_with = "url_serde::deserialize", serialize_with = "url_serde::serialize")]
pub uri: Url,
/// An array of diagnostic information items.
pub diagnostics: Vec<RustDiagnostic>,
}
/// A range in a text document expressed as (zero-based) start and end positions.
/// A range is comparable to a selection in an editor. Therefore the end position is exclusive.
#[derive(Debug, PartialEq, Clone, Default, Deserialize, Serialize)]
pub struct LabelledRange {
/// The range's start position.
pub start: Position,
/// The range's end position.
pub end: Position,
/// The optional label.
pub label: Option<String>,
}
/// Represents a diagnostic, such as a compiler error or warning.
/// Diagnostic objects are only valid in the scope of a resource.
#[allow(non_snake_case)]
#[derive(Debug, PartialEq, Clone, Default, Deserialize, Serialize)]
pub struct RustDiagnostic {
/// The primary range at which the message applies.
pub range: LabelledRange,
/// The secondary ranges that apply to the message
pub secondaryRanges: Vec<LabelledRange>,
/// The diagnostic's severity. Can be omitted. If omitted it is up to the
/// client to interpret diagnostics as error, warning, info or hint.
pub severity: Option<DiagnosticSeverity>,
/// The diagnostic's code. Can be omitted.
pub code: Option<NumberOrString>,
/// A human-readable string describing the source of this
/// diagnostic, e.g. 'typescript' or 'super lint'.
pub source: Option<String>,
/// The diagnostic's message.
pub message: String,
}

View File

@ -1,532 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
mod compiler_message_parsing;
mod lsp_extensions;
use analysis::{AnalysisHost};
use hyper::Url;
use vfs::{Vfs, Change};
use racer;
use rustfmt::{Input as FmtInput, format_input};
use rustfmt::config::{self, WriteMode};
use serde_json;
use span;
use Span;
use build::*;
use lsp_data::*;
use server::{ResponseData, Output};
use std::collections::HashMap;
use std::panic;
use std::path::{Path, PathBuf};
use std::sync::{Arc, Mutex};
use std::thread;
use std::time::Duration;
use self::lsp_extensions::{PublishRustDiagnosticsParams, RustDiagnostic};
use self::compiler_message_parsing::{FileDiagnostic, ParseError};
type BuildResults = HashMap<PathBuf, Vec<RustDiagnostic>>;
pub struct ActionHandler {
analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>,
current_project: Mutex<Option<PathBuf>>,
previous_build_results: Mutex<BuildResults>,
}
impl ActionHandler {
pub fn new(analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>) -> ActionHandler {
ActionHandler {
analysis: analysis,
vfs: vfs,
build_queue: build_queue,
current_project: Mutex::new(None),
previous_build_results: Mutex::new(HashMap::new()),
}
}
pub fn init(&self, root_path: PathBuf, out: &Output) {
{
let mut results = self.previous_build_results.lock().unwrap();
results.clear();
}
{
let mut current_project = self.current_project.lock().unwrap();
*current_project = Some(root_path.clone());
}
self.build(&root_path, BuildPriority::Immediate, out);
}
pub fn build(&self, project_path: &Path, priority: BuildPriority, out: &Output) {
fn clear_build_results(results: &mut BuildResults) {
// We must not clear the hashmap, just the values in each list.
// This allows us to save allocated before memory.
for v in &mut results.values_mut() {
v.clear();
}
}
fn parse_compiler_messages(messages: &[String], results: &mut BuildResults) {
for msg in messages {
match compiler_message_parsing::parse(msg) {
Ok(FileDiagnostic { file_path, diagnostic }) => {
results.entry(file_path).or_insert_with(Vec::new).push(diagnostic);
}
Err(ParseError::JsonError(e)) => {
debug!("build error {:?}", e);
debug!("from {}", msg);
}
Err(ParseError::NoSpans) => {}
}
}
}
fn convert_build_results_to_notifications(build_results: &BuildResults,
project_path: &Path)
-> Vec<NotificationMessage<PublishRustDiagnosticsParams>>
{
let cwd = ::std::env::current_dir().unwrap();
build_results
.iter()
.map(|(path, diagnostics)| {
let method = "textDocument/publishDiagnostics".to_string();
let params = PublishRustDiagnosticsParams {
uri: Url::from_file_path(cwd.join(path)).unwrap(),
diagnostics: diagnostics.clone(),
};
NotificationMessage::new(method, params)
})
.collect()
}
// We use `rustDocument` document here since these notifications are
// custom to the RLS and not part of the LS protocol.
out.notify("rustDocument/diagnosticsBegin");
debug!("build {:?}", project_path);
let result = self.build_queue.request_build(project_path, priority);
match result {
BuildResult::Success(x, analysis) | BuildResult::Failure(x, analysis) => {
debug!("build - Success");
// These notifications will include empty sets of errors for files
// which had errors, but now don't. This instructs the IDE to clear
// errors for those files.
let notifications = {
let mut results = self.previous_build_results.lock().unwrap();
clear_build_results(&mut results);
parse_compiler_messages(&x, &mut results);
convert_build_results_to_notifications(&results, project_path)
};
// TODO we don't send an OK notification if there were no errors
for notification in notifications {
// FIXME(43) factor out the notification mechanism.
let output = serde_json::to_string(&notification).unwrap();
out.response(output);
}
trace!("reload analysis: {:?}", project_path);
let cwd = ::std::env::current_dir().unwrap();
if let Some(analysis) = analysis {
self.analysis.reload_from_analysis(analysis, project_path, &cwd, false).unwrap();
} else {
self.analysis.reload(project_path, &cwd, false).unwrap();
}
out.notify("rustDocument/diagnosticsEnd");
}
BuildResult::Squashed => {
trace!("build - Squashed");
out.notify("rustDocument/diagnosticsEnd");
},
BuildResult::Err => {
trace!("build - Error");
out.notify("rustDocument/diagnosticsEnd");
},
}
}
pub fn on_open(&self, open: DidOpenTextDocumentParams, out: &Output) {
let fname = parse_file_path(&open.text_document.uri).unwrap();
self.vfs.set_file(fname.as_path(), &open.text_document.text);
trace!("on_open: {:?}", fname);
self.build_current_project(BuildPriority::Normal, out);
}
pub fn on_change(&self, change: DidChangeTextDocumentParams, out: &Output) {
let fname = parse_file_path(&change.text_document.uri).unwrap();
let changes: Vec<Change> = change.content_changes.iter().map(move |i| {
if let Some(range) = i.range {
let range = ls_util::range_to_rls(range);
Change::ReplaceText {
span: Span::from_range(range, fname.clone()),
text: i.text.clone()
}
} else {
Change::AddFile {
file: fname.clone(),
text: i.text.clone(),
}
}
}).collect();
self.vfs.on_changes(&changes).unwrap();
trace!("on_change: {:?}", changes);
self.build_current_project(BuildPriority::Normal, out);
}
pub fn on_save(&self, save: DidSaveTextDocumentParams, out: &Output) {
let fname = parse_file_path(&save.text_document.uri).unwrap();
self.vfs.file_saved(&fname).unwrap();
self.build_current_project(BuildPriority::Immediate, out);
}
fn build_current_project(&self, priority: BuildPriority, out: &Output) {
let current_project = {
let current_project = self.current_project.lock().unwrap();
current_project.clone()
};
match current_project {
Some(ref current_project) => self.build(current_project, priority, out),
None => debug!("build_current_project - no project path"),
}
}
pub fn symbols(&self, id: usize, doc: DocumentSymbolParams, out: &Output) {
let t = thread::current();
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let file_name = parse_file_path(&doc.text_document.uri).unwrap();
let symbols = analysis.symbols(&file_name).unwrap_or_else(|_| vec![]);
t.unpark();
symbols.into_iter().map(|s| {
SymbolInformation {
name: s.name,
kind: source_kind_from_def_kind(s.kind),
location: ls_util::rls_to_location(&s.span),
container_name: None // FIXME: more info could be added here
}
}).collect()
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join().unwrap_or_else(|_| vec![]);
out.success(id, ResponseData::SymbolInfo(result));
}
pub fn complete(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let result: Vec<CompletionItem> = panic::catch_unwind(move || {
let file_path = &parse_file_path(&params.text_document.uri).unwrap();
let cache = racer::FileCache::new(self.vfs.clone());
let session = racer::Session::new(&cache);
let location = pos_to_racer_location(params.position);
let results = racer::complete_from_file(file_path, location, &session);
results.map(|comp| completion_item_from_racer_match(comp)).collect()
}).unwrap_or_else(|_| vec![]);
out.success(id, ResponseData::CompletionItems(result));
}
pub fn rename(&self, id: usize, params: RenameParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, true);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let mut edits: HashMap<Url, Vec<TextEdit>> = HashMap::new();
for item in result.iter() {
let loc = ls_util::rls_to_location(item);
edits.entry(loc.uri).or_insert_with(Vec::new).push(TextEdit {
range: loc.range,
new_text: params.new_name.clone(),
});
}
out.success(id, ResponseData::WorkspaceEdit(WorkspaceEdit { changes: edits }));
}
pub fn highlight(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, true);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let refs: Vec<_> = result.iter().map(|span| DocumentHighlight {
range: ls_util::rls_to_range(span.range),
kind: Some(DocumentHighlightKind::Text),
}).collect();
out.success(id, ResponseData::Highlights(refs));
}
pub fn find_all_refs(&self, id: usize, params: ReferenceParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, params.context.include_declaration);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let refs: Vec<_> = result.iter().map(|item| ls_util::rls_to_location(item)).collect();
out.success(id, ResponseData::Locations(refs));
}
pub fn goto_def(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
// Save-analysis thread.
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let vfs = self.vfs.clone();
let compiler_handle = thread::spawn(move || {
let result = analysis.goto_def(&span);
t.unpark();
result
});
// Racer thread.
let racer_handle = thread::spawn(move || {
let file_path = &parse_file_path(&params.text_document.uri).unwrap();
let cache = racer::FileCache::new(vfs);
let session = racer::Session::new(&cache);
let location = pos_to_racer_location(params.position);
racer::find_definition(file_path, location, &session)
.and_then(location_from_racer_match)
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let compiler_result = compiler_handle.join();
match compiler_result {
Ok(Ok(r)) => {
let result = vec![ls_util::rls_to_location(&r)];
trace!("goto_def TO: {:?}", result);
out.success(id, ResponseData::Locations(result));
}
_ => {
info!("goto_def - falling back to Racer");
match racer_handle.join() {
Ok(Some(r)) => {
trace!("goto_def: {:?}", r);
out.success(id, ResponseData::Locations(vec![r]));
}
_ => {
debug!("Error in Racer");
out.failure(id, "GotoDef failed to complete successfully");
}
}
}
}
}
pub fn hover(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
trace!("hover: {:?}", span);
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let ty = analysis.show_type(&span).unwrap_or_else(|_| String::new());
let docs = analysis.docs(&span).unwrap_or_else(|_| String::new());
let doc_url = analysis.doc_url(&span).unwrap_or_else(|_| String::new());
t.unpark();
let mut contents = vec![];
if !docs.is_empty() {
contents.push(MarkedString::from_markdown(docs.into()));
}
if !doc_url.is_empty() {
contents.push(MarkedString::from_markdown(doc_url.into()));
}
if !ty.is_empty() {
contents.push(MarkedString::from_language_code("rust".into(), ty.into()));
}
Hover {
contents: contents,
range: None, // TODO: maybe add?
}
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join();
match result {
Ok(r) => {
out.success(id, ResponseData::HoverSuccess(r));
}
Err(_) => {
out.failure(id, "Hover failed to complete successfully");
}
}
}
pub fn reformat(&self, id: usize, doc: TextDocumentIdentifier, out: &Output) {
trace!("Reformat: {} {:?}", id, doc);
let path = &parse_file_path(&doc.uri).unwrap();
let input = match self.vfs.load_file(path) {
Ok(s) => FmtInput::Text(s),
Err(e) => {
debug!("Reformat failed: {:?}", e);
out.failure(id, "Reformat failed to complete successfully");
return;
}
};
let mut config = config::Config::default();
config.skip_children = true;
config.write_mode = WriteMode::Plain;
let mut buf = Vec::<u8>::new();
match format_input(input, &config, Some(&mut buf)) {
Ok((summary, ..)) => {
// format_input returns Ok even if there are any errors, i.e., parsing errors.
if summary.has_no_errors() {
// Note that we don't need to keep the VFS up to date, the client
// echos back the change to us.
let range = ls_util::range_from_vfs_file(&self.vfs, path);
let text = String::from_utf8(buf).unwrap();
let result = [TextEdit {
range: range,
new_text: text,
}];
out.success(id, ResponseData::TextEdit(result))
} else {
debug!("reformat: format_input failed: has errors, summary = {:?}", summary);
out.failure(id, "Reformat failed to complete successfully")
}
}
Err(e) => {
debug!("Reformat failed: {:?}", e);
out.failure(id, "Reformat failed to complete successfully")
}
}
}
fn convert_pos_to_span(&self, doc: &TextDocumentIdentifier, pos: Position) -> Span {
let fname = parse_file_path(&doc.uri).unwrap();
trace!("convert_pos_to_span: {:?} {:?}", fname, pos);
let pos = ls_util::position_to_rls(pos);
let line = self.vfs.load_line(&fname, pos.row).unwrap();
trace!("line: `{}`", line);
let start_pos = {
let mut col = 0;
for (i, c) in line.chars().enumerate() {
if !(c.is_alphanumeric() || c == '_') {
col = i + 1;
}
if i == pos.col.0 as usize {
break;
}
}
trace!("start: {}", col);
span::Position::new(pos.row, span::Column::new_zero_indexed(col as u32))
};
let end_pos = {
let mut col = pos.col.0 as usize;
for c in line.chars().skip(col) {
if !(c.is_alphanumeric() || c == '_') {
break;
}
col += 1;
}
trace!("end: {}", col);
span::Position::new(pos.row, span::Column::new_zero_indexed(col as u32))
};
Span::from_positions(start_pos,
end_pos,
fname.to_owned())
}
}
fn racer_coord(line: span::Row<span::OneIndexed>,
column: span::Column<span::ZeroIndexed>)
-> racer::Coordinate {
racer::Coordinate {
line: line.0 as usize,
column: column.0 as usize,
}
}
fn from_racer_coord(coord: racer::Coordinate) -> (span::Row<span::OneIndexed>,span::Column<span::ZeroIndexed>) {
(span::Row::new_one_indexed(coord.line as u32), span::Column::new_zero_indexed(coord.column as u32))
}
fn pos_to_racer_location(pos: Position) -> racer::Location {
let pos = ls_util::position_to_rls(pos);
racer::Location::Coords(racer_coord(pos.row.one_indexed(), pos.col))
}
fn location_from_racer_match(mtch: racer::Match) -> Option<Location> {
let source_path = &mtch.filepath;
mtch.coords.map(|coord| {
let (row, col) = from_racer_coord(coord);
let loc = span::Location::new(row.zero_indexed(), col, source_path);
ls_util::rls_location_to_location(&loc)
})
}

View File

@ -1,726 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate getopts;
extern crate rustc;
extern crate rustc_driver;
extern crate rustc_errors as errors;
extern crate rustc_resolve;
extern crate rustc_save_analysis;
extern crate syntax;
use cargo::core::{PackageId, MultiShell, Workspace};
use cargo::ops::{compile_with_exec, Executor, Context, CompileOptions, CompileMode, CompileFilter};
use cargo::util::{Config as CargoConfig, ProcessBuilder, ProcessError, homedir, ConfigValue};
use data::Analysis;
use vfs::Vfs;
use self::rustc::session::Session;
use self::rustc::session::config::{self, Input, ErrorOutputType};
use self::rustc_driver::{RustcDefaultCalls, run_compiler, run, Compilation, CompilerCalls};
use self::rustc_driver::driver::CompileController;
use self::rustc_save_analysis as save;
use self::rustc_save_analysis::CallbackHandler;
use self::syntax::ast;
use self::syntax::codemap::{FileLoader, RealFileLoader};
use config::Config;
use std::collections::HashMap;
use std::env;
use std::ffi::OsString;
use std::fs::{read_dir, remove_file};
use std::io::{self, Write};
use std::mem;
use std::path::{Path, PathBuf};
use std::process::Command;
use std::sync::{Arc, Mutex};
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::mpsc::{channel, Sender};
use std::thread;
use std::time::Duration;
/// Manages builds.
///
/// The IDE will request builds quickly (possibly on every keystroke), there is
/// no point running every one. We also avoid running more than one build at once.
/// We cannot cancel builds. It might be worth running builds in parallel or
/// cancelling a started build.
///
/// `BuildPriority::Immediate` builds are started straightaway. Normal builds are
/// started after a timeout. A new build request cancels any pending build requests.
///
/// From the client's point of view, a build request is not guaranteed to cause
/// a build. However, a build is guaranteed to happen and that build will begin
/// after the build request is received (no guarantee on how long after), and
/// that build is guaranteed to have finished before the build reqest returns.
///
/// There is no way for the client to specify that an individual request will
/// result in a build. However, you can tell from the result - if a build
/// was run, the build result will contain any errors or warnings and an indication
/// of success or failure. If the build was not run, the result indicates that
/// it was squashed.
pub struct BuildQueue {
build_dir: Mutex<Option<PathBuf>>,
cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
// True if a build is running.
// Note I have been conservative with Ordering when accessing this atomic,
// we might be able to do better.
running: AtomicBool,
// A vec of channels to pending build threads.
pending: Mutex<Vec<Sender<Signal>>>,
vfs: Arc<Vfs>,
config: Mutex<Config>,
}
#[derive(Debug)]
pub enum BuildResult {
// Build was succesful, argument is warnings.
Success(Vec<String>, Option<Analysis>),
// Build finished with errors, argument is errors and warnings.
Failure(Vec<String>, Option<Analysis>),
// Build was coelesced with another build.
Squashed,
// There was an error attempting to build.
Err,
}
/// Priority for a build request.
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
pub enum BuildPriority {
/// Run this build as soon as possible (e.g., on save or explicit build request).
Immediate,
/// A regular build request (e.g., on a minor edit).
Normal,
}
// Minimum time to wait before starting a `BuildPriority::Normal` build.
const WAIT_TO_BUILD: u64 = 500;
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
enum Signal {
Build,
Skip,
}
impl BuildQueue {
pub fn new(vfs: Arc<Vfs>) -> BuildQueue {
BuildQueue {
build_dir: Mutex::new(None),
cmd_line_args: Arc::new(Mutex::new(vec![])),
cmd_line_envs: Arc::new(Mutex::new(HashMap::new())),
running: AtomicBool::new(false),
pending: Mutex::new(vec![]),
vfs: vfs,
config: Mutex::new(Config::default()),
}
}
pub fn request_build(&self, build_dir: &Path, priority: BuildPriority) -> BuildResult {
// println!("request_build, {:?} {:?}", build_dir, priority);
// If there is a change in the project directory, then we can forget any
// pending build and start straight with this new build.
{
let mut prev_build_dir = self.build_dir.lock().unwrap();
if prev_build_dir.as_ref().map_or(true, |dir| dir != build_dir) {
*prev_build_dir = Some(build_dir.to_owned());
self.cancel_pending();
let mut config = self.config.lock().unwrap();
*config = Config::from_path(build_dir);
let mut cmd_line_args = self.cmd_line_args.lock().unwrap();
*cmd_line_args = vec![];
}
}
self.cancel_pending();
match priority {
BuildPriority::Immediate => {
// There is a build running, wait for it to finish, then run.
if self.running.load(Ordering::SeqCst) {
let (tx, rx) = channel();
self.pending.lock().unwrap().push(tx);
// Blocks.
// println!("blocked on build");
let signal = rx.recv().unwrap_or(Signal::Build);
if signal == Signal::Skip {
return BuildResult::Squashed;
}
}
}
BuildPriority::Normal => {
let (tx, rx) = channel();
self.pending.lock().unwrap().push(tx);
thread::sleep(Duration::from_millis(WAIT_TO_BUILD));
if self.running.load(Ordering::SeqCst) {
// Blocks
// println!("blocked until wake up");
let signal = rx.recv().unwrap_or(Signal::Build);
if signal == Signal::Skip {
return BuildResult::Squashed;
}
} else if rx.try_recv().unwrap_or(Signal::Build) == Signal::Skip {
// Doesn't block.
return BuildResult::Squashed;
}
}
}
// If another build has started already, we don't need to build
// ourselves (it must have arrived after this request; so we don't add
// to the pending list). But we do need to wait for that build to
// finish.
if self.running.swap(true, Ordering::SeqCst) {
let mut wait = 100;
while self.running.load(Ordering::SeqCst) && wait < 50000 {
// println!("loop of death");
thread::sleep(Duration::from_millis(wait));
wait *= 2;
}
return BuildResult::Squashed;
}
let result = self.build();
self.running.store(false, Ordering::SeqCst);
// If there is a pending build, run it now.
let mut pending = self.pending.lock().unwrap();
let pending = mem::replace(&mut *pending, vec![]);
if !pending.is_empty() {
// Kick off one build, then skip the rest.
let mut pending = pending.iter();
while let Some(next) = pending.next() {
if next.send(Signal::Build).is_ok() {
break;
}
}
for t in pending {
let _ = t.send(Signal::Skip);
}
}
result
}
// Cancels all pending builds without running any of them.
fn cancel_pending(&self) {
let mut pending = self.pending.lock().unwrap();
let pending = mem::replace(&mut *pending, vec![]);
for t in pending {
let _ = t.send(Signal::Skip);
}
}
// Build the project.
fn build(&self) -> BuildResult {
// When we change build directory (presumably because the IDE is
// changing project), we must do a cargo build of the whole project.
// Otherwise we just use rustc directly.
//
// The 'full cargo build' is a `cargo check` customised and run
// in-process. Cargo will shell out to call rustc (this means the
// the compiler available at runtime must match the compiler linked to
// the RLS). All but the last crate are built as normal, we intercept
// the call to the last crate and do our own rustc build. We cache the
// command line args and environment so we can avoid running Cargo in
// the future.
//
// Our 'short' rustc build runs rustc directly and in-process (we must
// do this so we can load changed code from the VFS, rather than from
// disk). We get the data we need by building with `-Zsave-analysis`.
let needs_to_run_cargo = {
let cmd_line_args = self.cmd_line_args.lock().unwrap();
cmd_line_args.is_empty()
};
let build_dir = &self.build_dir.lock().unwrap();
let build_dir = build_dir.as_ref().unwrap();
if needs_to_run_cargo {
if let BuildResult::Err = self.cargo(build_dir.clone()) {
return BuildResult::Err;
}
}
let cmd_line_args = self.cmd_line_args.lock().unwrap();
assert!(!cmd_line_args.is_empty());
let cmd_line_envs = self.cmd_line_envs.lock().unwrap();
self.rustc(&*cmd_line_args, &*cmd_line_envs, build_dir)
}
// Runs an in-process instance of Cargo.
fn cargo(&self, build_dir: PathBuf) -> BuildResult {
struct RlsExecutor {
cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
cur_package_id: Mutex<Option<PackageId>>,
config: Config,
}
impl RlsExecutor {
fn new(cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
config: Config) -> RlsExecutor {
RlsExecutor {
cmd_line_args: cmd_line_args,
cmd_line_envs: cmd_line_envs,
cur_package_id: Mutex::new(None),
config: config,
}
}
}
impl Executor for RlsExecutor {
fn init(&self, cx: &Context) {
let mut cur_package_id = self.cur_package_id.lock().unwrap();
*cur_package_id = Some(cx.ws
.current_opt()
.expect("No current package in Cargo")
.package_id()
.clone());
}
fn exec(&self, cmd: ProcessBuilder, id: &PackageId) -> Result<(), ProcessError> {
// Delete any stale data. We try and remove any json files with
// the same crate name as Cargo would emit. This includes files
// with the same crate name but different hashes, e.g., those
// made with a different compiler.
let args = cmd.get_args();
let crate_name = parse_arg(args, "--crate-name").expect("no crate-name in rustc command line");
let out_dir = parse_arg(args, "--out-dir").expect("no out-dir in rustc command line");
let analysis_dir = Path::new(&out_dir).join("save-analysis");
if let Ok(dir_contents) = read_dir(&analysis_dir) {
for entry in dir_contents {
let entry = entry.expect("unexpected error reading save-analysis directory");
let name = entry.file_name();
let name = name.to_str().unwrap();
if name.starts_with(&crate_name) && name.ends_with(".json") {
debug!("removing: `{:?}`", name);
remove_file(entry.path()).expect("could not remove file");
}
}
}
let is_primary_crate = {
let cur_package_id = self.cur_package_id.lock().unwrap();
id == cur_package_id.as_ref().expect("Executor has not been initialised")
};
if is_primary_crate {
let mut args: Vec<_> =
cmd.get_args().iter().map(|a| a.clone().into_string().unwrap()).collect();
// We end up taking this code path for build scripts, we don't
// want to do that, so we check here if the crate is actually
// being linked (c.f., emit=metadata) and if just call the
// usual rustc. This is clearly a bit fragile (if the emit
// string changes, we get screwed).
if args.contains(&"--emit=dep-info,link".to_owned()) {
trace!("rustc not intercepted (link)");
return cmd.exec();
}
trace!("intercepted rustc, args: {:?}", args);
// FIXME here and below should check $RUSTC before using rustc.
{
// Cargo is going to expect to get dep-info for this crate, so we shell out
// to rustc to get that. This is not really ideal, because we are going to
// compute this info anyway when we run rustc ourselves, but we don't do
// that before we return to Cargo.
// FIXME Don't do this. Instead either persuade Cargo that it doesn't need
// this info at all, or start our build here rather than on another thread
// so the dep-info is ready by the time we return from this callback.
let mut cmd_dep_info = Command::new("rustc");
for a in &args {
if a.starts_with("--emit") {
cmd_dep_info.arg("--emit=dep-info");
} else {
cmd_dep_info.arg(a);
}
}
if let Some(cwd) = cmd.get_cwd() {
cmd_dep_info.current_dir(cwd);
}
cmd_dep_info.status().expect("Couldn't execute rustc");
}
args.insert(0, "rustc".to_owned());
if self.config.cfg_test {
args.push("--test".to_owned());
}
if self.config.sysroot.is_empty() {
args.push("--sysroot".to_owned());
let home = option_env!("RUSTUP_HOME").or(option_env!("MULTIRUST_HOME"));
let toolchain = option_env!("RUSTUP_TOOLCHAIN").or(option_env!("MULTIRUST_TOOLCHAIN"));
let sys_root = if let (Some(home), Some(toolchain)) = (home, toolchain) {
format!("{}/toolchains/{}", home, toolchain)
} else {
option_env!("SYSROOT")
.map(|s| s.to_owned())
.or_else(|| Command::new("rustc")
.arg("--print")
.arg("sysroot")
.output()
.ok()
.and_then(|out| String::from_utf8(out.stdout).ok())
.map(|s| s.trim().to_owned()))
.expect("need to specify SYSROOT env var, \
or use rustup or multirust")
};
args.push(sys_root.to_owned());
}
let envs = cmd.get_envs();
trace!("envs: {:?}", envs);
{
let mut queue_args = self.cmd_line_args.lock().unwrap();
*queue_args = args.clone();
}
{
let mut queue_envs = self.cmd_line_envs.lock().unwrap();
*queue_envs = envs.clone();
}
Ok(())
} else {
trace!("rustc not intercepted");
cmd.exec()
}
}
}
let rls_config = {
let rls_config = self.config.lock().unwrap();
rls_config.clone()
};
trace!("cargo - `{:?}`", build_dir);
let exec = RlsExecutor::new(self.cmd_line_args.clone(),
self.cmd_line_envs.clone(),
rls_config.clone());
let out = Arc::new(Mutex::new(vec![]));
let err = Arc::new(Mutex::new(vec![]));
let out_clone = out.clone();
let err_clone = err.clone();
// Cargo may or may not spawn threads to run the various builds, since
// we may be in separate threads we need to block and wait our thread.
// However, if Cargo doesn't run a separate thread, then we'll just wait
// forever. Therefore, we spawn an extra thread here to be safe.
let handle = thread::spawn(move || {
let hardcoded = "-Zunstable-options -Zsave-analysis --error-format=json \
-Zcontinue-parse-after-error";
if rls_config.sysroot.is_empty() {
env::set_var("RUSTFLAGS", hardcoded);
} else {
env::set_var("RUSTFLAGS", &format!("--sysroot {} {}", rls_config.sysroot, hardcoded));
}
let shell = MultiShell::from_write(Box::new(BufWriter(out.clone())),
Box::new(BufWriter(err.clone())));
let config = make_cargo_config(&build_dir, shell);
let mut manifest_path = build_dir.clone();
manifest_path.push("Cargo.toml");
trace!("manifest_path: {:?}", manifest_path);
let ws = Workspace::new(&manifest_path, &config).expect("could not create cargo workspace");
let mut opts = CompileOptions::default(&config, CompileMode::Check);
if rls_config.build_lib {
opts.filter = CompileFilter::new(true, &[], &[], &[], &[]);
}
compile_with_exec(&ws, &opts, Arc::new(exec)).expect("could not run cargo");
});
match handle.join() {
Ok(_) => BuildResult::Success(vec![], None),
Err(_) => {
info!("cargo stdout {}", String::from_utf8(out_clone.lock().unwrap().to_owned()).unwrap());
info!("cargo stderr {}", String::from_utf8(err_clone.lock().unwrap().to_owned()).unwrap());
BuildResult::Err
}
}
}
// Runs a single instance of rustc. Runs in-process.
fn rustc(&self, args: &[String], envs: &HashMap<String, Option<OsString>>, build_dir: &Path) -> BuildResult {
trace!("rustc - args: `{:?}`, envs: {:?}, build dir: {:?}", args, envs, build_dir);
let changed = self.vfs.get_cached_files();
let _restore_env = Environment::push(envs);
let buf = Arc::new(Mutex::new(vec![]));
let err_buf = buf.clone();
let args = args.to_owned();
let analysis = Arc::new(Mutex::new(None));
let mut controller = RlsRustcCalls::new(analysis.clone());
let exit_code = ::std::panic::catch_unwind(|| {
run(move || {
// Replace stderr so we catch most errors.
run_compiler(&args,
&mut controller,
Some(Box::new(ReplacedFileLoader::new(changed))),
Some(Box::new(BufWriter(buf))))
})
});
// FIXME(#25) given that we are running the compiler directly, there is no need
// to serialise either the error messages or save-analysis - we should pass
// them both in memory, without using save-analysis.
let stderr_json_msg = convert_message_to_json_strings(Arc::try_unwrap(err_buf)
.unwrap()
.into_inner()
.unwrap());
return match exit_code {
Ok(0) => BuildResult::Success(stderr_json_msg, analysis.lock().unwrap().clone()),
_ => BuildResult::Failure(stderr_json_msg, analysis.lock().unwrap().clone()),
};
// Our compiler controller. We mostly delegate to the default rustc
// controller, but use our own callback for save-analysis.
#[derive(Clone)]
struct RlsRustcCalls {
default_calls: RustcDefaultCalls,
analysis: Arc<Mutex<Option<Analysis>>>,
}
impl RlsRustcCalls {
fn new(analysis: Arc<Mutex<Option<Analysis>>>) -> RlsRustcCalls {
RlsRustcCalls {
default_calls: RustcDefaultCalls,
analysis: analysis,
}
}
}
impl<'a> CompilerCalls<'a> for RlsRustcCalls {
fn early_callback(&mut self,
matches: &getopts::Matches,
sopts: &config::Options,
cfg: &ast::CrateConfig,
descriptions: &errors::registry::Registry,
output: ErrorOutputType)
-> Compilation {
self.default_calls.early_callback(matches, sopts, cfg, descriptions, output)
}
fn no_input(&mut self,
matches: &getopts::Matches,
sopts: &config::Options,
cfg: &ast::CrateConfig,
odir: &Option<PathBuf>,
ofile: &Option<PathBuf>,
descriptions: &errors::registry::Registry)
-> Option<(Input, Option<PathBuf>)> {
self.default_calls.no_input(matches, sopts, cfg, odir, ofile, descriptions)
}
fn late_callback(&mut self,
matches: &getopts::Matches,
sess: &Session,
input: &Input,
odir: &Option<PathBuf>,
ofile: &Option<PathBuf>)
-> Compilation {
self.default_calls.late_callback(matches, sess, input, odir, ofile)
}
fn build_controller(&mut self,
sess: &Session,
matches: &getopts::Matches)
-> CompileController<'a> {
let mut result = self.default_calls.build_controller(sess, matches);
let analysis = self.analysis.clone();
result.after_analysis.callback = Box::new(move |state| {
save::process_crate(state.tcx.unwrap(),
state.expanded_crate.unwrap(),
state.analysis.unwrap(),
state.crate_name.unwrap(),
CallbackHandler { callback: &mut |a| {
let mut analysis = analysis.lock().unwrap();
*analysis = Some(unsafe { ::std::mem::transmute(a.clone()) } );
} });
});
result.after_analysis.run_callback_on_error = true;
result.make_glob_map = rustc_resolve::MakeGlobMap::Yes;
result
}
}
}
}
fn make_cargo_config(build_dir: &Path, shell: MultiShell) -> CargoConfig {
let config = CargoConfig::new(shell,
// This is Cargo's cwd. We are using the actual cwd, but perhaps
// we should use build_dir or something else?
env::current_dir().unwrap(),
homedir(&build_dir).unwrap());
// Cargo is expecting the config to come from a config file and keeps
// track of the path to that file. We'll make one up, it shouldn't be
// used for much. Cargo does use it for finding a root path. Since
// we pass an absolute path for the build directory, that doesn't
// matter too much. However, Cargo still takes the grandparent of this
// path, so we need to have at least two path elements.
let config_path = build_dir.join("config").join("rls-config.toml");
let mut config_value_map = config.load_values().unwrap();
{
let build_value = config_value_map.entry("build".to_owned()).or_insert(ConfigValue::Table(HashMap::new(), config_path.clone()));
let target_dir = build_dir.join("target").join("rls").to_str().unwrap().to_owned();
let td_value = ConfigValue::String(target_dir, config_path);
if let &mut ConfigValue::Table(ref mut build_table, _) = build_value {
build_table.insert("target-dir".to_owned(), td_value);
} else {
unreachable!();
}
}
config.set_values(config_value_map).unwrap();
config
}
fn parse_arg(args: &[OsString], arg: &str) -> Option<String> {
for (i, a) in args.iter().enumerate() {
if a == arg {
return Some(args[i + 1].clone().into_string().unwrap());
}
}
None
}
// A threadsafe buffer for writing.
struct BufWriter(Arc<Mutex<Vec<u8>>>);
impl Write for BufWriter {
fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
self.0.lock().unwrap().write(buf)
}
fn flush(&mut self) -> io::Result<()> {
self.0.lock().unwrap().flush()
}
}
// An RAII helper to set and reset the current working directory and env vars.
struct Environment {
old_vars: HashMap<String, Option<OsString>>,
}
impl Environment {
fn push(envs: &HashMap<String, Option<OsString>>) -> Environment {
let mut result = Environment {
old_vars: HashMap::new(),
};
for (k, v) in envs {
result.old_vars.insert(k.to_owned(), env::var_os(k));
match *v {
Some(ref v) => env::set_var(k, v),
None => env::remove_var(k),
}
}
result
}
}
impl Drop for Environment {
fn drop(&mut self) {
for (k, v) in &self.old_vars {
match *v {
Some(ref v) => env::set_var(k, v),
None => env::remove_var(k),
}
}
}
}
fn convert_message_to_json_strings(input: Vec<u8>) -> Vec<String> {
let mut output = vec![];
// FIXME: this is *so gross* Trying to work around cargo not supporting json messages
let it = input.into_iter();
let mut read_iter = it.skip_while(|&x| x != b'{');
let mut _msg = String::new();
loop {
match read_iter.next() {
Some(b'\n') => {
output.push(_msg);
_msg = String::new();
while let Some(res) = read_iter.next() {
if res == b'{' {
_msg.push('{');
break;
}
}
}
Some(x) => {
_msg.push(x as char);
}
None => {
break;
}
}
}
output
}
/// Tries to read a file from a list of replacements, and if the file is not
/// there, then reads it from disk, by delegating to `RealFileLoader`.
pub struct ReplacedFileLoader {
replacements: HashMap<PathBuf, String>,
real_file_loader: RealFileLoader,
}
impl ReplacedFileLoader {
pub fn new(replacements: HashMap<PathBuf, String>) -> ReplacedFileLoader {
ReplacedFileLoader {
replacements: replacements,
real_file_loader: RealFileLoader,
}
}
}
impl FileLoader for ReplacedFileLoader {
fn file_exists(&self, path: &Path) -> bool {
self.real_file_loader.file_exists(path)
}
fn abs_path(&self, path: &Path) -> Option<PathBuf> {
self.real_file_loader.abs_path(path)
}
fn read_file(&self, path: &Path) -> io::Result<String> {
if let Some(abs_path) = self.abs_path(path) {
if self.replacements.contains_key(&abs_path) {
return Ok(self.replacements[&abs_path].clone());
}
}
self.real_file_loader.read_file(path)
}
}

View File

@ -1,199 +0,0 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use toml;
use std::fs::File;
use std::io::Read;
use std::path::Path;
macro_rules! impl_enum_decodable {
( $e:ident, $( $x:ident ),* ) => {
impl ::serde::Deserialize for $e {
fn decode<D: ::serde::Deserializer>(d: &mut D) -> Result<Self, D::Error> {
use std::ascii::AsciiExt;
let s = try!(d.read_str());
$(
if stringify!($x).eq_ignore_ascii_case(&s) {
return Ok($e::$x);
}
)*
Err(d.error("Bad variant"))
}
}
impl ::std::str::FromStr for $e {
type Err = &'static str;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use std::ascii::AsciiExt;
$(
if stringify!($x).eq_ignore_ascii_case(s) {
return Ok($e::$x);
}
)*
Err("Bad variant")
}
}
impl ::config::ConfigType for $e {
fn get_variant_names() -> String {
let mut variants = Vec::new();
$(
variants.push(stringify!($x));
)*
format!("[{}]", variants.join("|"))
}
}
};
}
macro_rules! configuration_option_enum {
($e:ident: $( $x:ident ),+ $(,)*) => {
#[derive(Copy, Clone, Eq, PartialEq, Debug)]
pub enum $e {
$( $x ),+
}
impl_enum_decodable!($e, $( $x ),+);
}
}
// This trait and the following impl blocks are there so that we an use
// UCFS inside the get_docs() function on types for configs.
pub trait ConfigType {
fn get_variant_names() -> String;
}
impl ConfigType for bool {
fn get_variant_names() -> String {
String::from("<boolean>")
}
}
impl ConfigType for usize {
fn get_variant_names() -> String {
String::from("<unsigned integer>")
}
}
impl ConfigType for String {
fn get_variant_names() -> String {
String::from("<string>")
}
}
macro_rules! create_config {
($($i:ident: $ty:ty, $def:expr, $unstable:expr, $( $dstring:expr ),+ );+ $(;)*) => (
#[derive(RustcDecodable, Clone)]
pub struct Config {
$(pub $i: $ty),+
}
// Just like the Config struct but with each property wrapped
// as Option<T>. This is used to parse a rustfmt.toml that doesn't
// specity all properties of `Config`.
// We first parse into `ParsedConfig`, then create a default `Config`
// and overwrite the properties with corresponding values from `ParsedConfig`
#[derive(RustcDecodable, Clone, Deserialize)]
pub struct ParsedConfig {
$(pub $i: Option<$ty>),+
}
impl Config {
fn fill_from_parsed_config(mut self, parsed: ParsedConfig) -> Config {
$(
if let Some(val) = parsed.$i {
self.$i = val;
// TODO error out if unstable
}
)+
self
}
pub fn from_toml(toml: &str) -> Config {
let parsed_config: ParsedConfig = match toml::from_str(toml) {
Ok(decoded) => decoded,
Err(e) => {
debug!("Decoding config file failed.");
debug!("Error: {}", e);
debug!("Config:\n{}", toml);
let parsed: toml::Value = toml.parse().expect("Could not parse TOML");
debug!("\n\nParsed:\n{:?}", parsed);
panic!();
}
};
Config::default().fill_from_parsed_config(parsed_config)
}
#[allow(dead_code)]
pub fn print_docs() {
use std::cmp;
let max = 0;
$( let max = cmp::max(max, stringify!($i).len()+1); )+
let mut space_str = String::with_capacity(max);
for _ in 0..max {
space_str.push(' ');
}
println!("Configuration Options:");
$(
if !$unstable {
let name_raw = stringify!($i);
let mut name_out = String::with_capacity(max);
for _ in name_raw.len()..max-1 {
name_out.push(' ')
}
name_out.push_str(name_raw);
name_out.push(' ');
println!("{}{} Default: {:?}",
name_out,
<$ty>::get_variant_names(),
$def);
$(
println!("{}{}", space_str, $dstring);
)+
println!("");
}
)+
}
/// Attempt to read a confid from rls.toml in path, failing that use defaults.
pub fn from_path(path: &Path) -> Config {
let config_path = path.to_owned().join("rls.toml");
let config_file = File::open(config_path);
let mut toml = String::new();
if let Ok(mut f) = config_file {
f.read_to_string(&mut toml).unwrap();
}
Config::from_toml(&toml)
}
}
// Template for the default configuration
impl Default for Config {
fn default() -> Config {
Config {
$(
$i: $def,
)+
}
}
}
)
}
create_config! {
sysroot: String, String::new(), false, "--sysroot";
build_lib: bool, false, false, "cargo check --lib";
cfg_test: bool, true, false, "build cfg(test) code";
unstable_features: bool, false, false, "enable unstable features";
}

View File

@ -1,179 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::fmt::Debug;
use std::path::PathBuf;
use std::error::Error;
use analysis::raw;
use hyper::Url;
use serde::Serialize;
use span;
use racer;
pub use ls_types::*;
macro_rules! impl_file_name {
($ty_name: ty) => {
impl $ty_name {
pub fn file_name(&self) -> PathBuf {
uri_string_to_file_name(&self.uri)
}
}
}
}
pub fn parse_file_path(uri: &Url) -> Result<PathBuf, Box<Error>> {
if uri.scheme() != "file" {
Err("URI scheme is not `file`".into())
} else {
uri.to_file_path().map_err(|_err| "Invalid file path in URI".into())
}
}
pub mod ls_util {
use super::*;
use Span;
use std::path::Path;
use hyper::Url;
use vfs::Vfs;
pub fn range_to_rls(r: Range) -> span::Range<span::ZeroIndexed> {
span::Range::from_positions(position_to_rls(r.start), position_to_rls(r.end))
}
pub fn position_to_rls(p: Position) -> span::Position<span::ZeroIndexed> {
span::Position::new(span::Row::new_zero_indexed(p.line as u32),
span::Column::new_zero_indexed(p.character as u32))
}
// An RLS span has the same info as an LSP Location
pub fn rls_to_location(span: &Span) -> Location {
Location {
uri: Url::from_file_path(&span.file).unwrap(),
range: rls_to_range(span.range),
}
}
pub fn rls_location_to_location(l: &span::Location<span::ZeroIndexed>) -> Location {
Location {
uri: Url::from_file_path(&l.file).unwrap(),
range: rls_to_range(span::Range::from_positions(l.position, l.position)),
}
}
pub fn rls_to_range(r: span::Range<span::ZeroIndexed>) -> Range {
Range {
start: rls_to_position(r.start()),
end: rls_to_position(r.end()),
}
}
pub fn rls_to_position(p: span::Position<span::ZeroIndexed>) -> Position {
Position {
line: p.row.0 as u64,
character: p.col.0 as u64,
}
}
/// Creates a `Range` spanning the whole file as currently known by `Vfs`
///
/// Panics if `Vfs` cannot load the file.
pub fn range_from_vfs_file(vfs: &Vfs, fname: &Path) -> Range {
let content = vfs.load_file(fname).unwrap();
if content.is_empty() {
Range {start: Position::new(0, 0), end: Position::new(0, 0)}
} else {
// range is zero-based and the end position is exclusive
Range {
start: Position::new(0, 0),
end: Position::new(content.lines().count() as u64 - 1,
content.lines().last().expect("String is not empty.").chars().count() as u64)
}
}
}
}
pub fn source_kind_from_def_kind(k: raw::DefKind) -> SymbolKind {
match k {
raw::DefKind::Enum => SymbolKind::Enum,
raw::DefKind::Tuple => SymbolKind::Array,
raw::DefKind::Struct => SymbolKind::Class,
raw::DefKind::Union => SymbolKind::Class,
raw::DefKind::Trait => SymbolKind::Interface,
raw::DefKind::Function |
raw::DefKind::Method |
raw::DefKind::Macro => SymbolKind::Function,
raw::DefKind::Mod => SymbolKind::Module,
raw::DefKind::Type => SymbolKind::Interface,
raw::DefKind::Local |
raw::DefKind::Static |
raw::DefKind::Const |
raw::DefKind::Field => SymbolKind::Variable,
}
}
pub fn completion_kind_from_match_type(m : racer::MatchType) -> CompletionItemKind {
match m {
racer::MatchType::Crate |
racer::MatchType::Module => CompletionItemKind::Module,
racer::MatchType::Struct => CompletionItemKind::Class,
racer::MatchType::Enum => CompletionItemKind::Enum,
racer::MatchType::StructField |
racer::MatchType::EnumVariant => CompletionItemKind::Field,
racer::MatchType::Macro |
racer::MatchType::Function |
racer::MatchType::FnArg |
racer::MatchType::Impl => CompletionItemKind::Function,
racer::MatchType::Type |
racer::MatchType::Trait |
racer::MatchType::TraitImpl => CompletionItemKind::Interface,
racer::MatchType::Let |
racer::MatchType::IfLet |
racer::MatchType::WhileLet |
racer::MatchType::For |
racer::MatchType::MatchArm |
racer::MatchType::Const |
racer::MatchType::Static => CompletionItemKind::Variable,
racer::MatchType::Builtin => CompletionItemKind::Keyword,
}
}
pub fn completion_item_from_racer_match(m : racer::Match) -> CompletionItem {
let mut item = CompletionItem::new_simple(m.matchstr.clone(), m.contextstr.clone());
item.kind = Some(completion_kind_from_match_type(m.mtype));
item
}
/* ----------------- JSON-RPC protocol types ----------------- */
/// An event-like (no response needed) notification message.
#[derive(Debug, Serialize)]
pub struct NotificationMessage<T>
where T: Debug + Serialize
{
jsonrpc: &'static str,
pub method: String,
pub params: T,
}
impl <T> NotificationMessage<T> where T: Debug + Serialize {
pub fn new(method: String, params: T) -> Self {
NotificationMessage {
jsonrpc: "2.0",
method: method,
params: params
}
}
}

View File

@ -1,60 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![feature(rustc_private)]
extern crate cargo;
#[macro_use]
extern crate derive_new;
extern crate env_logger;
extern crate hyper;
extern crate languageserver_types as ls_types;
#[macro_use]
extern crate log;
extern crate racer;
extern crate rls_analysis as analysis;
extern crate rls_vfs as vfs;
extern crate rls_span as span;
extern crate rls_data as data;
extern crate rustc_serialize;
extern crate rustfmt;
extern crate serde;
#[macro_use]
extern crate serde_derive;
extern crate serde_json;
extern crate toml;
extern crate url;
extern crate url_serde;
use std::sync::Arc;
mod build;
mod server;
mod actions;
mod lsp_data;
mod config;
#[cfg(test)]
mod test;
// Timeout = 1.5s (totally arbitrary).
const COMPILER_TIMEOUT: u64 = 1500;
type Span = span::Span<span::ZeroIndexed>;
pub fn main() {
env_logger::init().unwrap();
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
server::run_server(analysis, vfs, build_queue);
}

View File

@ -1,515 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use analysis::AnalysisHost;
use vfs::Vfs;
use serde_json;
use build::*;
use lsp_data::*;
use actions::ActionHandler;
use std::fmt;
use std::io::{self, Read, Write, ErrorKind};
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::thread;
use std::path::PathBuf;
use config::Config;
#[derive(Debug, Serialize)]
pub struct Ack {}
#[derive(Debug, new)]
struct ParseError {
kind: ErrorKind,
message: &'static str,
id: Option<usize>,
}
#[derive(Debug)]
enum ServerMessage {
Request(Request),
Notification(Notification)
}
#[derive(Debug)]
struct Request {
id: usize,
method: Method
}
#[derive(Debug)]
enum Notification {
Exit,
CancelRequest(CancelParams),
Change(DidChangeTextDocumentParams),
Open(DidOpenTextDocumentParams),
Save(DidSaveTextDocumentParams),
}
/// Creates an public enum whose variants all contain a single serializable payload
/// with an automatic json to_string implementation
macro_rules! serializable_enum {
($enum_name:ident, $($variant_name:ident($variant_type:ty)),*) => (
pub enum $enum_name {
$(
$variant_name($variant_type),
)*
}
impl fmt::Display for $enum_name {
fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::Error> {
let value = match *self {
$(
$enum_name::$variant_name(ref value) => serde_json::to_string(value),
)*
}.unwrap();
write!(f, "{}", value)
}
}
)
}
serializable_enum!(ResponseData,
Init(InitializeResult),
SymbolInfo(Vec<SymbolInformation>),
CompletionItems(Vec<CompletionItem>),
WorkspaceEdit(WorkspaceEdit),
TextEdit([TextEdit; 1]),
Locations(Vec<Location>),
Highlights(Vec<DocumentHighlight>),
HoverSuccess(Hover),
Ack(Ack)
);
// Generates the Method enum and parse_message function.
macro_rules! messages {
(
methods {
// $method_arg is really a 0-1 repetition
$($method_str: pat => $method_name: ident $(($method_arg: ty))*;)*
}
notifications {
$($notif_str: pat => $notif_name: ident $(($notif_arg: ty))*;)*
}
$($other_str: pat => $other_expr: expr;)*
) => {
#[derive(Debug)]
enum Method {
$($method_name$(($method_arg))*,)*
}
fn parse_message(input: &str) -> Result<ServerMessage, ParseError> {
let ls_command: serde_json::Value = serde_json::from_str(input).unwrap();
let params = ls_command.get("params");
macro_rules! params_as {
($ty: ty) => ({
let method: $ty =
serde_json::from_value(params.unwrap().to_owned()).unwrap();
method
});
}
macro_rules! id {
() => ((ls_command.get("id").map(|id| id.as_u64().unwrap() as usize)));
}
if let Some(v) = ls_command.get("method") {
if let Some(name) = v.as_str() {
match name {
$(
$method_str => {
let id = ls_command.get("id").unwrap().as_u64().unwrap() as usize;
Ok(ServerMessage::Request(Request{id: id, method: Method::$method_name$((params_as!($method_arg)))* }))
}
)*
$(
$notif_str => {
Ok(ServerMessage::Notification(Notification::$notif_name$((params_as!($notif_arg)))*))
}
)*
$(
$other_str => $other_expr,
)*
}
} else {
Err(ParseError::new(ErrorKind::InvalidData, "Method is not a string", id!()))
}
} else {
Err(ParseError::new(ErrorKind::InvalidData, "Method not found", id!()))
}
}
};
}
messages! {
methods {
"shutdown" => Shutdown;
"initialize" => Initialize(InitializeParams);
"textDocument/hover" => Hover(TextDocumentPositionParams);
"textDocument/definition" => GotoDef(TextDocumentPositionParams);
"textDocument/references" => FindAllRef(ReferenceParams);
"textDocument/completion" => Complete(TextDocumentPositionParams);
"textDocument/documentHighlight" => Highlight(TextDocumentPositionParams);
// currently, we safely ignore this as a pass-through since we fully handle
// textDocument/completion. In the future, we may want to use this method as a
// way to more lazily fill out completion information
"completionItem/resolve" => CompleteResolve(CompletionItem);
"textDocument/documentSymbol" => Symbols(DocumentSymbolParams);
"textDocument/rename" => Rename(RenameParams);
"textDocument/formatting" => Reformat(DocumentFormattingParams);
"textDocument/rangeFormatting" => ReformatRange(DocumentRangeFormattingParams);
}
notifications {
"exit" => Exit;
"textDocument/didChange" => Change(DidChangeTextDocumentParams);
"textDocument/didOpen" => Open(DidOpenTextDocumentParams);
"textDocument/didSave" => Save(DidSaveTextDocumentParams);
"$/cancelRequest" => CancelRequest(CancelParams);
}
// TODO handle me
"$/setTraceNotification" => Err(ParseError::new(ErrorKind::InvalidData, "setTraceNotification", None));
// TODO handle me
"workspace/didChangeConfiguration" => Err(ParseError::new(ErrorKind::InvalidData, "didChangeConfiguration", None));
_ => Err(ParseError::new(ErrorKind::InvalidData, "Unknown command", id!()));
}
pub struct LsService {
shut_down: AtomicBool,
msg_reader: Box<MessageReader + Sync + Send>,
output: Box<Output + Sync + Send>,
handler: ActionHandler,
}
#[derive(Eq, PartialEq, Debug, Clone, Copy)]
pub enum ServerStateChange {
Continue,
Break,
}
impl LsService {
pub fn new(analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>,
reader: Box<MessageReader + Send + Sync>,
output: Box<Output + Send + Sync>)
-> Arc<LsService> {
Arc::new(LsService {
shut_down: AtomicBool::new(false),
msg_reader: reader,
output: output,
handler: ActionHandler::new(analysis, vfs, build_queue),
})
}
pub fn run(this: Arc<Self>) {
while LsService::handle_message(this.clone()) == ServerStateChange::Continue {}
}
fn init(&self, id: usize, init: InitializeParams) {
let root_path = init.root_path.map(PathBuf::from);
let unstable_features = if let Some(ref root_path) = root_path {
let config = Config::from_path(&root_path);
config.unstable_features
} else {
false
};
let result = InitializeResult {
capabilities: ServerCapabilities {
text_document_sync: Some(TextDocumentSyncKind::Incremental),
hover_provider: Some(true),
completion_provider: Some(CompletionOptions {
resolve_provider: Some(true),
trigger_characters: vec![".".to_string(), ":".to_string()],
}),
// TODO
signature_help_provider: Some(SignatureHelpOptions {
trigger_characters: Some(vec![]),
}),
definition_provider: Some(true),
references_provider: Some(true),
document_highlight_provider: Some(true),
document_symbol_provider: Some(true),
workspace_symbol_provider: Some(true),
code_action_provider: Some(false),
// TODO maybe?
code_lens_provider: None,
document_formatting_provider: Some(unstable_features),
document_range_formatting_provider: Some(unstable_features),
document_on_type_formatting_provider: None, // TODO: review this, maybe add?
rename_provider: Some(unstable_features),
}
};
self.output.success(id, ResponseData::Init(result));
if let Some(root_path) = root_path {
self.handler.init(root_path, &*self.output);
}
}
pub fn handle_message(this: Arc<Self>) -> ServerStateChange {
let c = match this.msg_reader.read_message() {
Some(c) => c,
None => {
this.output.parse_error();
return ServerStateChange::Break
},
};
let this = this.clone();
thread::spawn(move || {
// FIXME(45) refactor to generate this match.
let message = parse_message(&c);
{
let shut_down = this.shut_down.load(Ordering::SeqCst);
if shut_down {
if let Ok(ServerMessage::Notification(Notification::Exit)) = message {
} else {
// We're shutdown, ignore any messages other than 'exit'. This is not actually
// in the spec, I'm not sure we should do this, but it kinda makes sense.
return;
}
}
}
match message {
Ok(ServerMessage::Notification(method)) => {
match method {
Notification::Exit => {
trace!("exiting...");
let shut_down = this.shut_down.load(Ordering::SeqCst);
::std::process::exit(if shut_down { 0 } else { 1 });
}
Notification::CancelRequest(params) => {
trace!("request to cancel {:?}", params.id);
}
Notification::Change(change) => {
trace!("notification(change): {:?}", change);
this.handler.on_change(change, &*this.output);
}
Notification::Open(open) => {
trace!("notification(open): {:?}", open);
this.handler.on_open(open, &*this.output);
}
Notification::Save(save) => {
trace!("notification(save): {:?}", save);
this.handler.on_save(save, &*this.output);
}
}
}
Ok(ServerMessage::Request(Request{id, method})) => {
match method {
Method::Initialize(init) => {
trace!("command(init): {:?}", init);
this.init(id, init);
}
Method::Shutdown => {
trace!("shutting down...");
this.shut_down.store(true, Ordering::SeqCst);
let out = &*this.output;
out.success(id, ResponseData::Ack(Ack {}));
}
Method::Hover(params) => {
trace!("command(hover): {:?}", params);
this.handler.hover(id, params, &*this.output);
}
Method::GotoDef(params) => {
trace!("command(goto): {:?}", params);
this.handler.goto_def(id, params, &*this.output);
}
Method::Complete(params) => {
trace!("command(complete): {:?}", params);
this.handler.complete(id, params, &*this.output);
}
Method::CompleteResolve(params) => {
trace!("command(complete): {:?}", params);
this.output.success(id, ResponseData::CompletionItems(vec![params]))
}
Method::Highlight(params) => {
trace!("command(highlight): {:?}", params);
this.handler.highlight(id, params, &*this.output);
}
Method::Symbols(params) => {
trace!("command(goto): {:?}", params);
this.handler.symbols(id, params, &*this.output);
}
Method::FindAllRef(params) => {
trace!("command(find_all_refs): {:?}", params);
this.handler.find_all_refs(id, params, &*this.output);
}
Method::Rename(params) => {
trace!("command(rename): {:?}", params);
this.handler.rename(id, params, &*this.output);
}
Method::Reformat(params) => {
// FIXME take account of options.
trace!("command(reformat): {:?}", params);
this.handler.reformat(id, params.text_document, &*this.output);
}
Method::ReformatRange(params) => {
// FIXME reformats the whole file, not just a range.
// FIXME take account of options.
trace!("command(reformat range): {:?}", params);
this.handler.reformat(id, params.text_document, &*this.output);
}
}
}
Err(e) => {
trace!("parsing invalid message: {:?}", e);
if let Some(id) = e.id {
this.output.failure(id, "Unsupported message");
}
},
}
});
ServerStateChange::Continue
}
}
pub trait MessageReader {
fn read_message(&self) -> Option<String>;
}
struct StdioMsgReader;
impl MessageReader for StdioMsgReader {
fn read_message(&self) -> Option<String> {
macro_rules! handle_err {
($e: expr, $s: expr) => {
match $e {
Ok(x) => x,
Err(_) => {
debug!($s);
return None;
}
}
}
}
// Read in the "Content-length: xx" part
let mut buffer = String::new();
handle_err!(io::stdin().read_line(&mut buffer), "Could not read from stdin");
if buffer.is_empty() {
info!("Header is empty");
return None;
}
let res: Vec<&str> = buffer.split(' ').collect();
// Make sure we see the correct header
if res.len() != 2 {
info!("Header is malformed");
return None;
}
if res[0].to_lowercase() != "content-length:" {
info!("Header is missing 'content-length'");
return None;
}
let size = handle_err!(usize::from_str_radix(&res[1].trim(), 10), "Couldn't read size");
trace!("reading: {} bytes", size);
// Skip the new lines
let mut tmp = String::new();
handle_err!(io::stdin().read_line(&mut tmp), "Could not read from stdin");
let mut content = vec![0; size];
handle_err!(io::stdin().read_exact(&mut content), "Could not read from stdin");
let content = handle_err!(String::from_utf8(content), "Non-utf8 input");
Some(content)
}
}
pub trait Output {
fn response(&self, output: String);
fn parse_error(&self) {
self.response(r#"{"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}"#.to_owned());
}
fn failure(&self, id: usize, message: &str) {
// For now this is a catch-all for any error back to the consumer of the RLS
const METHOD_NOT_FOUND: i64 = -32601;
#[derive(Serialize)]
struct ResponseError {
code: i64,
message: String
}
#[derive(Serialize)]
struct ResponseFailure {
jsonrpc: &'static str,
id: usize,
error: ResponseError,
}
let rf = ResponseFailure {
jsonrpc: "2.0",
id: id,
error: ResponseError {
code: METHOD_NOT_FOUND,
message: message.to_owned(),
},
};
let output = serde_json::to_string(&rf).unwrap();
self.response(output);
}
fn success(&self, id: usize, data: ResponseData) {
// {
// jsonrpc: String,
// id: usize,
// result: String,
// }
let output = format!("{{\"jsonrpc\":\"2.0\",\"id\":{},\"result\":{}}}", id, data);
self.response(output);
}
fn notify(&self, message: &str) {
let output = serde_json::to_string(
&NotificationMessage::new(message.to_owned(), ())
).unwrap();
self.response(output);
}
}
struct StdioOutput;
impl Output for StdioOutput {
fn response(&self, output: String) {
let o = format!("Content-Length: {}\r\n\r\n{}", output.len(), output);
debug!("response: {:?}", o);
print!("{}", o);
io::stdout().flush().unwrap();
}
}
pub fn run_server(analysis: Arc<AnalysisHost>, vfs: Arc<Vfs>, build_queue: Arc<BuildQueue>) {
debug!("Language Server Starting up");
let service = LsService::new(analysis,
vfs,
build_queue,
Box::new(StdioMsgReader),
Box::new(StdioOutput));
LsService::run(service);
debug!("Server shutting down");
}

View File

@ -1,567 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// Utilities and infrastructure for testing. Tests in this module test the
// testing infrastructure *not* the RLS.
mod types;
use std::sync::{Arc, Mutex};
use std::thread;
use std::time::{Duration, SystemTime};
use env_logger;
use analysis;
use build;
use server as ls_server;
use vfs;
use self::types::src;
use hyper::Url;
use serde_json;
use std::path::{Path, PathBuf};
const TEST_TIMEOUT_IN_SEC: u64 = 10;
#[test]
fn test_goto_def() {
let (mut cache, _tc) = init_env("goto_def");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/definition",
vec![("textDocument", text_doc),
("position", cache.mk_ls_position(src(&source_file_path, 22, "world")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
// TODO structural checking of result, rather than looking for a string - src(&source_file_path, 12, "world")
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("\"start\":{\"line\":20,\"character\":8}")]);
}
#[test]
fn test_hover() {
let (mut cache, _tc) = init_env("hover");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/hover",
vec![("textDocument", text_doc),
("position", cache.mk_ls_position(src(&source_file_path, 22, "world")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"language\":\"rust\",\"value\":\"&str\"}]")]);
}
#[test]
fn test_find_all_refs() {
let (mut cache, _tc) = init_env("find_all_refs");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/references",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"context": {{
"includeDeclaration": true
}}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 10, "Bar")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":9,"character":7},"end":{"line":9,"character":10}}"#)
.expect_contains(r#"{"start":{"line":15,"character":14},"end":{"line":15,"character":17}}"#)
.expect_contains(r#"{"start":{"line":23,"character":15},"end":{"line":23,"character":18}}"#)]);
}
#[test]
fn test_find_all_refs_no_cfg_test() {
let (mut cache, _tc) = init_env("find_all_refs_no_cfg_test");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/references",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"context": {{
"includeDeclaration": true
}}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 10, "Bar")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":9,"character":7},"end":{"line":9,"character":10}}"#)
.expect_contains(r#"{"start":{"line":23,"character":15},"end":{"line":23,"character":18}}"#)]);
}
#[test]
fn test_borrow_error() {
let (cache, _tc) = init_env("borrow_error");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path)];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("\"secondaryRanges\":[{\"start\":{\"line\":2,\"character\":17},\"end\":{\"line\":2,\"character\":18},\"label\":\"first mutable borrow occurs here\"}"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
}
#[test]
fn test_highlight() {
let (mut cache, _tc) = init_env("highlight");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/documentHighlight",
"id": 42,
"params": {{
"textDocument": {},
"position": {}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 22, "world")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":20,"character":8},"end":{"line":20,"character":13}}"#)
.expect_contains(r#"{"start":{"line":21,"character":27},"end":{"line":21,"character":32}}"#),]);
}
#[test]
fn test_rename() {
let (mut cache, _tc) = init_env("rename");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/rename",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"newName": "foo"
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 22, "world")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":20,"character":8},"end":{"line":20,"character":13}}"#)
.expect_contains(r#"{"start":{"line":21,"character":27},"end":{"line":21,"character":32}}"#)
.expect_contains(r#"{"changes""#),]);
}
#[test]
fn test_completion() {
let (mut cache, _tc) = init_env("completion");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/completion",
vec![("textDocument", text_doc.to_owned()),
("position", cache.mk_ls_position(src(&source_file_path, 22, "rld")))]),
Message::new("textDocument/completion",
vec![("textDocument", text_doc.to_owned()),
("position", cache.mk_ls_position(src(&source_file_path, 25, "x)")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"label\":\"world\",\"kind\":6,\"detail\":\"let world = \\\"world\\\";\"}]")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"label\":\"x\",\"kind\":5,\"detail\":\"u64\"}]")]);
}
#[test]
fn test_parse_error_on_malformed_input() {
let _ = env_logger::init();
struct NoneMsgReader;
impl ls_server::MessageReader for NoneMsgReader {
fn read_message(&self) -> Option<String> { None }
}
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(NoneMsgReader);
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
let server = ls_server::LsService::new(analysis, vfs, build_queue, reader, output);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Break);
let error = results.lock().unwrap()
.pop().expect("no error response");
assert!(error.contains(r#""code": -32700"#))
}
// Initialise and run the internals of an LS protocol RLS server.
fn mock_server(messages: Vec<Message>) -> (Arc<ls_server::LsService>, LsResultList)
{
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(MockMsgReader::new(messages));
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
(ls_server::LsService::new(analysis, vfs, build_queue, reader, output), results)
}
// Initialise and run the internals of an LS protocol RLS server.
fn mock_raw_server(messages: Vec<String>) -> (Arc<ls_server::LsService>, LsResultList)
{
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(MockRawMsgReader::new(messages));
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
(ls_server::LsService::new(analysis, vfs, build_queue, reader, output), results)
}
struct MockMsgReader {
messages: Vec<Message>,
cur: Mutex<usize>,
}
impl MockMsgReader {
fn new(messages: Vec<Message>) -> MockMsgReader {
MockMsgReader {
messages: messages,
cur: Mutex::new(0),
}
}
}
struct MockRawMsgReader {
messages: Vec<String>,
cur: Mutex<usize>,
}
impl MockRawMsgReader {
fn new(messages: Vec<String>) -> MockRawMsgReader {
MockRawMsgReader {
messages: messages,
cur: Mutex::new(0),
}
}
}
// TODO should have a structural way of making params, rather than taking Strings
struct Message {
method: &'static str,
params: Vec<(&'static str, String)>,
}
impl Message {
fn new(method: &'static str, params: Vec<(&'static str, String)>) -> Message {
Message {
method: method,
params: params,
}
}
}
impl ls_server::MessageReader for MockMsgReader {
fn read_message(&self) -> Option<String> {
// Note that we hold this lock until the end of the function, thus meaning
// that we must finish processing one message before processing the next.
let mut cur = self.cur.lock().unwrap();
let index = *cur;
*cur += 1;
if index >= self.messages.len() {
return None;
}
let message = &self.messages[index];
let params = message.params.iter().map(|&(k, ref v)| format!("\"{}\":{}", k, v)).collect::<Vec<String>>().join(",");
// TODO don't hardcode the id, we should use fresh ids and use them to look up responses
let result = format!("{{\"method\":\"{}\",\"id\":42,\"params\":{{{}}}}}", message.method, params);
// println!("read_message: `{}`", result);
Some(result)
}
}
impl ls_server::MessageReader for MockRawMsgReader {
fn read_message(&self) -> Option<String> {
// Note that we hold this lock until the end of the function, thus meaning
// that we must finish processing one message before processing the next.
let mut cur = self.cur.lock().unwrap();
let index = *cur;
*cur += 1;
if index >= self.messages.len() {
return None;
}
let message = &self.messages[index];
Some(message.clone())
}
}
type LsResultList = Arc<Mutex<Vec<String>>>;
struct RecordOutput {
output: LsResultList,
}
impl RecordOutput {
fn new() -> RecordOutput {
RecordOutput {
output: Arc::new(Mutex::new(vec![])),
}
}
}
impl ls_server::Output for RecordOutput {
fn response(&self, output: String) {
let mut records = self.output.lock().unwrap();
records.push(output);
}
}
// Initialise the environment for a test.
fn init_env(project_dir: &str) -> (types::Cache, TestCleanup) {
let _ = env_logger::init();
let path = &Path::new("test_data").join(project_dir);
let tc = TestCleanup { path: path.to_owned() };
(types::Cache::new(path), tc)
}
#[derive(Clone, Debug)]
struct ExpectedMessage {
id: Option<u64>,
contains: Vec<String>,
}
impl ExpectedMessage {
fn new(id: Option<u64>) -> ExpectedMessage {
ExpectedMessage {
id: id,
contains: vec![],
}
}
fn expect_contains(&mut self, s: &str) -> &mut ExpectedMessage {
self.contains.push(s.to_owned());
self
}
}
fn expect_messages(results: LsResultList, expected: &[&ExpectedMessage]) {
let start_clock = SystemTime::now();
let mut results_count = results.lock().unwrap().len();
while (results_count != expected.len()) && (start_clock.elapsed().unwrap().as_secs() < TEST_TIMEOUT_IN_SEC) {
thread::sleep(Duration::from_millis(100));
results_count = results.lock().unwrap().len();
}
let mut results = results.lock().unwrap();
println!("expect_messages: results: {:?},\nexpected: {:?}", *results, expected);
assert_eq!(results.len(), expected.len());
for (found, expected) in results.iter().zip(expected.iter()) {
let values: serde_json::Value = serde_json::from_str(found).unwrap();
assert!(values.get("jsonrpc").expect("Missing jsonrpc field").as_str().unwrap() == "2.0", "Bad jsonrpc field");
if let Some(id) = expected.id {
assert_eq!(values.get("id").expect("Missing id field").as_u64().unwrap(), id, "Unexpected id");
}
for c in expected.contains.iter() {
found.find(c).expect(&format!("Could not find `{}` in `{}`", c, found));
}
}
*results = vec![];
}
struct TestCleanup {
path: PathBuf
}
impl Drop for TestCleanup {
fn drop(&mut self) {
use std::fs;
let target_path = self.path.join("target");
if fs::metadata(&target_path).is_ok() {
fs::remove_dir_all(target_path).expect("failed to tidy up");
}
}
}

View File

@ -1,92 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::collections::HashMap;
use std::env;
use std::fs::File;
use std::path::{Path, PathBuf};
use std::io::{BufRead, BufReader};
#[derive(Clone, Copy, Debug)]
pub struct Src<'a, 'b> {
pub file_name: &'a Path,
// 1 indexed
pub line: usize,
pub name: &'b str,
}
pub fn src<'a, 'b>(file_name: &'a Path, line: usize, name: &'b str) -> Src<'a, 'b> {
Src {
file_name: file_name,
line: line,
name: name,
}
}
pub struct Cache {
base_path: PathBuf,
files: HashMap<PathBuf, Vec<String>>,
}
impl Cache {
pub fn new(base_path: &Path) -> Cache {
let mut root_path = env::current_dir().expect("Could not find current working directory");
root_path.push(base_path);
Cache {
base_path: root_path,
files: HashMap::new(),
}
}
pub fn mk_ls_position(&mut self, src: Src) -> String {
let line = self.get_line(src);
let col = line.find(src.name).expect(&format!("Line does not contain name {}", src.name));
format!("{{\"line\":\"{}\",\"character\":\"{}\"}}", src.line - 1, char_of_byte_index(&line, col))
}
pub fn abs_path(&self, file_name: &Path) -> PathBuf {
let result = self.base_path.join(file_name).canonicalize().expect("Couldn't canonicalise path");
let result = if cfg!(windows) {
// FIXME: If the \\?\ prefix is not stripped from the canonical path, the HTTP server tests fail. Why?
let result_string = result.to_str().expect("Path contains non-utf8 characters.");
PathBuf::from(&result_string[r"\\?\".len()..])
} else {
result
};
result
}
fn get_line(&mut self, src: Src) -> String {
let base_path = &self.base_path;
let lines = self.files.entry(src.file_name.to_owned()).or_insert_with(|| {
let file_name = &base_path.join(src.file_name);
let file = File::open(file_name).expect(&format!("Couldn't find file: {:?}", file_name));
let lines = BufReader::new(file).lines();
lines.collect::<Result<Vec<_>, _>>().unwrap()
});
if src.line - 1 >= lines.len() {
panic!("Line {} not in file, found {} lines", src.line, lines.len());
}
lines[src.line - 1].to_owned()
}
}
fn char_of_byte_index(s: &str, byte: usize) -> usize {
for (c, (b, _)) in s.char_indices().enumerate() {
if b == byte {
return c;
}
}
panic!("Couldn't find byte {} in {:?}", byte, s);
}

View File

@ -1,4 +0,0 @@
[root]
name = "borrow_error"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "borrow_error"
version = "0.1.0"
authors = ["Jonathan Turner <jturner@mozilla.com>"]
[dependencies]

View File

@ -1,5 +0,0 @@
fn main() {
let mut x = 3;
let y = &mut x;
let z = &mut x;
}

View File

@ -1,4 +0,0 @@
[root]
name = "completion"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "completion"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1,4 +0,0 @@
[root]
name = "find_all_refs"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "find_all_refs"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1,4 +0,0 @@
[root]
name = "find_all_refs_no_cfg_test"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "find_all_refs_no_cfg_test"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1 +0,0 @@
cfg_test = false

View File

@ -1,4 +0,0 @@
[root]
name = "goto_def"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "goto_def"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1,4 +0,0 @@
[root]
name = "highlight"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "highlight"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1,26 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

View File

@ -1,4 +0,0 @@
[root]
name = "hover"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "hover"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1,26 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

View File

@ -1,4 +0,0 @@
[root]
name = "rename"
version = "0.1.0"

View File

@ -1,6 +0,0 @@
[package]
name = "rename"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -1 +0,0 @@
unstable_features = true

View File

@ -1,26 +0,0 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

1070
src/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -11,8 +11,10 @@ members = [
"tools/rustbook",
"tools/tidy",
"tools/build-manifest",
"tools/qemu-test-client",
"tools/qemu-test-server",
"tools/remote-test-client",
"tools/remote-test-server",
"tools/rust-installer",
"tools/cargo",
]
# Curiously, compiletest will segfault if compiled with opt-level=3 on 64-bit

View File

@ -23,13 +23,18 @@ name = "rustdoc"
path = "bin/rustdoc.rs"
test = false
[[bin]]
name = "sccache-plus-cl"
path = "bin/sccache-plus-cl.rs"
test = false
[dependencies]
build_helper = { path = "../build_helper" }
cmake = "0.1.17"
cmake = "0.1.23"
filetime = "0.1"
num_cpus = "0.2"
num_cpus = "1.0"
toml = "0.1"
getopts = "0.2"
rustc-serialize = "0.3"
gcc = "0.3.38"
gcc = "0.3.50"
libc = "0.2"

View File

@ -26,12 +26,6 @@ use bootstrap::{Flags, Config, Build};
fn main() {
let args = env::args().skip(1).collect::<Vec<_>>();
let flags = Flags::parse(&args);
let mut config = Config::parse(&flags.build, flags.config.clone());
// compat with `./configure` while we're still using that
if std::fs::metadata("config.mk").is_ok() {
config.update_with_config_mk();
}
let config = Config::parse(&flags.build, flags.config.clone());
Build::new(flags, config).build();
}

View File

@ -38,7 +38,31 @@ use std::path::PathBuf;
use std::process::{Command, ExitStatus};
fn main() {
let args = env::args_os().skip(1).collect::<Vec<_>>();
let mut args = env::args_os().skip(1).collect::<Vec<_>>();
// Append metadata suffix for internal crates. See the corresponding entry
// in bootstrap/lib.rs for details.
if let Ok(s) = env::var("RUSTC_METADATA_SUFFIX") {
for i in 1..args.len() {
// Dirty code for borrowing issues
let mut new = None;
if let Some(current_as_str) = args[i].to_str() {
if (&*args[i - 1] == "-C" && current_as_str.starts_with("metadata")) ||
current_as_str.starts_with("-Cmetadata") {
new = Some(format!("{}-{}", current_as_str, s));
}
}
if let Some(new) = new { args[i] = new.into(); }
}
}
// Drop `--error-format json` because despite our desire for json messages
// from Cargo we don't want any from rustc itself.
if let Some(n) = args.iter().position(|n| n == "--error-format") {
args.remove(n);
args.remove(n);
}
// Detect whether or not we're a build script depending on whether --target
// is passed (a bit janky...)
let target = args.windows(2)
@ -194,6 +218,8 @@ fn main() {
// do that we pass a weird flag to the compiler to get it to do
// so. Note that this is definitely a hack, and we should likely
// flesh out rpath support more fully in the future.
//
// FIXME: remove condition after next stage0
if stage != "0" {
cmd.arg("-Z").arg("osx-rpath-install-name");
}
@ -206,18 +232,23 @@ fn main() {
if let Some(rpath) = rpath {
cmd.arg("-C").arg(format!("link-args={}", rpath));
}
if let Ok(s) = env::var("RUSTFLAGS") {
for flag in s.split_whitespace() {
cmd.arg(flag);
}
}
}
if target.contains("pc-windows-msvc") {
cmd.arg("-Z").arg("unstable-options");
cmd.arg("-C").arg("target-feature=+crt-static");
}
// Force all crates compiled by this compiler to (a) be unstable and (b)
// allow the `rustc_private` feature to link to other unstable crates
// also in the sysroot.
//
// FIXME: remove condition after next stage0
if env::var_os("RUSTC_FORCE_UNSTABLE").is_some() {
if stage != "0" {
cmd.arg("-Z").arg("force-unstable-if-unmarked");
}
}
}
if verbose > 1 {

View File

@ -0,0 +1,43 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate gcc;
use std::env;
use std::process::{self, Command};
fn main() {
let target = env::var("SCCACHE_TARGET").unwrap();
// Locate the actual compiler that we're invoking
env::remove_var("CC");
env::remove_var("CXX");
let mut cfg = gcc::Config::new();
cfg.cargo_metadata(false)
.out_dir("/")
.target(&target)
.host(&target)
.opt_level(0)
.debug(false);
let compiler = cfg.get_compiler();
// Invoke sccache with said compiler
let sccache_path = env::var_os("SCCACHE_PATH").unwrap();
let mut cmd = Command::new(&sccache_path);
cmd.arg(compiler.path());
for &(ref k, ref v) in compiler.env() {
cmd.env(k, v);
}
for arg in env::args().skip(1) {
cmd.arg(arg);
}
let status = cmd.status().expect("failed to spawn");
process::exit(status.code().unwrap_or(2))
}

View File

@ -14,6 +14,7 @@ import contextlib
import datetime
import hashlib
import os
import re
import shutil
import subprocess
import sys
@ -39,7 +40,8 @@ def get(url, path, verbose=False):
return
else:
if verbose:
print("ignoring already-download file " + path + " due to failed verification")
print("ignoring already-download file " +
path + " due to failed verification")
os.unlink(path)
download(temp_path, url, True, verbose)
if not verify(temp_path, sha_path, verbose):
@ -126,13 +128,13 @@ def unpack(tarball, dst, verbose=False, match=None):
shutil.move(tp, fp)
shutil.rmtree(os.path.join(dst, fname))
def run(args, verbose=False, exception=False):
def run(args, verbose=False, exception=False, **kwargs):
if verbose:
print("running: " + ' '.join(args))
sys.stdout.flush()
# Use Popen here instead of call() as it apparently allows powershell on
# Windows to not lock up waiting for input presumably.
ret = subprocess.Popen(args)
ret = subprocess.Popen(args, **kwargs)
code = ret.wait()
if code != 0:
err = "failed to run: " + ' '.join(args)
@ -140,6 +142,7 @@ def run(args, verbose=False, exception=False):
raise RuntimeError(err)
sys.exit(err)
def stage0_data(rust_root):
nightlies = os.path.join(rust_root, "src/stage0.txt")
data = {}
@ -152,71 +155,84 @@ def stage0_data(rust_root):
data[a] = b
return data
def format_build_time(duration):
return str(datetime.timedelta(seconds=int(duration)))
class RustBuild(object):
def download_stage0(self):
cache_dst = os.path.join(self.build_dir, "cache")
rustc_cache = os.path.join(cache_dst, self.stage0_rustc_date())
rustc_cache = os.path.join(cache_dst, self.stage0_date())
if not os.path.exists(rustc_cache):
os.makedirs(rustc_cache)
channel = self.stage0_rustc_channel()
rustc_channel = self.stage0_rustc_channel()
cargo_channel = self.stage0_cargo_channel()
if self.rustc().startswith(self.bin_root()) and \
(not os.path.exists(self.rustc()) or self.rustc_out_of_date()):
self.print_what_it_means_to_bootstrap()
if os.path.exists(self.bin_root()):
shutil.rmtree(self.bin_root())
filename = "rust-std-{}-{}.tar.gz".format(channel, self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
filename = "rust-std-{}-{}.tar.gz".format(
rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date()
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="rust-std-" + self.build,
verbose=self.verbose)
filename = "rustc-{}-{}.tar.gz".format(channel, self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
filename = "rustc-{}-{}.tar.gz".format(rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date()
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match="rustc", verbose=self.verbose)
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="rustc", verbose=self.verbose)
self.fix_executable(self.bin_root() + "/bin/rustc")
self.fix_executable(self.bin_root() + "/bin/rustdoc")
with open(self.rustc_stamp(), 'w') as f:
f.write(self.stage0_rustc_date())
f.write(self.stage0_date())
if "pc-windows-gnu" in self.build:
filename = "rust-mingw-{}-{}.tar.gz".format(channel, self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
filename = "rust-mingw-{}-{}.tar.gz".format(
rustc_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date()
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match="rust-mingw", verbose=self.verbose)
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="rust-mingw", verbose=self.verbose)
if self.cargo().startswith(self.bin_root()) and \
(not os.path.exists(self.cargo()) or self.cargo_out_of_date()):
self.print_what_it_means_to_bootstrap()
filename = "cargo-{}-{}.tar.gz".format('0.18.0', self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
filename = "cargo-{}-{}.tar.gz".format(cargo_channel, self.build)
url = self._download_url + "/dist/" + self.stage0_date()
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match="cargo", verbose=self.verbose)
get("{}/{}".format(url, filename),
tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(),
match="cargo", verbose=self.verbose)
self.fix_executable(self.bin_root() + "/bin/cargo")
with open(self.cargo_stamp(), 'w') as f:
f.write(self.stage0_rustc_date())
f.write(self.stage0_date())
def fix_executable(self, fname):
# If we're on NixOS we need to change the path to the dynamic loader
default_encoding = sys.getdefaultencoding()
try:
ostype = subprocess.check_output(['uname', '-s']).strip().decode(default_encoding)
ostype = subprocess.check_output(
['uname', '-s']).strip().decode(default_encoding)
except (subprocess.CalledProcessError, WindowsError):
return
@ -232,7 +248,8 @@ class RustBuild(object):
print("info: you seem to be running NixOS. Attempting to patch " + fname)
try:
interpreter = subprocess.check_output(["patchelf", "--print-interpreter", fname])
interpreter = subprocess.check_output(
["patchelf", "--print-interpreter", fname])
interpreter = interpreter.strip().decode(default_encoding)
except subprocess.CalledProcessError as e:
print("warning: failed to call patchelf: %s" % e)
@ -241,7 +258,8 @@ class RustBuild(object):
loader = interpreter.split("/")[-1]
try:
ldd_output = subprocess.check_output(['ldd', '/run/current-system/sw/bin/sh'])
ldd_output = subprocess.check_output(
['ldd', '/run/current-system/sw/bin/sh'])
ldd_output = ldd_output.strip().decode(default_encoding)
except subprocess.CalledProcessError as e:
print("warning: unable to call ldd: %s" % e)
@ -259,17 +277,21 @@ class RustBuild(object):
correct_interpreter = loader_path + loader
try:
subprocess.check_output(["patchelf", "--set-interpreter", correct_interpreter, fname])
subprocess.check_output(
["patchelf", "--set-interpreter", correct_interpreter, fname])
except subprocess.CalledProcessError as e:
print("warning: failed to call patchelf: %s" % e)
return
def stage0_rustc_date(self):
return self._rustc_date
def stage0_date(self):
return self._date
def stage0_rustc_channel(self):
return self._rustc_channel
def stage0_cargo_channel(self):
return self._cargo_channel
def rustc_stamp(self):
return os.path.join(self.bin_root(), '.rustc-stamp')
@ -280,21 +302,23 @@ class RustBuild(object):
if not os.path.exists(self.rustc_stamp()) or self.clean:
return True
with open(self.rustc_stamp(), 'r') as f:
return self.stage0_rustc_date() != f.read()
return self.stage0_date() != f.read()
def cargo_out_of_date(self):
if not os.path.exists(self.cargo_stamp()) or self.clean:
return True
with open(self.cargo_stamp(), 'r') as f:
return self.stage0_rustc_date() != f.read()
return self.stage0_date() != f.read()
def bin_root(self):
return os.path.join(self.build_dir, self.build, "stage0")
def get_toml(self, key):
for line in self.config_toml.splitlines():
if line.startswith(key + ' ='):
return self.get_string(line)
match = re.match(r'^{}\s*=(.*)$'.format(key), line)
if match is not None:
value = match.group(1)
return self.get_string(value) or value.strip()
return None
def get_mk(self, key):
@ -325,6 +349,8 @@ class RustBuild(object):
def get_string(self, line):
start = line.find('"')
if start == -1:
return None
end = start + 1 + line[start + 1:].find('"')
return line[start + 1:end]
@ -367,23 +393,24 @@ class RustBuild(object):
env["DYLD_LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \
(os.pathsep + env["DYLD_LIBRARY_PATH"]) \
if "DYLD_LIBRARY_PATH" in env else ""
env["LIBRARY_PATH"] = os.path.join(self.bin_root(), "lib") + \
(os.pathsep + env["LIBRARY_PATH"]) \
if "LIBRARY_PATH" in env else ""
env["PATH"] = os.path.join(self.bin_root(), "bin") + \
os.pathsep + env["PATH"]
if not os.path.isfile(self.cargo()):
raise Exception("no cargo executable found at `%s`" % self.cargo())
args = [self.cargo(), "build", "--manifest-path",
os.path.join(self.rust_root, "src/bootstrap/Cargo.toml")]
if self.verbose:
args.append("--verbose")
if self.verbose > 1:
args.append("--verbose")
if self.use_locked_deps:
args.append("--locked")
if self.use_vendored_sources:
args.append("--frozen")
self.run(args, env)
def run(self, args, env):
proc = subprocess.Popen(args, env=env)
ret = proc.wait()
if ret != 0:
sys.exit(ret)
run(args, env=env, verbose=self.verbose)
def build_triple(self):
default_encoding = sys.getdefaultencoding()
@ -394,8 +421,10 @@ class RustBuild(object):
if config:
return config
try:
ostype = subprocess.check_output(['uname', '-s']).strip().decode(default_encoding)
cputype = subprocess.check_output(['uname', '-m']).strip().decode(default_encoding)
ostype = subprocess.check_output(
['uname', '-s']).strip().decode(default_encoding)
cputype = subprocess.check_output(
['uname', '-m']).strip().decode(default_encoding)
except (subprocess.CalledProcessError, OSError):
if sys.platform == 'win32':
return 'x86_64-pc-windows-msvc'
@ -407,6 +436,11 @@ class RustBuild(object):
# The goal here is to come up with the same triple as LLVM would,
# at least for the subset of platforms we're willing to target.
if ostype == 'Linux':
os_from_sp = subprocess.check_output(
['uname', '-o']).strip().decode(default_encoding)
if os_from_sp == 'Android':
ostype = 'linux-android'
else:
ostype = 'unknown-linux-gnu'
elif ostype == 'FreeBSD':
ostype = 'unknown-freebsd'
@ -464,15 +498,21 @@ class RustBuild(object):
cputype = 'i686'
elif cputype in {'xscale', 'arm'}:
cputype = 'arm'
if ostype == 'linux-android':
ostype = 'linux-androideabi'
elif cputype == 'armv6l':
cputype = 'arm'
if ostype == 'linux-android':
ostype = 'linux-androideabi'
else:
ostype += 'eabihf'
elif cputype in {'armv7l', 'armv8l'}:
cputype = 'armv7'
if ostype == 'linux-android':
ostype = 'linux-androideabi'
else:
ostype += 'eabihf'
elif cputype == 'aarch64':
cputype = 'aarch64'
elif cputype == 'arm64':
elif cputype in {'aarch64', 'arm64'}:
cputype = 'aarch64'
elif cputype == 'mips':
if sys.byteorder == 'big':
@ -512,6 +552,32 @@ class RustBuild(object):
return "{}-{}".format(cputype, ostype)
def update_submodules(self):
if (not os.path.exists(os.path.join(self.rust_root, ".git"))) or \
self.get_toml('submodules') == "false" or \
self.get_mk('CFG_DISABLE_MANAGE_SUBMODULES') == "1":
return
print('Updating submodules')
default_encoding = sys.getdefaultencoding()
run(["git", "submodule", "-q", "sync"], cwd=self.rust_root)
submodules = [s.split(' ', 1)[1] for s in subprocess.check_output(
["git", "config", "--file", os.path.join(self.rust_root, ".gitmodules"),
"--get-regexp", "path"]
).decode(default_encoding).splitlines()]
submodules = [module for module in submodules
if not ((module.endswith("llvm") and
(self.get_toml('llvm-config') or self.get_mk('CFG_LLVM_ROOT'))) or
(module.endswith("jemalloc") and
(self.get_toml('jemalloc') or self.get_mk('CFG_JEMALLOC_ROOT'))))
]
run(["git", "submodule", "update",
"--init"] + submodules, cwd=self.rust_root, verbose=self.verbose)
run(["git", "submodule", "-q", "foreach", "git",
"reset", "-q", "--hard"], cwd=self.rust_root, verbose=self.verbose)
run(["git", "submodule", "-q", "foreach", "git",
"clean", "-qdfx"], cwd=self.rust_root, verbose=self.verbose)
def bootstrap():
parser = argparse.ArgumentParser(description='Build rust')
parser.add_argument('--config')
@ -540,6 +606,11 @@ def bootstrap():
except:
pass
if '\nverbose = 2' in rb.config_toml:
rb.verbose = 2
elif '\nverbose = 1' in rb.config_toml:
rb.verbose = 1
rb.use_vendored_sources = '\nvendor = true' in rb.config_toml or \
'CFG_ENABLE_VENDOR' in rb.config_mk
@ -558,7 +629,7 @@ def bootstrap():
if rb.use_vendored_sources:
if not os.path.exists('.cargo'):
os.makedirs('.cargo')
with open('.cargo/config','w') as f:
with open('.cargo/config', 'w') as f:
f.write("""
[source.crates-io]
replace-with = 'vendored-sources'
@ -572,7 +643,15 @@ def bootstrap():
shutil.rmtree('.cargo')
data = stage0_data(rb.rust_root)
rb._rustc_channel, rb._rustc_date = data['rustc'].split('-', 1)
rb._date = data['date']
rb._rustc_channel = data['rustc']
rb._cargo_channel = data['cargo']
if 'dev' in data:
rb._download_url = 'https://dev-static.rust-lang.org'
else:
rb._download_url = 'https://static.rust-lang.org'
rb.update_submodules()
# Fetch/build the bootstrap
rb.build = rb.build_triple()
@ -588,15 +667,18 @@ def bootstrap():
env["BUILD"] = rb.build
env["SRC"] = rb.rust_root
env["BOOTSTRAP_PARENT_ID"] = str(os.getpid())
rb.run(args, env)
run(args, env=env, verbose=rb.verbose)
def main():
start_time = time()
help_triggered = ('-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1)
help_triggered = (
'-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1)
try:
bootstrap()
if not help_triggered:
print("Build completed successfully in %s" % format_build_time(time() - start_time))
print("Build completed successfully in %s" %
format_build_time(time() - start_time))
except (SystemExit, KeyboardInterrupt) as e:
if hasattr(e, 'code') and isinstance(e.code, int):
exit_code = e.code
@ -604,7 +686,8 @@ def main():
exit_code = 1
print(e)
if not help_triggered:
print("Build completed unsuccessfully in %s" % format_build_time(time() - start_time))
print("Build completed unsuccessfully in %s" %
format_build_time(time() - start_time))
sys.exit(exit_code)
if __name__ == '__main__':

View File

@ -23,7 +23,7 @@ use build_helper::output;
use Build;
// The version number
pub const CFG_RELEASE_NUM: &'static str = "1.18.0";
pub const CFG_RELEASE_NUM: &'static str = "1.19.0";
// An optional number to put after the label, e.g. '.2' -> '-beta.2'
// Be sure to make this starts with a dot to conform to semver pre-release

View File

@ -28,7 +28,7 @@ use {Build, Compiler, Mode};
use dist;
use util::{self, dylib_path, dylib_path_var, exe};
const ADB_TEST_DIR: &'static str = "/data/tmp";
const ADB_TEST_DIR: &'static str = "/data/tmp/work";
/// The two modes of the test runner; tests or benchmarks.
#[derive(Copy, Clone)]
@ -58,6 +58,28 @@ impl fmt::Display for TestKind {
}
}
fn try_run(build: &Build, cmd: &mut Command) {
if build.flags.cmd.no_fail_fast() {
if !build.try_run(cmd) {
let failures = build.delayed_failures.get();
build.delayed_failures.set(failures + 1);
}
} else {
build.run(cmd);
}
}
fn try_run_quiet(build: &Build, cmd: &mut Command) {
if build.flags.cmd.no_fail_fast() {
if !build.try_run_quiet(cmd) {
let failures = build.delayed_failures.get();
build.delayed_failures.set(failures + 1);
}
} else {
build.run_quiet(cmd);
}
}
/// Runs the `linkchecker` tool as compiled in `stage` by the `host` compiler.
///
/// This tool in `src/tools` will verify the validity of all our links in the
@ -67,7 +89,7 @@ pub fn linkcheck(build: &Build, host: &str) {
let compiler = Compiler::new(0, host);
let _time = util::timeit();
build.run(build.tool_cmd(&compiler, "linkchecker")
try_run(build, build.tool_cmd(&compiler, "linkchecker")
.arg(build.out.join(host).join("doc")));
}
@ -78,14 +100,6 @@ pub fn linkcheck(build: &Build, host: &str) {
pub fn cargotest(build: &Build, stage: u32, host: &str) {
let ref compiler = Compiler::new(stage, host);
// Configure PATH to find the right rustc. NB. we have to use PATH
// and not RUSTC because the Cargo test suite has tests that will
// fail if rustc is not spelled `rustc`.
let path = build.sysroot(compiler).join("bin");
let old_path = ::std::env::var("PATH").expect("");
let sep = if cfg!(windows) { ";" } else {":" };
let ref newpath = format!("{}{}{}", path.display(), sep, old_path);
// Note that this is a short, cryptic, and not scoped directory name. This
// is currently to minimize the length of path on Windows where we otherwise
// quickly run into path name limit constraints.
@ -95,9 +109,38 @@ pub fn cargotest(build: &Build, stage: u32, host: &str) {
let _time = util::timeit();
let mut cmd = Command::new(build.tool(&Compiler::new(0, host), "cargotest"));
build.prepare_tool_cmd(compiler, &mut cmd);
build.run(cmd.env("PATH", newpath)
.arg(&build.cargo)
.arg(&out_dir));
try_run(build, cmd.arg(&build.cargo)
.arg(&out_dir)
.env("RUSTC", build.compiler_path(compiler))
.env("RUSTDOC", build.rustdoc(compiler)));
}
/// Runs `cargo test` for `cargo` packaged with Rust.
pub fn cargo(build: &Build, stage: u32, host: &str) {
let ref compiler = Compiler::new(stage, host);
// Configure PATH to find the right rustc. NB. we have to use PATH
// and not RUSTC because the Cargo test suite has tests that will
// fail if rustc is not spelled `rustc`.
let path = build.sysroot(compiler).join("bin");
let old_path = ::std::env::var("PATH").expect("");
let sep = if cfg!(windows) { ";" } else {":" };
let ref newpath = format!("{}{}{}", path.display(), sep, old_path);
let mut cargo = build.cargo(compiler, Mode::Tool, host, "test");
cargo.arg("--manifest-path").arg(build.src.join("src/tools/cargo/Cargo.toml"));
if build.flags.cmd.no_fail_fast() {
cargo.arg("--no-fail-fast");
}
// Don't build tests dynamically, just a pain to work with
cargo.env("RUSTC_NO_PREFER_DYNAMIC", "1");
// Don't run cross-compile tests, we may not have cross-compiled libstd libs
// available.
cargo.env("CFG_DISABLE_CROSS_TESTS", "1");
try_run(build, cargo.env("PATH", newpath));
}
/// Runs the `tidy` tool as compiled in `stage` by the `host` compiler.
@ -106,6 +149,7 @@ pub fn cargotest(build: &Build, stage: u32, host: &str) {
/// otherwise just implements a few lint-like checks that are specific to the
/// compiler itself.
pub fn tidy(build: &Build, host: &str) {
let _folder = build.fold_output(|| "tidy");
println!("tidy check ({})", host);
let compiler = Compiler::new(0, host);
let mut cmd = build.tool_cmd(&compiler, "tidy");
@ -113,7 +157,10 @@ pub fn tidy(build: &Build, host: &str) {
if !build.config.vendor {
cmd.arg("--no-vendor");
}
build.run(&mut cmd);
if build.config.quiet_tests {
cmd.arg("--quiet");
}
try_run(build, &mut cmd);
}
fn testdir(build: &Build, host: &str) -> PathBuf {
@ -130,6 +177,7 @@ pub fn compiletest(build: &Build,
target: &str,
mode: &str,
suite: &str) {
let _folder = build.fold_output(|| format!("test_{}", suite));
println!("Check compiletest suite={} mode={} ({} -> {})",
suite, mode, compiler.host, target);
let mut cmd = Command::new(build.tool(&Compiler::new(0, compiler.host),
@ -225,10 +273,10 @@ pub fn compiletest(build: &Build,
.arg("--llvm-cxxflags").arg("");
}
if build.qemu_rootfs(target).is_some() {
cmd.arg("--qemu-test-client")
if build.remote_tested(target) {
cmd.arg("--remote-test-client")
.arg(build.tool(&Compiler::new(0, &build.config.build),
"qemu-test-client"));
"remote-test-client"));
}
// Running a C compiler on MSVC requires a few env vars to be set, to be
@ -260,8 +308,10 @@ pub fn compiletest(build: &Build,
cmd.arg("--android-cross-path").arg("");
}
build.ci_env.force_coloring_in_ci(&mut cmd);
let _time = util::timeit();
build.run(&mut cmd);
try_run(build, &mut cmd);
}
/// Run `rustdoc --test` for all documentation in `src/doc`.
@ -274,6 +324,7 @@ pub fn docs(build: &Build, compiler: &Compiler) {
// tests for all files that end in `*.md`
let mut stack = vec![build.src.join("src/doc")];
let _time = util::timeit();
let _folder = build.fold_output(|| "test_docs");
while let Some(p) = stack.pop() {
if p.is_dir() {
@ -307,6 +358,7 @@ pub fn docs(build: &Build, compiler: &Compiler) {
/// generate a markdown file from the error indexes of the code base which is
/// then passed to `rustdoc --test`.
pub fn error_index(build: &Build, compiler: &Compiler) {
let _folder = build.fold_output(|| "test_error_index");
println!("Testing error-index stage{}", compiler.stage);
let dir = testdir(build, compiler.host);
@ -331,13 +383,14 @@ fn markdown_test(build: &Build, compiler: &Compiler, markdown: &Path) {
cmd.arg(markdown);
cmd.env("RUSTC_BOOTSTRAP", "1");
let mut test_args = build.flags.cmd.test_args().join(" ");
if build.config.quiet_tests {
test_args.push_str(" --quiet");
}
let test_args = build.flags.cmd.test_args().join(" ");
cmd.arg("--test-args").arg(test_args);
build.run(&mut cmd);
if build.config.quiet_tests {
try_run_quiet(build, &mut cmd);
} else {
try_run(build, &mut cmd);
}
}
/// Run all unit tests plus documentation tests for an entire crate DAG defined
@ -366,6 +419,9 @@ pub fn krate(build: &Build,
}
_ => panic!("can only test libraries"),
};
let _folder = build.fold_output(|| {
format!("{}_stage{}-{}", test_kind.subcommand(), compiler.stage, name)
});
println!("{} {} stage{} ({} -> {})", test_kind, name, compiler.stage,
compiler.host, target);
@ -388,6 +444,9 @@ pub fn krate(build: &Build,
cargo.arg("--manifest-path")
.arg(build.src.join(path).join("Cargo.toml"))
.arg("--features").arg(features);
if test_kind.subcommand() == "test" && build.flags.cmd.no_fail_fast() {
cargo.arg("--no-fail-fast");
}
match krate {
Some(krate) => {
@ -427,9 +486,7 @@ pub fn krate(build: &Build,
dylib_path.insert(0, build.sysroot_libdir(&compiler, target));
cargo.env(dylib_path_var(), env::join_paths(&dylib_path).unwrap());
if target.contains("android") ||
target.contains("emscripten") ||
build.qemu_rootfs(target).is_some() {
if target.contains("emscripten") || build.remote_tested(target) {
cargo.arg("--no-run");
}
@ -441,65 +498,15 @@ pub fn krate(build: &Build,
let _time = util::timeit();
if target.contains("android") {
build.run(&mut cargo);
krate_android(build, &compiler, target, mode);
} else if target.contains("emscripten") {
if target.contains("emscripten") {
build.run(&mut cargo);
krate_emscripten(build, &compiler, target, mode);
} else if build.qemu_rootfs(target).is_some() {
} else if build.remote_tested(target) {
build.run(&mut cargo);
krate_qemu(build, &compiler, target, mode);
krate_remote(build, &compiler, target, mode);
} else {
cargo.args(&build.flags.cmd.test_args());
build.run(&mut cargo);
}
}
fn krate_android(build: &Build,
compiler: &Compiler,
target: &str,
mode: Mode) {
let mut tests = Vec::new();
let out_dir = build.cargo_out(compiler, mode, target);
find_tests(&out_dir, target, &mut tests);
find_tests(&out_dir.join("deps"), target, &mut tests);
for test in tests {
build.run(Command::new("adb").arg("push").arg(&test).arg(ADB_TEST_DIR));
let test_file_name = test.file_name().unwrap().to_string_lossy();
let log = format!("{}/check-stage{}-T-{}-H-{}-{}.log",
ADB_TEST_DIR,
compiler.stage,
target,
compiler.host,
test_file_name);
let quiet = if build.config.quiet_tests { "--quiet" } else { "" };
let program = format!("(cd {dir}; \
LD_LIBRARY_PATH=./{target} ./{test} \
--logfile {log} \
{quiet} \
{args})",
dir = ADB_TEST_DIR,
target = target,
test = test_file_name,
log = log,
quiet = quiet,
args = build.flags.cmd.test_args().join(" "));
let output = output(Command::new("adb").arg("shell").arg(&program));
println!("{}", output);
t!(fs::create_dir_all(build.out.join("tmp")));
build.run(Command::new("adb")
.arg("pull")
.arg(&log)
.arg(build.out.join("tmp")));
build.run(Command::new("adb").arg("shell").arg("rm").arg(&log));
if !output.contains("result: ok") {
panic!("some tests failed");
}
try_run(build, &mut cargo);
}
}
@ -509,7 +516,6 @@ fn krate_emscripten(build: &Build,
mode: Mode) {
let mut tests = Vec::new();
let out_dir = build.cargo_out(compiler, mode, target);
find_tests(&out_dir, target, &mut tests);
find_tests(&out_dir.join("deps"), target, &mut tests);
for test in tests {
@ -521,21 +527,20 @@ fn krate_emscripten(build: &Build,
if build.config.quiet_tests {
cmd.arg("--quiet");
}
build.run(&mut cmd);
try_run(build, &mut cmd);
}
}
fn krate_qemu(build: &Build,
fn krate_remote(build: &Build,
compiler: &Compiler,
target: &str,
mode: Mode) {
let mut tests = Vec::new();
let out_dir = build.cargo_out(compiler, mode, target);
find_tests(&out_dir, target, &mut tests);
find_tests(&out_dir.join("deps"), target, &mut tests);
let tool = build.tool(&Compiler::new(0, &build.config.build),
"qemu-test-client");
"remote-test-client");
for test in tests {
let mut cmd = Command::new(&tool);
cmd.arg("run")
@ -544,11 +549,10 @@ fn krate_qemu(build: &Build,
cmd.arg("--quiet");
}
cmd.args(&build.flags.cmd.test_args());
build.run(&mut cmd);
try_run(build, &mut cmd);
}
}
fn find_tests(dir: &Path,
target: &str,
dst: &mut Vec<PathBuf>) {
@ -566,60 +570,29 @@ fn find_tests(dir: &Path,
}
}
pub fn emulator_copy_libs(build: &Build, compiler: &Compiler, target: &str) {
if target.contains("android") {
android_copy_libs(build, compiler, target)
} else if let Some(s) = build.qemu_rootfs(target) {
qemu_copy_libs(build, compiler, target, s)
pub fn remote_copy_libs(build: &Build, compiler: &Compiler, target: &str) {
if !build.remote_tested(target) {
return
}
}
fn android_copy_libs(build: &Build, compiler: &Compiler, target: &str) {
println!("Android copy libs to emulator ({})", target);
build.run(Command::new("adb").arg("wait-for-device"));
build.run(Command::new("adb").arg("remount"));
build.run(Command::new("adb").args(&["shell", "rm", "-r", ADB_TEST_DIR]));
build.run(Command::new("adb").args(&["shell", "mkdir", ADB_TEST_DIR]));
build.run(Command::new("adb")
.arg("push")
.arg(build.src.join("src/etc/adb_run_wrapper.sh"))
.arg(ADB_TEST_DIR));
let target_dir = format!("{}/{}", ADB_TEST_DIR, target);
build.run(Command::new("adb").args(&["shell", "mkdir", &target_dir]));
for f in t!(build.sysroot_libdir(compiler, target).read_dir()) {
let f = t!(f);
let name = f.file_name().into_string().unwrap();
if util::is_dylib(&name) {
build.run(Command::new("adb")
.arg("push")
.arg(f.path())
.arg(&target_dir));
}
}
}
fn qemu_copy_libs(build: &Build,
compiler: &Compiler,
target: &str,
rootfs: &Path) {
println!("QEMU copy libs to emulator ({})", target);
assert!(target.starts_with("arm"), "only works with arm for now");
println!("REMOTE copy libs to emulator ({})", target);
t!(fs::create_dir_all(build.out.join("tmp")));
// Copy our freshly compiled test server over to the rootfs
let server = build.cargo_out(compiler, Mode::Tool, target)
.join(exe("qemu-test-server", target));
t!(fs::copy(&server, rootfs.join("testd")));
.join(exe("remote-test-server", target));
// Spawn the emulator and wait for it to come online
let tool = build.tool(&Compiler::new(0, &build.config.build),
"qemu-test-client");
build.run(Command::new(&tool)
.arg("spawn-emulator")
.arg(rootfs)
.arg(build.out.join("tmp")));
"remote-test-client");
let mut cmd = Command::new(&tool);
cmd.arg("spawn-emulator")
.arg(target)
.arg(&server)
.arg(build.out.join("tmp"));
if let Some(rootfs) = build.qemu_rootfs(target) {
cmd.arg(rootfs);
}
build.run(&mut cmd);
// Push all our dylibs to the emulator
for f in t!(build.sysroot_libdir(compiler, target).read_dir()) {
@ -645,6 +618,7 @@ pub fn distcheck(build: &Build) {
return
}
println!("Distcheck");
let dir = build.out.join("tmp").join("distcheck");
let _ = fs::remove_dir_all(&dir);
t!(fs::create_dir_all(&dir));
@ -662,6 +636,26 @@ pub fn distcheck(build: &Build) {
build.run(Command::new(build_helper::make(&build.config.build))
.arg("check")
.current_dir(&dir));
// Now make sure that rust-src has all of libstd's dependencies
println!("Distcheck rust-src");
let dir = build.out.join("tmp").join("distcheck-src");
let _ = fs::remove_dir_all(&dir);
t!(fs::create_dir_all(&dir));
let mut cmd = Command::new("tar");
cmd.arg("-xzf")
.arg(dist::rust_src_installer(build))
.arg("--strip-components=1")
.current_dir(&dir);
build.run(&mut cmd);
let toml = dir.join("rust-src/lib/rustlib/src/rust/src/libstd/Cargo.toml");
build.run(Command::new(&build.cargo)
.arg("generate-lockfile")
.arg("--manifest-path")
.arg(&toml)
.current_dir(&dir));
}
/// Test the build system itself
@ -671,6 +665,9 @@ pub fn bootstrap(build: &Build) {
.current_dir(build.src.join("src/bootstrap"))
.env("CARGO_TARGET_DIR", build.out.join("bootstrap"))
.env("RUSTC", &build.rustc);
if build.flags.cmd.no_fail_fast() {
cmd.arg("--no-fail-fast");
}
cmd.arg("--").args(&build.flags.cmd.test_args());
build.run(&mut cmd);
try_run(build, &mut cmd);
}

View File

@ -16,14 +16,17 @@
//! compiler. This module is also responsible for assembling the sysroot as it
//! goes along from the output of the previous stage.
use std::collections::HashMap;
use std::fs::{self, File};
use std::path::{Path, PathBuf};
use std::process::Command;
use std::env;
use std::fs::{self, File};
use std::io::BufReader;
use std::io::prelude::*;
use std::path::{Path, PathBuf};
use std::process::{Command, Stdio};
use std::str;
use build_helper::{output, mtime, up_to_date};
use filetime::FileTime;
use rustc_serialize::json;
use channel::GitInfo;
use util::{exe, libdir, is_dylib, copy};
@ -38,6 +41,7 @@ pub fn std(build: &Build, target: &str, compiler: &Compiler) {
let libdir = build.sysroot_libdir(compiler, target);
t!(fs::create_dir_all(&libdir));
let _folder = build.fold_output(|| format!("stage{}-std", compiler.stage));
println!("Building stage{} std artifacts ({} -> {})", compiler.stage,
compiler.host, target);
@ -84,8 +88,9 @@ pub fn std(build: &Build, target: &str, compiler: &Compiler) {
}
}
build.run(&mut cargo);
update_mtime(build, &libstd_stamp(build, &compiler, target));
run_cargo(build,
&mut cargo,
&libstd_stamp(build, &compiler, target));
}
/// Link all libstd rlibs/dylibs into the sysroot location.
@ -106,15 +111,19 @@ pub fn std_link(build: &Build,
compiler.host,
target_compiler.host,
target);
let libdir = build.sysroot_libdir(&target_compiler, target);
let out_dir = build.cargo_out(&compiler, Mode::Libstd, target);
t!(fs::create_dir_all(&libdir));
add_to_sysroot(&out_dir, &libdir);
let libdir = build.sysroot_libdir(target_compiler, target);
add_to_sysroot(&libdir, &libstd_stamp(build, compiler, target));
if target.contains("musl") && !target.contains("mips") {
copy_musl_third_party_objects(build, target, &libdir);
}
if build.config.sanitizers && compiler.stage != 0 && target == "x86_64-apple-darwin" {
// The sanitizers are only built in stage1 or above, so the dylibs will
// be missing in stage0 and causes panic. See the `std()` function above
// for reason why the sanitizers are not built in stage0.
copy_apple_sanitizer_dylibs(&build.native_dir(target), "osx", &libdir);
}
}
/// Copies the crt(1,i,n).o startup objects
@ -126,6 +135,18 @@ fn copy_musl_third_party_objects(build: &Build, target: &str, into: &Path) {
}
}
fn copy_apple_sanitizer_dylibs(native_dir: &Path, platform: &str, into: &Path) {
for &sanitizer in &["asan", "tsan"] {
let filename = format!("libclang_rt.{}_{}_dynamic.dylib", sanitizer, platform);
let mut src_path = native_dir.join(sanitizer);
src_path.push("build");
src_path.push("lib");
src_path.push("darwin");
src_path.push(&filename);
copy(&src_path, &into.join(filename));
}
}
/// Build and prepare startup objects like rsbegin.o and rsend.o
///
/// These are primarily used on Windows right now for linking executables/dlls.
@ -172,6 +193,7 @@ pub fn build_startup_objects(build: &Build, for_compiler: &Compiler, target: &st
/// the build using the `compiler` targeting the `target` architecture. The
/// artifacts created will also be linked into the sysroot directory.
pub fn test(build: &Build, target: &str, compiler: &Compiler) {
let _folder = build.fold_output(|| format!("stage{}-test", compiler.stage));
println!("Building stage{} test artifacts ({} -> {})", compiler.stage,
compiler.host, target);
let out_dir = build.cargo_out(compiler, Mode::Libtest, target);
@ -182,8 +204,9 @@ pub fn test(build: &Build, target: &str, compiler: &Compiler) {
}
cargo.arg("--manifest-path")
.arg(build.src.join("src/libtest/Cargo.toml"));
build.run(&mut cargo);
update_mtime(build, &libtest_stamp(build, compiler, target));
run_cargo(build,
&mut cargo,
&libtest_stamp(build, compiler, target));
}
/// Same as `std_link`, only for libtest
@ -197,9 +220,8 @@ pub fn test_link(build: &Build,
compiler.host,
target_compiler.host,
target);
let libdir = build.sysroot_libdir(&target_compiler, target);
let out_dir = build.cargo_out(&compiler, Mode::Libtest, target);
add_to_sysroot(&out_dir, &libdir);
add_to_sysroot(&build.sysroot_libdir(target_compiler, target),
&libtest_stamp(build, compiler, target));
}
/// Build the compiler.
@ -208,6 +230,7 @@ pub fn test_link(build: &Build,
/// the `compiler` targeting the `target` architecture. The artifacts
/// created will also be linked into the sysroot directory.
pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
let _folder = build.fold_output(|| format!("stage{}-rustc", compiler.stage));
println!("Building stage{} compiler artifacts ({} -> {})",
compiler.stage, compiler.host, target);
@ -275,8 +298,9 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
if let Some(ref s) = build.config.rustc_default_ar {
cargo.env("CFG_DEFAULT_AR", s);
}
build.run(&mut cargo);
update_mtime(build, &librustc_stamp(build, compiler, target));
run_cargo(build,
&mut cargo,
&librustc_stamp(build, compiler, target));
}
/// Same as `std_link`, only for librustc
@ -290,9 +314,8 @@ pub fn rustc_link(build: &Build,
compiler.host,
target_compiler.host,
target);
let libdir = build.sysroot_libdir(&target_compiler, target);
let out_dir = build.cargo_out(&compiler, Mode::Librustc, target);
add_to_sysroot(&out_dir, &libdir);
add_to_sysroot(&build.sysroot_libdir(target_compiler, target),
&librustc_stamp(build, compiler, target));
}
/// Cargo's output path for the standard library in a given stage, compiled
@ -378,39 +401,17 @@ pub fn assemble_rustc(build: &Build, stage: u32, host: &str) {
/// Link some files into a rustc sysroot.
///
/// For a particular stage this will link all of the contents of `out_dir`
/// into the sysroot of the `host` compiler, assuming the artifacts are
/// compiled for the specified `target`.
fn add_to_sysroot(out_dir: &Path, sysroot_dst: &Path) {
// Collect the set of all files in the dependencies directory, keyed
// off the name of the library. We assume everything is of the form
// `foo-<hash>.{rlib,so,...}`, and there could be multiple different
// `<hash>` values for the same name (of old builds).
let mut map = HashMap::new();
for file in t!(fs::read_dir(out_dir.join("deps"))).map(|f| t!(f)) {
let filename = file.file_name().into_string().unwrap();
// We're only interested in linking rlibs + dylibs, other things like
// unit tests don't get linked in
if !filename.ends_with(".rlib") &&
!filename.ends_with(".lib") &&
!is_dylib(&filename) {
/// For a particular stage this will link the file listed in `stamp` into the
/// `sysroot_dst` provided.
fn add_to_sysroot(sysroot_dst: &Path, stamp: &Path) {
t!(fs::create_dir_all(&sysroot_dst));
let mut contents = Vec::new();
t!(t!(File::open(stamp)).read_to_end(&mut contents));
for part in contents.split(|b| *b == 0) {
if part.is_empty() {
continue
}
let file = file.path();
let dash = filename.find("-").unwrap();
let key = (filename[..dash].to_string(),
file.extension().unwrap().to_owned());
map.entry(key).or_insert(Vec::new())
.push(file.clone());
}
// For all hash values found, pick the most recent one to move into the
// sysroot, that should be the one we just built.
for (_, paths) in map {
let (_, path) = paths.iter().map(|path| {
(mtime(&path).seconds(), path)
}).max().unwrap();
let path = Path::new(t!(str::from_utf8(part)));
copy(&path, &sysroot_dst.join(path.file_name().unwrap()));
}
}
@ -437,15 +438,13 @@ pub fn maybe_clean_tools(build: &Build, stage: u32, target: &str, mode: Mode) {
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
pub fn tool(build: &Build, stage: u32, target: &str, tool: &str) {
let _folder = build.fold_output(|| format!("stage{}-{}", stage, tool));
println!("Building stage{} tool {} ({})", stage, tool, target);
let compiler = Compiler::new(stage, &build.config.build);
let mut cargo = build.cargo(&compiler, Mode::Tool, target, "build");
let mut dir = build.src.join(tool);
if !dir.exists() {
dir = build.src.join("src/tools").join(tool);
}
let dir = build.src.join("src/tools").join(tool);
cargo.arg("--manifest-path").arg(dir.join("Cargo.toml"));
// We don't want to build tools dynamically as they'll be running across
@ -474,40 +473,148 @@ pub fn tool(build: &Build, stage: u32, target: &str, tool: &str) {
build.run(&mut cargo);
}
/// Updates the mtime of a stamp file if necessary, only changing it if it's
/// older than some other library file in the same directory.
///
/// We don't know what file Cargo is going to output (because there's a hash in
/// the file name) but we know where it's going to put it. We use this helper to
/// detect changes to that output file by looking at the modification time for
/// all files in a directory and updating the stamp if any are newer.
///
/// Note that we only consider Rust libraries as that's what we're interested in
/// propagating changes from. Files like executables are tracked elsewhere.
fn update_mtime(build: &Build, path: &Path) {
let entries = match path.parent().unwrap().join("deps").read_dir() {
Ok(entries) => entries,
Err(_) => return,
};
let files = entries.map(|e| t!(e)).filter(|e| t!(e.file_type()).is_file());
let files = files.filter(|e| {
let filename = e.file_name();
let filename = filename.to_str().unwrap();
filename.ends_with(".rlib") ||
filename.ends_with(".lib") ||
is_dylib(&filename)
});
let max = files.max_by_key(|entry| {
let meta = t!(entry.metadata());
FileTime::from_last_modification_time(&meta)
});
let max = match max {
Some(max) => max,
None => return,
fn run_cargo(build: &Build, cargo: &mut Command, stamp: &Path) {
// Instruct Cargo to give us json messages on stdout, critically leaving
// stderr as piped so we can get those pretty colors.
cargo.arg("--message-format").arg("json")
.stdout(Stdio::piped());
build.verbose(&format!("running: {:?}", cargo));
let mut child = match cargo.spawn() {
Ok(child) => child,
Err(e) => panic!("failed to execute command: {:?}\nerror: {}", cargo, e),
};
if mtime(&max.path()) > mtime(path) {
build.verbose(&format!("updating {:?} as {:?} changed", path, max.path()));
t!(File::create(path));
// `target_root_dir` looks like $dir/$target/release
let target_root_dir = stamp.parent().unwrap();
// `target_deps_dir` looks like $dir/$target/release/deps
let target_deps_dir = target_root_dir.join("deps");
// `host_root_dir` looks like $dir/release
let host_root_dir = target_root_dir.parent().unwrap() // chop off `release`
.parent().unwrap() // chop off `$target`
.join(target_root_dir.file_name().unwrap());
// Spawn Cargo slurping up its JSON output. We'll start building up the
// `deps` array of all files it generated along with a `toplevel` array of
// files we need to probe for later.
let mut deps = Vec::new();
let mut toplevel = Vec::new();
let stdout = BufReader::new(child.stdout.take().unwrap());
for line in stdout.lines() {
let line = t!(line);
let json = if line.starts_with("{") {
t!(line.parse::<json::Json>())
} else {
// If this was informational, just print it out and continue
println!("{}", line);
continue
};
if json.find("reason").and_then(|j| j.as_string()) != Some("compiler-artifact") {
continue
}
for filename in json["filenames"].as_array().unwrap() {
let filename = filename.as_string().unwrap();
// Skip files like executables
if !filename.ends_with(".rlib") &&
!filename.ends_with(".lib") &&
!is_dylib(&filename) {
continue
}
let filename = Path::new(filename);
// If this was an output file in the "host dir" we don't actually
// worry about it, it's not relevant for us.
if filename.starts_with(&host_root_dir) {
continue
// If this was output in the `deps` dir then this is a precise file
// name (hash included) so we start tracking it.
} else if filename.starts_with(&target_deps_dir) {
deps.push(filename.to_path_buf());
// Otherwise this was a "top level artifact" which right now doesn't
// have a hash in the name, but there's a version of this file in
// the `deps` folder which *does* have a hash in the name. That's
// the one we'll want to we'll probe for it later.
} else {
toplevel.push((filename.file_stem().unwrap()
.to_str().unwrap().to_string(),
filename.extension().unwrap().to_owned()
.to_str().unwrap().to_string()));
}
}
}
// Make sure Cargo actually succeeded after we read all of its stdout.
let status = t!(child.wait());
if !status.success() {
panic!("command did not execute successfully: {:?}\n\
expected success, got: {}",
cargo,
status);
}
// Ok now we need to actually find all the files listed in `toplevel`. We've
// got a list of prefix/extensions and we basically just need to find the
// most recent file in the `deps` folder corresponding to each one.
let contents = t!(target_deps_dir.read_dir())
.map(|e| t!(e))
.map(|e| (e.path(), e.file_name().into_string().unwrap(), t!(e.metadata())))
.collect::<Vec<_>>();
for (prefix, extension) in toplevel {
let candidates = contents.iter().filter(|&&(_, ref filename, _)| {
filename.starts_with(&prefix[..]) &&
filename[prefix.len()..].starts_with("-") &&
filename.ends_with(&extension[..])
});
let max = candidates.max_by_key(|&&(_, _, ref metadata)| {
FileTime::from_last_modification_time(metadata)
});
let path_to_add = match max {
Some(triple) => triple.0.to_str().unwrap(),
None => panic!("no output generated for {:?} {:?}", prefix, extension),
};
if is_dylib(path_to_add) {
let candidate = format!("{}.lib", path_to_add);
let candidate = PathBuf::from(candidate);
if candidate.exists() {
deps.push(candidate);
}
}
deps.push(path_to_add.into());
}
// Now we want to update the contents of the stamp file, if necessary. First
// we read off the previous contents along with its mtime. If our new
// contents (the list of files to copy) is different or if any dep's mtime
// is newer then we rewrite the stamp file.
deps.sort();
let mut stamp_contents = Vec::new();
if let Ok(mut f) = File::open(stamp) {
t!(f.read_to_end(&mut stamp_contents));
}
let stamp_mtime = mtime(&stamp);
let mut new_contents = Vec::new();
let mut max = None;
let mut max_path = None;
for dep in deps {
let mtime = mtime(&dep);
if Some(mtime) > max {
max = Some(mtime);
max_path = Some(dep.clone());
}
new_contents.extend(dep.to_str().unwrap().as_bytes());
new_contents.extend(b"\0");
}
let max = max.unwrap();
let max_path = max_path.unwrap();
if stamp_contents == new_contents && max <= stamp_mtime {
return
}
if max > stamp_mtime {
build.verbose(&format!("updating {:?} as {:?} changed", stamp, max_path));
} else {
build.verbose(&format!("updating {:?} as deps changed", stamp));
}
t!(t!(File::create(stamp)).write_all(&new_contents));
}

View File

@ -15,7 +15,7 @@
use std::collections::HashMap;
use std::env;
use std::fs::File;
use std::fs::{self, File};
use std::io::prelude::*;
use std::path::PathBuf;
use std::process;
@ -94,12 +94,15 @@ pub struct Config {
pub backtrace: bool, // support for RUST_BACKTRACE
// misc
pub low_priority: bool,
pub channel: String,
pub quiet_tests: bool,
// Fallback musl-root for all targets
pub musl_root: Option<PathBuf>,
pub prefix: Option<PathBuf>,
pub sysconfdir: Option<PathBuf>,
pub docdir: Option<PathBuf>,
pub bindir: Option<PathBuf>,
pub libdir: Option<PathBuf>,
pub libdir_relative: Option<PathBuf>,
pub mandir: Option<PathBuf>,
@ -146,6 +149,7 @@ struct Build {
target: Vec<String>,
cargo: Option<String>,
rustc: Option<String>,
low_priority: Option<bool>,
compiler_docs: Option<bool>,
docs: Option<bool>,
submodules: Option<bool>,
@ -165,9 +169,11 @@ struct Build {
#[derive(RustcDecodable, Default, Clone)]
struct Install {
prefix: Option<String>,
mandir: Option<String>,
sysconfdir: Option<String>,
docdir: Option<String>,
bindir: Option<String>,
libdir: Option<String>,
mandir: Option<String>,
}
/// TOML representation of how the LLVM build is configured.
@ -264,7 +270,7 @@ impl Config {
let table = match p.parse() {
Some(table) => table,
None => {
println!("failed to parse TOML configuration:");
println!("failed to parse TOML configuration '{}':", file.to_str().unwrap());
for err in p.errors.iter() {
let (loline, locol) = p.to_linecol(err.lo);
let (hiline, hicol) = p.to_linecol(err.hi);
@ -302,6 +308,7 @@ impl Config {
config.nodejs = build.nodejs.map(PathBuf::from);
config.gdb = build.gdb.map(PathBuf::from);
config.python = build.python.map(PathBuf::from);
set(&mut config.low_priority, build.low_priority);
set(&mut config.compiler_docs, build.compiler_docs);
set(&mut config.docs, build.docs);
set(&mut config.submodules, build.submodules);
@ -315,9 +322,11 @@ impl Config {
if let Some(ref install) = toml.install {
config.prefix = install.prefix.clone().map(PathBuf::from);
config.mandir = install.mandir.clone().map(PathBuf::from);
config.sysconfdir = install.sysconfdir.clone().map(PathBuf::from);
config.docdir = install.docdir.clone().map(PathBuf::from);
config.bindir = install.bindir.clone().map(PathBuf::from);
config.libdir = install.libdir.clone().map(PathBuf::from);
config.mandir = install.mandir.clone().map(PathBuf::from);
}
if let Some(ref llvm) = toml.llvm {
@ -395,6 +404,12 @@ impl Config {
set(&mut config.rust_dist_src, t.src_tarball);
}
// compat with `./configure` while we're still using that
if fs::metadata("config.mk").is_ok() {
config.update_with_config_mk();
}
return config
}
@ -403,7 +418,7 @@ impl Config {
/// While we still have `./configure` this implements the ability to decode
/// that configuration into this. This isn't exactly a full-blown makefile
/// parser, but hey it gets the job done!
pub fn update_with_config_mk(&mut self) {
fn update_with_config_mk(&mut self) {
let mut config = String::new();
File::open("config.mk").unwrap().read_to_string(&mut config).unwrap();
for line in config.lines() {
@ -523,9 +538,15 @@ impl Config {
"CFG_PREFIX" => {
self.prefix = Some(PathBuf::from(value));
}
"CFG_SYSCONFDIR" => {
self.sysconfdir = Some(PathBuf::from(value));
}
"CFG_DOCDIR" => {
self.docdir = Some(PathBuf::from(value));
}
"CFG_BINDIR" => {
self.bindir = Some(PathBuf::from(value));
}
"CFG_LIBDIR" => {
self.libdir = Some(PathBuf::from(value));
}

View File

@ -51,7 +51,7 @@
# support. You'll need to write a target specification at least, and most
# likely, teach rustc about the C ABI of the target. Get in touch with the
# Rust team and file an issue if you need assistance in porting!
#targets = "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX"
#targets = "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX;Hexagon"
# Cap the number of parallel linker invocations when compiling LLVM.
# This can be useful when building LLVM with debug info, which significantly
@ -152,6 +152,10 @@
# known-good version of OpenSSL, compile it, and link it to Cargo.
#openssl-static = false
# Run the build with low priority, by setting the process group's "nice" value
# to +10 on Unix platforms, and by using a "low priority" job object on Windows.
#low-priority = false
# =============================================================================
# General install configuration options
# =============================================================================
@ -160,21 +164,31 @@
# Instead of installing to /usr/local, install to this path instead.
#prefix = "/usr/local"
# Where to install system configuration files
# If this is a relative path, it will get installed in `prefix` above
#sysconfdir = "/etc"
# Where to install documentation in `prefix` above
#docdir = "share/doc/rust"
# Where to install binaries in `prefix` above
#bindir = "bin"
# Where to install libraries in `prefix` above
#libdir = "lib"
# Where to install man pages in `prefix` above
#mandir = "share/man"
# Where to install documentation in `prefix` above
#docdir = "share/doc/rust"
# =============================================================================
# Options for compiling Rust code itself
# =============================================================================
[rust]
# Whether or not to optimize the compiler and standard library
# Note: the slowness of the non optimized compiler compiling itself usually
# outweighs the time gains in not doing optimizations, therefore a
# full bootstrap takes much more time with optimize set to false.
#optimize = true
# Number of codegen units to use for each compiler invocation. A value of 0
@ -300,3 +314,9 @@
# Note that this address should not contain a trailing slash as file names will
# be appended to it.
#upload-addr = "https://example.com/folder"
# Whether to build a plain source tarball to upload
# We disable that on Windows not to override the one already uploaded on S3
# as the one built on Windows will contain backslashes in paths causing problems
# on linux
#src-tarball = true

View File

@ -26,21 +26,15 @@ use std::process::{Command, Stdio};
use build_helper::output;
#[cfg(not(target_os = "solaris"))]
const SH_CMD: &'static str = "sh";
// On Solaris, sh is the historical bourne shell, not a POSIX shell, or bash.
#[cfg(target_os = "solaris")]
const SH_CMD: &'static str = "bash";
use {Build, Compiler, Mode};
use channel;
use util::{cp_r, libdir, is_dylib, cp_filtered, copy, exe};
fn pkgname(build: &Build, component: &str) -> String {
pub fn pkgname(build: &Build, component: &str) -> String {
if component == "cargo" {
format!("{}-{}", component, build.cargo_package_vers())
} else if component == "rls" {
format!("{}-{}", component, build.package_vers(&build.release_num("rls")))
format!("{}-{}", component, build.rls_package_vers())
} else {
assert!(component.starts_with("rust"));
format!("{}-{}", component, build.rust_package_vers())
@ -55,6 +49,10 @@ pub fn tmpdir(build: &Build) -> PathBuf {
build.out.join("tmp/dist")
}
fn rust_installer(build: &Build) -> Command {
build.tool_cmd(&Compiler::new(0, &build.config.build), "rust-installer")
}
/// Builds the `rust-docs` installer component.
///
/// Slurps up documentation from the `stage`'s `host`.
@ -74,14 +72,14 @@ pub fn docs(build: &Build, stage: u32, host: &str) {
let src = build.out.join(host).join("doc");
cp_r(&src, &dst);
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust-Documentation")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Rust-documentation-is-installed.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}-{}", name, host))
.arg("--component-name=rust-docs")
.arg("--legacy-manifest-dirs=rustlib,cargo")
@ -98,6 +96,140 @@ pub fn docs(build: &Build, stage: u32, host: &str) {
}
}
fn find_files(files: &[&str], path: &[PathBuf]) -> Vec<PathBuf> {
let mut found = Vec::new();
for file in files {
let file_path =
path.iter()
.map(|dir| dir.join(file))
.find(|p| p.exists());
if let Some(file_path) = file_path {
found.push(file_path);
} else {
panic!("Could not find '{}' in {:?}", file, path);
}
}
found
}
fn make_win_dist(rust_root: &Path, plat_root: &Path, target_triple: &str, build: &Build) {
//Ask gcc where it keeps its stuff
let mut cmd = Command::new(build.cc(target_triple));
cmd.arg("-print-search-dirs");
build.run_quiet(&mut cmd);
let gcc_out =
String::from_utf8(
cmd
.output()
.expect("failed to execute gcc")
.stdout).expect("gcc.exe output was not utf8");
let mut bin_path: Vec<_> =
env::split_paths(&env::var_os("PATH").unwrap_or_default())
.collect();
let mut lib_path = Vec::new();
for line in gcc_out.lines() {
let idx = line.find(':').unwrap();
let key = &line[..idx];
let trim_chars: &[_] = &[' ', '='];
let value =
line[(idx + 1)..]
.trim_left_matches(trim_chars)
.split(';')
.map(|s| PathBuf::from(s));
if key == "programs" {
bin_path.extend(value);
} else if key == "libraries" {
lib_path.extend(value);
}
}
let target_tools = vec!["gcc.exe", "ld.exe", "ar.exe", "dlltool.exe", "libwinpthread-1.dll"];
let mut rustc_dlls = vec!["libstdc++-6.dll", "libwinpthread-1.dll"];
if target_triple.starts_with("i686-") {
rustc_dlls.push("libgcc_s_dw2-1.dll");
} else {
rustc_dlls.push("libgcc_s_seh-1.dll");
}
let target_libs = vec![ //MinGW libs
"libgcc.a",
"libgcc_eh.a",
"libgcc_s.a",
"libm.a",
"libmingw32.a",
"libmingwex.a",
"libstdc++.a",
"libiconv.a",
"libmoldname.a",
"libpthread.a",
//Windows import libs
"libadvapi32.a",
"libbcrypt.a",
"libcomctl32.a",
"libcomdlg32.a",
"libcrypt32.a",
"libgdi32.a",
"libimagehlp.a",
"libiphlpapi.a",
"libkernel32.a",
"libmsvcrt.a",
"libodbc32.a",
"libole32.a",
"liboleaut32.a",
"libopengl32.a",
"libpsapi.a",
"librpcrt4.a",
"libsetupapi.a",
"libshell32.a",
"libuser32.a",
"libuserenv.a",
"libuuid.a",
"libwinhttp.a",
"libwinmm.a",
"libwinspool.a",
"libws2_32.a",
"libwsock32.a",
];
//Find mingw artifacts we want to bundle
let target_tools = find_files(&target_tools, &bin_path);
let rustc_dlls = find_files(&rustc_dlls, &bin_path);
let target_libs = find_files(&target_libs, &lib_path);
fn copy_to_folder(src: &Path, dest_folder: &Path) {
let file_name = src.file_name().unwrap().to_os_string();
let dest = dest_folder.join(file_name);
copy(src, &dest);
}
//Copy runtime dlls next to rustc.exe
let dist_bin_dir = rust_root.join("bin/");
fs::create_dir_all(&dist_bin_dir).expect("creating dist_bin_dir failed");
for src in rustc_dlls {
copy_to_folder(&src, &dist_bin_dir);
}
//Copy platform tools to platform-specific bin directory
let target_bin_dir = plat_root.join("lib").join("rustlib").join(target_triple).join("bin");
fs::create_dir_all(&target_bin_dir).expect("creating target_bin_dir failed");
for src in target_tools {
copy_to_folder(&src, &target_bin_dir);
}
//Copy platform libs to platform-specific lib directory
let target_lib_dir = plat_root.join("lib").join("rustlib").join(target_triple).join("lib");
fs::create_dir_all(&target_lib_dir).expect("creating target_lib_dir failed");
for src in target_libs {
copy_to_folder(&src, &target_lib_dir);
}
}
/// Build the `rust-mingw` installer component.
///
/// This contains all the bits and pieces to run the MinGW Windows targets
@ -111,27 +243,20 @@ pub fn mingw(build: &Build, host: &str) {
let _ = fs::remove_dir_all(&image);
t!(fs::create_dir_all(&image));
// The first argument to the script is a "temporary directory" which is just
// The first argument is a "temporary directory" which is just
// thrown away (this contains the runtime DLLs included in the rustc package
// above) and the second argument is where to place all the MinGW components
// (which is what we want).
//
// FIXME: this script should be rewritten into Rust
let mut cmd = Command::new(build.python());
cmd.arg(build.src.join("src/etc/make-win-dist.py"))
.arg(tmpdir(build))
.arg(&image)
.arg(host);
build.run(&mut cmd);
make_win_dist(&tmpdir(build), &image, host, &build);
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust-MinGW")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Rust-MinGW-is-installed.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}-{}", name, host))
.arg("--component-name=rust-mingw")
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -174,15 +299,8 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
// anything requiring us to distribute a license, but it's likely the
// install will *also* include the rust-mingw package, which also needs
// licenses, so to be safe we just include it here in all MinGW packages.
//
// FIXME: this script should be rewritten into Rust
if host.contains("pc-windows-gnu") {
let mut cmd = Command::new(build.python());
cmd.arg(build.src.join("src/etc/make-win-dist.py"))
.arg(&image)
.arg(tmpdir(build))
.arg(host);
build.run(&mut cmd);
make_win_dist(&image, &tmpdir(build), host, build);
let dst = image.join("share/doc");
t!(fs::create_dir_all(&dst));
@ -190,15 +308,15 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
}
// Finally, wrap everything up in a nice tarball!
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Rust-is-ready-to-roll.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg(format!("--non-installed-overlay={}", sanitize_sh(&overlay)))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg("--non-installed-overlay").arg(&overlay)
.arg(format!("--package-name={}-{}", name, host))
.arg("--component-name=rustc")
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -254,7 +372,12 @@ pub fn debugger_scripts(build: &Build,
install(&build.src.join("src/etc/").join(file), &dst, 0o644);
};
if host.contains("windows-msvc") {
// no debugger scripts
// windbg debugger scripts
install(&build.src.join("src/etc/rust-windbg.cmd"), &sysroot.join("bin"),
0o755);
cp_debugger_script("natvis/libcore.natvis");
cp_debugger_script("natvis/libcollections.natvis");
} else {
cp_debugger_script("debugger_pretty_printers_common.py");
@ -295,14 +418,14 @@ pub fn std(build: &Build, compiler: &Compiler, target: &str) {
let src = build.sysroot(compiler).join("lib/rustlib");
cp_r(&src.join(target), &dst);
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=std-is-standing-at-the-ready.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}-{}", name, target))
.arg(format!("--component-name=rust-std-{}", target))
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -310,11 +433,18 @@ pub fn std(build: &Build, compiler: &Compiler, target: &str) {
t!(fs::remove_dir_all(&image));
}
/// The path to the complete rustc-src tarball
pub fn rust_src_location(build: &Build) -> PathBuf {
let plain_name = format!("rustc-{}-src", build.rust_package_vers());
distdir(build).join(&format!("{}.tar.gz", plain_name))
}
/// The path to the rust-src component installer
pub fn rust_src_installer(build: &Build) -> PathBuf {
let name = pkgname(build, "rust-src");
distdir(build).join(&format!("{}.tar.gz", name))
}
/// Creates a tarball of save-analysis metadata, if available.
pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
assert!(build.config.extended);
@ -344,14 +474,14 @@ pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
println!("image_src: {:?}, dst: {:?}", image_src, dst);
cp_r(&image_src, &dst);
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=save-analysis-saved.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}-{}", name, target))
.arg(format!("--component-name=rust-analysis-{}", target))
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -359,43 +489,8 @@ pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
t!(fs::remove_dir_all(&image));
}
const CARGO_VENDOR_VERSION: &'static str = "0.1.4";
/// Creates the `rust-src` installer component and the plain source tarball
pub fn rust_src(build: &Build) {
if !build.config.rust_dist_src {
return
}
println!("Dist src");
let name = pkgname(build, "rust-src");
let image = tmpdir(build).join(format!("{}-image", name));
let _ = fs::remove_dir_all(&image);
let dst = image.join("lib/rustlib/src");
let dst_src = dst.join("rust");
t!(fs::create_dir_all(&dst_src));
// This is the set of root paths which will become part of the source package
let src_files = [
"COPYRIGHT",
"LICENSE-APACHE",
"LICENSE-MIT",
"CONTRIBUTING.md",
"README.md",
"RELEASES.md",
"configure",
"x.py",
];
let src_dirs = [
"man",
"src",
"cargo",
"rls",
];
let filter_fn = move |path: &Path| {
fn copy_src_dirs(build: &Build, src_dirs: &[&str], exclude_dirs: &[&str], dst_dir: &Path) {
fn filter_fn(exclude_dirs: &[&str], dir: &str, path: &Path) -> bool {
let spath = match path.to_str() {
Some(path) => path,
None => return false,
@ -411,6 +506,11 @@ pub fn rust_src(build: &Build) {
}
}
let full_path = Path::new(dir).join(path);
if exclude_dirs.iter().any(|excl| full_path == Path::new(excl)) {
return false;
}
let excludes = [
"CVS", "RCS", "SCCS", ".git", ".gitignore", ".gitmodules",
".gitattributes", ".cvsignore", ".svn", ".arch-ids", "{arch}",
@ -420,19 +520,119 @@ pub fn rust_src(build: &Build) {
!path.iter()
.map(|s| s.to_str().unwrap())
.any(|s| excludes.contains(&s))
};
}
// Copy the directories using our filter
for item in &src_dirs {
let dst = &dst_src.join(item);
t!(fs::create_dir(dst));
cp_filtered(&build.src.join(item), dst, &filter_fn);
for item in src_dirs {
let dst = &dst_dir.join(item);
t!(fs::create_dir_all(dst));
cp_filtered(&build.src.join(item), dst, &|path| filter_fn(exclude_dirs, item, path));
}
}
/// Creates the `rust-src` installer component
pub fn rust_src(build: &Build) {
println!("Dist src");
let name = pkgname(build, "rust-src");
let image = tmpdir(build).join(format!("{}-image", name));
let _ = fs::remove_dir_all(&image);
let dst = image.join("lib/rustlib/src");
let dst_src = dst.join("rust");
t!(fs::create_dir_all(&dst_src));
// This is the reduced set of paths which will become the rust-src component
// (essentially libstd and all of its path dependencies)
let std_src_dirs = [
"src/build_helper",
"src/liballoc",
"src/liballoc_jemalloc",
"src/liballoc_system",
"src/libbacktrace",
"src/libcollections",
"src/libcompiler_builtins",
"src/libcore",
"src/liblibc",
"src/libpanic_abort",
"src/libpanic_unwind",
"src/librand",
"src/librustc_asan",
"src/librustc_lsan",
"src/librustc_msan",
"src/librustc_tsan",
"src/libstd",
"src/libstd_unicode",
"src/libunwind",
"src/rustc/libc_shim",
"src/libtest",
"src/libterm",
"src/libgetopts",
"src/compiler-rt",
"src/jemalloc",
];
let std_src_dirs_exclude = [
"src/compiler-rt/test",
"src/jemalloc/test/unit",
];
copy_src_dirs(build, &std_src_dirs[..], &std_src_dirs_exclude[..], &dst_src);
// Create source tarball in rust-installer format
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Awesome-Source.")
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}", name))
.arg("--component-name=rust-src")
.arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd);
t!(fs::remove_dir_all(&image));
}
const CARGO_VENDOR_VERSION: &'static str = "0.1.4";
/// Creates the plain source tarball
pub fn plain_source_tarball(build: &Build) {
println!("Create plain source tarball");
// Make sure that the root folder of tarball has the correct name
let plain_name = format!("{}-src", pkgname(build, "rustc"));
let plain_dst_src = tmpdir(build).join(&plain_name);
let _ = fs::remove_dir_all(&plain_dst_src);
t!(fs::create_dir_all(&plain_dst_src));
// This is the set of root paths which will become part of the source package
let src_files = [
"COPYRIGHT",
"LICENSE-APACHE",
"LICENSE-MIT",
"CONTRIBUTING.md",
"README.md",
"RELEASES.md",
"configure",
"x.py",
];
let src_dirs = [
"man",
"src",
];
copy_src_dirs(build, &src_dirs[..], &[], &plain_dst_src);
// Copy the files normally
for item in &src_files {
copy(&build.src.join(item), &dst_src.join(item));
copy(&build.src.join(item), &plain_dst_src.join(item));
}
// Create the version file
write_file(&plain_dst_src.join("version"), build.rust_version().as_bytes());
// If we're building from git sources, we need to vendor a complete distribution.
if build.src_is_git {
// Get cargo-vendor installed, if it isn't already.
@ -455,43 +655,24 @@ pub fn rust_src(build: &Build) {
// Vendor all Cargo dependencies
let mut cmd = Command::new(&build.cargo);
cmd.arg("vendor")
.current_dir(&dst_src.join("src"));
.current_dir(&plain_dst_src.join("src"));
build.run(&mut cmd);
}
// Create source tarball in rust-installer format
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Awesome-Source.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg(format!("--package-name={}", name))
.arg("--component-name=rust-src")
.arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd);
// Rename directory, so that root folder of tarball has the correct name
let plain_name = format!("rustc-{}-src", build.rust_package_vers());
let plain_dst_src = tmpdir(build).join(&plain_name);
let _ = fs::remove_dir_all(&plain_dst_src);
t!(fs::create_dir_all(&plain_dst_src));
cp_r(&dst_src, &plain_dst_src);
// Create the version file
write_file(&plain_dst_src.join("version"), build.rust_version().as_bytes());
// Create plain source tarball
let mut cmd = Command::new("tar");
cmd.arg("-czf").arg(sanitize_sh(&rust_src_location(build)))
.arg(&plain_name)
let mut tarball = rust_src_location(build);
tarball.set_extension(""); // strip .gz
tarball.set_extension(""); // strip .tar
if let Some(dir) = tarball.parent() {
t!(fs::create_dir_all(dir));
}
let mut cmd = rust_installer(build);
cmd.arg("tarball")
.arg("--input").arg(&plain_name)
.arg("--output").arg(&tarball)
.arg("--work-dir=.")
.current_dir(tmpdir(build));
build.run(&mut cmd);
t!(fs::remove_dir_all(&image));
t!(fs::remove_dir_all(&plain_dst_src));
}
fn install(src: &Path, dstdir: &Path, perms: u32) {
@ -537,7 +718,7 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
println!("Dist cargo stage{} ({})", stage, target);
let compiler = Compiler::new(stage, &build.config.build);
let src = build.src.join("cargo");
let src = build.src.join("src/tools/cargo");
let etc = src.join("src/etc");
let release_num = build.release_num("cargo");
let name = pkgname(build, "cargo");
@ -550,7 +731,7 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
// Prepare the image directory
t!(fs::create_dir_all(image.join("share/zsh/site-functions")));
t!(fs::create_dir_all(image.join("etc/bash_completions.d")));
t!(fs::create_dir_all(image.join("etc/bash_completion.d")));
let cargo = build.cargo_out(&compiler, Mode::Tool, target)
.join(exe("cargo", target));
install(&cargo, &image.join("bin"), 0o755);
@ -560,7 +741,7 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
}
install(&etc.join("_cargo"), &image.join("share/zsh/site-functions"), 0o644);
copy(&etc.join("cargo.bashcomp.sh"),
&image.join("etc/bash_completions.d/cargo"));
&image.join("etc/bash_completion.d/cargo"));
let doc = image.join("share/doc/cargo");
install(&src.join("README.md"), &doc, 0o644);
install(&src.join("LICENSE-MIT"), &doc, 0o644);
@ -578,15 +759,15 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes()));
// Generate the installer tarball
let mut cmd = Command::new("sh");
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Rust-is-ready-to-roll.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg(format!("--non-installed-overlay={}", sanitize_sh(&overlay)))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg("--non-installed-overlay").arg(&overlay)
.arg(format!("--package-name={}-{}", name, target))
.arg("--component-name=cargo")
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -598,7 +779,7 @@ pub fn rls(build: &Build, stage: u32, target: &str) {
println!("Dist RLS stage{} ({})", stage, target);
let compiler = Compiler::new(stage, &build.config.build);
let src = build.src.join("rls");
let src = build.src.join("src/tools/rls");
let release_num = build.release_num("rls");
let name = pkgname(build, "rls");
let version = build.rls_info.version(build, &release_num);
@ -627,15 +808,15 @@ pub fn rls(build: &Build, stage: u32, target: &str) {
t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes()));
// Generate the installer tarball
let mut cmd = Command::new("sh");
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
let mut cmd = rust_installer(build);
cmd.arg("generate")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=RLS-ready-to-serve.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg(format!("--non-installed-overlay={}", sanitize_sh(&overlay)))
.arg("--image-dir").arg(&image)
.arg("--work-dir").arg(&tmpdir(build))
.arg("--output-dir").arg(&distdir(build))
.arg("--non-installed-overlay").arg(&overlay)
.arg(format!("--package-name={}-{}", name, target))
.arg("--component-name=rls")
.arg("--legacy-manifest-dirs=rustlib,cargo");
@ -653,9 +834,6 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
let cargo_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "cargo"),
target));
let rls_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rls"),
target));
let analysis_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rust-analysis"),
target));
@ -686,29 +864,28 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
// upgrades rustc was upgraded before rust-std. To avoid rustc clobbering
// the std files during uninstall. To do this ensure that rustc comes
// before rust-std in the list below.
let mut input_tarballs = format!("{},{},{},{},{},{}",
sanitize_sh(&rustc_installer),
sanitize_sh(&cargo_installer),
sanitize_sh(&rls_installer),
sanitize_sh(&analysis_installer),
sanitize_sh(&docs_installer),
sanitize_sh(&std_installer));
let mut tarballs = vec![rustc_installer, cargo_installer,
analysis_installer, docs_installer, std_installer];
if target.contains("pc-windows-gnu") {
input_tarballs.push_str(",");
input_tarballs.push_str(&sanitize_sh(&mingw_installer));
tarballs.push(mingw_installer);
}
let mut input_tarballs = tarballs[0].as_os_str().to_owned();
for tarball in &tarballs[1..] {
input_tarballs.push(",");
input_tarballs.push(tarball);
}
let mut cmd = Command::new(SH_CMD);
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/combine-installers.sh")))
let mut cmd = rust_installer(build);
cmd.arg("combine")
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=Rust-is-ready-to-roll.")
.arg(format!("--work-dir={}", sanitize_sh(&work)))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg("--work-dir").arg(&work)
.arg("--output-dir").arg(&distdir(build))
.arg(format!("--package-name={}-{}", pkgname(build, "rust"), target))
.arg("--legacy-manifest-dirs=rustlib,cargo")
.arg(format!("--input-tarballs={}", input_tarballs))
.arg(format!("--non-installed-overlay={}", sanitize_sh(&overlay)));
.arg("--input-tarballs").arg(input_tarballs)
.arg("--non-installed-overlay").arg(&overlay);
build.run(&mut cmd);
let mut license = String::new();
@ -1004,7 +1181,6 @@ pub fn hash_and_sign(build: &Build) {
cmd.arg(today.trim());
cmd.arg(build.rust_package_vers());
cmd.arg(build.package_vers(&build.release_num("cargo")));
cmd.arg(build.package_vers(&build.release_num("rls")));
cmd.arg(addr);
t!(fs::create_dir_all(distdir(build)));

View File

@ -61,6 +61,7 @@ pub enum Subcommand {
Test {
paths: Vec<PathBuf>,
test_args: Vec<String>,
no_fail_fast: bool,
},
Bench {
paths: Vec<PathBuf>,
@ -69,7 +70,9 @@ pub enum Subcommand {
Clean,
Dist {
paths: Vec<PathBuf>,
install: bool,
},
Install {
paths: Vec<PathBuf>,
},
}
@ -85,7 +88,8 @@ Subcommands:
bench Build and run some benchmarks
doc Build documentation
clean Clean out build directories
dist Build and/or install distribution artifacts
dist Build distribution artifacts
install Install distribution artifacts
To learn more about a subcommand, run `./x.py <subcommand> -h`");
@ -125,7 +129,8 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`");
|| (s == "bench")
|| (s == "doc")
|| (s == "clean")
|| (s == "dist"));
|| (s == "dist")
|| (s == "install"));
let subcommand = match possible_subcommands.first() {
Some(s) => s,
None => {
@ -137,9 +142,11 @@ To learn more about a subcommand, run `./x.py <subcommand> -h`");
// Some subcommands get extra options
match subcommand.as_str() {
"test" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); },
"test" => {
opts.optflag("", "no-fail-fast", "Run all tests regardless of failure");
opts.optmulti("", "test-args", "extra arguments", "ARGS");
},
"bench" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); },
"dist" => { opts.optflag("", "install", "run installer as well"); },
_ => { },
};
@ -230,11 +237,18 @@ Arguments:
let cwd = t!(env::current_dir());
let paths = matches.free[1..].iter().map(|p| cwd.join(p)).collect::<Vec<_>>();
let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| {
if fs::metadata("config.toml").is_ok() {
Some(PathBuf::from("config.toml"))
} else {
None
}
});
// All subcommands can have an optional "Available paths" section
if matches.opt_present("verbose") {
let flags = Flags::parse(&["build".to_string()]);
let mut config = Config::default();
let mut config = Config::parse(&flags.build, cfg_file.clone());
config.build = flags.build.clone();
let mut build = Build::new(flags, config);
metadata::build(&mut build);
@ -260,6 +274,7 @@ Arguments:
Subcommand::Test {
paths: paths,
test_args: matches.opt_strs("test-args"),
no_fail_fast: matches.opt_present("no-fail-fast"),
}
}
"bench" => {
@ -281,7 +296,11 @@ Arguments:
"dist" => {
Subcommand::Dist {
paths: paths,
install: matches.opt_present("install"),
}
}
"install" => {
Subcommand::Install {
paths: paths,
}
}
_ => {
@ -290,14 +309,6 @@ Arguments:
};
let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| {
if fs::metadata("config.toml").is_ok() {
Some(PathBuf::from("config.toml"))
} else {
None
}
});
let mut stage = matches.opt_str("stage").map(|j| j.parse().unwrap());
if matches.opt_present("incremental") {
@ -335,6 +346,13 @@ impl Subcommand {
_ => Vec::new(),
}
}
pub fn no_fail_fast(&self) -> bool {
match *self {
Subcommand::Test { no_fail_fast, .. } => no_fail_fast,
_ => false,
}
}
}
fn split(s: Vec<String>) -> Vec<String> {

View File

@ -19,61 +19,120 @@ use std::path::{Path, PathBuf, Component};
use std::process::Command;
use Build;
use dist::{sanitize_sh, tmpdir};
use dist::{pkgname, sanitize_sh, tmpdir};
/// Installs everything.
pub fn install(build: &Build, stage: u32, host: &str) {
pub struct Installer<'a> {
build: &'a Build,
prefix: PathBuf,
sysconfdir: PathBuf,
docdir: PathBuf,
bindir: PathBuf,
libdir: PathBuf,
mandir: PathBuf,
empty_dir: PathBuf,
}
impl<'a> Drop for Installer<'a> {
fn drop(&mut self) {
t!(fs::remove_dir_all(&self.empty_dir));
}
}
impl<'a> Installer<'a> {
pub fn new(build: &'a Build) -> Installer<'a> {
let prefix_default = PathBuf::from("/usr/local");
let sysconfdir_default = PathBuf::from("/etc");
let docdir_default = PathBuf::from("share/doc/rust");
let mandir_default = PathBuf::from("share/man");
let bindir_default = PathBuf::from("bin");
let libdir_default = PathBuf::from("lib");
let mandir_default = PathBuf::from("share/man");
let prefix = build.config.prefix.as_ref().unwrap_or(&prefix_default);
let sysconfdir = build.config.sysconfdir.as_ref().unwrap_or(&sysconfdir_default);
let docdir = build.config.docdir.as_ref().unwrap_or(&docdir_default);
let bindir = build.config.bindir.as_ref().unwrap_or(&bindir_default);
let libdir = build.config.libdir.as_ref().unwrap_or(&libdir_default);
let mandir = build.config.mandir.as_ref().unwrap_or(&mandir_default);
let sysconfdir = prefix.join(sysconfdir);
let docdir = prefix.join(docdir);
let bindir = prefix.join(bindir);
let libdir = prefix.join(libdir);
let mandir = prefix.join(mandir);
let destdir = env::var_os("DESTDIR").map(PathBuf::from);
let prefix = add_destdir(&prefix, &destdir);
let sysconfdir = add_destdir(&sysconfdir, &destdir);
let docdir = add_destdir(&docdir, &destdir);
let bindir = add_destdir(&bindir, &destdir);
let libdir = add_destdir(&libdir, &destdir);
let mandir = add_destdir(&mandir, &destdir);
let empty_dir = build.out.join("tmp/empty_dir");
t!(fs::create_dir_all(&empty_dir));
if build.config.docs {
install_sh(&build, "docs", "rust-docs", stage, host, &prefix,
&docdir, &libdir, &mandir, &empty_dir);
Installer {
build,
prefix,
sysconfdir,
docdir,
bindir,
libdir,
mandir,
empty_dir,
}
}
for target in build.config.target.iter() {
install_sh(&build, "std", "rust-std", stage, target, &prefix,
&docdir, &libdir, &mandir, &empty_dir);
pub fn install_docs(&self, stage: u32, host: &str) {
self.install_sh("docs", "rust-docs", stage, Some(host));
}
install_sh(&build, "rustc", "rustc", stage, host, &prefix,
&docdir, &libdir, &mandir, &empty_dir);
t!(fs::remove_dir_all(&empty_dir));
}
pub fn install_std(&self, stage: u32) {
for target in self.build.config.target.iter() {
self.install_sh("std", "rust-std", stage, Some(target));
}
}
fn install_sh(build: &Build, package: &str, name: &str, stage: u32, host: &str,
prefix: &Path, docdir: &Path, libdir: &Path, mandir: &Path, empty_dir: &Path) {
println!("Install {} stage{} ({})", package, stage, host);
let package_name = format!("{}-{}-{}", name, build.rust_package_vers(), host);
pub fn install_cargo(&self, stage: u32, host: &str) {
self.install_sh("cargo", "cargo", stage, Some(host));
}
pub fn install_rls(&self, stage: u32, host: &str) {
self.install_sh("rls", "rls", stage, Some(host));
}
pub fn install_analysis(&self, stage: u32, host: &str) {
self.install_sh("analysis", "rust-analysis", stage, Some(host));
}
pub fn install_src(&self, stage: u32) {
self.install_sh("src", "rust-src", stage, None);
}
pub fn install_rustc(&self, stage: u32, host: &str) {
self.install_sh("rustc", "rustc", stage, Some(host));
}
fn install_sh(&self, package: &str, name: &str, stage: u32, host: Option<&str>) {
println!("Install {} stage{} ({:?})", package, stage, host);
let package_name = if let Some(host) = host {
format!("{}-{}", pkgname(self.build, name), host)
} else {
pkgname(self.build, name)
};
let mut cmd = Command::new("sh");
cmd.current_dir(empty_dir)
.arg(sanitize_sh(&tmpdir(build).join(&package_name).join("install.sh")))
.arg(format!("--prefix={}", sanitize_sh(prefix)))
.arg(format!("--docdir={}", sanitize_sh(docdir)))
.arg(format!("--libdir={}", sanitize_sh(libdir)))
.arg(format!("--mandir={}", sanitize_sh(mandir)))
cmd.current_dir(&self.empty_dir)
.arg(sanitize_sh(&tmpdir(self.build).join(&package_name).join("install.sh")))
.arg(format!("--prefix={}", sanitize_sh(&self.prefix)))
.arg(format!("--sysconfdir={}", sanitize_sh(&self.sysconfdir)))
.arg(format!("--docdir={}", sanitize_sh(&self.docdir)))
.arg(format!("--bindir={}", sanitize_sh(&self.bindir)))
.arg(format!("--libdir={}", sanitize_sh(&self.libdir)))
.arg(format!("--mandir={}", sanitize_sh(&self.mandir)))
.arg("--disable-ldconfig");
build.run(&mut cmd);
self.build.run(&mut cmd);
}
}
fn add_destdir(path: &Path, destdir: &Option<PathBuf>) -> PathBuf {

View File

@ -42,6 +42,7 @@
use std::env;
use std::io;
use std::mem;
use Build;
type HANDLE = *mut u8;
type BOOL = i32;
@ -60,8 +61,10 @@ const DUPLICATE_SAME_ACCESS: DWORD = 0x2;
const PROCESS_DUP_HANDLE: DWORD = 0x40;
const JobObjectExtendedLimitInformation: JOBOBJECTINFOCLASS = 9;
const JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE: DWORD = 0x2000;
const JOB_OBJECT_LIMIT_PRIORITY_CLASS: DWORD = 0x00000020;
const SEM_FAILCRITICALERRORS: UINT = 0x0001;
const SEM_NOGPFAULTERRORBOX: UINT = 0x0002;
const BELOW_NORMAL_PRIORITY_CLASS: DWORD = 0x00004000;
extern "system" {
fn CreateJobObjectW(lpJobAttributes: *mut u8, lpName: *const u8) -> HANDLE;
@ -118,7 +121,7 @@ struct JOBOBJECT_BASIC_LIMIT_INFORMATION {
SchedulingClass: DWORD,
}
pub unsafe fn setup() {
pub unsafe fn setup(build: &mut Build) {
// Tell Windows to not show any UI on errors (such as not finding a required dll
// during startup or terminating abnormally). This is important for running tests,
// since some of them use abnormal termination by design.
@ -136,6 +139,10 @@ pub unsafe fn setup() {
// children will reside in the job by default.
let mut info = mem::zeroed::<JOBOBJECT_EXTENDED_LIMIT_INFORMATION>();
info.BasicLimitInformation.LimitFlags = JOB_OBJECT_LIMIT_KILL_ON_JOB_CLOSE;
if build.config.low_priority {
info.BasicLimitInformation.LimitFlags |= JOB_OBJECT_LIMIT_PRIORITY_CLASS;
info.BasicLimitInformation.PriorityClass = BELOW_NORMAL_PRIORITY_CLASS;
}
let r = SetInformationJobObject(job,
JobObjectExtendedLimitInformation,
&mut info as *mut _ as LPVOID,

View File

@ -76,18 +76,22 @@ extern crate num_cpus;
extern crate rustc_serialize;
extern crate toml;
#[cfg(unix)]
extern crate libc;
use std::cell::Cell;
use std::cmp;
use std::collections::HashMap;
use std::env;
use std::ffi::OsString;
use std::fs::{self, File};
use std::io::Read;
use std::path::{Component, PathBuf, Path};
use std::path::{PathBuf, Path};
use std::process::Command;
use build_helper::{run_silent, run_suppressed, output, mtime};
use build_helper::{run_silent, run_suppressed, try_run_silent, try_run_suppressed, output, mtime};
use util::{exe, libdir, add_lib_path};
use util::{exe, libdir, add_lib_path, OutputFolder, CiEnv};
mod cc;
mod channel;
@ -108,9 +112,21 @@ pub mod util;
#[cfg(windows)]
mod job;
#[cfg(not(windows))]
#[cfg(unix)]
mod job {
pub unsafe fn setup() {}
use libc;
pub unsafe fn setup(build: &mut ::Build) {
if build.config.low_priority {
libc::setpriority(libc::PRIO_PGRP as _, 0, 10);
}
}
}
#[cfg(not(any(unix, windows)))]
mod job {
pub unsafe fn setup(_build: &mut ::Build) {
}
}
pub use config::Config;
@ -164,6 +180,8 @@ pub struct Build {
crates: HashMap<String, Crate>,
is_sudo: bool,
src_is_git: bool,
ci_env: CiEnv,
delayed_failures: Cell<usize>,
}
#[derive(Debug)]
@ -196,7 +214,7 @@ pub enum Mode {
/// output in the "stageN-rustc" directory.
Librustc,
/// This cargo is going to some build tool, placing output in the
/// This cargo is going to build some tool, placing output in the
/// "stageN-tools" directory.
Tool,
}
@ -234,8 +252,8 @@ impl Build {
None => false,
};
let rust_info = channel::GitInfo::new(&src);
let cargo_info = channel::GitInfo::new(&src.join("cargo"));
let rls_info = channel::GitInfo::new(&src.join("rls"));
let cargo_info = channel::GitInfo::new(&src.join("src/tools/cargo"));
let rls_info = channel::GitInfo::new(&src.join("src/tools/rls"));
let src_is_git = src.join(".git").exists();
Build {
@ -257,13 +275,15 @@ impl Build {
lldb_python_dir: None,
is_sudo: is_sudo,
src_is_git: src_is_git,
ci_env: CiEnv::current(),
delayed_failures: Cell::new(0),
}
}
/// Executes the entire build, as configured by the flags and configuration.
pub fn build(&mut self) {
unsafe {
job::setup();
job::setup(self);
}
if let Subcommand::Clean = self.flags.cmd {
@ -285,129 +305,12 @@ impl Build {
self.verbose(&format!("auto-detected local-rebuild {}", local_release));
self.local_rebuild = true;
}
self.verbose("updating submodules");
self.update_submodules();
self.verbose("learning about cargo");
metadata::build(self);
step::run(self);
}
/// Updates all git submodules that we have.
///
/// This will detect if any submodules are out of date an run the necessary
/// commands to sync them all with upstream.
fn update_submodules(&self) {
struct Submodule<'a> {
path: &'a Path,
state: State,
}
enum State {
// The submodule may have staged/unstaged changes
MaybeDirty,
// Or could be initialized but never updated
NotInitialized,
// The submodule, itself, has extra commits but those changes haven't been commited to
// the (outer) git repository
OutOfSync,
}
if !self.src_is_git || !self.config.submodules {
return
}
let git = || {
let mut cmd = Command::new("git");
cmd.current_dir(&self.src);
return cmd
};
let git_submodule = || {
let mut cmd = Command::new("git");
cmd.current_dir(&self.src).arg("submodule");
return cmd
};
// FIXME: this takes a seriously long time to execute on Windows and a
// nontrivial amount of time on Unix, we should have a better way
// of detecting whether we need to run all the submodule commands
// below.
let out = output(git_submodule().arg("status"));
let mut submodules = vec![];
for line in out.lines() {
// NOTE `git submodule status` output looks like this:
//
// -5066b7dcab7e700844b0e2ba71b8af9dc627a59b src/liblibc
// +b37ef24aa82d2be3a3cc0fe89bf82292f4ca181c src/compiler-rt (remotes/origin/..)
// e058ca661692a8d01f8cf9d35939dfe3105ce968 src/jemalloc (3.6.0-533-ge058ca6)
//
// The first character can be '-', '+' or ' ' and denotes the `State` of the submodule
// Right next to this character is the SHA-1 of the submodule HEAD
// And after that comes the path to the submodule
let path = Path::new(line[1..].split(' ').skip(1).next().unwrap());
let state = if line.starts_with('-') {
State::NotInitialized
} else if line.starts_with('+') {
State::OutOfSync
} else if line.starts_with(' ') {
State::MaybeDirty
} else {
panic!("unexpected git submodule state: {:?}", line.chars().next());
};
submodules.push(Submodule { path: path, state: state })
}
self.run(git_submodule().arg("sync"));
for submodule in submodules {
// If using llvm-root then don't touch the llvm submodule.
if submodule.path.components().any(|c| c == Component::Normal("llvm".as_ref())) &&
self.config.target_config.get(&self.config.build)
.and_then(|c| c.llvm_config.as_ref()).is_some()
{
continue
}
if submodule.path.components().any(|c| c == Component::Normal("jemalloc".as_ref())) &&
!self.config.use_jemalloc
{
continue
}
// `submodule.path` is the relative path to a submodule (from the repository root)
// `submodule_path` is the path to a submodule from the cwd
// use `submodule.path` when e.g. executing a submodule specific command from the
// repository root
// use `submodule_path` when e.g. executing a normal git command for the submodule
// (set via `current_dir`)
let submodule_path = self.src.join(submodule.path);
match submodule.state {
State::MaybeDirty => {
// drop staged changes
self.run(git().current_dir(&submodule_path)
.args(&["reset", "--hard"]));
// drops unstaged changes
self.run(git().current_dir(&submodule_path)
.args(&["clean", "-fdx"]));
},
State::NotInitialized => {
self.run(git_submodule().arg("init").arg(submodule.path));
self.run(git_submodule().arg("update").arg(submodule.path));
},
State::OutOfSync => {
// drops submodule commits that weren't reported to the (outer) git repository
self.run(git_submodule().arg("update").arg(submodule.path));
self.run(git().current_dir(&submodule_path)
.args(&["reset", "--hard"]));
self.run(git().current_dir(&submodule_path)
.args(&["clean", "-fdx"]));
},
}
}
}
/// Clear out `dir` if `input` is newer.
///
/// After this executes, it will also ensure that `dir` exists.
@ -444,7 +347,7 @@ impl Build {
// FIXME: Temporary fix for https://github.com/rust-lang/cargo/issues/3005
// Force cargo to output binaries with disambiguating hashes in the name
cargo.env("__CARGO_DEFAULT_LIB_METADATA", "1");
cargo.env("__CARGO_DEFAULT_LIB_METADATA", &self.config.channel);
let stage;
if compiler.stage == 0 && self.local_rebuild {
@ -475,11 +378,30 @@ impl Build {
.env("RUSTDOC_REAL", self.rustdoc(compiler))
.env("RUSTC_FLAGS", self.rustc_flags(target).join(" "));
// Tools don't get debuginfo right now, e.g. cargo and rls don't get
// compiled with debuginfo.
if mode != Mode::Tool {
// Tools don't get debuginfo right now, e.g. cargo and rls don't
// get compiled with debuginfo.
cargo.env("RUSTC_DEBUGINFO", self.config.rust_debuginfo.to_string())
.env("RUSTC_DEBUGINFO_LINES", self.config.rust_debuginfo_lines.to_string());
.env("RUSTC_DEBUGINFO_LINES", self.config.rust_debuginfo_lines.to_string())
.env("RUSTC_FORCE_UNSTABLE", "1");
// Currently the compiler depends on crates from crates.io, and
// then other crates can depend on the compiler (e.g. proc-macro
// crates). Let's say, for example that rustc itself depends on the
// bitflags crate. If an external crate then depends on the
// bitflags crate as well, we need to make sure they don't
// conflict, even if they pick the same verison of bitflags. We'll
// want to make sure that e.g. a plugin and rustc each get their
// own copy of bitflags.
// Cargo ensures that this works in general through the -C metadata
// flag. This flag will frob the symbols in the binary to make sure
// they're different, even though the source code is the exact
// same. To solve this problem for the compiler we extend Cargo's
// already-passed -C metadata flag with our own. Our rustc.rs
// wrapper around the actual rustc will detect -C metadata being
// passed and frob it with this extra string we're passing in.
cargo.env("RUSTC_METADATA_SUFFIX", "rustc");
}
// Enable usage of unstable features
@ -507,7 +429,7 @@ impl Build {
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler));
}
// There are two invariants we try must maintain:
// There are two invariants we must maintain:
// * stable crates cannot depend on unstable crates (general Rust rule),
// * crates that end up in the sysroot must be unstable (rustbuild rule).
//
@ -521,15 +443,17 @@ impl Build {
// feature and opt-in to `rustc_private`.
//
// We can't always pass `rustbuild` because crates which are outside of
// the comipiler, libs, and tests are stable and we don't want to make
// the compiler, libs, and tests are stable and we don't want to make
// their deps unstable (since this would break the first invariant
// above).
if mode != Mode::Tool {
//
// FIXME: remove this after next stage0
if mode != Mode::Tool && stage == 0 {
cargo.env("RUSTBUILD_UNSTABLE", "1");
}
// Ignore incremental modes except for stage0, since we're
// not guaranteeing correctness acros builds if the compiler
// not guaranteeing correctness across builds if the compiler
// is changing under your feet.`
if self.flags.incremental && compiler.stage == 0 {
let incr_dir = self.incremental_dir(compiler);
@ -557,7 +481,20 @@ impl Build {
cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string());
}
// Environment variables *required* needed throughout the build
// When being built Cargo will at some point call `nmake.exe` on Windows
// MSVC. Unfortunately `nmake` will read these two environment variables
// below and try to intepret them. We're likely being run, however, from
// MSYS `make` which uses the same variables.
//
// As a result, to prevent confusion and errors, we remove these
// variables from our environment to prevent passing MSYS make flags to
// nmake, causing it to blow up.
if cfg!(target_env = "msvc") {
cargo.env_remove("MAKE");
cargo.env_remove("MAKEFLAGS");
}
// Environment variables *required* throughout the build
//
// FIXME: should update code to not require this env var
cargo.env("CFG_COMPILER_HOST_TRIPLE", target);
@ -575,6 +512,9 @@ impl Build {
if self.config.vendor || self.is_sudo {
cargo.arg("--frozen");
}
self.ci_env.force_coloring_in_ci(&mut cargo);
return cargo
}
@ -715,7 +655,7 @@ impl Build {
}
/// Returns the root output directory for all Cargo output in a given stage,
/// running a particular comipler, wehther or not we're building the
/// running a particular compiler, wehther or not we're building the
/// standard library, and targeting the specified architecture.
fn cargo_out(&self,
compiler: &Compiler,
@ -847,6 +787,22 @@ impl Build {
run_suppressed(cmd)
}
/// Runs a command, printing out nice contextual information if it fails.
/// Exits if the command failed to execute at all, otherwise returns its
/// `status.success()`.
fn try_run(&self, cmd: &mut Command) -> bool {
self.verbose(&format!("running: {:?}", cmd));
try_run_silent(cmd)
}
/// Runs a command, printing out nice contextual information if it fails.
/// Exits if the command failed to execute at all, otherwise returns its
/// `status.success()`.
fn try_run_quiet(&self, cmd: &mut Command) -> bool {
self.verbose(&format!("running: {:?}", cmd));
try_run_suppressed(cmd)
}
/// Prints a message if this build is configured in verbose mode.
fn verbose(&self, msg: &str) {
if self.flags.verbose() || self.config.verbose() {
@ -882,6 +838,13 @@ impl Build {
if target.contains("apple-darwin") {
base.push("-stdlib=libc++".into());
}
// Work around an apparently bad MinGW / GCC optimization,
// See: http://lists.llvm.org/pipermail/cfe-dev/2016-December/051980.html
// See: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78936
if target == "i686-pc-windows-gnu" {
base.push("-fno-omit-frame-pointer".into());
}
return base
}
@ -925,6 +888,12 @@ impl Build {
.map(|p| &**p)
}
/// Returns whether the target will be tested using the `remote-test-client`
/// and `remote-test-server` binaries.
fn remote_tested(&self, target: &str) -> bool {
self.qemu_rootfs(target).is_some() || target.contains("android")
}
/// Returns the root of the "rootfs" image that this target will be using,
/// if one was configured.
///
@ -1028,6 +997,11 @@ impl Build {
self.package_vers(&self.release_num("cargo"))
}
/// Returns the value of `package_vers` above for rls
fn rls_package_vers(&self) -> String {
self.package_vers(&self.release_num("rls"))
}
/// Returns the `version` string associated with this compiler for Rust
/// itself.
///
@ -1040,7 +1014,7 @@ impl Build {
/// Returns the `a.b.c` version that the given package is at.
fn release_num(&self, package: &str) -> String {
let mut toml = String::new();
let toml_file_name = self.src.join(&format!("{}/Cargo.toml", package));
let toml_file_name = self.src.join(&format!("src/tools/{}/Cargo.toml", package));
t!(t!(File::open(toml_file_name)).read_to_string(&mut toml));
for line in toml.lines() {
let prefix = "version = \"";
@ -1061,6 +1035,19 @@ impl Build {
"nightly" | _ => true,
}
}
/// Fold the output of the commands after this method into a group. The fold
/// ends when the returned object is dropped. Folding can only be used in
/// the Travis CI environment.
pub fn fold_output<D, F>(&self, name: F) -> Option<OutputFolder>
where D: Into<String>, F: FnOnce() -> D
{
if self.ci_env == CiEnv::Travis {
Some(OutputFolder::new(name().into()))
} else {
None
}
}
}
impl<'a> Compiler<'a> {

View File

@ -58,6 +58,7 @@ fn build_krate(build: &mut Build, krate: &str) {
// the dependency graph and what `-p` arguments there are.
let mut cargo = Command::new(&build.cargo);
cargo.arg("metadata")
.arg("--format-version").arg("1")
.arg("--manifest-path").arg(build.src.join(krate).join("Cargo.toml"));
let output = output(&mut cargo);
let output: Output = json::decode(&output).unwrap();

View File

@ -55,6 +55,7 @@ check:
check-aux:
$(Q)$(BOOTSTRAP) test \
src/tools/cargotest \
cargo \
src/test/pretty \
src/test/run-pass/pretty \
src/test/run-fail/pretty \
@ -68,7 +69,7 @@ distcheck:
$(Q)$(BOOTSTRAP) dist $(BOOTSTRAP_ARGS)
$(Q)$(BOOTSTRAP) test distcheck $(BOOTSTRAP_ARGS)
install:
$(Q)$(BOOTSTRAP) dist --install $(BOOTSTRAP_ARGS)
$(Q)$(BOOTSTRAP) install $(BOOTSTRAP_ARGS)
tidy:
$(Q)$(BOOTSTRAP) test src/tools/tidy $(BOOTSTRAP_ARGS)
prepare:

View File

@ -19,6 +19,7 @@
//! ensure that they're always in place if needed.
use std::env;
use std::ffi::OsString;
use std::fs::{self, File};
use std::io::{Read, Write};
use std::path::Path;
@ -62,6 +63,7 @@ pub fn llvm(build: &Build, target: &str) {
drop(fs::remove_dir_all(&out_dir));
}
let _folder = build.fold_output(|| "llvm");
println!("Building LLVM for {}", target);
let _time = util::timeit();
t!(fs::create_dir_all(&out_dir));
@ -81,7 +83,7 @@ pub fn llvm(build: &Build, target: &str) {
// NOTE: remember to also update `config.toml.example` when changing the defaults!
let llvm_targets = match build.config.llvm_targets {
Some(ref s) => s,
None => "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX",
None => "X86;ARM;AArch64;Mips;PowerPC;SystemZ;JSBackend;MSP430;Sparc;NVPTX;Hexagon",
};
let assertions = if build.config.llvm_assertions {"ON"} else {"OFF"};
@ -107,6 +109,7 @@ pub fn llvm(build: &Build, target: &str) {
cfg.define("LLVM_USE_CRT_DEBUG", "MT");
cfg.define("LLVM_USE_CRT_RELEASE", "MT");
cfg.define("LLVM_USE_CRT_RELWITHDEBINFO", "MT");
cfg.static_crt(true);
}
if target.starts_with("i686") {
@ -129,25 +132,59 @@ pub fn llvm(build: &Build, target: &str) {
.define("LLVM_TABLEGEN", &host);
}
// MSVC handles compiler business itself
if !target.contains("msvc") {
if let Some(ref ccache) = build.config.ccache {
cfg.define("CMAKE_C_COMPILER", ccache)
.define("CMAKE_C_COMPILER_ARG1", build.cc(target))
.define("CMAKE_CXX_COMPILER", ccache)
.define("CMAKE_CXX_COMPILER_ARG1", build.cxx(target));
let sanitize_cc = |cc: &Path| {
if target.contains("msvc") {
OsString::from(cc.to_str().unwrap().replace("\\", "/"))
} else {
cfg.define("CMAKE_C_COMPILER", build.cc(target))
.define("CMAKE_CXX_COMPILER", build.cxx(target));
cc.as_os_str().to_owned()
}
cfg.build_arg("-j").build_arg(build.jobs().to_string());
};
let configure_compilers = |cfg: &mut cmake::Config| {
// MSVC with CMake uses msbuild by default which doesn't respect these
// vars that we'd otherwise configure. In that case we just skip this
// entirely.
if target.contains("msvc") && !build.config.ninja {
return
}
let cc = build.cc(target);
let cxx = build.cxx(target);
// Handle msvc + ninja + ccache specially (this is what the bots use)
if target.contains("msvc") &&
build.config.ninja &&
build.config.ccache.is_some() {
let mut cc = env::current_exe().expect("failed to get cwd");
cc.set_file_name("sccache-plus-cl.exe");
cfg.define("CMAKE_C_COMPILER", sanitize_cc(&cc))
.define("CMAKE_CXX_COMPILER", sanitize_cc(&cc));
cfg.env("SCCACHE_PATH",
build.config.ccache.as_ref().unwrap())
.env("SCCACHE_TARGET", target);
// If ccache is configured we inform the build a little differently hwo
// to invoke ccache while also invoking our compilers.
} else if let Some(ref ccache) = build.config.ccache {
cfg.define("CMAKE_C_COMPILER", ccache)
.define("CMAKE_C_COMPILER_ARG1", sanitize_cc(cc))
.define("CMAKE_CXX_COMPILER", ccache)
.define("CMAKE_CXX_COMPILER_ARG1", sanitize_cc(cxx));
} else {
cfg.define("CMAKE_C_COMPILER", sanitize_cc(cc))
.define("CMAKE_CXX_COMPILER", sanitize_cc(cxx));
}
cfg.build_arg("-j").build_arg(build.jobs().to_string());
cfg.define("CMAKE_C_FLAGS", build.cflags(target).join(" "));
cfg.define("CMAKE_CXX_FLAGS", build.cflags(target).join(" "));
}
};
configure_compilers(&mut cfg);
if env::var_os("SCCACHE_ERROR_LOG").is_some() {
cfg.env("RUST_LOG", "sccache=info");
cfg.env("RUST_LOG", "sccache=warn");
}
// FIXME: we don't actually need to build all LLVM tools and all LLVM
@ -182,6 +219,7 @@ pub fn test_helpers(build: &Build, target: &str) {
return
}
let _folder = build.fold_output(|| "build_test_helpers");
println!("Building test helpers");
t!(fs::create_dir_all(&dst));
let mut cfg = gcc::Config::new();
@ -274,11 +312,15 @@ pub fn openssl(build: &Build, target: &str) {
configure.arg("no-ssl3");
let os = match target {
"aarch64-linux-android" => "linux-aarch64",
"aarch64-unknown-linux-gnu" => "linux-aarch64",
"arm-linux-androideabi" => "android",
"arm-unknown-linux-gnueabi" => "linux-armv4",
"arm-unknown-linux-gnueabihf" => "linux-armv4",
"armv7-linux-androideabi" => "android-armv7",
"armv7-unknown-linux-gnueabihf" => "linux-armv4",
"i686-apple-darwin" => "darwin-i386-cc",
"i686-linux-android" => "android-x86",
"i686-unknown-freebsd" => "BSD-x86-elf",
"i686-unknown-linux-gnu" => "linux-elf",
"i686-unknown-linux-musl" => "linux-elf",
@ -291,6 +333,7 @@ pub fn openssl(build: &Build, target: &str) {
"powerpc64le-unknown-linux-gnu" => "linux-ppc64le",
"s390x-unknown-linux-gnu" => "linux64-s390x",
"x86_64-apple-darwin" => "darwin64-x86_64-cc",
"x86_64-linux-android" => "linux-x86_64",
"x86_64-unknown-freebsd" => "BSD-x86_64",
"x86_64-unknown-linux-gnu" => "linux-x86_64",
"x86_64-unknown-linux-musl" => "linux-x86_64",
@ -302,6 +345,18 @@ pub fn openssl(build: &Build, target: &str) {
for flag in build.cflags(target) {
configure.arg(flag);
}
// There is no specific os target for android aarch64 or x86_64,
// so we need to pass some extra cflags
if target == "aarch64-linux-android" || target == "x86_64-linux-android" {
configure.arg("-mandroid");
configure.arg("-fomit-frame-pointer");
}
// Make PIE binaries
// Non-PIE linker support was removed in Lollipop
// https://source.android.com/security/enhancements/enhancements50
if target == "i686-linux-android" {
configure.arg("no-asm");
}
configure.current_dir(&obj);
println!("Configuring openssl for {}", target);
build.run_quiet(&mut configure);

View File

@ -69,23 +69,22 @@ pub fn check(build: &mut Build) {
need_cmd("git".as_ref());
}
// We need cmake, but only if we're actually building LLVM
for host in build.config.host.iter() {
if let Some(config) = build.config.target_config.get(host) {
if config.llvm_config.is_some() {
continue
}
}
// We need cmake, but only if we're actually building LLVM or sanitizers.
let building_llvm = build.config.host.iter()
.filter_map(|host| build.config.target_config.get(host))
.any(|config| config.llvm_config.is_none());
if building_llvm || build.config.sanitizers {
need_cmd("cmake".as_ref());
if build.config.ninja {
}
// Ninja is currently only used for LLVM itself.
if building_llvm && build.config.ninja {
// Some Linux distros rename `ninja` to `ninja-build`.
// CMake can work with either binary name.
if have_cmd("ninja-build".as_ref()).is_none() {
need_cmd("ninja".as_ref());
}
}
break
}
if build.config.python.is_none() {
build.config.python = have_cmd("python2.7".as_ref());

View File

@ -28,6 +28,7 @@
use std::collections::{BTreeMap, HashSet, HashMap};
use std::mem;
use std::process;
use check::{self, TestKind};
use compile;
@ -307,7 +308,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|s| s.name("libtest"))
.dep(|s| s.name("tool-compiletest").target(s.host).stage(0))
.dep(|s| s.name("test-helpers"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.default(mode != "pretty") // pretty tests don't run everywhere
.run(move |s| {
check::compiletest(build, &s.compiler(), s.target, mode, dir)
@ -346,7 +347,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|s| s.name("tool-compiletest").target(s.host).stage(0))
.dep(|s| s.name("test-helpers"))
.dep(|s| s.name("debugger-scripts"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.run(move |s| check::compiletest(build, &s.compiler(), s.target,
"debuginfo-gdb", "debuginfo"));
let mut rule = rules.test("check-debuginfo", "src/test/debuginfo");
@ -400,14 +401,14 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
for (krate, path, _default) in krates("std") {
rules.test(&krate.test_step, path)
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libstd, TestKind::Test,
Some(&krate.name)));
}
rules.test("check-std-all", "path/to/nowhere")
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.default(true)
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libstd, TestKind::Test, None));
@ -416,14 +417,14 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
for (krate, path, _default) in krates("std") {
rules.bench(&krate.bench_step, path)
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libstd, TestKind::Bench,
Some(&krate.name)));
}
rules.bench("bench-std-all", "path/to/nowhere")
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.default(true)
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libstd, TestKind::Bench, None));
@ -431,21 +432,21 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
for (krate, path, _default) in krates("test") {
rules.test(&krate.test_step, path)
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libtest, TestKind::Test,
Some(&krate.name)));
}
rules.test("check-test-all", "path/to/nowhere")
.dep(|s| s.name("libtest"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.default(true)
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Libtest, TestKind::Test, None));
for (krate, path, _default) in krates("rustc-main") {
rules.test(&krate.test_step, path)
.dep(|s| s.name("librustc"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.host(true)
.run(move |s| check::krate(build, &s.compiler(), s.target,
Mode::Librustc, TestKind::Test,
@ -453,7 +454,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
}
rules.test("check-rustc-all", "path/to/nowhere")
.dep(|s| s.name("librustc"))
.dep(|s| s.name("emulator-copy-libs"))
.dep(|s| s.name("remote-copy-libs"))
.default(true)
.host(true)
.run(move |s| check::krate(build, &s.compiler(), s.target,
@ -470,6 +471,10 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|s| s.name("librustc"))
.host(true)
.run(move |s| check::cargotest(build, s.stage, s.target));
rules.test("check-cargo", "cargo")
.dep(|s| s.name("tool-cargo"))
.host(true)
.run(move |s| check::cargo(build, s.stage, s.target));
rules.test("check-tidy", "src/tools/tidy")
.dep(|s| s.name("tool-tidy").stage(0))
.default(true)
@ -488,6 +493,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.host(true)
.run(move |s| check::docs(build, &s.compiler()));
rules.test("check-distcheck", "distcheck")
.dep(|s| s.name("dist-plain-source-tarball"))
.dep(|s| s.name("dist-src"))
.run(move |_| check::distcheck(build));
@ -496,33 +502,33 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
rules.build("openssl", "path/to/nowhere")
.run(move |s| native::openssl(build, s.target));
// Some test suites are run inside emulators, and most of our test binaries
// are linked dynamically which means we need to ship the standard library
// and such to the emulator ahead of time. This step represents this and is
// a dependency of all test suites.
// Some test suites are run inside emulators or on remote devices, and most
// of our test binaries are linked dynamically which means we need to ship
// the standard library and such to the emulator ahead of time. This step
// represents this and is a dependency of all test suites.
//
// Most of the time this step is a noop (the `check::emulator_copy_libs`
// only does work if necessary). For some steps such as shipping data to
// QEMU we have to build our own tools so we've got conditional dependencies
// on those programs as well. Note that the QEMU client is built for the
// build target (us) and the server is built for the target.
rules.test("emulator-copy-libs", "path/to/nowhere")
// on those programs as well. Note that the remote test client is built for
// the build target (us) and the server is built for the target.
rules.test("remote-copy-libs", "path/to/nowhere")
.dep(|s| s.name("libtest"))
.dep(move |s| {
if build.qemu_rootfs(s.target).is_some() {
s.name("tool-qemu-test-client").target(s.host).stage(0)
if build.remote_tested(s.target) {
s.name("tool-remote-test-client").target(s.host).stage(0)
} else {
Step::noop()
}
})
.dep(move |s| {
if build.qemu_rootfs(s.target).is_some() {
s.name("tool-qemu-test-server")
if build.remote_tested(s.target) {
s.name("tool-remote-test-server")
} else {
Step::noop()
}
})
.run(move |s| check::emulator_copy_libs(build, &s.compiler(), s.target));
.run(move |s| check::remote_copy_libs(build, &s.compiler(), s.target));
rules.test("check-bootstrap", "src/bootstrap")
.default(true)
@ -562,15 +568,21 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "build-manifest"));
rules.build("tool-qemu-test-server", "src/tools/qemu-test-server")
rules.build("tool-remote-test-server", "src/tools/remote-test-server")
.dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-server"));
rules.build("tool-qemu-test-client", "src/tools/qemu-test-client")
.run(move |s| compile::tool(build, s.stage, s.target, "remote-test-server"));
rules.build("tool-remote-test-client", "src/tools/remote-test-client")
.dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-client"));
rules.build("tool-cargo", "cargo")
.run(move |s| compile::tool(build, s.stage, s.target, "remote-test-client"));
rules.build("tool-rust-installer", "src/tools/rust-installer")
.dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "rust-installer"));
rules.build("tool-cargo", "src/tools/cargo")
.host(true)
.default(build.config.extended)
.dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.dep(|s| s.stage(0).host(s.target).name("openssl"))
@ -582,7 +594,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.host(&build.config.build)
})
.run(move |s| compile::tool(build, s.stage, s.target, "cargo"));
rules.build("tool-rls", "rls")
rules.build("tool-rls", "src/tools/rls")
.host(true)
.dep(|s| s.name("librustc-tool"))
.dep(|s| s.stage(0).host(s.target).name("openssl"))
@ -697,6 +709,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.host(true)
.only_host_build(true)
.default(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::rustc(build, s.stage, s.target));
rules.dist("dist-std", "src/libstd")
.dep(move |s| {
@ -711,43 +724,54 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
})
.default(true)
.only_host_build(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::std(build, &s.compiler(), s.target));
rules.dist("dist-mingw", "path/to/nowhere")
.default(true)
.only_host_build(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| {
if s.target.contains("pc-windows-gnu") {
dist::mingw(build, s.target)
}
});
rules.dist("dist-plain-source-tarball", "src")
.default(build.config.rust_dist_src)
.host(true)
.only_build(true)
.only_host_build(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |_| dist::plain_source_tarball(build));
rules.dist("dist-src", "src")
.default(true)
.host(true)
.only_build(true)
.only_host_build(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |_| dist::rust_src(build));
rules.dist("dist-docs", "src/doc")
.default(true)
.only_host_build(true)
.dep(|s| s.name("default:doc"))
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::docs(build, s.stage, s.target));
rules.dist("dist-analysis", "analysis")
.default(build.config.extended)
.dep(|s| s.name("dist-std"))
.only_host_build(true)
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::analysis(build, &s.compiler(), s.target));
rules.dist("dist-rls", "rls")
.host(true)
.only_host_build(true)
.dep(|s| s.name("tool-rls"))
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::rls(build, s.stage, s.target));
rules.dist("install", "path/to/nowhere")
.dep(|s| s.name("default:dist"))
.run(move |s| install::install(build, s.stage, s.target));
rules.dist("dist-cargo", "cargo")
.host(true)
.only_host_build(true)
.dep(|s| s.name("tool-cargo"))
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::cargo(build, s.stage, s.target));
rules.dist("dist-extended", "extended")
.default(build.config.extended)
@ -758,8 +782,8 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|d| d.name("dist-mingw"))
.dep(|d| d.name("dist-docs"))
.dep(|d| d.name("dist-cargo"))
.dep(|d| d.name("dist-rls"))
.dep(|d| d.name("dist-analysis"))
.dep(move |s| tool_rust_installer(build, s))
.run(move |s| dist::extended(build, s.stage, s.target));
rules.dist("dist-sign", "hash-and-sign")
@ -769,8 +793,56 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(move |s| s.name("tool-build-manifest").target(&build.config.build).stage(0))
.run(move |_| dist::hash_and_sign(build));
rules.install("install-docs", "src/doc")
.default(build.config.docs)
.only_host_build(true)
.dep(|s| s.name("dist-docs"))
.run(move |s| install::Installer::new(build).install_docs(s.stage, s.target));
rules.install("install-std", "src/libstd")
.default(true)
.only_host_build(true)
.dep(|s| s.name("dist-std"))
.run(move |s| install::Installer::new(build).install_std(s.stage));
rules.install("install-cargo", "cargo")
.default(build.config.extended)
.host(true)
.only_host_build(true)
.dep(|s| s.name("dist-cargo"))
.run(move |s| install::Installer::new(build).install_cargo(s.stage, s.target));
rules.install("install-rls", "rls")
.host(true)
.only_host_build(true)
.dep(|s| s.name("dist-rls"))
.run(move |s| install::Installer::new(build).install_rls(s.stage, s.target));
rules.install("install-analysis", "analysis")
.default(build.config.extended)
.only_host_build(true)
.dep(|s| s.name("dist-analysis"))
.run(move |s| install::Installer::new(build).install_analysis(s.stage, s.target));
rules.install("install-src", "src")
.default(build.config.extended)
.host(true)
.only_build(true)
.only_host_build(true)
.dep(|s| s.name("dist-src"))
.run(move |s| install::Installer::new(build).install_src(s.stage));
rules.install("install-rustc", "src/librustc")
.default(true)
.host(true)
.only_host_build(true)
.dep(|s| s.name("dist-rustc"))
.run(move |s| install::Installer::new(build).install_rustc(s.stage, s.target));
rules.verify();
return rules;
/// Helper to depend on a stage0 build-only rust-installer tool.
fn tool_rust_installer<'a>(build: &'a Build, step: &Step<'a>) -> Step<'a> {
step.name("tool-rust-installer")
.host(&build.config.build)
.target(&build.config.build)
.stage(0)
}
}
#[derive(PartialEq, Eq, Hash, Clone, Debug)]
@ -874,6 +946,7 @@ enum Kind {
Bench,
Dist,
Doc,
Install,
}
impl<'a> Rule<'a> {
@ -1005,6 +1078,12 @@ impl<'a> Rules<'a> {
self.rule(name, path, Kind::Dist)
}
/// Same as `build`, but for `Kind::Install`.
fn install<'b>(&'b mut self, name: &'a str, path: &'a str)
-> RuleBuilder<'a, 'b> {
self.rule(name, path, Kind::Install)
}
fn rule<'b>(&'b mut self,
name: &'a str,
path: &'a str,
@ -1045,6 +1124,7 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
"test" => Kind::Test,
"bench" => Kind::Bench,
"dist" => Kind::Dist,
"install" => Kind::Install,
_ => return None,
};
let rules = self.rules.values().filter(|r| r.kind == kind);
@ -1092,15 +1172,10 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
let (kind, paths) = match self.build.flags.cmd {
Subcommand::Build { ref paths } => (Kind::Build, &paths[..]),
Subcommand::Doc { ref paths } => (Kind::Doc, &paths[..]),
Subcommand::Test { ref paths, test_args: _ } => (Kind::Test, &paths[..]),
Subcommand::Bench { ref paths, test_args: _ } => (Kind::Bench, &paths[..]),
Subcommand::Dist { ref paths, install } => {
if install {
return vec![self.sbuild.name("install")]
} else {
(Kind::Dist, &paths[..])
}
}
Subcommand::Test { ref paths, .. } => (Kind::Test, &paths[..]),
Subcommand::Bench { ref paths, .. } => (Kind::Bench, &paths[..]),
Subcommand::Dist { ref paths } => (Kind::Dist, &paths[..]),
Subcommand::Install { ref paths } => (Kind::Install, &paths[..]),
Subcommand::Clean => panic!(),
};
@ -1191,6 +1266,13 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
self.build.verbose(&format!("executing step {:?}", step));
(self.rules[step.name].run)(step);
}
// Check for postponed failures from `test --no-fail-fast`.
let failures = self.build.delayed_failures.get();
if failures > 0 {
println!("\n{} command(s) did not execute successfully.\n", failures);
process::exit(1);
}
}
/// From the top level targets `steps` generate a topological ordering of
@ -1319,10 +1401,6 @@ mod tests {
use config::Config;
use flags::Flags;
macro_rules! a {
($($a:expr),*) => (vec![$($a.to_string()),*])
}
fn build(args: &[&str],
extra_host: &[&str],
extra_target: &[&str]) -> Build {

View File

@ -16,10 +16,10 @@
use std::env;
use std::ffi::OsString;
use std::fs;
use std::io;
use std::io::{self, Write};
use std::path::{Path, PathBuf};
use std::process::Command;
use std::time::Instant;
use std::time::{SystemTime, Instant};
use filetime::{self, FileTime};
@ -139,6 +139,8 @@ pub fn dylib_path_var() -> &'static str {
"PATH"
} else if cfg!(target_os = "macos") {
"DYLD_LIBRARY_PATH"
} else if cfg!(target_os = "haiku") {
"LIBRARY_PATH"
} else {
"LD_LIBRARY_PATH"
}
@ -322,3 +324,102 @@ pub fn symlink_dir(src: &Path, dest: &Path) -> io::Result<()> {
}
}
}
/// An RAII structure that indicates all output until this instance is dropped
/// is part of the same group.
///
/// On Travis CI, these output will be folded by default, together with the
/// elapsed time in this block. This reduces noise from unnecessary logs,
/// allowing developers to quickly identify the error.
///
/// Travis CI supports folding by printing `travis_fold:start:<name>` and
/// `travis_fold:end:<name>` around the block. Time elapsed is recognized
/// similarly with `travis_time:[start|end]:<name>`. These are undocumented, but
/// can easily be deduced from source code of the [Travis build commands].
///
/// [Travis build commands]:
/// https://github.com/travis-ci/travis-build/blob/f603c0089/lib/travis/build/templates/header.sh
pub struct OutputFolder {
name: String,
start_time: SystemTime, // we need SystemTime to get the UNIX timestamp.
}
impl OutputFolder {
/// Creates a new output folder with the given group name.
pub fn new(name: String) -> OutputFolder {
// "\r" moves the cursor to the beginning of the line, and "\x1b[0K" is
// the ANSI escape code to clear from the cursor to end of line.
// Travis seems to have trouble when _not_ using "\r\x1b[0K", that will
// randomly put lines to the top of the webpage.
print!("travis_fold:start:{0}\r\x1b[0Ktravis_time:start:{0}\r\x1b[0K", name);
OutputFolder {
name,
start_time: SystemTime::now(),
}
}
}
impl Drop for OutputFolder {
fn drop(&mut self) {
use std::time::*;
use std::u64;
fn to_nanos(duration: Result<Duration, SystemTimeError>) -> u64 {
match duration {
Ok(d) => d.as_secs() * 1_000_000_000 + d.subsec_nanos() as u64,
Err(_) => u64::MAX,
}
}
let end_time = SystemTime::now();
let duration = end_time.duration_since(self.start_time);
let start = self.start_time.duration_since(UNIX_EPOCH);
let finish = end_time.duration_since(UNIX_EPOCH);
println!(
"travis_fold:end:{0}\r\x1b[0K\n\
travis_time:end:{0}:start={1},finish={2},duration={3}\r\x1b[0K",
self.name,
to_nanos(start),
to_nanos(finish),
to_nanos(duration)
);
io::stdout().flush().unwrap();
}
}
/// The CI environment rustbuild is running in. This mainly affects how the logs
/// are printed.
#[derive(Copy, Clone, PartialEq, Eq, Debug)]
pub enum CiEnv {
/// Not a CI environment.
None,
/// The Travis CI environment, for Linux (including Docker) and macOS builds.
Travis,
/// The AppVeyor environment, for Windows builds.
AppVeyor,
}
impl CiEnv {
/// Obtains the current CI environment.
pub fn current() -> CiEnv {
if env::var("TRAVIS").ok().map_or(false, |e| &*e == "true") {
CiEnv::Travis
} else if env::var("APPVEYOR").ok().map_or(false, |e| &*e == "True") {
CiEnv::AppVeyor
} else {
CiEnv::None
}
}
/// If in a CI environment, forces the command to run with colors.
pub fn force_coloring_in_ci(self, cmd: &mut Command) {
if self != CiEnv::None {
// Due to use of stamp/docker, the output stream of rustbuild is not
// a TTY in CI, so coloring is by-default turned off.
// The explicit `TERM=xterm` environment is needed for
// `--color always` to actually work. This env var was lost when
// compiling through the Makefile. Very strange.
cmd.env("TERM", "xterm").args(&["--color", "always"]);
}
}
}

View File

@ -42,35 +42,49 @@ pub fn run(cmd: &mut Command) {
}
pub fn run_silent(cmd: &mut Command) {
if !try_run_silent(cmd) {
std::process::exit(1);
}
}
pub fn try_run_silent(cmd: &mut Command) -> bool {
let status = match cmd.status() {
Ok(status) => status,
Err(e) => fail(&format!("failed to execute command: {:?}\nerror: {}",
cmd, e)),
};
if !status.success() {
fail(&format!("command did not execute successfully: {:?}\n\
expected success, got: {}",
println!("\n\ncommand did not execute successfully: {:?}\n\
expected success, got: {}\n\n",
cmd,
status));
status);
}
status.success()
}
pub fn run_suppressed(cmd: &mut Command) {
if !try_run_suppressed(cmd) {
std::process::exit(1);
}
}
pub fn try_run_suppressed(cmd: &mut Command) -> bool {
let output = match cmd.output() {
Ok(status) => status,
Err(e) => fail(&format!("failed to execute command: {:?}\nerror: {}",
cmd, e)),
};
if !output.status.success() {
fail(&format!("command did not execute successfully: {:?}\n\
println!("\n\ncommand did not execute successfully: {:?}\n\
expected success, got: {}\n\n\
stdout ----\n{}\n\
stderr ----\n{}\n",
stderr ----\n{}\n\n",
cmd,
output.status,
String::from_utf8_lossy(&output.stdout),
String::from_utf8_lossy(&output.stderr)));
String::from_utf8_lossy(&output.stderr));
}
output.status.success()
}
pub fn gnu_target(target: &str) -> String {
@ -198,7 +212,11 @@ pub fn native_lib_boilerplate(src_name: &str,
let out_dir = env::var_os("RUSTBUILD_NATIVE_DIR").unwrap_or(env::var_os("OUT_DIR").unwrap());
let out_dir = PathBuf::from(out_dir).join(out_name);
t!(create_dir_racy(&out_dir));
if link_name.contains('=') {
println!("cargo:rustc-link-lib={}", link_name);
} else {
println!("cargo:rustc-link-lib=static={}", link_name);
}
println!("cargo:rustc-link-search=native={}", out_dir.join(search_subdir).display());
let timestamp = out_dir.join("rustbuild.timestamp");
@ -209,6 +227,21 @@ pub fn native_lib_boilerplate(src_name: &str,
}
}
pub fn sanitizer_lib_boilerplate(sanitizer_name: &str) -> Result<NativeLibBoilerplate, ()> {
let (link_name, search_path) = match &*env::var("TARGET").unwrap() {
"x86_64-unknown-linux-gnu" => (
format!("clang_rt.{}-x86_64", sanitizer_name),
"build/lib/linux",
),
"x86_64-apple-darwin" => (
format!("dylib=clang_rt.{}_osx_dynamic", sanitizer_name),
"build/lib/darwin",
),
_ => return Err(()),
};
native_lib_boilerplate("compiler-rt", sanitizer_name, &link_name, search_path)
}
fn dir_up_to_date(src: &Path, threshold: &FileTime) -> bool {
t!(fs::read_dir(src)).map(|e| t!(e)).all(|e| {
let meta = t!(e.metadata());

View File

@ -16,6 +16,12 @@ for example:
Images will output artifacts in an `obj` dir at the root of a repository.
## Filesystem layout
- Each directory, excluding `scripts` and `disabled`, corresponds to a docker image
- `scripts` contains files shared by docker images
- `disabled` contains images that are not build travis
## Cross toolchains
A number of these images take quite a long time to compile as they're building

View File

@ -1,46 +1,60 @@
FROM ubuntu:16.04
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
curl \
file \
g++ \
git \
libssl-dev \
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm 9
# sdk
RUN dpkg --add-architecture i386 && \
apt-get update && \
apt-get install -y --no-install-recommends \
g++ \
make \
file \
curl \
ca-certificates \
python2.7 \
git \
cmake \
unzip \
expect \
openjdk-9-jre-headless \
sudo \
libgl1-mesa-glx \
libpulse0 \
libstdc++6:i386 \
xz-utils \
libssl-dev \
pkg-config
openjdk-9-jre-headless \
tzdata
WORKDIR /android/
ENV PATH=$PATH:/android/ndk-arm-9/bin:/android/sdk/tools:/android/sdk/platform-tools
COPY scripts/android-sdk.sh /scripts/
RUN . /scripts/android-sdk.sh && \
download_and_create_avd tools_r25.2.5-linux.zip armeabi-v7a 18
COPY install-ndk.sh install-sdk.sh accept-licenses.sh /android/
RUN sh /android/install-ndk.sh
RUN sh /android/install-sdk.sh
# env
ENV PATH=$PATH:/android/sdk/tools
ENV PATH=$PATH:/android/sdk/platform-tools
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb
COPY start-emulator.sh /android/
ENTRYPOINT ["/usr/bin/dumb-init", "--", "/android/start-emulator.sh"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-02-24-sccache-x86_64-unknown-linux-gnu && \
chmod +x /usr/local/bin/sccache
ENV TARGETS=arm-linux-androideabi
ENV RUST_CONFIGURE_ARGS \
--target=arm-linux-androideabi \
--arm-linux-androideabi-ndk=/android/ndk-arm-9
--target=$TARGETS \
--arm-linux-androideabi-ndk=/android/ndk/arm-9
ENV SCRIPT python2.7 ../x.py test --target arm-linux-androideabi
ENV SCRIPT python2.7 ../x.py test --target $TARGETS
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
COPY scripts/android-start-emulator.sh /scripts/
ENTRYPOINT ["/usr/bin/dumb-init", "--", "/scripts/android-start-emulator.sh"]

View File

@ -1,15 +0,0 @@
#!/usr/bin/expect -f
# ignore-license
set timeout 1800
set cmd [lindex $argv 0]
set licenses [lindex $argv 1]
spawn {*}$cmd
expect {
"Do you accept the license '*'*" {
exp_send "y\r"
exp_continue
}
eof
}

View File

@ -1,33 +0,0 @@
#!/bin/sh
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
cpgdb() {
cp android-ndk-r11c/prebuilt/linux-x86_64/bin/gdb /android/$1/bin/$2-gdb
cp android-ndk-r11c/prebuilt/linux-x86_64/bin/gdb-orig /android/$1/bin/gdb-orig
cp -r android-ndk-r11c/prebuilt/linux-x86_64/share /android/$1/share
}
# Prep the Android NDK
#
# See https://github.com/servo/servo/wiki/Building-for-Android
curl -O https://dl.google.com/android/repository/android-ndk-r11c-linux-x86_64.zip
unzip -q android-ndk-r11c-linux-x86_64.zip
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-9 \
--toolchain=arm-linux-androideabi-4.9 \
--install-dir=/android/ndk-arm-9 \
--ndk-dir=/android/android-ndk-r11c \
--arch=arm
cpgdb ndk-arm-9 arm-linux-androideabi
rm -rf ./android-ndk-r11c-linux-x86_64.zip ./android-ndk-r11c

View File

@ -1,33 +0,0 @@
#!/bin/sh
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
# Prep the SDK and emulator
#
# Note that the update process requires that we accept a bunch of licenses, and
# we can't just pipe `yes` into it for some reason, so we take the same strategy
# located in https://github.com/appunite/docker by just wrapping it in a script
# which apparently magically accepts the licenses.
mkdir sdk
curl https://dl.google.com/android/android-sdk_r24.4-linux.tgz | \
tar xzf - -C sdk --strip-components=1
filter="platform-tools,android-18"
filter="$filter,sys-img-armeabi-v7a-android-18"
./accept-licenses.sh "android - update sdk -a --no-ui --filter $filter"
echo "no" | android create avd \
--name arm-18 \
--target android-18 \
--abi armeabi-v7a

View File

@ -31,7 +31,7 @@ WORKDIR /build
# The `vexpress_config` config file was a previously generated config file for
# the kernel. This file was generated by running `make vexpress_defconfig`
# followed by `make menuconfig` and then enabling the IPv6 protocol page.
COPY vexpress_config /build/.config
COPY armhf-gnu/vexpress_config /build/.config
RUN curl https://cdn.kernel.org/pub/linux/kernel/v4.x/linux-4.4.42.tar.xz | \
tar xJf - && \
cd /build/linux-4.4.42 && \
@ -63,18 +63,18 @@ RUN curl http://cdimage.ubuntu.com/ubuntu-base/releases/16.04/release/ubuntu-bas
# Copy over our init script, which starts up our test server and also a few
# other misc tasks.
COPY rcS rootfs/etc/init.d/rcS
COPY armhf-gnu/rcS rootfs/etc/init.d/rcS
RUN chmod +x rootfs/etc/init.d/rcS
# Helper to quickly fill the entropy pool in the kernel.
COPY addentropy.c /tmp/
COPY armhf-gnu/addentropy.c /tmp/
RUN arm-linux-gnueabihf-gcc addentropy.c -o rootfs/addentropy -static
# TODO: What is this?!
RUN curl -O http://ftp.nl.debian.org/debian/dists/jessie/main/installer-armhf/current/images/device-tree/vexpress-v2p-ca15-tc1.dtb
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \

View File

@ -21,40 +21,23 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \
pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
WORKDIR /tmp
COPY build-rumprun.sh /tmp/
COPY cross/build-rumprun.sh /tmp/
RUN ./build-rumprun.sh
COPY build-arm-musl.sh /tmp/
COPY cross/build-arm-musl.sh /tmp/
RUN ./build-arm-musl.sh
# originally from
# https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2
RUN mkdir /usr/local/mips-linux-musl
RUN curl -L https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2 | \
tar xjf - -C /usr/local/mips-linux-musl --strip-components=2
RUN for file in /usr/local/mips-linux-musl/bin/mips-openwrt-linux-*; do \
ln -s $file /usr/local/bin/`basename $file`; \
done
COPY cross/install-mips-musl.sh /tmp/
RUN ./install-mips-musl.sh
# Note that this originally came from:
# https://downloads.openwrt.org/snapshots/trunk/malta/generic/OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2
RUN mkdir /usr/local/mipsel-linux-musl
RUN curl -L https://s3.amazonaws.com/rust-lang-ci/libc/OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2 | \
tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2
RUN for file in /usr/local/mipsel-linux-musl/bin/mipsel-openwrt-linux-*; do \
ln -s $file /usr/local/bin/`basename $file`; \
done
COPY cross/install-mipsel-musl.sh /tmp/
RUN ./install-mipsel-musl.sh
ENV TARGETS=asmjs-unknown-emscripten
ENV TARGETS=$TARGETS,wasm32-unknown-emscripten
@ -80,3 +63,10 @@ ENV RUST_CONFIGURE_ARGS \
--musl-root-armhf=/usr/local/arm-linux-musleabihf \
--musl-root-armv7=/usr/local/armv7-linux-musleabihf
ENV SCRIPT python2.7 ../x.py dist --target $TARGETS
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -0,0 +1,24 @@
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
mkdir /usr/local/mips-linux-musl
# originally from
# https://downloads.openwrt.org/snapshots/trunk/ar71xx/generic/
# OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2
URL="https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror"
FILE="OpenWrt-Toolchain-ar71xx-generic_gcc-5.3.0_musl-1.1.16.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mips-linux-musl --strip-components=2
for file in /usr/local/mips-linux-musl/bin/mips-openwrt-linux-*; do
ln -s $file /usr/local/bin/`basename $file`
done

View File

@ -0,0 +1,24 @@
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
mkdir /usr/local/mipsel-linux-musl
# Note that this originally came from:
# https://downloads.openwrt.org/snapshots/trunk/malta/generic/
# OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2
URL="https://s3.amazonaws.com/rust-lang-ci/libc"
FILE="OpenWrt-Toolchain-malta-le_gcc-5.3.0_musl-1.1.15.Linux-x86_64.tar.bz2"
curl -L "$URL/$FILE" | tar xjf - -C /usr/local/mipsel-linux-musl --strip-components=2
for file in /usr/local/mipsel-linux-musl/bin/mipsel-openwrt-linux-*; do
ln -s $file /usr/local/bin/`basename $file`
done

View File

@ -0,0 +1,50 @@
FROM ubuntu:16.04
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
curl \
file \
g++ \
git \
libssl-dev \
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip arm64 21
# env
ENV PATH=$PATH:/android/ndk/arm64-21/bin
ENV DEP_Z_ROOT=/android/ndk/arm64-21/sysroot/usr/
ENV HOSTS=aarch64-linux-android
ENV RUST_CONFIGURE_ARGS \
--host=$HOSTS \
--target=$HOSTS \
--aarch64-linux-android-ndk=/android/ndk/arm64-21 \
--disable-rpath \
--enable-extended \
--enable-cargo-openssl-static
ENV SCRIPT python2.7 ../x.py dist --target $HOSTS --host $HOSTS
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -0,0 +1,68 @@
FROM ubuntu:16.04
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
curl \
file \
g++ \
git \
libssl-dev \
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_ndk android-ndk-r13b-linux-x86_64.zip && \
make_standalone_toolchain arm 9 && \
make_standalone_toolchain arm 21 && \
remove_ndk
RUN chmod 777 /android/ndk && \
ln -s /android/ndk/arm-21 /android/ndk/arm
# env
ENV PATH=$PATH:/android/ndk/arm-9/bin
ENV DEP_Z_ROOT=/android/ndk/arm-9/sysroot/usr/
ENV HOSTS=armv7-linux-androideabi
ENV RUST_CONFIGURE_ARGS \
--host=$HOSTS \
--target=$HOSTS \
--armv7-linux-androideabi-ndk=/android/ndk/arm \
--disable-rpath \
--enable-extended \
--enable-cargo-openssl-static
# We support api level 9, but api level 21 is required to build llvm. To
# overcome this problem we use a ndk with api level 21 to build llvm and then
# switch to a ndk with api level 9 to complete the build. When the linker is
# invoked there are missing symbols (like sigsetempty, not available with api
# level 9), the default linker behavior is to generate an error, to allow the
# build to finish we use --warn-unresolved-symbols. Note that the missing
# symbols does not affect std, only the compiler (llvm) and cargo (openssl).
ENV SCRIPT \
python2.7 ../x.py build src/llvm --host $HOSTS --target $HOSTS && \
(export RUSTFLAGS="\"-C link-arg=-Wl,--warn-unresolved-symbols\""; \
rm /android/ndk/arm && \
ln -s /android/ndk/arm-9 /android/ndk/arm && \
python2.7 ../x.py dist --host $HOSTS --target $HOSTS)
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -0,0 +1,68 @@
FROM ubuntu:16.04
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
curl \
file \
g++ \
git \
libssl-dev \
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_ndk android-ndk-r13b-linux-x86_64.zip && \
make_standalone_toolchain x86 9 && \
make_standalone_toolchain x86 21 && \
remove_ndk
RUN chmod 777 /android/ndk && \
ln -s /android/ndk/x86-21 /android/ndk/x86
# env
ENV PATH=$PATH:/android/ndk/x86-9/bin
ENV DEP_Z_ROOT=/android/ndk/x86-9/sysroot/usr/
ENV HOSTS=i686-linux-android
ENV RUST_CONFIGURE_ARGS \
--host=$HOSTS \
--target=$HOSTS \
--i686-linux-android-ndk=/android/ndk/x86 \
--disable-rpath \
--enable-extended \
--enable-cargo-openssl-static
# We support api level 9, but api level 21 is required to build llvm. To
# overcome this problem we use a ndk with api level 21 to build llvm and then
# switch to a ndk with api level 9 to complete the build. When the linker is
# invoked there are missing symbols (like sigsetempty, not available with api
# level 9), the default linker behavior is to generate an error, to allow the
# build to finish we use --warn-unresolved-symbols. Note that the missing
# symbols does not affect std, only the compiler (llvm) and cargo (openssl).
ENV SCRIPT \
python2.7 ../x.py build src/llvm --host $HOSTS --target $HOSTS && \
(export RUSTFLAGS="\"-C link-arg=-Wl,--warn-unresolved-symbols\""; \
rm /android/ndk/x86 && \
ln -s /android/ndk/x86-9 /android/ndk/x86 && \
python2.7 ../x.py dist --host $HOSTS --target $HOSTS)
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -0,0 +1,50 @@
FROM ubuntu:16.04
RUN apt-get update && \
apt-get install -y --no-install-recommends \
ca-certificates \
cmake \
curl \
file \
g++ \
git \
libssl-dev \
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_and_make_toolchain android-ndk-r13b-linux-x86_64.zip x86_64 21
# env
ENV PATH=$PATH:/android/ndk/x86_64-21/bin
ENV DEP_Z_ROOT=/android/ndk/x86_64-21/sysroot/usr/
ENV HOSTS=x86_64-linux-android
ENV RUST_CONFIGURE_ARGS \
--host=$HOSTS \
--target=$HOSTS \
--x86_64-linux-android-ndk=/android/ndk/x86_64-21 \
--disable-rpath \
--enable-extended \
--enable-cargo-openssl-static
ENV SCRIPT python2.7 ../x.py dist --target $HOSTS --host $HOSTS
# sccache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -56,13 +56,13 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY aarch64-linux-gnu.config build-toolchains.sh /tmp/
COPY dist-aarch64-linux/aarch64-linux-gnu.config dist-aarch64-linux/build-toolchains.sh /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/aarch64-unknown-linux-gnueabi/bin

View File

@ -1,40 +1,36 @@
FROM ubuntu:16.04
RUN dpkg --add-architecture i386 && \
apt-get update && \
RUN apt-get update && \
apt-get install -y --no-install-recommends \
g++ \
make \
file \
curl \
ca-certificates \
python2.7 \
git \
cmake \
unzip \
expect \
openjdk-9-jre \
sudo \
libstdc++6:i386 \
xz-utils \
curl \
file \
g++ \
git \
libssl-dev \
pkg-config
make \
pkg-config \
python2.7 \
sudo \
unzip \
xz-utils
WORKDIR /android/
ENV PATH=$PATH:/android/ndk-arm-9/bin:/android/sdk/tools:/android/sdk/platform-tools
# dumb-init
COPY scripts/dumb-init.sh /scripts/
RUN sh /scripts/dumb-init.sh
COPY install-ndk.sh /android/
RUN sh /android/install-ndk.sh
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
# ndk
COPY scripts/android-ndk.sh /scripts/
RUN . /scripts/android-ndk.sh && \
download_ndk android-ndk-r13b-linux-x86_64.zip && \
make_standalone_toolchain arm 9 && \
make_standalone_toolchain x86 9 && \
make_standalone_toolchain arm64 21 && \
make_standalone_toolchain x86_64 21 && \
remove_ndk
# env
ENV TARGETS=arm-linux-androideabi
ENV TARGETS=$TARGETS,armv7-linux-androideabi
ENV TARGETS=$TARGETS,i686-linux-android
@ -44,10 +40,17 @@ ENV TARGETS=$TARGETS,x86_64-linux-android
ENV RUST_CONFIGURE_ARGS \
--target=$TARGETS \
--enable-extended \
--arm-linux-androideabi-ndk=/android/ndk-arm-9 \
--armv7-linux-androideabi-ndk=/android/ndk-arm-9 \
--i686-linux-android-ndk=/android/ndk-x86-9 \
--aarch64-linux-android-ndk=/android/ndk-arm64-21 \
--x86_64-linux-android-ndk=/android/ndk-x86_64-21
--arm-linux-androideabi-ndk=/android/ndk/arm-9 \
--armv7-linux-androideabi-ndk=/android/ndk/arm-9 \
--i686-linux-android-ndk=/android/ndk/x86-9 \
--aarch64-linux-android-ndk=/android/ndk/arm64-21 \
--x86_64-linux-android-ndk=/android/ndk/x86_64-21
ENV SCRIPT python2.7 ../x.py dist --target $TARGETS
# cache
COPY scripts/sccache.sh /scripts/
RUN sh /scripts/sccache.sh
# init
ENTRYPOINT ["/usr/bin/dumb-init", "--"]

View File

@ -1,44 +0,0 @@
#!/bin/sh
# Copyright 2016 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
# Prep the Android NDK
#
# See https://github.com/servo/servo/wiki/Building-for-Android
curl -O https://dl.google.com/android/repository/android-ndk-r11c-linux-x86_64.zip
unzip -q android-ndk-r11c-linux-x86_64.zip
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-9 \
--toolchain=arm-linux-androideabi-4.9 \
--install-dir=/android/ndk-arm-9 \
--ndk-dir=/android/android-ndk-r11c \
--arch=arm
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-21 \
--toolchain=aarch64-linux-android-4.9 \
--install-dir=/android/ndk-arm64-21 \
--ndk-dir=/android/android-ndk-r11c \
--arch=arm64
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-9 \
--toolchain=x86-4.9 \
--install-dir=/android/ndk-x86-9 \
--ndk-dir=/android/android-ndk-r11c \
--arch=x86
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-21 \
--toolchain=x86_64-4.9 \
--install-dir=/android/ndk-x86_64-21 \
--ndk-dir=/android/android-ndk-r11c \
--arch=x86_64
rm -rf ./android-ndk-r11c-linux-x86_64.zip ./android-ndk-r11c

View File

@ -56,13 +56,13 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY arm-linux-gnueabi.config build-toolchains.sh /tmp/
COPY dist-arm-linux/arm-linux-gnueabi.config dist-arm-linux/build-toolchains.sh /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabi/bin

View File

@ -56,13 +56,13 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY arm-linux-gnueabihf.config build-toolchains.sh /tmp/
COPY dist-armhf-linux/arm-linux-gnueabihf.config dist-armhf-linux/build-toolchains.sh /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabihf/bin

View File

@ -56,13 +56,13 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY build-toolchains.sh armv7-linux-gnueabihf.config /tmp/
COPY dist-armv7-linux/build-toolchains.sh dist-armv7-linux/armv7-linux-gnueabihf.config /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/armv7-unknown-linux-gnueabihf/bin

View File

@ -21,7 +21,7 @@ RUN curl -L https://cmake.org/files/v3.8/cmake-3.8.0-rc1-Linux-x86_64.tar.gz | \
tar xzf - -C /usr/local --strip-components=1
WORKDIR /tmp
COPY shared.sh build-toolchain.sh compiler-rt-dso-handle.patch /tmp/
COPY dist-fuchsia/shared.sh dist-fuchsia/build-toolchain.sh dist-fuchsia/compiler-rt-dso-handle.patch /tmp/
RUN /tmp/build-toolchain.sh
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
@ -30,7 +30,7 @@ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-ini
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV \

View File

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config
WORKDIR /build/
COPY musl-libunwind-patch.patch build-musl.sh /build/
COPY dist-i586-gnu-i686-musl/musl-libunwind-patch.patch dist-i586-gnu-i686-musl/build-musl.sh /build/
RUN sh /build/build-musl.sh && rm -rf /build
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
@ -26,7 +26,7 @@ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-ini
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV RUST_CONFIGURE_ARGS \

View File

@ -16,7 +16,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \
pkg-config
COPY build-toolchain.sh /tmp/
COPY dist-i686-freebsd/build-toolchain.sh /tmp/
RUN /tmp/build-toolchain.sh i686
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
@ -25,7 +25,7 @@ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-ini
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV \

View File

@ -29,13 +29,13 @@ ENV PATH=/rustroot/bin:$PATH
ENV LD_LIBRARY_PATH=/rustroot/lib64:/rustroot/lib
ENV PKG_CONFIG_PATH=/rustroot/lib/pkgconfig
WORKDIR /tmp
COPY shared.sh build-binutils.sh /tmp/
COPY dist-i686-linux/shared.sh dist-i686-linux/build-binutils.sh /tmp/
# We need a build of openssl which supports SNI to download artifacts from
# static.rust-lang.org. This'll be used to link into libcurl below (and used
# later as well), so build a copy of OpenSSL with dynamic libraries into our
# generic root.
COPY build-openssl.sh /tmp/
COPY dist-i686-linux/build-openssl.sh /tmp/
RUN ./build-openssl.sh
# The `curl` binary on CentOS doesn't support SNI which is needed for fetching
@ -44,7 +44,7 @@ RUN ./build-openssl.sh
#
# Note that we also disable a bunch of optional features of curl that we don't
# really need.
COPY build-curl.sh /tmp/
COPY dist-i686-linux/build-curl.sh /tmp/
RUN ./build-curl.sh
# binutils < 2.22 has a bug where the 32-bit executables it generates
@ -54,26 +54,26 @@ RUN ./build-curl.sh
RUN ./build-binutils.sh
# Need a newer version of gcc than centos has to compile LLVM nowadays
COPY build-gcc.sh /tmp/
COPY dist-i686-linux/build-gcc.sh /tmp/
RUN ./build-gcc.sh
# CentOS 5.5 has Python 2.4 by default, but LLVM needs 2.7+
COPY build-python.sh /tmp/
COPY dist-i686-linux/build-python.sh /tmp/
RUN ./build-python.sh
# Apparently CentOS 5.5 desn't have `git` in yum, but we're gonna need it for
# cloning, so download and build it here.
COPY build-git.sh /tmp/
COPY dist-i686-linux/build-git.sh /tmp/
RUN ./build-git.sh
# libssh2 (a dependency of Cargo) requires cmake 2.8.11 or higher but CentOS
# only has 2.6.4, so build our own
COPY build-cmake.sh /tmp/
COPY dist-i686-linux/build-cmake.sh /tmp/
RUN ./build-cmake.sh
# for sanitizers, we need kernel headers files newer than the ones CentOS ships
# with so we install newer ones here
COPY build-headers.sh /tmp/
COPY dist-i686-linux/build-headers.sh /tmp/
RUN ./build-headers.sh
RUN curl -Lo /rustroot/dumb-init \
@ -82,7 +82,7 @@ RUN curl -Lo /rustroot/dumb-init \
ENTRYPOINT ["/rustroot/dumb-init", "--"]
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV HOSTS=i686-unknown-linux-gnu

View File

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \

View File

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \

View File

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \

View File

@ -17,7 +17,7 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \

View File

@ -56,14 +56,14 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY patches/ /tmp/patches/
COPY powerpc-linux-gnu.config build-powerpc-toolchain.sh /tmp/
COPY dist-powerpc-linux/patches/ /tmp/patches/
COPY dist-powerpc-linux/powerpc-linux-gnu.config dist-powerpc-linux/build-powerpc-toolchain.sh /tmp/
RUN ./build-powerpc-toolchain.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/powerpc-unknown-linux-gnu/bin

View File

@ -56,14 +56,14 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY patches/ /tmp/patches/
COPY shared.sh powerpc64-linux-gnu.config build-powerpc64-toolchain.sh /tmp/
COPY dist-powerpc64-linux/patches/ /tmp/patches/
COPY dist-powerpc64-linux/shared.sh dist-powerpc64-linux/powerpc64-linux-gnu.config dist-powerpc64-linux/build-powerpc64-toolchain.sh /tmp/
RUN ./build-powerpc64-toolchain.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/powerpc64-unknown-linux-gnu/bin

View File

@ -59,11 +59,11 @@ WORKDIR /tmp
USER root
RUN apt-get install -y --no-install-recommends rpm2cpio cpio
COPY shared.sh build-powerpc64le-toolchain.sh /tmp/
COPY dist-powerpc64le-linux/shared.sh dist-powerpc64le-linux/build-powerpc64le-toolchain.sh /tmp/
RUN ./build-powerpc64le-toolchain.sh
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-05-12-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV \

Some files were not shown because too many files have changed in this diff Show More