New upstream version 1.18.0+dfsg1

This commit is contained in:
Ximin Luo 2017-06-21 00:27:43 +02:00
parent f8e3f18f33
commit cc61c64bd2
2135 changed files with 141237 additions and 35303 deletions

View File

@ -311,9 +311,13 @@ To save @bors some work, and to get small changes through more quickly, when
the other rollup-eligible patches too, and they'll get tested and merged at the other rollup-eligible patches too, and they'll get tested and merged at
the same time. the same time.
To find documentation-related issues, sort by the [A-docs label][adocs]. To find documentation-related issues, sort by the [T-doc label][tdoc].
[adocs]: https://github.com/rust-lang/rust/issues?q=is%3Aopen+is%3Aissue+label%3AA-docs [tdoc]: https://github.com/rust-lang/rust/issues?q=is%3Aopen%20is%3Aissue%20label%3AT-doc
You can find documentation style guidelines in [RFC 1574][rfc1574].
[rfc1574]: https://github.com/rust-lang/rfcs/blob/master/text/1574-more-api-documentation-conventions.md#appendix-a-full-conventions-text
In many cases, you don't need a full `./x.py doc`. You can use `rustdoc` directly In many cases, you don't need a full `./x.py doc`. You can use `rustdoc` directly
to check small fixes. For example, `rustdoc src/doc/reference.md` will render to check small fixes. For example, `rustdoc src/doc/reference.md` will render

View File

@ -16,7 +16,7 @@ Read ["Installing Rust"] from [The Book].
1. Make sure you have installed the dependencies: 1. Make sure you have installed the dependencies:
* `g++` 4.7 or later or `clang++` 3.x * `g++` 4.7 or later or `clang++` 3.x or later
* `python` 2.7 (but not 3.x) * `python` 2.7 (but not 3.x)
* GNU `make` 3.81 or later * GNU `make` 3.81 or later
* `cmake` 3.4.3 or later * `cmake` 3.4.3 or later
@ -161,8 +161,9 @@ If youd like to build the documentation, its almost the same:
$ ./x.py doc $ ./x.py doc
``` ```
The generated documentation will appear in a top-level `doc` directory, The generated documentation will appear under `doc` in the `build` directory for
created by the `make` rule. the ABI used. I.e., if the ABI was `x86_64-pc-windows-msvc`, the directory will be
`build\x86_64-pc-windows-msvc\doc`.
## Notes ## Notes
@ -197,8 +198,8 @@ The Rust community congregates in a few places:
* [users.rust-lang.org] - General discussion and broader questions. * [users.rust-lang.org] - General discussion and broader questions.
* [/r/rust] - News and general discussion. * [/r/rust] - News and general discussion.
[Stack Overflow]: http://stackoverflow.com/questions/tagged/rust [Stack Overflow]: https://stackoverflow.com/questions/tagged/rust
[/r/rust]: http://reddit.com/r/rust [/r/rust]: https://reddit.com/r/rust
[users.rust-lang.org]: https://users.rust-lang.org/ [users.rust-lang.org]: https://users.rust-lang.org/
## Contributing ## Contributing

View File

@ -1,10 +1,417 @@
Version 1.18.0 (2017-06-08)
===========================
Language
--------
- [Stabilize pub(restricted)][40556] `pub` can now accept a module path to
make the item visible to just that module tree. Also accepts the keyword
`crate` to make something public to the whole crate but not users of the
library. Example: `pub(crate) mod utils;`. [RFC 1422].
- [Stabilize `#![windows_subsystem]` attribute][40870] conservative exposure of the
`/SUBSYSTEM` linker flag on Windows platforms. [RFC 1665].
- [Refactor of trait object type parsing][40043] Now `ty` in macros can accept
types like `Write + Send`, trailing `+` are now supported in trait objects,
and better error reporting for trait objects starting with `?Sized`.
- [0e+10 is now a valid floating point literal][40589]
- [Now warns if you bind a lifetime parameter to 'static][40734]
- [Tuples, Enum variant fields, and structs with no `repr` attribute or with
`#[repr(Rust)]` are reordered to minimize padding and produce a smaller
representation in some cases.][40377]
Compiler
--------
- [rustc can now emit mir with `--emit mir`][39891]
- [Improved LLVM IR for trivial functions][40367]
- [Added explanation for E0090(Wrong number of lifetimes are supplied)][40723]
- [rustc compilation is now 15%-20% faster][41469] Thanks to optimisation
opportunities found through profiling
- [Improved backtrace formatting when panicking][38165]
Libraries
---------
- [Specialized `Vec::from_iter` being passed `vec::IntoIter`][40731] if the
iterator hasn't been advanced the original `Vec` is reassembled with no actual
iteration or reallocation.
- [Simplified HashMap Bucket interface][40561] provides performance
improvements for iterating and cloning.
- [Specialize Vec::from_elem to use calloc][40409]
- [Fixed Race condition in fs::create_dir_all][39799]
- [No longer caching stdio on Windows][40516]
- [Optimized insertion sort in slice][40807] insertion sort in some cases
2.50%~ faster and in one case now 12.50% faster.
- [Optimized `AtomicBool::fetch_nand`][41143]
Stabilized APIs
---------------
- [`Child::try_wait`]
- [`HashMap::retain`]
- [`HashSet::retain`]
- [`PeekMut::pop`]
- [`TcpStream::peek`]
- [`UdpSocket::peek`]
- [`UdpSocket::peek_from`]
Cargo
-----
- [Added partial Pijul support][cargo/3842] Pijul is a version control system in Rust.
You can now create new cargo projects with Pijul using `cargo new --vcs pijul`
- [Now always emits build script warnings for crates that fail to build][cargo/3847]
- [Added Android build support][cargo/3885]
- [Added `--bins` and `--tests` flags][cargo/3901] now you can build all programs
of a certain type, for example `cargo build --bins` will build all
binaries.
- [Added support for haiku][cargo/3952]
Misc
----
- [rustdoc can now use pulldown-cmark with the `--enable-commonmark` flag][40338]
- [Added rust-winbg script for better debugging on Windows][39983]
- [Rust now uses the official cross compiler for NetBSD][40612]
- [rustdoc now accepts `#` at the start of files][40828]
- [Fixed jemalloc support for musl][41168]
Compatibility Notes
-------------------
- [Changes to how the `0` flag works in format!][40241] Padding zeroes are now
always placed after the sign if it exists and before the digits. With the `#`
flag the zeroes are placed after the prefix and before the digits.
- [Due to the struct field optimisation][40377], using `transmute` on structs
that have no `repr` attribute or `#[repr(Rust)]` will no longer work. This has
always been undefined behavior, but is now more likely to break in practice.
- [The refactor of trait object type parsing][40043] fixed a bug where `+` was
receiving the wrong priority parsing things like `&for<'a> Tr<'a> + Send` as
`&(for<'a> Tr<'a> + Send)` instead of `(&for<'a> Tr<'a>) + Send`
- [Overlapping inherent `impl`s are now a hard error][40728]
- [`PartialOrd` and `Ord` must agree on the ordering.][41270]
- [`rustc main.rs -o out --emit=asm,llvm-ir`][41085] Now will output
`out.asm` and `out.ll` instead of only one of the filetypes.
- [ calling a function that returns `Self` will no longer work][41805] when
the size of `Self` cannot be statically determined.
- [rustc now builds with a "pthreads" flavour of MinGW for Windows GNU][40805]
this has caused a few regressions namely:
- Changed the link order of local static/dynamic libraries (respecting the
order on given rather than having the compiler reorder).
- Changed how MinGW is linked, native code linked to dynamic libraries
may require manually linking to the gcc support library (for the native
code itself)
[38165]: https://github.com/rust-lang/rust/pull/38165
[39799]: https://github.com/rust-lang/rust/pull/39799
[39891]: https://github.com/rust-lang/rust/pull/39891
[39983]: https://github.com/rust-lang/rust/pull/39983
[40043]: https://github.com/rust-lang/rust/pull/40043
[40241]: https://github.com/rust-lang/rust/pull/40241
[40338]: https://github.com/rust-lang/rust/pull/40338
[40367]: https://github.com/rust-lang/rust/pull/40367
[40377]: https://github.com/rust-lang/rust/pull/40377
[40409]: https://github.com/rust-lang/rust/pull/40409
[40516]: https://github.com/rust-lang/rust/pull/40516
[40556]: https://github.com/rust-lang/rust/pull/40556
[40561]: https://github.com/rust-lang/rust/pull/40561
[40589]: https://github.com/rust-lang/rust/pull/40589
[40612]: https://github.com/rust-lang/rust/pull/40612
[40723]: https://github.com/rust-lang/rust/pull/40723
[40728]: https://github.com/rust-lang/rust/pull/40728
[40731]: https://github.com/rust-lang/rust/pull/40731
[40734]: https://github.com/rust-lang/rust/pull/40734
[40805]: https://github.com/rust-lang/rust/pull/40805
[40807]: https://github.com/rust-lang/rust/pull/40807
[40828]: https://github.com/rust-lang/rust/pull/40828
[40870]: https://github.com/rust-lang/rust/pull/40870
[41085]: https://github.com/rust-lang/rust/pull/41085
[41143]: https://github.com/rust-lang/rust/pull/41143
[41168]: https://github.com/rust-lang/rust/pull/41168
[41270]: https://github.com/rust-lang/rust/issues/41270
[41469]: https://github.com/rust-lang/rust/pull/41469
[41805]: https://github.com/rust-lang/rust/issues/41805
[RFC 1422]: https://github.com/rust-lang/rfcs/blob/master/text/1422-pub-restricted.md
[RFC 1665]: https://github.com/rust-lang/rfcs/blob/master/text/1665-windows-subsystem.md
[`Child::try_wait`]: https://doc.rust-lang.org/std/process/struct.Child.html#method.try_wait
[`HashMap::retain`]: https://doc.rust-lang.org/std/collections/struct.HashMap.html#method.retain
[`HashSet::retain`]: https://doc.rust-lang.org/std/collections/struct.HashSet.html#method.retain
[`PeekMut::pop`]: https://doc.rust-lang.org/std/collections/binary_heap/struct.PeekMut.html#method.pop
[`TcpStream::peek`]: https://doc.rust-lang.org/std/net/struct.TcpStream.html#method.peek
[`UdpSocket::peek_from`]: https://doc.rust-lang.org/std/net/struct.UdpSocket.html#method.peek_from
[`UdpSocket::peek`]: https://doc.rust-lang.org/std/net/struct.UdpSocket.html#method.peek
[cargo/3842]: https://github.com/rust-lang/cargo/pull/3842
[cargo/3847]: https://github.com/rust-lang/cargo/pull/3847
[cargo/3885]: https://github.com/rust-lang/cargo/pull/3885
[cargo/3901]: https://github.com/rust-lang/cargo/pull/3901
[cargo/3952]: https://github.com/rust-lang/cargo/pull/3952
Version 1.17.0 (2017-04-27)
===========================
Language
--------
* [The lifetime of statics and consts defaults to `'static`][39265]. [RFC 1623]
* [Fields of structs may be initialized without duplicating the field/variable
names][39761]. [RFC 1682]
* [`Self` may be included in the `where` clause of `impls`][38864]. [RFC 1647]
* [When coercing to an unsized type lifetimes must be equal][40319]. That is,
there is no subtyping between `T` and `U` when `T: Unsize<U>`. For example,
coercing `&mut [&'a X; N]` to `&mut [&'b X]` requires `'a` be equal to
`'b`. Soundness fix.
* [Values passed to the indexing operator, `[]`, automatically coerce][40166]
* [Static variables may contain references to other statics][40027]
Compiler
--------
* [Exit quickly on only `--emit dep-info`][40336]
* [Make `-C relocation-model` more correctly determine whether the linker
creates a position-independent executable][40245]
* [Add `-C overflow-checks` to directly control whether integer overflow
panics][40037]
* [The rustc type checker now checks items on demand instead of in a single
in-order pass][40008]. This is mostly an internal refactoring in support of
future work, including incremental type checking, but also resolves [RFC
1647], allowing `Self` to appear in `impl` `where` clauses.
* [Optimize vtable loads][39995]
* [Turn off vectorization for Emscripten targets][39990]
* [Provide suggestions for unknown macros imported with `use`][39953]
* [Fix ICEs in path resolution][39939]
* [Strip exception handling code on Emscripten when `panic=abort`][39193]
* [Add clearer error message using `&str + &str`][39116]
Stabilized APIs
---------------
* [`Arc::into_raw`]
* [`Arc::from_raw`]
* [`Arc::ptr_eq`]
* [`Rc::into_raw`]
* [`Rc::from_raw`]
* [`Rc::ptr_eq`]
* [`Ordering::then`]
* [`Ordering::then_with`]
* [`BTreeMap::range`]
* [`BTreeMap::range_mut`]
* [`collections::Bound`]
* [`process::abort`]
* [`ptr::read_unaligned`]
* [`ptr::write_unaligned`]
* [`Result::expect_err`]
* [`Cell::swap`]
* [`Cell::replace`]
* [`Cell::into_inner`]
* [`Cell::take`]
Libraries
---------
* [`BTreeMap` and `BTreeSet` can iterate over ranges][27787]
* [`Cell` can store non-`Copy` types][39793]. [RFC 1651]
* [`String` implements `FromIterator<&char>`][40028]
* `Box` [implements][40009] a number of new conversions:
`From<Box<str>> for String`,
`From<Box<[T]>> for Vec<T>`,
`From<Box<CStr>> for CString`,
`From<Box<OsStr>> for OsString`,
`From<Box<Path>> for PathBuf`,
`Into<Box<str>> for String`,
`Into<Box<[T]>> for Vec<T>`,
`Into<Box<CStr>> for CString`,
`Into<Box<OsStr>> for OsString`,
`Into<Box<Path>> for PathBuf`,
`Default for Box<str>`,
`Default for Box<CStr>`,
`Default for Box<OsStr>`,
`From<&CStr> for Box<CStr>`,
`From<&OsStr> for Box<OsStr>`,
`From<&Path> for Box<Path>`
* [`ffi::FromBytesWithNulError` implements `Error` and `Display`][39960]
* [Specialize `PartialOrd<A> for [A] where A: Ord`][39642]
* [Slightly optimize `slice::sort`][39538]
* [Add `ToString` trait specialization for `Cow<'a, str>` and `String`][39440]
* [`Box<[T]>` implements `From<&[T]> where T: Copy`,
`Box<str>` implements `From<&str>`][39438]
* [`IpAddr` implements `From` for various arrays. `SocketAddr` implements
`From<(I, u16)> where I: Into<IpAddr>`][39372]
* [`format!` estimates the needed capacity before writing a string][39356]
* [Support unprivileged symlink creation in Windows][38921]
* [`PathBuf` implements `Default`][38764]
* [Implement `PartialEq<[A]>` for `VecDeque<A>`][38661]
* [`HashMap` resizes adaptively][38368] to guard against DOS attacks
and poor hash functions.
Cargo
-----
* [Add `cargo check --all`][cargo/3731]
* [Add an option to ignore SSL revocation checking][cargo/3699]
* [Add `cargo run --package`][cargo/3691]
* [Add `required_features`][cargo/3667]
* [Assume `build.rs` is a build script][cargo/3664]
* [Find workspace via `workspace_root` link in containing member][cargo/3562]
Misc
----
* [Documentation is rendered with mdbook instead of the obsolete, in-tree
`rustbook`][39633]
* [The "Unstable Book" documents nightly-only features][ubook]
* [Improve the style of the sidebar in rustdoc output][40265]
* [Configure build correctly on 64-bit CPU's with the armhf ABI][40261]
* [Fix MSP430 breakage due to `i128`][40257]
* [Preliminary Solaris/SPARCv9 support][39903]
* [`rustc` is linked statically on Windows MSVC targets][39837], allowing it to
run without installing the MSVC runtime.
* [`rustdoc --test` includes file names in test names][39788]
* This release includes builds of `std` for `sparc64-unknown-linux-gnu`,
`aarch64-unknown-linux-fuchsia`, and `x86_64-unknown-linux-fuchsia`.
* [Initial support for `aarch64-unknown-freebsd`][39491]
* [Initial support for `i686-unknown-netbsd`][39426]
* [This release no longer includes the old makefile build system][39431]. Rust
is built with a custom build system, written in Rust, and with Cargo.
* [Add Debug implementations for libcollection structs][39002]
* [`TypeId` implements `PartialOrd` and `Ord`][38981]
* [`--test-threads=0` produces an error][38945]
* [`rustup` installs documentation by default][40526]
* [The Rust source includes NatVis visualizations][39843]. These can be used by
WinDbg and Visual Studio to improve the debugging experience.
Compatibility Notes
-------------------
* [Rust 1.17 does not correctly detect the MSVC 2017 linker][38584]. As a
workaround, either use MSVC 2015 or run vcvars.bat.
* [When coercing to an unsized type lifetimes must be equal][40319]. That is,
disallow subtyping between `T` and `U` when `T: Unsize<U>`, e.g. coercing
`&mut [&'a X; N]` to `&mut [&'b X]` requires `'a` be equal to `'b`. Soundness
fix.
* [`format!` and `Display::to_string` panic if an underlying formatting
implementation returns an error][40117]. Previously the error was silently
ignored. It is incorrect for `write_fmt` to return an error when writing
to a string.
* [In-tree crates are verified to be unstable][39851]. Previously, some minor
crates were marked stable and could be accessed from the stable toolchain.
* [Rust git source no longer includes vendored crates][39728]. Those that need
to build with vendored crates should build from release tarballs.
* [Fix inert attributes from `proc_macro_derives`][39572]
* [During crate resolution, rustc prefers a crate in the sysroot if two crates
are otherwise identical][39518]. Unlikely to be encountered outside the Rust
build system.
* [Fixed bugs around how type inference interacts with dead-code][39485]. The
existing code generally ignores the type of dead-code unless a type-hint is
provided; this can cause surprising inference interactions particularly around
defaulting. The new code uniformly ignores the result type of dead-code.
* [Tuple-struct constructors with private fields are no longer visible][38932]
* [Lifetime parameters that do not appear in the arguments are now considered
early-bound][38897], resolving a soundness bug (#[32330]). The
`hr_lifetime_in_assoc_type` future-compatibility lint has been in effect since
April of 2016.
* [rustdoc: fix doctests with non-feature crate attributes][38161]
* [Make transmuting from fn item types to pointer-sized types a hard
error][34198]
[27787]: https://github.com/rust-lang/rust/issues/27787
[32330]: https://github.com/rust-lang/rust/issues/32330
[34198]: https://github.com/rust-lang/rust/pull/34198
[38161]: https://github.com/rust-lang/rust/pull/38161
[38368]: https://github.com/rust-lang/rust/pull/38368
[38584]: https://github.com/rust-lang/rust/issues/38584
[38661]: https://github.com/rust-lang/rust/pull/38661
[38764]: https://github.com/rust-lang/rust/pull/38764
[38864]: https://github.com/rust-lang/rust/issues/38864
[38897]: https://github.com/rust-lang/rust/pull/38897
[38921]: https://github.com/rust-lang/rust/pull/38921
[38932]: https://github.com/rust-lang/rust/pull/38932
[38945]: https://github.com/rust-lang/rust/pull/38945
[38981]: https://github.com/rust-lang/rust/pull/38981
[39002]: https://github.com/rust-lang/rust/pull/39002
[39116]: https://github.com/rust-lang/rust/pull/39116
[39193]: https://github.com/rust-lang/rust/pull/39193
[39265]: https://github.com/rust-lang/rust/pull/39265
[39356]: https://github.com/rust-lang/rust/pull/39356
[39372]: https://github.com/rust-lang/rust/pull/39372
[39426]: https://github.com/rust-lang/rust/pull/39426
[39431]: https://github.com/rust-lang/rust/pull/39431
[39438]: https://github.com/rust-lang/rust/pull/39438
[39440]: https://github.com/rust-lang/rust/pull/39440
[39485]: https://github.com/rust-lang/rust/pull/39485
[39491]: https://github.com/rust-lang/rust/pull/39491
[39518]: https://github.com/rust-lang/rust/pull/39518
[39538]: https://github.com/rust-lang/rust/pull/39538
[39572]: https://github.com/rust-lang/rust/pull/39572
[39633]: https://github.com/rust-lang/rust/pull/39633
[39642]: https://github.com/rust-lang/rust/pull/39642
[39728]: https://github.com/rust-lang/rust/pull/39728
[39761]: https://github.com/rust-lang/rust/pull/39761
[39788]: https://github.com/rust-lang/rust/pull/39788
[39793]: https://github.com/rust-lang/rust/pull/39793
[39837]: https://github.com/rust-lang/rust/pull/39837
[39843]: https://github.com/rust-lang/rust/pull/39843
[39851]: https://github.com/rust-lang/rust/pull/39851
[39903]: https://github.com/rust-lang/rust/pull/39903
[39939]: https://github.com/rust-lang/rust/pull/39939
[39953]: https://github.com/rust-lang/rust/pull/39953
[39960]: https://github.com/rust-lang/rust/pull/39960
[39990]: https://github.com/rust-lang/rust/pull/39990
[39995]: https://github.com/rust-lang/rust/pull/39995
[40008]: https://github.com/rust-lang/rust/pull/40008
[40009]: https://github.com/rust-lang/rust/pull/40009
[40027]: https://github.com/rust-lang/rust/pull/40027
[40028]: https://github.com/rust-lang/rust/pull/40028
[40037]: https://github.com/rust-lang/rust/pull/40037
[40117]: https://github.com/rust-lang/rust/pull/40117
[40166]: https://github.com/rust-lang/rust/pull/40166
[40245]: https://github.com/rust-lang/rust/pull/40245
[40257]: https://github.com/rust-lang/rust/pull/40257
[40261]: https://github.com/rust-lang/rust/pull/40261
[40265]: https://github.com/rust-lang/rust/pull/40265
[40319]: https://github.com/rust-lang/rust/pull/40319
[40336]: https://github.com/rust-lang/rust/pull/40336
[40526]: https://github.com/rust-lang/rust/pull/40526
[RFC 1623]: https://github.com/rust-lang/rfcs/blob/master/text/1623-static.md
[RFC 1647]: https://github.com/rust-lang/rfcs/blob/master/text/1647-allow-self-in-where-clauses.md
[RFC 1651]: https://github.com/rust-lang/rfcs/blob/master/text/1651-movecell.md
[RFC 1682]: https://github.com/rust-lang/rfcs/blob/master/text/1682-field-init-shorthand.md
[`Arc::from_raw`]: https://doc.rust-lang.org/std/sync/struct.Arc.html#method.from_raw
[`Arc::into_raw`]: https://doc.rust-lang.org/std/sync/struct.Arc.html#method.into_raw
[`Arc::ptr_eq`]: https://doc.rust-lang.org/std/sync/struct.Arc.html#method.ptr_eq
[`BTreeMap::range_mut`]: https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.range_mut
[`BTreeMap::range`]: https://doc.rust-lang.org/std/collections/btree_map/struct.BTreeMap.html#method.range
[`Cell::into_inner`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.into_inner
[`Cell::replace`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.replace
[`Cell::swap`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.swap
[`Cell::take`]: https://doc.rust-lang.org/std/cell/struct.Cell.html#method.take
[`Ordering::then_with`]: https://doc.rust-lang.org/std/cmp/enum.Ordering.html#method.then_with
[`Ordering::then`]: https://doc.rust-lang.org/std/cmp/enum.Ordering.html#method.then
[`Rc::from_raw`]: https://doc.rust-lang.org/std/rc/struct.Rc.html#method.from_raw
[`Rc::into_raw`]: https://doc.rust-lang.org/std/rc/struct.Rc.html#method.into_raw
[`Rc::ptr_eq`]: https://doc.rust-lang.org/std/rc/struct.Rc.html#method.ptr_eq
[`Result::expect_err`]: https://doc.rust-lang.org/std/result/enum.Result.html#method.expect_err
[`collections::Bound`]: https://doc.rust-lang.org/std/collections/enum.Bound.html
[`process::abort`]: https://doc.rust-lang.org/std/process/fn.abort.html
[`ptr::read_unaligned`]: https://doc.rust-lang.org/std/ptr/fn.read_unaligned.html
[`ptr::write_unaligned`]: https://doc.rust-lang.org/std/ptr/fn.write_unaligned.html
[cargo/3562]: https://github.com/rust-lang/cargo/pull/3562
[cargo/3664]: https://github.com/rust-lang/cargo/pull/3664
[cargo/3667]: https://github.com/rust-lang/cargo/pull/3667
[cargo/3691]: https://github.com/rust-lang/cargo/pull/3691
[cargo/3699]: https://github.com/rust-lang/cargo/pull/3699
[cargo/3731]: https://github.com/rust-lang/cargo/pull/3731
[mdbook]: https://crates.io/crates/mdbook
[ubook]: https://doc.rust-lang.org/unstable-book/
Version 1.16.0 (2017-03-16) Version 1.16.0 (2017-03-16)
=========================== ===========================
Language Language
-------- --------
* Lifetimes in statics and consts default to `'static`. [RFC 1623]
* [The compiler's `dead_code` lint now accounts for type aliases][38051]. * [The compiler's `dead_code` lint now accounts for type aliases][38051].
* [Uninhabitable enums (those without any variants) no longer permit wildcard * [Uninhabitable enums (those without any variants) no longer permit wildcard
match patterns][38069] match patterns][38069]
@ -5056,7 +5463,7 @@ Version 0.1 (2012-01-20)
* Compiler works with the following configurations: * Compiler works with the following configurations:
* Linux: x86 and x86_64 hosts and targets * Linux: x86 and x86_64 hosts and targets
* MacOS: x86 and x86_64 hosts and targets * macOS: x86 and x86_64 hosts and targets
* Windows: x86 hosts and targets * Windows: x86 hosts and targets
* Cross compilation / multi-target configuration supported. * Cross compilation / multi-target configuration supported.

2
configure vendored
View File

@ -479,6 +479,7 @@ valopt i686-linux-android-ndk "" "i686-linux-android NDK standalone path"
valopt arm-linux-androideabi-ndk "" "arm-linux-androideabi NDK standalone path" valopt arm-linux-androideabi-ndk "" "arm-linux-androideabi NDK standalone path"
valopt armv7-linux-androideabi-ndk "" "armv7-linux-androideabi NDK standalone path" valopt armv7-linux-androideabi-ndk "" "armv7-linux-androideabi NDK standalone path"
valopt aarch64-linux-android-ndk "" "aarch64-linux-android NDK standalone path" valopt aarch64-linux-android-ndk "" "aarch64-linux-android NDK standalone path"
valopt x86_64-linux-android-ndk "" "x86_64-linux-android NDK standalone path"
valopt nacl-cross-path "" "NaCl SDK path (Pepper Canary is recommended). Must be absolute!" valopt nacl-cross-path "" "NaCl SDK path (Pepper Canary is recommended). Must be absolute!"
valopt musl-root "/usr/local" "MUSL root installation directory (deprecated)" valopt musl-root "/usr/local" "MUSL root installation directory (deprecated)"
valopt musl-root-x86_64 "" "x86_64-unknown-linux-musl install directory" valopt musl-root-x86_64 "" "x86_64-unknown-linux-musl install directory"
@ -746,6 +747,7 @@ putvar CFG_AARCH64_LINUX_ANDROID_NDK
putvar CFG_ARM_LINUX_ANDROIDEABI_NDK putvar CFG_ARM_LINUX_ANDROIDEABI_NDK
putvar CFG_ARMV7_LINUX_ANDROIDEABI_NDK putvar CFG_ARMV7_LINUX_ANDROIDEABI_NDK
putvar CFG_I686_LINUX_ANDROID_NDK putvar CFG_I686_LINUX_ANDROID_NDK
putvar CFG_X86_64_LINUX_ANDROID_NDK
putvar CFG_NACL_CROSS_PATH putvar CFG_NACL_CROSS_PATH
putvar CFG_MANDIR putvar CFG_MANDIR
putvar CFG_DOCDIR putvar CFG_DOCDIR

View File

@ -50,7 +50,7 @@ Comma separated list of types of crates for the compiler to emit.
\fB\-\-crate\-name\fR \fINAME\fR \fB\-\-crate\-name\fR \fINAME\fR
Specify the name of the crate being built. Specify the name of the crate being built.
.TP .TP
\fB\-\-emit\fR [asm|llvm\-bc|llvm\-ir|obj|link|dep\-info][=\fIPATH\fR] \fB\-\-emit\fR [asm|llvm\-bc|llvm\-ir|obj|link|dep\-info|mir][=\fIPATH\fR]
Configure the output that \fBrustc\fR will produce. Each emission may also have Configure the output that \fBrustc\fR will produce. Each emission may also have
an optional explicit output \fIPATH\fR specified for that particular emission an optional explicit output \fIPATH\fR specified for that particular emission
kind. This path takes precedence over the \fB-o\fR option. kind. This path takes precedence over the \fB-o\fR option.

17
rls/.travis.yml Normal file
View File

@ -0,0 +1,17 @@
language: rust
sudo: true
cache: cargo
os:
- linux
- osx
rust:
- nightly
before_install:
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo add-apt-repository ppa:kubuntu-ppa/backports -y; fi
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo apt-get update -qq; fi
- if [[ "$TRAVIS_OS_NAME" == "linux" ]]; then sudo apt-get install -qq cmake=2.8.12.2-0ubuntu1~ubuntu12.04.1~ppa2; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew update; fi
- if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then brew upgrade cmake; fi
script:
- cargo build --verbose
- cargo test --release --verbose

40
rls/COPYRIGHT Normal file
View File

@ -0,0 +1,40 @@
Short version for non-lawyers:
The Rust Project is dual-licensed under Apache 2.0 and MIT
terms.
Longer version:
The Rust Project is copyright 2010, The Rust Project
Developers.
Licensed under the Apache License, Version 2.0
<LICENSE-APACHE or
http://www.apache.org/licenses/LICENSE-2.0> or the MIT
license <LICENSE-MIT or http://opensource.org/licenses/MIT>,
at your option. All files in the project carrying such
notice may not be copied, modified, or distributed except
according to those terms.
* Additional copyright may be retained by contributors other
than Mozilla, the Rust Project Developers, or the parties
enumerated in this file. Such copyright can be determined
on a case-by-case basis by examining the author of each
portion of a file in the revision-control commit records
of the project, or by consulting representative comments
claiming copyright ownership for a file.
For example, the text:
"Copyright (c) 2011 Google Inc."
appears in some files, and these files thereby denote
that their author and copyright-holder is Google Inc.
In all such cases, the absence of explicit licensing text
indicates that the contributor chose to license their work
for distribution under identical terms to those Mozilla
has chosen for the collective work, enumerated at the top
of this file. The only difference is the retention of
copyright itself, held by the contributor.

1349
rls/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

28
rls/Cargo.toml Normal file
View File

@ -0,0 +1,28 @@
[package]
name = "rls"
version = "0.1.0"
authors = ["Jonathan Turner <jturner@mozilla.com>"]
[dependencies]
cargo = { git = "https://github.com/rust-lang/cargo" }
derive-new = "0.3"
env_logger = "0.3"
languageserver-types = { git = "https://github.com/gluon-lang/languageserver-types" }
log = "0.3"
racer = { git = "https://github.com/phildawes/racer" }
rls-analysis = { git = "https://github.com/nrc/rls-analysis" }
rls-data = "0.1"
rls-span = { version = "0.1", features = ["serialize-serde"] }
rls-vfs = { git = "https://github.com/nrc/rls-vfs", features = ["racer-impls"] }
rustc-serialize = "0.3"
rustfmt = { git = "https://github.com/rust-lang-nursery/rustfmt" }
serde = "0.9"
serde_json = "0.9"
serde_derive = "0.9"
toml = "0.3"
url = "1.1.0"
url_serde = "0.1.0"
[dependencies.hyper]
version = "0.9"
default-features = false

201
rls/LICENSE-APACHE Normal file
View File

@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

25
rls/LICENSE-MIT Normal file
View File

@ -0,0 +1,25 @@
Copyright (c) 2016 The Rust Project Developers
Permission is hereby granted, free of charge, to any
person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the
Software without restriction, including without
limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software
is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice
shall be included in all copies or substantial portions
of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR
IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.

123
rls/README.md Normal file
View File

@ -0,0 +1,123 @@
[![Build Status](https://travis-ci.org/rust-lang-nursery/rls.svg?branch=master)](https://travis-ci.org/rust-lang-nursery/rls) [![Build status](https://ci.appveyor.com/api/projects/status/cxfejvsqnnc1oygs?svg=true)](https://ci.appveyor.com/project/jonathandturner/rls-x6grn)
# Rust Language Server (RLS)
**This project is in the alpha stage of development. It is likely to be buggy in
some situations; proceed with caution.**
The RLS provides a server that runs in the background, providing IDEs,
editors, and other tools with information about Rust programs. It supports
functionality such as 'goto definition', symbol search, reformatting, and code
completion, and enables renaming and refactorings.
The RLS gets its source data from the compiler and from
[Racer](https://github.com/phildawes/racer). Where possible it uses data from
the compiler which is precise and complete. Where its not possible, (for example
for code completion and where building is too slow), it uses Racer.
Since the Rust compiler does not yet support end-to-end incremental compilation,
we can't offer a perfect experience. However, by optimising our use of the
compiler and falling back to Racer, we can offer a pretty good experience for
small to medium sized crates. As the RLS and compiler evolve, we'll offer a
better experience for larger and larger crates.
The RLS is designed to be frontend-independent. We hope it will be widely
adopted by different editors and IDEs. To seed development, we provide a
[reference implementation of an RLS frontend](https://github.com/jonathandturner/rls_vscode)
for [Visual Studio Code](https://code.visualstudio.com/).
## Setup
### Step 1: Install rustup
You can install [rustup](http://rustup.rs/) on many platforms. This will help us quickly install the
rls and its dependencies.
### Step 2: Switch to nightly
Switch to the nightly compiler:
```
rustup default nightly
rustup update nightly
```
### Step 3: Install the RLS
Once you have rustup installed, run the following commands:
```
rustup component add rls
rustup component add rust-analysis
rustup component add rust-src
```
If you've never set up Racer before, you may also need follow the [Racer configuration
steps](https://github.com/phildawes/racer#configuration)
## Running
Though the RLS is built to work with many IDEs and editors, we currently use
VSCode to test the RLS.
To run with VSCode, you'll need a
[recent VSCode version](https://code.visualstudio.com/download) installed.
Next, you'll need to run the VSCode extension (for this step, you'll need a
recent [node](https://nodejs.org/en/) installed:
```
git clone https://github.com/jonathandturner/rls_vscode.git
cd rls_vscode
npm install
code .
```
VSCode will open into the `rls_vscode` project. From here, click the Debug
button on the left-hand side (a bug with a line through it). Next, click the
green triangle at the top. This will launch a new instance of VSCode with the
`rls_vscode` plugin enabled. From there, you can open your Rust projects using
the RLS.
You'll know it's working when you see this in the status bar at the bottom, with
a spinning indicator:
`RLS analysis: working /`
Once you see:
`RLS analysis: done`
Then you have the full set of capabilities available to you. You can goto def,
find all refs, rename, goto type, etc. Completions are also available using the
heuristics that Racer provides. As you type, your code will be checked and
error squiggles will be reported when errors occur. You can hover these
squiggles to see the text of the error.
## Configuration
The RLS can be configured on a per-project basis by adding a file called
`rls.toml` to the project root (i.e., next to Cargo.toml). Entries in this file
will affect how the RLS operates and how it builds your project.
Currently we accept the following options:
* `build_lib` (`bool`, defaults to `false`) checks the project as if you passed
the `--lib` argument to cargo.
* `cfg_test` (`bool`, defaults to `true`) checks the project as if you were
running `cargo test` rather than `cargo build`. I.e., compiles (but does not
run) test code.
* `unstable_features` (`bool`, defaults to `false`) enables unstable features.
Currently, this includes renaming and formatting.
* `sysroot` (`String`, defaults to `""`) if the given string is not empty, use
the given path as the sysroot for all rustc invocations instead of trying to
detect the sysroot automatically
## Contributing
You can look in the [contributing.md](https://github.com/rust-lang-nursery/rls/blob/master/contributing.md)
in this repo to learn more about contributing to this project.

41
rls/appveyor.yml Normal file
View File

@ -0,0 +1,41 @@
environment:
global:
RUST_TEST_THREADS: 1
PROJECT_NAME: rls
matrix:
# Nightly channel
#- TARGET: i686-pc-windows-gnu
# CHANNEL: nightly
# BITS: 32
- TARGET: i686-pc-windows-msvc
CHANNEL: nightly
BITS: 32
#- TARGET: x86_64-pc-windows-gnu
# CHANNEL: nightly
# BITS: 64
- TARGET: x86_64-pc-windows-msvc
CHANNEL: nightly
BITS: 64
install:
- set PATH=C:\msys64\mingw%BITS%\bin;C:\msys64\usr\bin;%PATH%
- curl -sSf -o rustup-init.exe https://win.rustup.rs
# Install rust, x86_64-pc-windows-msvc host
- rustup-init.exe -y --default-host x86_64-pc-windows-msvc --default-toolchain nightly-x86_64-pc-windows-msvc
# Install the target we're compiling for
- set PATH=%PATH%;C:\Users\appveyor\.cargo\bin
- set PATH=%PATH%;C:\Users\appveyor\.multirust\toolchains\nightly-x86_64-pc-windows-msvc\lib\rustlib\%TARGET%\lib
- if NOT "%TARGET%" == "x86_64-pc-windows-msvc" rustup target add %TARGET%
- rustc -Vv
- cargo -V
build: false
test_script:
- set RUST_TEST_THREADS=1
- cargo test --release --target %TARGET% --verbose
cache:
- target
- C:\Users\appveyor\.cargo\registry

300
rls/contributing.md Normal file
View File

@ -0,0 +1,300 @@
# Contributing
This document provides information for developers who want to contribute to the
RLS or run it in a heavily customised configuration.
The RLS is open source and we'd love you to contribute to the project. Testing,
reporting issues, writing documentation, writing tests, writing code, and
implementing clients are all extremely valuable.
Here is the list of known [issues](https://github.com/rust-lang-nursery/rls/issues).
These are [good issues to start on](https://github.com/rust-lang-nursery/rls/issues?q=is%3Aopen+is%3Aissue+label%3Aeasy).
We're happy to help however we can. The best way to get help is either to
leave a comment on an issue in this repo, or to ping us (nrc or jntrnr) in #rust-tools
on IRC.
We'd love for existing and new tools to use the RLS. If that sounds interesting
please get in touch by filing an issue or on IRC.
## Building
**YOU NEED A VERY RECENT NIGHTLY COMPILER**
Otherwise the RLS will not work very well. You also don't need to build the `rls` to use it. Instead, you can
install via `rustup`, which is the currently preferred method. See the [readme](README.md) for more information.
### Step 1: Install build dependencies
On Linux, you will need [pkg-config](https://www.freedesktop.org/wiki/Software/pkg-config/)
and [zlib](http://zlib.net/):
- On Ubuntu run: `sudo apt-get install pkg-config zlib1g-dev`
- On Fedora run: `sudo dnf install pkgconfig zlib-devel`
### Step 2: Clone and build the RLS
Since the RLS is closely linked to the compiler and is in active development,
you'll need a recent nightly compiler to build it.
```
git clone https://github.com/rust-lang-nursery/rls.git
cd rls
cargo build --release
```
### Step 3: Connect the RLS to your compiler
If you're using recent versions of rustup, you will also need to make sure that
the compiler's dynamic libraries are available for the RLS to load. You can see
where they are using:
```
rustc --print sysroot
```
This will show you where the compiler keeps the dynamic libs. In Windows, this
will be in the `bin` directory under this path. On other platforms, it will be
in the `lib` directory.
Next, you'll make the compiler available to the RLS:
#### Windows
On Windows, make sure this path (plus `bin`) is in your PATH. For example:
```
set PATH=%PATH%;C:\Users\appveyor\.multirust\toolchains\nightly-i686-pc-windows-gnu\bin
```
#### Mac
For Mac, you need to set the DYLD_LIBRARY_PATH. For example:
```
export DYLD_LIBRARY_PATH=$(rustc --print sysroot)/lib
```
#### Linux
For Linux, this path is called LD_LIBRARY_PATH.
```
export LD_LIBRARY_PATH=$(rustc --print sysroot)/lib
```
### Step 4: Set your RLS_ROOT
Next, we'll set the RLS_ROOT environment variable to point to where we built
the RLS:
```
export RLS_ROOT=/Source/rls
```
### Step 5: Download standard library metadata
Finally, we need to get the metadata for the standard library. This lets
us get additional docs and types for all of `std`. The command is currently only
supported on the nightly compilers, though we hope to remove this restriction in
the future.
```
rustup component add rust-analysis
```
If you've never set up Racer before, you may also need follow the [Racer configuration
steps](https://github.com/phildawes/racer#configuration)
## Running and testing
You can run the rls by hand with:
```
cargo run
```
Though more commonly, you'll use an IDE plugin to invoke it for you.
Test using `cargo test`.
Testing is unfortunately minimal. There is support for regression tests, but not
many actual tests exists yet. There is signifcant [work to do](https://github.com/rust-lang-nursery/rls/issues/12)
before we have a comprehensive testing story.
## Standard library support
The way it works is that when the libraries are built, the compiler can emit all
the data that the RLS needs. This can be read by the RLS on startup and used to
provide things like type on hover without having access to the source code for
the libraries.
The compiler gives every definition an id, and the RLS matches up these ids. In
order for the RLS to work, the id of a identifier used in the IDE and the id of
its declaration in a library must match exactly. Since ids are very unstable,
the data used by the RLS for libraries must match exactly with the crate that
your source code links with.
You need a version of the above data which exactly matches the standard
libraries you will use with your project. Rustup takes care of this for you and
is the preferred (and easiest) method for installing this data. If you want to
use the RLS with a Rust compiler/libraries you have built yourself, then you'll
need to take some extra steps.
### Install with rustup
You'll need to be using [rustup](https://www.rustup.rs/) to manage your Rust
compiler toolchains. The RLS does not yet support cross-compilation - your
compiler host and target must be exactly the same.
You must be using nightly (you need to be using nightly for the RLS to work at
the moment in any case). To install a nightly toolchain use `rustup install
nightly`. To switch to using that nightly toolchain by default use `rustup
default nightly`.
Add the RLS data component using `rustup component add rust-analysis`.
Everything should now work! You may need to restart the RLS.
### Build it yourself
When you build Rust, add `-Zsave-analysis-api` to your stage 2 flags, e.g., by
setting the environment variable:
```
export RUSTFLAGS_STAGE2='-Zsave-analysis-api'
```
When the build has finished, you should have a bunch of JSON data in a directory like
`~/rust1/build/x86_64-unknown-linux-gnu/stage1-std/x86_64-unknown-linux-gnu/release/deps/save-analysis`.
You need to copy all those files (should be around 16) into a new directory:
`~/rust1/build/x86_64-unknown-linux-gnu/stage2/lib/rustlib/x86_64-unknown-linux-gnu/analysis`
(assuming you are running the stage 2 compiler you just built. You'll need to
modify the root directory (`~/rust1` here) and the host triple
(`x86_64-unknown-linux-gnu` in both places)).
Finally, to run the RLS you'll need to set things up to use the newly built
compiler, something like:
```
export RUSTC="~/rust1/build/x86_64-unknown-linux-gnu/stage2/bin/rustc"
```
Either before you run the RLS, or before you run the IDE which will start the
RLS.
### Details
Rustup (or you, manually) will install the rls data (which is a bunch of json
files) into `$SYSROOT/lib/rustlib/$TARGET_TRIPLE/analysis`, where `$SYSROOT` is
your Rust sysroot, this can be found using `rustc --print=sysroot`.
`$TARGET_TRIPLE` is the triple which defines the compilation target. Since the
RLS currently does not support cross-compilation, this must match your host
triple. It will look something like `x86_64-unknown-linux-gnu`.
For example, on my system RLS data is installed at:
```
/home/ncameron/.rustup/toolchains/nightly-x86_64-unknown-linux-gnu/lib/rustlib/x86_64-unknown-linux-gnu/analysis
```
This data is only for the standard libraries, project-specific data is stored
inside your project's target directory.
## Implementation overview
The goal of the RLS project is to provide an awesome IDE experience *now*. That
means not waiting for incremental compilation support in the compiler. However,
Rust is a somewhat complex language to analyse and providing precise and
complete information about programs requires using the compiler.
The RLS has two data sources - the compiler and Racer. The compiler is always
right, and always precise. But can sometimes be too slow for IDEs. Racer is
nearly always fast, but can't handle some constructs (e.g., macros) or can only
handle them with limited precision (e.g., complex generic types).
The RLS tries to provide data using the compiler. It sets a time budget and
queries both the compiler and Racer. If the compiler completes within the time
budget, we use that data. If not, we use Racer's data.
We link both Racer and the compiler into the RLS, so we don't need to shell out
to either (though see notes on the build process below). We also customise our
use of the compiler (via standard APIs) so that we can read modified files
directly from memory without saving them to disk.
### Building
The RLS tracks changes to files, and keeps the changed file in memory (i.e., the
RLS does not need the IDE to save a file before providing data). These changed
files are tracked by the 'Virtual File System' (which is a bit of a grandiose
name for a pretty simple file cache at the moment, but I expect this area to
grow significantly in the future). The VFS is in a [separate
crate](https://github.com/nrc/rls-vfs).
We want to start building before the user needs information (it would be too
slow to start a build when data is requested). However, we don't want to start a
build on every keystroke (this would be too heavy on user resources). Nor is
there any point starting multiple builds when we would throw away the data from
some of them. We therefore try to queue up and coalesce builds. This is further
documented in [src/build.rs](src/build.rs).
When we do start a build, we may also need to build dependent crates. We
therefore do a full `cargo build`. However, we do not compile the last crate
(the one the user is editing in the IDE). We only run Cargo to get a command
line to build that crate. Furthermore, we cache that command line, so for most
builds (where we don't need to build dependent crates, and where we can be
reasonably sure they haven't changed since a previous build) we don't run Cargo
at all.
The command line we got from Cargo, we chop up and feed to the in-process
compiler. We then collect error messages and analysis data in JSON format
(although this is inefficient and [should
change](https://github.com/rust-lang-nursery/rls/issues/25)).
### Analysis data
From the compiler, we get a serialised dump of its analysis data (from name
resolution and type checking). We combine data from all crates and the standard
libraries and combine this into an index for the whole project. We cross-
reference and store this data in HashMaps and use it to look up data for the
IDE.
Reading, processing, and storing the analysis data is handled by the
[rls-analysis crate](https://github.com/nrc/rls-analysis).
### Communicating with IDEs
The RLS communicates with IDEs via
the [Language Server protocol](https://github.com/Microsoft/language-server-protocol/blob/master/protocol.md).
The LS protocol uses JSON sent over stdin/stdout. The JSON is rather dynamic -
we can't make structs to easily map to many of the protocol objects. The client
sends commands and notifications to the RLS. Commands must get a reply,
notifications do not. Usually the structure of the reply is dictated by the
protocol spec. The RLS can also send notifications to the client. So for a long
running task (such as a build), the RLS will reply quickly to acknowledge the
request, then send a message later with the result of the task.
Associating requests with replies is done using an id which must be handled by
the RLS.
### Extensions to the Language Server Protocol
The RLS uses some custom extensions to the Language Server Protocol. Currently
these are all sent from the RLS to an LSP client and are only used to improve
the user experience by showing progress indicators.
* `rustDocument/diagnosticsBegin`: notification, no arguments. Sent before a
build starts and before any diagnostics from a build are sent.
* `rustDocument/diagnosticsEnd`: notification, no arguments. Sent when a build
is complete (successfully or not, or even skipped) and all post-build analysis
by the RLS is complete.

5
rls/goto_def.sh Executable file
View File

@ -0,0 +1,5 @@
# jntrnr's test
#curl -v -H "Content-Type: application/json" -X POST -d '{{"pos": {"filepath":"sample_project/src/main.rs","line":22,"col":5}, "span":{"file_name":"sample_project/src/main.rs","line_start":22,"column_start":5,"line_end":22,"column_end":6}}}' 127.0.0.1:9000/goto_def
# nrc's test
curl -v -H "Content-Type: application/json" -X POST -d '{{"pos": {"filepath":"sample_project_2/src/main.rs","line":18,"col":15}, "span":{"file_name":"src/main.rs","line_start":18,"column_start":13,"line_end":18,"column_end":16}}}' 127.0.0.1:9000/goto_def

6
rls/ls.sh Executable file
View File

@ -0,0 +1,6 @@
export PATH="$PWD/target/debug:$PATH"
#export RUSTC="/home/ncameron/rust/x86_64-unknown-linux-gnu/stage2/bin/rustc"
#export SYS_ROOT="/home/ncameron/rust/x86_64-unknown-linux-gnu/stage2"
#export SYS_ROOT="/usr/local"
export RUST_BACKTRACE=1
cargo build && code

View File

@ -0,0 +1,126 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::path::PathBuf;
use ls_types::{DiagnosticSeverity, NumberOrString};
use serde_json;
use span::compiler::DiagnosticSpan;
use span;
use actions::lsp_extensions::{RustDiagnostic, LabelledRange};
use lsp_data::ls_util;
#[derive(Debug, Deserialize)]
struct CompilerMessageCode {
code: String
}
#[derive(Debug, Deserialize)]
struct CompilerMessage {
message: String,
code: Option<CompilerMessageCode>,
level: String,
spans: Vec<DiagnosticSpan>,
children: Vec<CompilerMessage>,
}
#[derive(Debug)]
pub struct FileDiagnostic {
pub file_path: PathBuf,
pub diagnostic: RustDiagnostic,
}
#[derive(Debug)]
pub enum ParseError {
JsonError(serde_json::Error),
NoSpans,
}
impl From<serde_json::Error> for ParseError {
fn from(error: serde_json::Error) -> Self {
ParseError::JsonError(error)
}
}
pub fn parse(message: &str) -> Result<FileDiagnostic, ParseError> {
let message = serde_json::from_str::<CompilerMessage>(message)?;
if message.spans.is_empty() {
return Err(ParseError::NoSpans);
}
let message_text = compose_message(&message);
let primary = message.spans.iter()
.filter(|x| x.is_primary)
.collect::<Vec<&span::compiler::DiagnosticSpan>>()[0].clone();
let primary_span = primary.rls_span().zero_indexed();
let primary_range = ls_util::rls_to_range(primary_span.range);
// build up the secondary spans
let secondary_labels: Vec<LabelledRange> = message.spans.iter()
.filter(|x| !x.is_primary)
.map(|x| {
let secondary_range = ls_util::rls_to_range(x.rls_span().zero_indexed().range);
LabelledRange {
start: secondary_range.start,
end: secondary_range.end,
label: x.label.clone(),
}
}).collect();
let diagnostic = RustDiagnostic {
range: LabelledRange {
start: primary_range.start,
end: primary_range.end,
label: primary.label.clone(),
},
secondaryRanges: secondary_labels,
severity: Some(if message.level == "error" {
DiagnosticSeverity::Error
} else {
DiagnosticSeverity::Warning
}),
code: Some(NumberOrString::String(match message.code {
Some(c) => c.code.clone(),
None => String::new(),
})),
source: Some("rustc".into()),
message: message_text,
};
Ok(FileDiagnostic {
file_path: primary_span.file.clone(),
diagnostic: diagnostic
})
}
/// Builds a more sophisticated error message
fn compose_message(compiler_message: &CompilerMessage) -> String {
let mut message = compiler_message.message.clone();
for sp in &compiler_message.spans {
if !sp.is_primary {
continue;
}
if let Some(ref label) = sp.label {
message.push_str("\n");
message.push_str(label);
}
}
if !compiler_message.children.is_empty() {
message.push_str("\n");
for child in &compiler_message.children {
message.push_str(&format!("\n{}: {}", child.level, child.message));
}
}
message
}

View File

@ -0,0 +1,62 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use url_serde;
use lsp_data::*;
use url::Url;
#[derive(Debug, PartialEq, Deserialize, Serialize)]
pub struct PublishRustDiagnosticsParams {
/// The URI for which diagnostic information is reported.
#[serde(deserialize_with = "url_serde::deserialize", serialize_with = "url_serde::serialize")]
pub uri: Url,
/// An array of diagnostic information items.
pub diagnostics: Vec<RustDiagnostic>,
}
/// A range in a text document expressed as (zero-based) start and end positions.
/// A range is comparable to a selection in an editor. Therefore the end position is exclusive.
#[derive(Debug, PartialEq, Clone, Default, Deserialize, Serialize)]
pub struct LabelledRange {
/// The range's start position.
pub start: Position,
/// The range's end position.
pub end: Position,
/// The optional label.
pub label: Option<String>,
}
/// Represents a diagnostic, such as a compiler error or warning.
/// Diagnostic objects are only valid in the scope of a resource.
#[allow(non_snake_case)]
#[derive(Debug, PartialEq, Clone, Default, Deserialize, Serialize)]
pub struct RustDiagnostic {
/// The primary range at which the message applies.
pub range: LabelledRange,
/// The secondary ranges that apply to the message
pub secondaryRanges: Vec<LabelledRange>,
/// The diagnostic's severity. Can be omitted. If omitted it is up to the
/// client to interpret diagnostics as error, warning, info or hint.
pub severity: Option<DiagnosticSeverity>,
/// The diagnostic's code. Can be omitted.
pub code: Option<NumberOrString>,
/// A human-readable string describing the source of this
/// diagnostic, e.g. 'typescript' or 'super lint'.
pub source: Option<String>,
/// The diagnostic's message.
pub message: String,
}

532
rls/src/actions/mod.rs Normal file
View File

@ -0,0 +1,532 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
mod compiler_message_parsing;
mod lsp_extensions;
use analysis::{AnalysisHost};
use hyper::Url;
use vfs::{Vfs, Change};
use racer;
use rustfmt::{Input as FmtInput, format_input};
use rustfmt::config::{self, WriteMode};
use serde_json;
use span;
use Span;
use build::*;
use lsp_data::*;
use server::{ResponseData, Output};
use std::collections::HashMap;
use std::panic;
use std::path::{Path, PathBuf};
use std::sync::{Arc, Mutex};
use std::thread;
use std::time::Duration;
use self::lsp_extensions::{PublishRustDiagnosticsParams, RustDiagnostic};
use self::compiler_message_parsing::{FileDiagnostic, ParseError};
type BuildResults = HashMap<PathBuf, Vec<RustDiagnostic>>;
pub struct ActionHandler {
analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>,
current_project: Mutex<Option<PathBuf>>,
previous_build_results: Mutex<BuildResults>,
}
impl ActionHandler {
pub fn new(analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>) -> ActionHandler {
ActionHandler {
analysis: analysis,
vfs: vfs,
build_queue: build_queue,
current_project: Mutex::new(None),
previous_build_results: Mutex::new(HashMap::new()),
}
}
pub fn init(&self, root_path: PathBuf, out: &Output) {
{
let mut results = self.previous_build_results.lock().unwrap();
results.clear();
}
{
let mut current_project = self.current_project.lock().unwrap();
*current_project = Some(root_path.clone());
}
self.build(&root_path, BuildPriority::Immediate, out);
}
pub fn build(&self, project_path: &Path, priority: BuildPriority, out: &Output) {
fn clear_build_results(results: &mut BuildResults) {
// We must not clear the hashmap, just the values in each list.
// This allows us to save allocated before memory.
for v in &mut results.values_mut() {
v.clear();
}
}
fn parse_compiler_messages(messages: &[String], results: &mut BuildResults) {
for msg in messages {
match compiler_message_parsing::parse(msg) {
Ok(FileDiagnostic { file_path, diagnostic }) => {
results.entry(file_path).or_insert_with(Vec::new).push(diagnostic);
}
Err(ParseError::JsonError(e)) => {
debug!("build error {:?}", e);
debug!("from {}", msg);
}
Err(ParseError::NoSpans) => {}
}
}
}
fn convert_build_results_to_notifications(build_results: &BuildResults,
project_path: &Path)
-> Vec<NotificationMessage<PublishRustDiagnosticsParams>>
{
let cwd = ::std::env::current_dir().unwrap();
build_results
.iter()
.map(|(path, diagnostics)| {
let method = "textDocument/publishDiagnostics".to_string();
let params = PublishRustDiagnosticsParams {
uri: Url::from_file_path(cwd.join(path)).unwrap(),
diagnostics: diagnostics.clone(),
};
NotificationMessage::new(method, params)
})
.collect()
}
// We use `rustDocument` document here since these notifications are
// custom to the RLS and not part of the LS protocol.
out.notify("rustDocument/diagnosticsBegin");
debug!("build {:?}", project_path);
let result = self.build_queue.request_build(project_path, priority);
match result {
BuildResult::Success(x, analysis) | BuildResult::Failure(x, analysis) => {
debug!("build - Success");
// These notifications will include empty sets of errors for files
// which had errors, but now don't. This instructs the IDE to clear
// errors for those files.
let notifications = {
let mut results = self.previous_build_results.lock().unwrap();
clear_build_results(&mut results);
parse_compiler_messages(&x, &mut results);
convert_build_results_to_notifications(&results, project_path)
};
// TODO we don't send an OK notification if there were no errors
for notification in notifications {
// FIXME(43) factor out the notification mechanism.
let output = serde_json::to_string(&notification).unwrap();
out.response(output);
}
trace!("reload analysis: {:?}", project_path);
let cwd = ::std::env::current_dir().unwrap();
if let Some(analysis) = analysis {
self.analysis.reload_from_analysis(analysis, project_path, &cwd, false).unwrap();
} else {
self.analysis.reload(project_path, &cwd, false).unwrap();
}
out.notify("rustDocument/diagnosticsEnd");
}
BuildResult::Squashed => {
trace!("build - Squashed");
out.notify("rustDocument/diagnosticsEnd");
},
BuildResult::Err => {
trace!("build - Error");
out.notify("rustDocument/diagnosticsEnd");
},
}
}
pub fn on_open(&self, open: DidOpenTextDocumentParams, out: &Output) {
let fname = parse_file_path(&open.text_document.uri).unwrap();
self.vfs.set_file(fname.as_path(), &open.text_document.text);
trace!("on_open: {:?}", fname);
self.build_current_project(BuildPriority::Normal, out);
}
pub fn on_change(&self, change: DidChangeTextDocumentParams, out: &Output) {
let fname = parse_file_path(&change.text_document.uri).unwrap();
let changes: Vec<Change> = change.content_changes.iter().map(move |i| {
if let Some(range) = i.range {
let range = ls_util::range_to_rls(range);
Change::ReplaceText {
span: Span::from_range(range, fname.clone()),
text: i.text.clone()
}
} else {
Change::AddFile {
file: fname.clone(),
text: i.text.clone(),
}
}
}).collect();
self.vfs.on_changes(&changes).unwrap();
trace!("on_change: {:?}", changes);
self.build_current_project(BuildPriority::Normal, out);
}
pub fn on_save(&self, save: DidSaveTextDocumentParams, out: &Output) {
let fname = parse_file_path(&save.text_document.uri).unwrap();
self.vfs.file_saved(&fname).unwrap();
self.build_current_project(BuildPriority::Immediate, out);
}
fn build_current_project(&self, priority: BuildPriority, out: &Output) {
let current_project = {
let current_project = self.current_project.lock().unwrap();
current_project.clone()
};
match current_project {
Some(ref current_project) => self.build(current_project, priority, out),
None => debug!("build_current_project - no project path"),
}
}
pub fn symbols(&self, id: usize, doc: DocumentSymbolParams, out: &Output) {
let t = thread::current();
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let file_name = parse_file_path(&doc.text_document.uri).unwrap();
let symbols = analysis.symbols(&file_name).unwrap_or_else(|_| vec![]);
t.unpark();
symbols.into_iter().map(|s| {
SymbolInformation {
name: s.name,
kind: source_kind_from_def_kind(s.kind),
location: ls_util::rls_to_location(&s.span),
container_name: None // FIXME: more info could be added here
}
}).collect()
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join().unwrap_or_else(|_| vec![]);
out.success(id, ResponseData::SymbolInfo(result));
}
pub fn complete(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let result: Vec<CompletionItem> = panic::catch_unwind(move || {
let file_path = &parse_file_path(&params.text_document.uri).unwrap();
let cache = racer::FileCache::new(self.vfs.clone());
let session = racer::Session::new(&cache);
let location = pos_to_racer_location(params.position);
let results = racer::complete_from_file(file_path, location, &session);
results.map(|comp| completion_item_from_racer_match(comp)).collect()
}).unwrap_or_else(|_| vec![]);
out.success(id, ResponseData::CompletionItems(result));
}
pub fn rename(&self, id: usize, params: RenameParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, true);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let mut edits: HashMap<Url, Vec<TextEdit>> = HashMap::new();
for item in result.iter() {
let loc = ls_util::rls_to_location(item);
edits.entry(loc.uri).or_insert_with(Vec::new).push(TextEdit {
range: loc.range,
new_text: params.new_name.clone(),
});
}
out.success(id, ResponseData::WorkspaceEdit(WorkspaceEdit { changes: edits }));
}
pub fn highlight(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, true);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let refs: Vec<_> = result.iter().map(|span| DocumentHighlight {
range: ls_util::rls_to_range(span.range),
kind: Some(DocumentHighlightKind::Text),
}).collect();
out.success(id, ResponseData::Highlights(refs));
}
pub fn find_all_refs(&self, id: usize, params: ReferenceParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let handle = thread::spawn(move || {
let result = analysis.find_all_refs(&span, params.context.include_declaration);
t.unpark();
result
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = handle.join().ok().and_then(|t| t.ok()).unwrap_or_else(Vec::new);
let refs: Vec<_> = result.iter().map(|item| ls_util::rls_to_location(item)).collect();
out.success(id, ResponseData::Locations(refs));
}
pub fn goto_def(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
// Save-analysis thread.
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
let analysis = self.analysis.clone();
let vfs = self.vfs.clone();
let compiler_handle = thread::spawn(move || {
let result = analysis.goto_def(&span);
t.unpark();
result
});
// Racer thread.
let racer_handle = thread::spawn(move || {
let file_path = &parse_file_path(&params.text_document.uri).unwrap();
let cache = racer::FileCache::new(vfs);
let session = racer::Session::new(&cache);
let location = pos_to_racer_location(params.position);
racer::find_definition(file_path, location, &session)
.and_then(location_from_racer_match)
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let compiler_result = compiler_handle.join();
match compiler_result {
Ok(Ok(r)) => {
let result = vec![ls_util::rls_to_location(&r)];
trace!("goto_def TO: {:?}", result);
out.success(id, ResponseData::Locations(result));
}
_ => {
info!("goto_def - falling back to Racer");
match racer_handle.join() {
Ok(Some(r)) => {
trace!("goto_def: {:?}", r);
out.success(id, ResponseData::Locations(vec![r]));
}
_ => {
debug!("Error in Racer");
out.failure(id, "GotoDef failed to complete successfully");
}
}
}
}
}
pub fn hover(&self, id: usize, params: TextDocumentPositionParams, out: &Output) {
let t = thread::current();
let span = self.convert_pos_to_span(&params.text_document, params.position);
trace!("hover: {:?}", span);
let analysis = self.analysis.clone();
let rustw_handle = thread::spawn(move || {
let ty = analysis.show_type(&span).unwrap_or_else(|_| String::new());
let docs = analysis.docs(&span).unwrap_or_else(|_| String::new());
let doc_url = analysis.doc_url(&span).unwrap_or_else(|_| String::new());
t.unpark();
let mut contents = vec![];
if !docs.is_empty() {
contents.push(MarkedString::from_markdown(docs.into()));
}
if !doc_url.is_empty() {
contents.push(MarkedString::from_markdown(doc_url.into()));
}
if !ty.is_empty() {
contents.push(MarkedString::from_language_code("rust".into(), ty.into()));
}
Hover {
contents: contents,
range: None, // TODO: maybe add?
}
});
thread::park_timeout(Duration::from_millis(::COMPILER_TIMEOUT));
let result = rustw_handle.join();
match result {
Ok(r) => {
out.success(id, ResponseData::HoverSuccess(r));
}
Err(_) => {
out.failure(id, "Hover failed to complete successfully");
}
}
}
pub fn reformat(&self, id: usize, doc: TextDocumentIdentifier, out: &Output) {
trace!("Reformat: {} {:?}", id, doc);
let path = &parse_file_path(&doc.uri).unwrap();
let input = match self.vfs.load_file(path) {
Ok(s) => FmtInput::Text(s),
Err(e) => {
debug!("Reformat failed: {:?}", e);
out.failure(id, "Reformat failed to complete successfully");
return;
}
};
let mut config = config::Config::default();
config.skip_children = true;
config.write_mode = WriteMode::Plain;
let mut buf = Vec::<u8>::new();
match format_input(input, &config, Some(&mut buf)) {
Ok((summary, ..)) => {
// format_input returns Ok even if there are any errors, i.e., parsing errors.
if summary.has_no_errors() {
// Note that we don't need to keep the VFS up to date, the client
// echos back the change to us.
let range = ls_util::range_from_vfs_file(&self.vfs, path);
let text = String::from_utf8(buf).unwrap();
let result = [TextEdit {
range: range,
new_text: text,
}];
out.success(id, ResponseData::TextEdit(result))
} else {
debug!("reformat: format_input failed: has errors, summary = {:?}", summary);
out.failure(id, "Reformat failed to complete successfully")
}
}
Err(e) => {
debug!("Reformat failed: {:?}", e);
out.failure(id, "Reformat failed to complete successfully")
}
}
}
fn convert_pos_to_span(&self, doc: &TextDocumentIdentifier, pos: Position) -> Span {
let fname = parse_file_path(&doc.uri).unwrap();
trace!("convert_pos_to_span: {:?} {:?}", fname, pos);
let pos = ls_util::position_to_rls(pos);
let line = self.vfs.load_line(&fname, pos.row).unwrap();
trace!("line: `{}`", line);
let start_pos = {
let mut col = 0;
for (i, c) in line.chars().enumerate() {
if !(c.is_alphanumeric() || c == '_') {
col = i + 1;
}
if i == pos.col.0 as usize {
break;
}
}
trace!("start: {}", col);
span::Position::new(pos.row, span::Column::new_zero_indexed(col as u32))
};
let end_pos = {
let mut col = pos.col.0 as usize;
for c in line.chars().skip(col) {
if !(c.is_alphanumeric() || c == '_') {
break;
}
col += 1;
}
trace!("end: {}", col);
span::Position::new(pos.row, span::Column::new_zero_indexed(col as u32))
};
Span::from_positions(start_pos,
end_pos,
fname.to_owned())
}
}
fn racer_coord(line: span::Row<span::OneIndexed>,
column: span::Column<span::ZeroIndexed>)
-> racer::Coordinate {
racer::Coordinate {
line: line.0 as usize,
column: column.0 as usize,
}
}
fn from_racer_coord(coord: racer::Coordinate) -> (span::Row<span::OneIndexed>,span::Column<span::ZeroIndexed>) {
(span::Row::new_one_indexed(coord.line as u32), span::Column::new_zero_indexed(coord.column as u32))
}
fn pos_to_racer_location(pos: Position) -> racer::Location {
let pos = ls_util::position_to_rls(pos);
racer::Location::Coords(racer_coord(pos.row.one_indexed(), pos.col))
}
fn location_from_racer_match(mtch: racer::Match) -> Option<Location> {
let source_path = &mtch.filepath;
mtch.coords.map(|coord| {
let (row, col) = from_racer_coord(coord);
let loc = span::Location::new(row.zero_indexed(), col, source_path);
ls_util::rls_location_to_location(&loc)
})
}

726
rls/src/build.rs Normal file
View File

@ -0,0 +1,726 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
extern crate getopts;
extern crate rustc;
extern crate rustc_driver;
extern crate rustc_errors as errors;
extern crate rustc_resolve;
extern crate rustc_save_analysis;
extern crate syntax;
use cargo::core::{PackageId, MultiShell, Workspace};
use cargo::ops::{compile_with_exec, Executor, Context, CompileOptions, CompileMode, CompileFilter};
use cargo::util::{Config as CargoConfig, ProcessBuilder, ProcessError, homedir, ConfigValue};
use data::Analysis;
use vfs::Vfs;
use self::rustc::session::Session;
use self::rustc::session::config::{self, Input, ErrorOutputType};
use self::rustc_driver::{RustcDefaultCalls, run_compiler, run, Compilation, CompilerCalls};
use self::rustc_driver::driver::CompileController;
use self::rustc_save_analysis as save;
use self::rustc_save_analysis::CallbackHandler;
use self::syntax::ast;
use self::syntax::codemap::{FileLoader, RealFileLoader};
use config::Config;
use std::collections::HashMap;
use std::env;
use std::ffi::OsString;
use std::fs::{read_dir, remove_file};
use std::io::{self, Write};
use std::mem;
use std::path::{Path, PathBuf};
use std::process::Command;
use std::sync::{Arc, Mutex};
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::mpsc::{channel, Sender};
use std::thread;
use std::time::Duration;
/// Manages builds.
///
/// The IDE will request builds quickly (possibly on every keystroke), there is
/// no point running every one. We also avoid running more than one build at once.
/// We cannot cancel builds. It might be worth running builds in parallel or
/// cancelling a started build.
///
/// `BuildPriority::Immediate` builds are started straightaway. Normal builds are
/// started after a timeout. A new build request cancels any pending build requests.
///
/// From the client's point of view, a build request is not guaranteed to cause
/// a build. However, a build is guaranteed to happen and that build will begin
/// after the build request is received (no guarantee on how long after), and
/// that build is guaranteed to have finished before the build reqest returns.
///
/// There is no way for the client to specify that an individual request will
/// result in a build. However, you can tell from the result - if a build
/// was run, the build result will contain any errors or warnings and an indication
/// of success or failure. If the build was not run, the result indicates that
/// it was squashed.
pub struct BuildQueue {
build_dir: Mutex<Option<PathBuf>>,
cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
// True if a build is running.
// Note I have been conservative with Ordering when accessing this atomic,
// we might be able to do better.
running: AtomicBool,
// A vec of channels to pending build threads.
pending: Mutex<Vec<Sender<Signal>>>,
vfs: Arc<Vfs>,
config: Mutex<Config>,
}
#[derive(Debug)]
pub enum BuildResult {
// Build was succesful, argument is warnings.
Success(Vec<String>, Option<Analysis>),
// Build finished with errors, argument is errors and warnings.
Failure(Vec<String>, Option<Analysis>),
// Build was coelesced with another build.
Squashed,
// There was an error attempting to build.
Err,
}
/// Priority for a build request.
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
pub enum BuildPriority {
/// Run this build as soon as possible (e.g., on save or explicit build request).
Immediate,
/// A regular build request (e.g., on a minor edit).
Normal,
}
// Minimum time to wait before starting a `BuildPriority::Normal` build.
const WAIT_TO_BUILD: u64 = 500;
#[derive(Clone, Copy, Debug, Eq, PartialEq)]
enum Signal {
Build,
Skip,
}
impl BuildQueue {
pub fn new(vfs: Arc<Vfs>) -> BuildQueue {
BuildQueue {
build_dir: Mutex::new(None),
cmd_line_args: Arc::new(Mutex::new(vec![])),
cmd_line_envs: Arc::new(Mutex::new(HashMap::new())),
running: AtomicBool::new(false),
pending: Mutex::new(vec![]),
vfs: vfs,
config: Mutex::new(Config::default()),
}
}
pub fn request_build(&self, build_dir: &Path, priority: BuildPriority) -> BuildResult {
// println!("request_build, {:?} {:?}", build_dir, priority);
// If there is a change in the project directory, then we can forget any
// pending build and start straight with this new build.
{
let mut prev_build_dir = self.build_dir.lock().unwrap();
if prev_build_dir.as_ref().map_or(true, |dir| dir != build_dir) {
*prev_build_dir = Some(build_dir.to_owned());
self.cancel_pending();
let mut config = self.config.lock().unwrap();
*config = Config::from_path(build_dir);
let mut cmd_line_args = self.cmd_line_args.lock().unwrap();
*cmd_line_args = vec![];
}
}
self.cancel_pending();
match priority {
BuildPriority::Immediate => {
// There is a build running, wait for it to finish, then run.
if self.running.load(Ordering::SeqCst) {
let (tx, rx) = channel();
self.pending.lock().unwrap().push(tx);
// Blocks.
// println!("blocked on build");
let signal = rx.recv().unwrap_or(Signal::Build);
if signal == Signal::Skip {
return BuildResult::Squashed;
}
}
}
BuildPriority::Normal => {
let (tx, rx) = channel();
self.pending.lock().unwrap().push(tx);
thread::sleep(Duration::from_millis(WAIT_TO_BUILD));
if self.running.load(Ordering::SeqCst) {
// Blocks
// println!("blocked until wake up");
let signal = rx.recv().unwrap_or(Signal::Build);
if signal == Signal::Skip {
return BuildResult::Squashed;
}
} else if rx.try_recv().unwrap_or(Signal::Build) == Signal::Skip {
// Doesn't block.
return BuildResult::Squashed;
}
}
}
// If another build has started already, we don't need to build
// ourselves (it must have arrived after this request; so we don't add
// to the pending list). But we do need to wait for that build to
// finish.
if self.running.swap(true, Ordering::SeqCst) {
let mut wait = 100;
while self.running.load(Ordering::SeqCst) && wait < 50000 {
// println!("loop of death");
thread::sleep(Duration::from_millis(wait));
wait *= 2;
}
return BuildResult::Squashed;
}
let result = self.build();
self.running.store(false, Ordering::SeqCst);
// If there is a pending build, run it now.
let mut pending = self.pending.lock().unwrap();
let pending = mem::replace(&mut *pending, vec![]);
if !pending.is_empty() {
// Kick off one build, then skip the rest.
let mut pending = pending.iter();
while let Some(next) = pending.next() {
if next.send(Signal::Build).is_ok() {
break;
}
}
for t in pending {
let _ = t.send(Signal::Skip);
}
}
result
}
// Cancels all pending builds without running any of them.
fn cancel_pending(&self) {
let mut pending = self.pending.lock().unwrap();
let pending = mem::replace(&mut *pending, vec![]);
for t in pending {
let _ = t.send(Signal::Skip);
}
}
// Build the project.
fn build(&self) -> BuildResult {
// When we change build directory (presumably because the IDE is
// changing project), we must do a cargo build of the whole project.
// Otherwise we just use rustc directly.
//
// The 'full cargo build' is a `cargo check` customised and run
// in-process. Cargo will shell out to call rustc (this means the
// the compiler available at runtime must match the compiler linked to
// the RLS). All but the last crate are built as normal, we intercept
// the call to the last crate and do our own rustc build. We cache the
// command line args and environment so we can avoid running Cargo in
// the future.
//
// Our 'short' rustc build runs rustc directly and in-process (we must
// do this so we can load changed code from the VFS, rather than from
// disk). We get the data we need by building with `-Zsave-analysis`.
let needs_to_run_cargo = {
let cmd_line_args = self.cmd_line_args.lock().unwrap();
cmd_line_args.is_empty()
};
let build_dir = &self.build_dir.lock().unwrap();
let build_dir = build_dir.as_ref().unwrap();
if needs_to_run_cargo {
if let BuildResult::Err = self.cargo(build_dir.clone()) {
return BuildResult::Err;
}
}
let cmd_line_args = self.cmd_line_args.lock().unwrap();
assert!(!cmd_line_args.is_empty());
let cmd_line_envs = self.cmd_line_envs.lock().unwrap();
self.rustc(&*cmd_line_args, &*cmd_line_envs, build_dir)
}
// Runs an in-process instance of Cargo.
fn cargo(&self, build_dir: PathBuf) -> BuildResult {
struct RlsExecutor {
cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
cur_package_id: Mutex<Option<PackageId>>,
config: Config,
}
impl RlsExecutor {
fn new(cmd_line_args: Arc<Mutex<Vec<String>>>,
cmd_line_envs: Arc<Mutex<HashMap<String, Option<OsString>>>>,
config: Config) -> RlsExecutor {
RlsExecutor {
cmd_line_args: cmd_line_args,
cmd_line_envs: cmd_line_envs,
cur_package_id: Mutex::new(None),
config: config,
}
}
}
impl Executor for RlsExecutor {
fn init(&self, cx: &Context) {
let mut cur_package_id = self.cur_package_id.lock().unwrap();
*cur_package_id = Some(cx.ws
.current_opt()
.expect("No current package in Cargo")
.package_id()
.clone());
}
fn exec(&self, cmd: ProcessBuilder, id: &PackageId) -> Result<(), ProcessError> {
// Delete any stale data. We try and remove any json files with
// the same crate name as Cargo would emit. This includes files
// with the same crate name but different hashes, e.g., those
// made with a different compiler.
let args = cmd.get_args();
let crate_name = parse_arg(args, "--crate-name").expect("no crate-name in rustc command line");
let out_dir = parse_arg(args, "--out-dir").expect("no out-dir in rustc command line");
let analysis_dir = Path::new(&out_dir).join("save-analysis");
if let Ok(dir_contents) = read_dir(&analysis_dir) {
for entry in dir_contents {
let entry = entry.expect("unexpected error reading save-analysis directory");
let name = entry.file_name();
let name = name.to_str().unwrap();
if name.starts_with(&crate_name) && name.ends_with(".json") {
debug!("removing: `{:?}`", name);
remove_file(entry.path()).expect("could not remove file");
}
}
}
let is_primary_crate = {
let cur_package_id = self.cur_package_id.lock().unwrap();
id == cur_package_id.as_ref().expect("Executor has not been initialised")
};
if is_primary_crate {
let mut args: Vec<_> =
cmd.get_args().iter().map(|a| a.clone().into_string().unwrap()).collect();
// We end up taking this code path for build scripts, we don't
// want to do that, so we check here if the crate is actually
// being linked (c.f., emit=metadata) and if just call the
// usual rustc. This is clearly a bit fragile (if the emit
// string changes, we get screwed).
if args.contains(&"--emit=dep-info,link".to_owned()) {
trace!("rustc not intercepted (link)");
return cmd.exec();
}
trace!("intercepted rustc, args: {:?}", args);
// FIXME here and below should check $RUSTC before using rustc.
{
// Cargo is going to expect to get dep-info for this crate, so we shell out
// to rustc to get that. This is not really ideal, because we are going to
// compute this info anyway when we run rustc ourselves, but we don't do
// that before we return to Cargo.
// FIXME Don't do this. Instead either persuade Cargo that it doesn't need
// this info at all, or start our build here rather than on another thread
// so the dep-info is ready by the time we return from this callback.
let mut cmd_dep_info = Command::new("rustc");
for a in &args {
if a.starts_with("--emit") {
cmd_dep_info.arg("--emit=dep-info");
} else {
cmd_dep_info.arg(a);
}
}
if let Some(cwd) = cmd.get_cwd() {
cmd_dep_info.current_dir(cwd);
}
cmd_dep_info.status().expect("Couldn't execute rustc");
}
args.insert(0, "rustc".to_owned());
if self.config.cfg_test {
args.push("--test".to_owned());
}
if self.config.sysroot.is_empty() {
args.push("--sysroot".to_owned());
let home = option_env!("RUSTUP_HOME").or(option_env!("MULTIRUST_HOME"));
let toolchain = option_env!("RUSTUP_TOOLCHAIN").or(option_env!("MULTIRUST_TOOLCHAIN"));
let sys_root = if let (Some(home), Some(toolchain)) = (home, toolchain) {
format!("{}/toolchains/{}", home, toolchain)
} else {
option_env!("SYSROOT")
.map(|s| s.to_owned())
.or_else(|| Command::new("rustc")
.arg("--print")
.arg("sysroot")
.output()
.ok()
.and_then(|out| String::from_utf8(out.stdout).ok())
.map(|s| s.trim().to_owned()))
.expect("need to specify SYSROOT env var, \
or use rustup or multirust")
};
args.push(sys_root.to_owned());
}
let envs = cmd.get_envs();
trace!("envs: {:?}", envs);
{
let mut queue_args = self.cmd_line_args.lock().unwrap();
*queue_args = args.clone();
}
{
let mut queue_envs = self.cmd_line_envs.lock().unwrap();
*queue_envs = envs.clone();
}
Ok(())
} else {
trace!("rustc not intercepted");
cmd.exec()
}
}
}
let rls_config = {
let rls_config = self.config.lock().unwrap();
rls_config.clone()
};
trace!("cargo - `{:?}`", build_dir);
let exec = RlsExecutor::new(self.cmd_line_args.clone(),
self.cmd_line_envs.clone(),
rls_config.clone());
let out = Arc::new(Mutex::new(vec![]));
let err = Arc::new(Mutex::new(vec![]));
let out_clone = out.clone();
let err_clone = err.clone();
// Cargo may or may not spawn threads to run the various builds, since
// we may be in separate threads we need to block and wait our thread.
// However, if Cargo doesn't run a separate thread, then we'll just wait
// forever. Therefore, we spawn an extra thread here to be safe.
let handle = thread::spawn(move || {
let hardcoded = "-Zunstable-options -Zsave-analysis --error-format=json \
-Zcontinue-parse-after-error";
if rls_config.sysroot.is_empty() {
env::set_var("RUSTFLAGS", hardcoded);
} else {
env::set_var("RUSTFLAGS", &format!("--sysroot {} {}", rls_config.sysroot, hardcoded));
}
let shell = MultiShell::from_write(Box::new(BufWriter(out.clone())),
Box::new(BufWriter(err.clone())));
let config = make_cargo_config(&build_dir, shell);
let mut manifest_path = build_dir.clone();
manifest_path.push("Cargo.toml");
trace!("manifest_path: {:?}", manifest_path);
let ws = Workspace::new(&manifest_path, &config).expect("could not create cargo workspace");
let mut opts = CompileOptions::default(&config, CompileMode::Check);
if rls_config.build_lib {
opts.filter = CompileFilter::new(true, &[], &[], &[], &[]);
}
compile_with_exec(&ws, &opts, Arc::new(exec)).expect("could not run cargo");
});
match handle.join() {
Ok(_) => BuildResult::Success(vec![], None),
Err(_) => {
info!("cargo stdout {}", String::from_utf8(out_clone.lock().unwrap().to_owned()).unwrap());
info!("cargo stderr {}", String::from_utf8(err_clone.lock().unwrap().to_owned()).unwrap());
BuildResult::Err
}
}
}
// Runs a single instance of rustc. Runs in-process.
fn rustc(&self, args: &[String], envs: &HashMap<String, Option<OsString>>, build_dir: &Path) -> BuildResult {
trace!("rustc - args: `{:?}`, envs: {:?}, build dir: {:?}", args, envs, build_dir);
let changed = self.vfs.get_cached_files();
let _restore_env = Environment::push(envs);
let buf = Arc::new(Mutex::new(vec![]));
let err_buf = buf.clone();
let args = args.to_owned();
let analysis = Arc::new(Mutex::new(None));
let mut controller = RlsRustcCalls::new(analysis.clone());
let exit_code = ::std::panic::catch_unwind(|| {
run(move || {
// Replace stderr so we catch most errors.
run_compiler(&args,
&mut controller,
Some(Box::new(ReplacedFileLoader::new(changed))),
Some(Box::new(BufWriter(buf))))
})
});
// FIXME(#25) given that we are running the compiler directly, there is no need
// to serialise either the error messages or save-analysis - we should pass
// them both in memory, without using save-analysis.
let stderr_json_msg = convert_message_to_json_strings(Arc::try_unwrap(err_buf)
.unwrap()
.into_inner()
.unwrap());
return match exit_code {
Ok(0) => BuildResult::Success(stderr_json_msg, analysis.lock().unwrap().clone()),
_ => BuildResult::Failure(stderr_json_msg, analysis.lock().unwrap().clone()),
};
// Our compiler controller. We mostly delegate to the default rustc
// controller, but use our own callback for save-analysis.
#[derive(Clone)]
struct RlsRustcCalls {
default_calls: RustcDefaultCalls,
analysis: Arc<Mutex<Option<Analysis>>>,
}
impl RlsRustcCalls {
fn new(analysis: Arc<Mutex<Option<Analysis>>>) -> RlsRustcCalls {
RlsRustcCalls {
default_calls: RustcDefaultCalls,
analysis: analysis,
}
}
}
impl<'a> CompilerCalls<'a> for RlsRustcCalls {
fn early_callback(&mut self,
matches: &getopts::Matches,
sopts: &config::Options,
cfg: &ast::CrateConfig,
descriptions: &errors::registry::Registry,
output: ErrorOutputType)
-> Compilation {
self.default_calls.early_callback(matches, sopts, cfg, descriptions, output)
}
fn no_input(&mut self,
matches: &getopts::Matches,
sopts: &config::Options,
cfg: &ast::CrateConfig,
odir: &Option<PathBuf>,
ofile: &Option<PathBuf>,
descriptions: &errors::registry::Registry)
-> Option<(Input, Option<PathBuf>)> {
self.default_calls.no_input(matches, sopts, cfg, odir, ofile, descriptions)
}
fn late_callback(&mut self,
matches: &getopts::Matches,
sess: &Session,
input: &Input,
odir: &Option<PathBuf>,
ofile: &Option<PathBuf>)
-> Compilation {
self.default_calls.late_callback(matches, sess, input, odir, ofile)
}
fn build_controller(&mut self,
sess: &Session,
matches: &getopts::Matches)
-> CompileController<'a> {
let mut result = self.default_calls.build_controller(sess, matches);
let analysis = self.analysis.clone();
result.after_analysis.callback = Box::new(move |state| {
save::process_crate(state.tcx.unwrap(),
state.expanded_crate.unwrap(),
state.analysis.unwrap(),
state.crate_name.unwrap(),
CallbackHandler { callback: &mut |a| {
let mut analysis = analysis.lock().unwrap();
*analysis = Some(unsafe { ::std::mem::transmute(a.clone()) } );
} });
});
result.after_analysis.run_callback_on_error = true;
result.make_glob_map = rustc_resolve::MakeGlobMap::Yes;
result
}
}
}
}
fn make_cargo_config(build_dir: &Path, shell: MultiShell) -> CargoConfig {
let config = CargoConfig::new(shell,
// This is Cargo's cwd. We are using the actual cwd, but perhaps
// we should use build_dir or something else?
env::current_dir().unwrap(),
homedir(&build_dir).unwrap());
// Cargo is expecting the config to come from a config file and keeps
// track of the path to that file. We'll make one up, it shouldn't be
// used for much. Cargo does use it for finding a root path. Since
// we pass an absolute path for the build directory, that doesn't
// matter too much. However, Cargo still takes the grandparent of this
// path, so we need to have at least two path elements.
let config_path = build_dir.join("config").join("rls-config.toml");
let mut config_value_map = config.load_values().unwrap();
{
let build_value = config_value_map.entry("build".to_owned()).or_insert(ConfigValue::Table(HashMap::new(), config_path.clone()));
let target_dir = build_dir.join("target").join("rls").to_str().unwrap().to_owned();
let td_value = ConfigValue::String(target_dir, config_path);
if let &mut ConfigValue::Table(ref mut build_table, _) = build_value {
build_table.insert("target-dir".to_owned(), td_value);
} else {
unreachable!();
}
}
config.set_values(config_value_map).unwrap();
config
}
fn parse_arg(args: &[OsString], arg: &str) -> Option<String> {
for (i, a) in args.iter().enumerate() {
if a == arg {
return Some(args[i + 1].clone().into_string().unwrap());
}
}
None
}
// A threadsafe buffer for writing.
struct BufWriter(Arc<Mutex<Vec<u8>>>);
impl Write for BufWriter {
fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
self.0.lock().unwrap().write(buf)
}
fn flush(&mut self) -> io::Result<()> {
self.0.lock().unwrap().flush()
}
}
// An RAII helper to set and reset the current working directory and env vars.
struct Environment {
old_vars: HashMap<String, Option<OsString>>,
}
impl Environment {
fn push(envs: &HashMap<String, Option<OsString>>) -> Environment {
let mut result = Environment {
old_vars: HashMap::new(),
};
for (k, v) in envs {
result.old_vars.insert(k.to_owned(), env::var_os(k));
match *v {
Some(ref v) => env::set_var(k, v),
None => env::remove_var(k),
}
}
result
}
}
impl Drop for Environment {
fn drop(&mut self) {
for (k, v) in &self.old_vars {
match *v {
Some(ref v) => env::set_var(k, v),
None => env::remove_var(k),
}
}
}
}
fn convert_message_to_json_strings(input: Vec<u8>) -> Vec<String> {
let mut output = vec![];
// FIXME: this is *so gross* Trying to work around cargo not supporting json messages
let it = input.into_iter();
let mut read_iter = it.skip_while(|&x| x != b'{');
let mut _msg = String::new();
loop {
match read_iter.next() {
Some(b'\n') => {
output.push(_msg);
_msg = String::new();
while let Some(res) = read_iter.next() {
if res == b'{' {
_msg.push('{');
break;
}
}
}
Some(x) => {
_msg.push(x as char);
}
None => {
break;
}
}
}
output
}
/// Tries to read a file from a list of replacements, and if the file is not
/// there, then reads it from disk, by delegating to `RealFileLoader`.
pub struct ReplacedFileLoader {
replacements: HashMap<PathBuf, String>,
real_file_loader: RealFileLoader,
}
impl ReplacedFileLoader {
pub fn new(replacements: HashMap<PathBuf, String>) -> ReplacedFileLoader {
ReplacedFileLoader {
replacements: replacements,
real_file_loader: RealFileLoader,
}
}
}
impl FileLoader for ReplacedFileLoader {
fn file_exists(&self, path: &Path) -> bool {
self.real_file_loader.file_exists(path)
}
fn abs_path(&self, path: &Path) -> Option<PathBuf> {
self.real_file_loader.abs_path(path)
}
fn read_file(&self, path: &Path) -> io::Result<String> {
if let Some(abs_path) = self.abs_path(path) {
if self.replacements.contains_key(&abs_path) {
return Ok(self.replacements[&abs_path].clone());
}
}
self.real_file_loader.read_file(path)
}
}

199
rls/src/config.rs Normal file
View File

@ -0,0 +1,199 @@
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use toml;
use std::fs::File;
use std::io::Read;
use std::path::Path;
macro_rules! impl_enum_decodable {
( $e:ident, $( $x:ident ),* ) => {
impl ::serde::Deserialize for $e {
fn decode<D: ::serde::Deserializer>(d: &mut D) -> Result<Self, D::Error> {
use std::ascii::AsciiExt;
let s = try!(d.read_str());
$(
if stringify!($x).eq_ignore_ascii_case(&s) {
return Ok($e::$x);
}
)*
Err(d.error("Bad variant"))
}
}
impl ::std::str::FromStr for $e {
type Err = &'static str;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use std::ascii::AsciiExt;
$(
if stringify!($x).eq_ignore_ascii_case(s) {
return Ok($e::$x);
}
)*
Err("Bad variant")
}
}
impl ::config::ConfigType for $e {
fn get_variant_names() -> String {
let mut variants = Vec::new();
$(
variants.push(stringify!($x));
)*
format!("[{}]", variants.join("|"))
}
}
};
}
macro_rules! configuration_option_enum {
($e:ident: $( $x:ident ),+ $(,)*) => {
#[derive(Copy, Clone, Eq, PartialEq, Debug)]
pub enum $e {
$( $x ),+
}
impl_enum_decodable!($e, $( $x ),+);
}
}
// This trait and the following impl blocks are there so that we an use
// UCFS inside the get_docs() function on types for configs.
pub trait ConfigType {
fn get_variant_names() -> String;
}
impl ConfigType for bool {
fn get_variant_names() -> String {
String::from("<boolean>")
}
}
impl ConfigType for usize {
fn get_variant_names() -> String {
String::from("<unsigned integer>")
}
}
impl ConfigType for String {
fn get_variant_names() -> String {
String::from("<string>")
}
}
macro_rules! create_config {
($($i:ident: $ty:ty, $def:expr, $unstable:expr, $( $dstring:expr ),+ );+ $(;)*) => (
#[derive(RustcDecodable, Clone)]
pub struct Config {
$(pub $i: $ty),+
}
// Just like the Config struct but with each property wrapped
// as Option<T>. This is used to parse a rustfmt.toml that doesn't
// specity all properties of `Config`.
// We first parse into `ParsedConfig`, then create a default `Config`
// and overwrite the properties with corresponding values from `ParsedConfig`
#[derive(RustcDecodable, Clone, Deserialize)]
pub struct ParsedConfig {
$(pub $i: Option<$ty>),+
}
impl Config {
fn fill_from_parsed_config(mut self, parsed: ParsedConfig) -> Config {
$(
if let Some(val) = parsed.$i {
self.$i = val;
// TODO error out if unstable
}
)+
self
}
pub fn from_toml(toml: &str) -> Config {
let parsed_config: ParsedConfig = match toml::from_str(toml) {
Ok(decoded) => decoded,
Err(e) => {
debug!("Decoding config file failed.");
debug!("Error: {}", e);
debug!("Config:\n{}", toml);
let parsed: toml::Value = toml.parse().expect("Could not parse TOML");
debug!("\n\nParsed:\n{:?}", parsed);
panic!();
}
};
Config::default().fill_from_parsed_config(parsed_config)
}
#[allow(dead_code)]
pub fn print_docs() {
use std::cmp;
let max = 0;
$( let max = cmp::max(max, stringify!($i).len()+1); )+
let mut space_str = String::with_capacity(max);
for _ in 0..max {
space_str.push(' ');
}
println!("Configuration Options:");
$(
if !$unstable {
let name_raw = stringify!($i);
let mut name_out = String::with_capacity(max);
for _ in name_raw.len()..max-1 {
name_out.push(' ')
}
name_out.push_str(name_raw);
name_out.push(' ');
println!("{}{} Default: {:?}",
name_out,
<$ty>::get_variant_names(),
$def);
$(
println!("{}{}", space_str, $dstring);
)+
println!("");
}
)+
}
/// Attempt to read a confid from rls.toml in path, failing that use defaults.
pub fn from_path(path: &Path) -> Config {
let config_path = path.to_owned().join("rls.toml");
let config_file = File::open(config_path);
let mut toml = String::new();
if let Ok(mut f) = config_file {
f.read_to_string(&mut toml).unwrap();
}
Config::from_toml(&toml)
}
}
// Template for the default configuration
impl Default for Config {
fn default() -> Config {
Config {
$(
$i: $def,
)+
}
}
}
)
}
create_config! {
sysroot: String, String::new(), false, "--sysroot";
build_lib: bool, false, false, "cargo check --lib";
cfg_test: bool, true, false, "build cfg(test) code";
unstable_features: bool, false, false, "enable unstable features";
}

179
rls/src/lsp_data.rs Normal file
View File

@ -0,0 +1,179 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::fmt::Debug;
use std::path::PathBuf;
use std::error::Error;
use analysis::raw;
use hyper::Url;
use serde::Serialize;
use span;
use racer;
pub use ls_types::*;
macro_rules! impl_file_name {
($ty_name: ty) => {
impl $ty_name {
pub fn file_name(&self) -> PathBuf {
uri_string_to_file_name(&self.uri)
}
}
}
}
pub fn parse_file_path(uri: &Url) -> Result<PathBuf, Box<Error>> {
if uri.scheme() != "file" {
Err("URI scheme is not `file`".into())
} else {
uri.to_file_path().map_err(|_err| "Invalid file path in URI".into())
}
}
pub mod ls_util {
use super::*;
use Span;
use std::path::Path;
use hyper::Url;
use vfs::Vfs;
pub fn range_to_rls(r: Range) -> span::Range<span::ZeroIndexed> {
span::Range::from_positions(position_to_rls(r.start), position_to_rls(r.end))
}
pub fn position_to_rls(p: Position) -> span::Position<span::ZeroIndexed> {
span::Position::new(span::Row::new_zero_indexed(p.line as u32),
span::Column::new_zero_indexed(p.character as u32))
}
// An RLS span has the same info as an LSP Location
pub fn rls_to_location(span: &Span) -> Location {
Location {
uri: Url::from_file_path(&span.file).unwrap(),
range: rls_to_range(span.range),
}
}
pub fn rls_location_to_location(l: &span::Location<span::ZeroIndexed>) -> Location {
Location {
uri: Url::from_file_path(&l.file).unwrap(),
range: rls_to_range(span::Range::from_positions(l.position, l.position)),
}
}
pub fn rls_to_range(r: span::Range<span::ZeroIndexed>) -> Range {
Range {
start: rls_to_position(r.start()),
end: rls_to_position(r.end()),
}
}
pub fn rls_to_position(p: span::Position<span::ZeroIndexed>) -> Position {
Position {
line: p.row.0 as u64,
character: p.col.0 as u64,
}
}
/// Creates a `Range` spanning the whole file as currently known by `Vfs`
///
/// Panics if `Vfs` cannot load the file.
pub fn range_from_vfs_file(vfs: &Vfs, fname: &Path) -> Range {
let content = vfs.load_file(fname).unwrap();
if content.is_empty() {
Range {start: Position::new(0, 0), end: Position::new(0, 0)}
} else {
// range is zero-based and the end position is exclusive
Range {
start: Position::new(0, 0),
end: Position::new(content.lines().count() as u64 - 1,
content.lines().last().expect("String is not empty.").chars().count() as u64)
}
}
}
}
pub fn source_kind_from_def_kind(k: raw::DefKind) -> SymbolKind {
match k {
raw::DefKind::Enum => SymbolKind::Enum,
raw::DefKind::Tuple => SymbolKind::Array,
raw::DefKind::Struct => SymbolKind::Class,
raw::DefKind::Union => SymbolKind::Class,
raw::DefKind::Trait => SymbolKind::Interface,
raw::DefKind::Function |
raw::DefKind::Method |
raw::DefKind::Macro => SymbolKind::Function,
raw::DefKind::Mod => SymbolKind::Module,
raw::DefKind::Type => SymbolKind::Interface,
raw::DefKind::Local |
raw::DefKind::Static |
raw::DefKind::Const |
raw::DefKind::Field => SymbolKind::Variable,
}
}
pub fn completion_kind_from_match_type(m : racer::MatchType) -> CompletionItemKind {
match m {
racer::MatchType::Crate |
racer::MatchType::Module => CompletionItemKind::Module,
racer::MatchType::Struct => CompletionItemKind::Class,
racer::MatchType::Enum => CompletionItemKind::Enum,
racer::MatchType::StructField |
racer::MatchType::EnumVariant => CompletionItemKind::Field,
racer::MatchType::Macro |
racer::MatchType::Function |
racer::MatchType::FnArg |
racer::MatchType::Impl => CompletionItemKind::Function,
racer::MatchType::Type |
racer::MatchType::Trait |
racer::MatchType::TraitImpl => CompletionItemKind::Interface,
racer::MatchType::Let |
racer::MatchType::IfLet |
racer::MatchType::WhileLet |
racer::MatchType::For |
racer::MatchType::MatchArm |
racer::MatchType::Const |
racer::MatchType::Static => CompletionItemKind::Variable,
racer::MatchType::Builtin => CompletionItemKind::Keyword,
}
}
pub fn completion_item_from_racer_match(m : racer::Match) -> CompletionItem {
let mut item = CompletionItem::new_simple(m.matchstr.clone(), m.contextstr.clone());
item.kind = Some(completion_kind_from_match_type(m.mtype));
item
}
/* ----------------- JSON-RPC protocol types ----------------- */
/// An event-like (no response needed) notification message.
#[derive(Debug, Serialize)]
pub struct NotificationMessage<T>
where T: Debug + Serialize
{
jsonrpc: &'static str,
pub method: String,
pub params: T,
}
impl <T> NotificationMessage<T> where T: Debug + Serialize {
pub fn new(method: String, params: T) -> Self {
NotificationMessage {
jsonrpc: "2.0",
method: method,
params: params
}
}
}

60
rls/src/main.rs Normal file
View File

@ -0,0 +1,60 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
#![feature(rustc_private)]
extern crate cargo;
#[macro_use]
extern crate derive_new;
extern crate env_logger;
extern crate hyper;
extern crate languageserver_types as ls_types;
#[macro_use]
extern crate log;
extern crate racer;
extern crate rls_analysis as analysis;
extern crate rls_vfs as vfs;
extern crate rls_span as span;
extern crate rls_data as data;
extern crate rustc_serialize;
extern crate rustfmt;
extern crate serde;
#[macro_use]
extern crate serde_derive;
extern crate serde_json;
extern crate toml;
extern crate url;
extern crate url_serde;
use std::sync::Arc;
mod build;
mod server;
mod actions;
mod lsp_data;
mod config;
#[cfg(test)]
mod test;
// Timeout = 1.5s (totally arbitrary).
const COMPILER_TIMEOUT: u64 = 1500;
type Span = span::Span<span::ZeroIndexed>;
pub fn main() {
env_logger::init().unwrap();
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
server::run_server(analysis, vfs, build_queue);
}

515
rls/src/server.rs Normal file
View File

@ -0,0 +1,515 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use analysis::AnalysisHost;
use vfs::Vfs;
use serde_json;
use build::*;
use lsp_data::*;
use actions::ActionHandler;
use std::fmt;
use std::io::{self, Read, Write, ErrorKind};
use std::sync::Arc;
use std::sync::atomic::{AtomicBool, Ordering};
use std::thread;
use std::path::PathBuf;
use config::Config;
#[derive(Debug, Serialize)]
pub struct Ack {}
#[derive(Debug, new)]
struct ParseError {
kind: ErrorKind,
message: &'static str,
id: Option<usize>,
}
#[derive(Debug)]
enum ServerMessage {
Request(Request),
Notification(Notification)
}
#[derive(Debug)]
struct Request {
id: usize,
method: Method
}
#[derive(Debug)]
enum Notification {
Exit,
CancelRequest(CancelParams),
Change(DidChangeTextDocumentParams),
Open(DidOpenTextDocumentParams),
Save(DidSaveTextDocumentParams),
}
/// Creates an public enum whose variants all contain a single serializable payload
/// with an automatic json to_string implementation
macro_rules! serializable_enum {
($enum_name:ident, $($variant_name:ident($variant_type:ty)),*) => (
pub enum $enum_name {
$(
$variant_name($variant_type),
)*
}
impl fmt::Display for $enum_name {
fn fmt(&self, f: &mut fmt::Formatter) -> Result<(), fmt::Error> {
let value = match *self {
$(
$enum_name::$variant_name(ref value) => serde_json::to_string(value),
)*
}.unwrap();
write!(f, "{}", value)
}
}
)
}
serializable_enum!(ResponseData,
Init(InitializeResult),
SymbolInfo(Vec<SymbolInformation>),
CompletionItems(Vec<CompletionItem>),
WorkspaceEdit(WorkspaceEdit),
TextEdit([TextEdit; 1]),
Locations(Vec<Location>),
Highlights(Vec<DocumentHighlight>),
HoverSuccess(Hover),
Ack(Ack)
);
// Generates the Method enum and parse_message function.
macro_rules! messages {
(
methods {
// $method_arg is really a 0-1 repetition
$($method_str: pat => $method_name: ident $(($method_arg: ty))*;)*
}
notifications {
$($notif_str: pat => $notif_name: ident $(($notif_arg: ty))*;)*
}
$($other_str: pat => $other_expr: expr;)*
) => {
#[derive(Debug)]
enum Method {
$($method_name$(($method_arg))*,)*
}
fn parse_message(input: &str) -> Result<ServerMessage, ParseError> {
let ls_command: serde_json::Value = serde_json::from_str(input).unwrap();
let params = ls_command.get("params");
macro_rules! params_as {
($ty: ty) => ({
let method: $ty =
serde_json::from_value(params.unwrap().to_owned()).unwrap();
method
});
}
macro_rules! id {
() => ((ls_command.get("id").map(|id| id.as_u64().unwrap() as usize)));
}
if let Some(v) = ls_command.get("method") {
if let Some(name) = v.as_str() {
match name {
$(
$method_str => {
let id = ls_command.get("id").unwrap().as_u64().unwrap() as usize;
Ok(ServerMessage::Request(Request{id: id, method: Method::$method_name$((params_as!($method_arg)))* }))
}
)*
$(
$notif_str => {
Ok(ServerMessage::Notification(Notification::$notif_name$((params_as!($notif_arg)))*))
}
)*
$(
$other_str => $other_expr,
)*
}
} else {
Err(ParseError::new(ErrorKind::InvalidData, "Method is not a string", id!()))
}
} else {
Err(ParseError::new(ErrorKind::InvalidData, "Method not found", id!()))
}
}
};
}
messages! {
methods {
"shutdown" => Shutdown;
"initialize" => Initialize(InitializeParams);
"textDocument/hover" => Hover(TextDocumentPositionParams);
"textDocument/definition" => GotoDef(TextDocumentPositionParams);
"textDocument/references" => FindAllRef(ReferenceParams);
"textDocument/completion" => Complete(TextDocumentPositionParams);
"textDocument/documentHighlight" => Highlight(TextDocumentPositionParams);
// currently, we safely ignore this as a pass-through since we fully handle
// textDocument/completion. In the future, we may want to use this method as a
// way to more lazily fill out completion information
"completionItem/resolve" => CompleteResolve(CompletionItem);
"textDocument/documentSymbol" => Symbols(DocumentSymbolParams);
"textDocument/rename" => Rename(RenameParams);
"textDocument/formatting" => Reformat(DocumentFormattingParams);
"textDocument/rangeFormatting" => ReformatRange(DocumentRangeFormattingParams);
}
notifications {
"exit" => Exit;
"textDocument/didChange" => Change(DidChangeTextDocumentParams);
"textDocument/didOpen" => Open(DidOpenTextDocumentParams);
"textDocument/didSave" => Save(DidSaveTextDocumentParams);
"$/cancelRequest" => CancelRequest(CancelParams);
}
// TODO handle me
"$/setTraceNotification" => Err(ParseError::new(ErrorKind::InvalidData, "setTraceNotification", None));
// TODO handle me
"workspace/didChangeConfiguration" => Err(ParseError::new(ErrorKind::InvalidData, "didChangeConfiguration", None));
_ => Err(ParseError::new(ErrorKind::InvalidData, "Unknown command", id!()));
}
pub struct LsService {
shut_down: AtomicBool,
msg_reader: Box<MessageReader + Sync + Send>,
output: Box<Output + Sync + Send>,
handler: ActionHandler,
}
#[derive(Eq, PartialEq, Debug, Clone, Copy)]
pub enum ServerStateChange {
Continue,
Break,
}
impl LsService {
pub fn new(analysis: Arc<AnalysisHost>,
vfs: Arc<Vfs>,
build_queue: Arc<BuildQueue>,
reader: Box<MessageReader + Send + Sync>,
output: Box<Output + Send + Sync>)
-> Arc<LsService> {
Arc::new(LsService {
shut_down: AtomicBool::new(false),
msg_reader: reader,
output: output,
handler: ActionHandler::new(analysis, vfs, build_queue),
})
}
pub fn run(this: Arc<Self>) {
while LsService::handle_message(this.clone()) == ServerStateChange::Continue {}
}
fn init(&self, id: usize, init: InitializeParams) {
let root_path = init.root_path.map(PathBuf::from);
let unstable_features = if let Some(ref root_path) = root_path {
let config = Config::from_path(&root_path);
config.unstable_features
} else {
false
};
let result = InitializeResult {
capabilities: ServerCapabilities {
text_document_sync: Some(TextDocumentSyncKind::Incremental),
hover_provider: Some(true),
completion_provider: Some(CompletionOptions {
resolve_provider: Some(true),
trigger_characters: vec![".".to_string(), ":".to_string()],
}),
// TODO
signature_help_provider: Some(SignatureHelpOptions {
trigger_characters: Some(vec![]),
}),
definition_provider: Some(true),
references_provider: Some(true),
document_highlight_provider: Some(true),
document_symbol_provider: Some(true),
workspace_symbol_provider: Some(true),
code_action_provider: Some(false),
// TODO maybe?
code_lens_provider: None,
document_formatting_provider: Some(unstable_features),
document_range_formatting_provider: Some(unstable_features),
document_on_type_formatting_provider: None, // TODO: review this, maybe add?
rename_provider: Some(unstable_features),
}
};
self.output.success(id, ResponseData::Init(result));
if let Some(root_path) = root_path {
self.handler.init(root_path, &*self.output);
}
}
pub fn handle_message(this: Arc<Self>) -> ServerStateChange {
let c = match this.msg_reader.read_message() {
Some(c) => c,
None => {
this.output.parse_error();
return ServerStateChange::Break
},
};
let this = this.clone();
thread::spawn(move || {
// FIXME(45) refactor to generate this match.
let message = parse_message(&c);
{
let shut_down = this.shut_down.load(Ordering::SeqCst);
if shut_down {
if let Ok(ServerMessage::Notification(Notification::Exit)) = message {
} else {
// We're shutdown, ignore any messages other than 'exit'. This is not actually
// in the spec, I'm not sure we should do this, but it kinda makes sense.
return;
}
}
}
match message {
Ok(ServerMessage::Notification(method)) => {
match method {
Notification::Exit => {
trace!("exiting...");
let shut_down = this.shut_down.load(Ordering::SeqCst);
::std::process::exit(if shut_down { 0 } else { 1 });
}
Notification::CancelRequest(params) => {
trace!("request to cancel {:?}", params.id);
}
Notification::Change(change) => {
trace!("notification(change): {:?}", change);
this.handler.on_change(change, &*this.output);
}
Notification::Open(open) => {
trace!("notification(open): {:?}", open);
this.handler.on_open(open, &*this.output);
}
Notification::Save(save) => {
trace!("notification(save): {:?}", save);
this.handler.on_save(save, &*this.output);
}
}
}
Ok(ServerMessage::Request(Request{id, method})) => {
match method {
Method::Initialize(init) => {
trace!("command(init): {:?}", init);
this.init(id, init);
}
Method::Shutdown => {
trace!("shutting down...");
this.shut_down.store(true, Ordering::SeqCst);
let out = &*this.output;
out.success(id, ResponseData::Ack(Ack {}));
}
Method::Hover(params) => {
trace!("command(hover): {:?}", params);
this.handler.hover(id, params, &*this.output);
}
Method::GotoDef(params) => {
trace!("command(goto): {:?}", params);
this.handler.goto_def(id, params, &*this.output);
}
Method::Complete(params) => {
trace!("command(complete): {:?}", params);
this.handler.complete(id, params, &*this.output);
}
Method::CompleteResolve(params) => {
trace!("command(complete): {:?}", params);
this.output.success(id, ResponseData::CompletionItems(vec![params]))
}
Method::Highlight(params) => {
trace!("command(highlight): {:?}", params);
this.handler.highlight(id, params, &*this.output);
}
Method::Symbols(params) => {
trace!("command(goto): {:?}", params);
this.handler.symbols(id, params, &*this.output);
}
Method::FindAllRef(params) => {
trace!("command(find_all_refs): {:?}", params);
this.handler.find_all_refs(id, params, &*this.output);
}
Method::Rename(params) => {
trace!("command(rename): {:?}", params);
this.handler.rename(id, params, &*this.output);
}
Method::Reformat(params) => {
// FIXME take account of options.
trace!("command(reformat): {:?}", params);
this.handler.reformat(id, params.text_document, &*this.output);
}
Method::ReformatRange(params) => {
// FIXME reformats the whole file, not just a range.
// FIXME take account of options.
trace!("command(reformat range): {:?}", params);
this.handler.reformat(id, params.text_document, &*this.output);
}
}
}
Err(e) => {
trace!("parsing invalid message: {:?}", e);
if let Some(id) = e.id {
this.output.failure(id, "Unsupported message");
}
},
}
});
ServerStateChange::Continue
}
}
pub trait MessageReader {
fn read_message(&self) -> Option<String>;
}
struct StdioMsgReader;
impl MessageReader for StdioMsgReader {
fn read_message(&self) -> Option<String> {
macro_rules! handle_err {
($e: expr, $s: expr) => {
match $e {
Ok(x) => x,
Err(_) => {
debug!($s);
return None;
}
}
}
}
// Read in the "Content-length: xx" part
let mut buffer = String::new();
handle_err!(io::stdin().read_line(&mut buffer), "Could not read from stdin");
if buffer.is_empty() {
info!("Header is empty");
return None;
}
let res: Vec<&str> = buffer.split(' ').collect();
// Make sure we see the correct header
if res.len() != 2 {
info!("Header is malformed");
return None;
}
if res[0].to_lowercase() != "content-length:" {
info!("Header is missing 'content-length'");
return None;
}
let size = handle_err!(usize::from_str_radix(&res[1].trim(), 10), "Couldn't read size");
trace!("reading: {} bytes", size);
// Skip the new lines
let mut tmp = String::new();
handle_err!(io::stdin().read_line(&mut tmp), "Could not read from stdin");
let mut content = vec![0; size];
handle_err!(io::stdin().read_exact(&mut content), "Could not read from stdin");
let content = handle_err!(String::from_utf8(content), "Non-utf8 input");
Some(content)
}
}
pub trait Output {
fn response(&self, output: String);
fn parse_error(&self) {
self.response(r#"{"jsonrpc": "2.0", "error": {"code": -32700, "message": "Parse error"}, "id": null}"#.to_owned());
}
fn failure(&self, id: usize, message: &str) {
// For now this is a catch-all for any error back to the consumer of the RLS
const METHOD_NOT_FOUND: i64 = -32601;
#[derive(Serialize)]
struct ResponseError {
code: i64,
message: String
}
#[derive(Serialize)]
struct ResponseFailure {
jsonrpc: &'static str,
id: usize,
error: ResponseError,
}
let rf = ResponseFailure {
jsonrpc: "2.0",
id: id,
error: ResponseError {
code: METHOD_NOT_FOUND,
message: message.to_owned(),
},
};
let output = serde_json::to_string(&rf).unwrap();
self.response(output);
}
fn success(&self, id: usize, data: ResponseData) {
// {
// jsonrpc: String,
// id: usize,
// result: String,
// }
let output = format!("{{\"jsonrpc\":\"2.0\",\"id\":{},\"result\":{}}}", id, data);
self.response(output);
}
fn notify(&self, message: &str) {
let output = serde_json::to_string(
&NotificationMessage::new(message.to_owned(), ())
).unwrap();
self.response(output);
}
}
struct StdioOutput;
impl Output for StdioOutput {
fn response(&self, output: String) {
let o = format!("Content-Length: {}\r\n\r\n{}", output.len(), output);
debug!("response: {:?}", o);
print!("{}", o);
io::stdout().flush().unwrap();
}
}
pub fn run_server(analysis: Arc<AnalysisHost>, vfs: Arc<Vfs>, build_queue: Arc<BuildQueue>) {
debug!("Language Server Starting up");
let service = LsService::new(analysis,
vfs,
build_queue,
Box::new(StdioMsgReader),
Box::new(StdioOutput));
LsService::run(service);
debug!("Server shutting down");
}

567
rls/src/test/mod.rs Normal file
View File

@ -0,0 +1,567 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
// Utilities and infrastructure for testing. Tests in this module test the
// testing infrastructure *not* the RLS.
mod types;
use std::sync::{Arc, Mutex};
use std::thread;
use std::time::{Duration, SystemTime};
use env_logger;
use analysis;
use build;
use server as ls_server;
use vfs;
use self::types::src;
use hyper::Url;
use serde_json;
use std::path::{Path, PathBuf};
const TEST_TIMEOUT_IN_SEC: u64 = 10;
#[test]
fn test_goto_def() {
let (mut cache, _tc) = init_env("goto_def");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/definition",
vec![("textDocument", text_doc),
("position", cache.mk_ls_position(src(&source_file_path, 22, "world")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
// TODO structural checking of result, rather than looking for a string - src(&source_file_path, 12, "world")
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("\"start\":{\"line\":20,\"character\":8}")]);
}
#[test]
fn test_hover() {
let (mut cache, _tc) = init_env("hover");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/hover",
vec![("textDocument", text_doc),
("position", cache.mk_ls_position(src(&source_file_path, 22, "world")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"language\":\"rust\",\"value\":\"&str\"}]")]);
}
#[test]
fn test_find_all_refs() {
let (mut cache, _tc) = init_env("find_all_refs");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/references",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"context": {{
"includeDeclaration": true
}}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 10, "Bar")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":9,"character":7},"end":{"line":9,"character":10}}"#)
.expect_contains(r#"{"start":{"line":15,"character":14},"end":{"line":15,"character":17}}"#)
.expect_contains(r#"{"start":{"line":23,"character":15},"end":{"line":23,"character":18}}"#)]);
}
#[test]
fn test_find_all_refs_no_cfg_test() {
let (mut cache, _tc) = init_env("find_all_refs_no_cfg_test");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/references",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"context": {{
"includeDeclaration": true
}}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 10, "Bar")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":9,"character":7},"end":{"line":9,"character":10}}"#)
.expect_contains(r#"{"start":{"line":23,"character":15},"end":{"line":23,"character":18}}"#)]);
}
#[test]
fn test_borrow_error() {
let (cache, _tc) = init_env("borrow_error");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path)];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("\"secondaryRanges\":[{\"start\":{\"line\":2,\"character\":17},\"end\":{\"line\":2,\"character\":18},\"label\":\"first mutable borrow occurs here\"}"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
}
#[test]
fn test_highlight() {
let (mut cache, _tc) = init_env("highlight");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/documentHighlight",
"id": 42,
"params": {{
"textDocument": {},
"position": {}
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 22, "world")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":20,"character":8},"end":{"line":20,"character":13}}"#)
.expect_contains(r#"{"start":{"line":21,"character":27},"end":{"line":21,"character":32}}"#),]);
}
#[test]
fn test_rename() {
let (mut cache, _tc) = init_env("rename");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![format!(r#"{{
"jsonrpc": "2.0",
"method": "initialize",
"id": 0,
"params": {{
"processId": "0",
"capabilities": null,
"rootPath": {}
}}
}}"#, root_path), format!(r#"{{
"jsonrpc": "2.0",
"method": "textDocument/rename",
"id": 42,
"params": {{
"textDocument": {},
"position": {},
"newName": "foo"
}}
}}"#, text_doc, cache.mk_ls_position(src(&source_file_path, 22, "world")))];
let (server, results) = mock_raw_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(0)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains(r#"{"start":{"line":20,"character":8},"end":{"line":20,"character":13}}"#)
.expect_contains(r#"{"start":{"line":21,"character":27},"end":{"line":21,"character":32}}"#)
.expect_contains(r#"{"changes""#),]);
}
#[test]
fn test_completion() {
let (mut cache, _tc) = init_env("completion");
let source_file_path = Path::new("src").join("main.rs");
let root_path = format!("{}", serde_json::to_string(&cache.abs_path(Path::new(".")))
.expect("couldn't convert path to JSON"));
let url = Url::from_file_path(cache.abs_path(&source_file_path)).expect("couldn't convert file path to URL");
let text_doc = format!("{{\"uri\":{}}}", serde_json::to_string(&url.as_str().to_owned())
.expect("couldn't convert path to JSON"));
let messages = vec![Message::new("initialize", vec![("processId", "0".to_owned()),
("capabilities", "null".to_owned()),
("rootPath", root_path)]),
Message::new("textDocument/completion",
vec![("textDocument", text_doc.to_owned()),
("position", cache.mk_ls_position(src(&source_file_path, 22, "rld")))]),
Message::new("textDocument/completion",
vec![("textDocument", text_doc.to_owned()),
("position", cache.mk_ls_position(src(&source_file_path, 25, "x)")))])];
let (server, results) = mock_server(messages);
// Initialise and build.
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("capabilities"),
ExpectedMessage::new(None).expect_contains("diagnosticsBegin"),
ExpectedMessage::new(None).expect_contains("diagnosticsEnd")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"label\":\"world\",\"kind\":6,\"detail\":\"let world = \\\"world\\\";\"}]")]);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Continue);
expect_messages(results.clone(), &[ExpectedMessage::new(Some(42)).expect_contains("[{\"label\":\"x\",\"kind\":5,\"detail\":\"u64\"}]")]);
}
#[test]
fn test_parse_error_on_malformed_input() {
let _ = env_logger::init();
struct NoneMsgReader;
impl ls_server::MessageReader for NoneMsgReader {
fn read_message(&self) -> Option<String> { None }
}
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(NoneMsgReader);
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
let server = ls_server::LsService::new(analysis, vfs, build_queue, reader, output);
assert_eq!(ls_server::LsService::handle_message(server.clone()),
ls_server::ServerStateChange::Break);
let error = results.lock().unwrap()
.pop().expect("no error response");
assert!(error.contains(r#""code": -32700"#))
}
// Initialise and run the internals of an LS protocol RLS server.
fn mock_server(messages: Vec<Message>) -> (Arc<ls_server::LsService>, LsResultList)
{
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(MockMsgReader::new(messages));
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
(ls_server::LsService::new(analysis, vfs, build_queue, reader, output), results)
}
// Initialise and run the internals of an LS protocol RLS server.
fn mock_raw_server(messages: Vec<String>) -> (Arc<ls_server::LsService>, LsResultList)
{
let analysis = Arc::new(analysis::AnalysisHost::new(analysis::Target::Debug));
let vfs = Arc::new(vfs::Vfs::new());
let build_queue = Arc::new(build::BuildQueue::new(vfs.clone()));
let reader = Box::new(MockRawMsgReader::new(messages));
let output = Box::new(RecordOutput::new());
let results = output.output.clone();
(ls_server::LsService::new(analysis, vfs, build_queue, reader, output), results)
}
struct MockMsgReader {
messages: Vec<Message>,
cur: Mutex<usize>,
}
impl MockMsgReader {
fn new(messages: Vec<Message>) -> MockMsgReader {
MockMsgReader {
messages: messages,
cur: Mutex::new(0),
}
}
}
struct MockRawMsgReader {
messages: Vec<String>,
cur: Mutex<usize>,
}
impl MockRawMsgReader {
fn new(messages: Vec<String>) -> MockRawMsgReader {
MockRawMsgReader {
messages: messages,
cur: Mutex::new(0),
}
}
}
// TODO should have a structural way of making params, rather than taking Strings
struct Message {
method: &'static str,
params: Vec<(&'static str, String)>,
}
impl Message {
fn new(method: &'static str, params: Vec<(&'static str, String)>) -> Message {
Message {
method: method,
params: params,
}
}
}
impl ls_server::MessageReader for MockMsgReader {
fn read_message(&self) -> Option<String> {
// Note that we hold this lock until the end of the function, thus meaning
// that we must finish processing one message before processing the next.
let mut cur = self.cur.lock().unwrap();
let index = *cur;
*cur += 1;
if index >= self.messages.len() {
return None;
}
let message = &self.messages[index];
let params = message.params.iter().map(|&(k, ref v)| format!("\"{}\":{}", k, v)).collect::<Vec<String>>().join(",");
// TODO don't hardcode the id, we should use fresh ids and use them to look up responses
let result = format!("{{\"method\":\"{}\",\"id\":42,\"params\":{{{}}}}}", message.method, params);
// println!("read_message: `{}`", result);
Some(result)
}
}
impl ls_server::MessageReader for MockRawMsgReader {
fn read_message(&self) -> Option<String> {
// Note that we hold this lock until the end of the function, thus meaning
// that we must finish processing one message before processing the next.
let mut cur = self.cur.lock().unwrap();
let index = *cur;
*cur += 1;
if index >= self.messages.len() {
return None;
}
let message = &self.messages[index];
Some(message.clone())
}
}
type LsResultList = Arc<Mutex<Vec<String>>>;
struct RecordOutput {
output: LsResultList,
}
impl RecordOutput {
fn new() -> RecordOutput {
RecordOutput {
output: Arc::new(Mutex::new(vec![])),
}
}
}
impl ls_server::Output for RecordOutput {
fn response(&self, output: String) {
let mut records = self.output.lock().unwrap();
records.push(output);
}
}
// Initialise the environment for a test.
fn init_env(project_dir: &str) -> (types::Cache, TestCleanup) {
let _ = env_logger::init();
let path = &Path::new("test_data").join(project_dir);
let tc = TestCleanup { path: path.to_owned() };
(types::Cache::new(path), tc)
}
#[derive(Clone, Debug)]
struct ExpectedMessage {
id: Option<u64>,
contains: Vec<String>,
}
impl ExpectedMessage {
fn new(id: Option<u64>) -> ExpectedMessage {
ExpectedMessage {
id: id,
contains: vec![],
}
}
fn expect_contains(&mut self, s: &str) -> &mut ExpectedMessage {
self.contains.push(s.to_owned());
self
}
}
fn expect_messages(results: LsResultList, expected: &[&ExpectedMessage]) {
let start_clock = SystemTime::now();
let mut results_count = results.lock().unwrap().len();
while (results_count != expected.len()) && (start_clock.elapsed().unwrap().as_secs() < TEST_TIMEOUT_IN_SEC) {
thread::sleep(Duration::from_millis(100));
results_count = results.lock().unwrap().len();
}
let mut results = results.lock().unwrap();
println!("expect_messages: results: {:?},\nexpected: {:?}", *results, expected);
assert_eq!(results.len(), expected.len());
for (found, expected) in results.iter().zip(expected.iter()) {
let values: serde_json::Value = serde_json::from_str(found).unwrap();
assert!(values.get("jsonrpc").expect("Missing jsonrpc field").as_str().unwrap() == "2.0", "Bad jsonrpc field");
if let Some(id) = expected.id {
assert_eq!(values.get("id").expect("Missing id field").as_u64().unwrap(), id, "Unexpected id");
}
for c in expected.contains.iter() {
found.find(c).expect(&format!("Could not find `{}` in `{}`", c, found));
}
}
*results = vec![];
}
struct TestCleanup {
path: PathBuf
}
impl Drop for TestCleanup {
fn drop(&mut self) {
use std::fs;
let target_path = self.path.join("target");
if fs::metadata(&target_path).is_ok() {
fs::remove_dir_all(target_path).expect("failed to tidy up");
}
}
}

92
rls/src/test/types.rs Normal file
View File

@ -0,0 +1,92 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::collections::HashMap;
use std::env;
use std::fs::File;
use std::path::{Path, PathBuf};
use std::io::{BufRead, BufReader};
#[derive(Clone, Copy, Debug)]
pub struct Src<'a, 'b> {
pub file_name: &'a Path,
// 1 indexed
pub line: usize,
pub name: &'b str,
}
pub fn src<'a, 'b>(file_name: &'a Path, line: usize, name: &'b str) -> Src<'a, 'b> {
Src {
file_name: file_name,
line: line,
name: name,
}
}
pub struct Cache {
base_path: PathBuf,
files: HashMap<PathBuf, Vec<String>>,
}
impl Cache {
pub fn new(base_path: &Path) -> Cache {
let mut root_path = env::current_dir().expect("Could not find current working directory");
root_path.push(base_path);
Cache {
base_path: root_path,
files: HashMap::new(),
}
}
pub fn mk_ls_position(&mut self, src: Src) -> String {
let line = self.get_line(src);
let col = line.find(src.name).expect(&format!("Line does not contain name {}", src.name));
format!("{{\"line\":\"{}\",\"character\":\"{}\"}}", src.line - 1, char_of_byte_index(&line, col))
}
pub fn abs_path(&self, file_name: &Path) -> PathBuf {
let result = self.base_path.join(file_name).canonicalize().expect("Couldn't canonicalise path");
let result = if cfg!(windows) {
// FIXME: If the \\?\ prefix is not stripped from the canonical path, the HTTP server tests fail. Why?
let result_string = result.to_str().expect("Path contains non-utf8 characters.");
PathBuf::from(&result_string[r"\\?\".len()..])
} else {
result
};
result
}
fn get_line(&mut self, src: Src) -> String {
let base_path = &self.base_path;
let lines = self.files.entry(src.file_name.to_owned()).or_insert_with(|| {
let file_name = &base_path.join(src.file_name);
let file = File::open(file_name).expect(&format!("Couldn't find file: {:?}", file_name));
let lines = BufReader::new(file).lines();
lines.collect::<Result<Vec<_>, _>>().unwrap()
});
if src.line - 1 >= lines.len() {
panic!("Line {} not in file, found {} lines", src.line, lines.len());
}
lines[src.line - 1].to_owned()
}
}
fn char_of_byte_index(s: &str, byte: usize) -> usize {
for (c, (b, _)) in s.char_indices().enumerate() {
if b == byte {
return c;
}
}
panic!("Couldn't find byte {} in {:?}", byte, s);
}

4
rls/test_data/borrow_error/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "borrow_error"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "borrow_error"
version = "0.1.0"
authors = ["Jonathan Turner <jturner@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,5 @@
fn main() {
let mut x = 3;
let y = &mut x;
let z = &mut x;
}

4
rls/test_data/completion/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "completion"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "completion"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

4
rls/test_data/find_all_refs/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "find_all_refs"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "find_all_refs"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

View File

@ -0,0 +1,4 @@
[root]
name = "find_all_refs_no_cfg_test"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "find_all_refs_no_cfg_test"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1 @@
cfg_test = false

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

4
rls/test_data/goto_def/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "goto_def"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "goto_def"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

4
rls/test_data/highlight/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "highlight"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "highlight"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

4
rls/test_data/hover/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "hover"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "hover"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

4
rls/test_data/rename/Cargo.lock generated Normal file
View File

@ -0,0 +1,4 @@
[root]
name = "rename"
version = "0.1.0"

View File

@ -0,0 +1,6 @@
[package]
name = "rename"
version = "0.1.0"
authors = ["Nick Cameron <ncameron@mozilla.com>"]
[dependencies]

View File

@ -0,0 +1 @@
unstable_features = true

View File

@ -0,0 +1,26 @@
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
struct Bar {
x: u64,
}
#[test]
pub fn test_fn() {
let bar = Bar { x: 4 };
println!("bar: {}", bar.x);
}
pub fn main() {
let world = "world";
println!("Hello, {}!", world);
let bar2 = Bar { x: 5 };
println!("bar2: {}", bar2.x);
}

195
src/Cargo.lock generated
View File

@ -8,7 +8,7 @@ dependencies = [
[[package]] [[package]]
name = "aho-corasick" name = "aho-corasick"
version = "0.6.2" version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)", "memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
@ -27,7 +27,7 @@ version = "0.0.0"
dependencies = [ dependencies = [
"build_helper 0.1.0", "build_helper 0.1.0",
"core 0.0.0", "core 0.0.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.0.0", "libc 0.0.0",
] ]
@ -48,6 +48,16 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
name = "arena" name = "arena"
version = "0.0.0" version = "0.0.0"
[[package]]
name = "atty"
version = "0.2.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]] [[package]]
name = "bitflags" name = "bitflags"
version = "0.5.0" version = "0.5.0"
@ -55,7 +65,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
name = "bitflags" name = "bitflags"
version = "0.7.0" version = "0.8.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
@ -63,13 +73,13 @@ name = "bootstrap"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"build_helper 0.1.0", "build_helper 0.1.0",
"cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)", "cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)",
"filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)", "filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
"getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)", "getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)", "libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)",
"num_cpus 0.2.13 (registry+https://github.com/rust-lang/crates.io-index)", "num_cpus 0.2.13 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
"toml 0.1.30 (registry+https://github.com/rust-lang/crates.io-index)", "toml 0.1.30 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
@ -77,7 +87,7 @@ dependencies = [
name = "build-manifest" name = "build-manifest"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
"toml 0.1.30 (registry+https://github.com/rust-lang/crates.io-index)", "toml 0.1.30 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
@ -94,25 +104,25 @@ version = "0.1.0"
[[package]] [[package]]
name = "clap" name = "clap"
version = "2.20.5" version = "2.22.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)", "ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)", "atty 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)", "bitflags 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)", "strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
"term_size 0.2.3 (registry+https://github.com/rust-lang/crates.io-index)", "term_size 0.2.3 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-segmentation 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "unicode-segmentation 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)", "unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
"vec_map 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)", "vec_map 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
name = "cmake" name = "cmake"
version = "0.1.21" version = "0.1.22"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
@ -130,17 +140,17 @@ version = "0.0.0"
dependencies = [ dependencies = [
"build_helper 0.1.0", "build_helper 0.1.0",
"core 0.0.0", "core 0.0.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
name = "compiletest" name = "compiletest"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"env_logger 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)", "env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)", "filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
@ -152,14 +162,6 @@ name = "dtoa"
version = "0.4.1" version = "0.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "env_logger"
version = "0.3.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]] [[package]]
name = "env_logger" name = "env_logger"
version = "0.4.2" version = "0.4.2"
@ -186,7 +188,7 @@ name = "flate"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"build_helper 0.1.0", "build_helper 0.1.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
@ -195,7 +197,7 @@ version = "0.0.0"
[[package]] [[package]]
name = "gcc" name = "gcc"
version = "0.3.43" version = "0.3.45"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
@ -213,15 +215,15 @@ version = "0.0.0"
[[package]] [[package]]
name = "handlebars" name = "handlebars"
version = "0.25.1" version = "0.25.2"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"lazy_static 0.2.4 (registry+https://github.com/rust-lang/crates.io-index)", "lazy_static 0.2.5 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"pest 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)", "pest 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
"quick-error 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)", "quick-error 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"regex 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)", "regex 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_json 0.9.9 (registry+https://github.com/rust-lang/crates.io-index)", "serde_json 0.9.9 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
@ -241,7 +243,7 @@ dependencies = [
[[package]] [[package]]
name = "lazy_static" name = "lazy_static"
version = "0.2.4" version = "0.2.5"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
@ -260,10 +262,6 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
name = "linkchecker" name = "linkchecker"
version = "0.1.0" version = "0.1.0"
[[package]]
name = "log"
version = "0.0.0"
[[package]] [[package]]
name = "log" name = "log"
version = "0.3.7" version = "0.3.7"
@ -271,12 +269,12 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
name = "mdbook" name = "mdbook"
version = "0.0.18" version = "0.0.19"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"clap 2.20.5 (registry+https://github.com/rust-lang/crates.io-index)", "clap 2.22.1 (registry+https://github.com/rust-lang/crates.io-index)",
"env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)", "env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"handlebars 0.25.1 (registry+https://github.com/rust-lang/crates.io-index)", "handlebars 0.25.2 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"open 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)", "open 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"pulldown-cmark 0.0.8 (registry+https://github.com/rust-lang/crates.io-index)", "pulldown-cmark 0.0.8 (registry+https://github.com/rust-lang/crates.io-index)",
@ -360,6 +358,14 @@ dependencies = [
"getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)", "getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]]
name = "pulldown-cmark"
version = "0.0.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"bitflags 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]] [[package]]
name = "qemu-test-client" name = "qemu-test-client"
version = "0.1.0" version = "0.1.0"
@ -385,7 +391,7 @@ name = "regex"
version = "0.2.1" version = "0.2.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"aho-corasick 0.6.2 (registry+https://github.com/rust-lang/crates.io-index)", "aho-corasick 0.6.3 (registry+https://github.com/rust-lang/crates.io-index)",
"memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)", "memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"regex-syntax 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)", "regex-syntax 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)",
"thread_local 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)", "thread_local 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)",
@ -397,12 +403,29 @@ name = "regex-syntax"
version = "0.4.0" version = "0.4.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "rls-data"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"rls-span 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]]
name = "rls-span"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
]
[[package]] [[package]]
name = "rustbook" name = "rustbook"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"clap 2.20.5 (registry+https://github.com/rust-lang/crates.io-index)", "clap 2.22.1 (registry+https://github.com/rust-lang/crates.io-index)",
"mdbook 0.0.18 (registry+https://github.com/rust-lang/crates.io-index)", "mdbook 0.0.19 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
@ -412,7 +435,7 @@ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"fmt_macros 0.0.0", "fmt_macros 0.0.0",
"graphviz 0.0.0", "graphviz 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_bitflags 0.0.0", "rustc_bitflags 0.0.0",
"rustc_const_math 0.0.0", "rustc_const_math 0.0.0",
@ -435,7 +458,7 @@ dependencies = [
[[package]] [[package]]
name = "rustc-serialize" name = "rustc-serialize"
version = "0.3.22" version = "0.3.23"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
@ -444,7 +467,7 @@ version = "0.0.0"
dependencies = [ dependencies = [
"alloc_system 0.0.0", "alloc_system 0.0.0",
"build_helper 0.1.0", "build_helper 0.1.0",
"cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)", "cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)",
"core 0.0.0", "core 0.0.0",
] ]
@ -452,7 +475,7 @@ dependencies = [
name = "rustc_back" name = "rustc_back"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"serialize 0.0.0", "serialize 0.0.0",
"syntax 0.0.0", "syntax 0.0.0",
] ]
@ -466,7 +489,7 @@ name = "rustc_borrowck"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"graphviz 0.0.0", "graphviz 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
@ -480,8 +503,7 @@ name = "rustc_const_eval"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"graphviz 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.0.0",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_const_math 0.0.0", "rustc_const_math 0.0.0",
@ -503,7 +525,7 @@ dependencies = [
name = "rustc_data_structures" name = "rustc_data_structures"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"serialize 0.0.0", "serialize 0.0.0",
] ]
@ -512,8 +534,9 @@ name = "rustc_driver"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"graphviz 0.0.0", "graphviz 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"proc_macro_plugin 0.0.0", "proc_macro_plugin 0.0.0",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
@ -552,7 +575,7 @@ name = "rustc_incremental"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"graphviz 0.0.0", "graphviz 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"serialize 0.0.0", "serialize 0.0.0",
@ -564,7 +587,7 @@ dependencies = [
name = "rustc_lint" name = "rustc_lint"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_const_eval 0.0.0", "rustc_const_eval 0.0.0",
@ -577,7 +600,7 @@ name = "rustc_llvm"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"build_helper 0.1.0", "build_helper 0.1.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc_bitflags 0.0.0", "rustc_bitflags 0.0.0",
] ]
@ -587,7 +610,7 @@ version = "0.0.0"
dependencies = [ dependencies = [
"alloc_system 0.0.0", "alloc_system 0.0.0",
"build_helper 0.1.0", "build_helper 0.1.0",
"cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)", "cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)",
"core 0.0.0", "core 0.0.0",
] ]
@ -596,7 +619,7 @@ name = "rustc_metadata"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"flate 0.0.0", "flate 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"proc_macro 0.0.0", "proc_macro 0.0.0",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
@ -615,7 +638,7 @@ name = "rustc_mir"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"graphviz 0.0.0", "graphviz 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_bitflags 0.0.0", "rustc_bitflags 0.0.0",
"rustc_const_eval 0.0.0", "rustc_const_eval 0.0.0",
@ -631,7 +654,7 @@ version = "0.0.0"
dependencies = [ dependencies = [
"alloc_system 0.0.0", "alloc_system 0.0.0",
"build_helper 0.1.0", "build_helper 0.1.0",
"cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)", "cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)",
"core 0.0.0", "core 0.0.0",
] ]
@ -639,7 +662,7 @@ dependencies = [
name = "rustc_passes" name = "rustc_passes"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_const_eval 0.0.0", "rustc_const_eval 0.0.0",
"rustc_const_math 0.0.0", "rustc_const_math 0.0.0",
@ -678,7 +701,7 @@ name = "rustc_resolve"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
"syntax 0.0.0", "syntax 0.0.0",
@ -689,9 +712,11 @@ dependencies = [
name = "rustc_save_analysis" name = "rustc_save_analysis"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rls-data 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"rls-span 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"serialize 0.0.0", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
"syntax 0.0.0", "syntax 0.0.0",
"syntax_pos 0.0.0", "syntax_pos 0.0.0",
] ]
@ -701,11 +726,10 @@ name = "rustc_trans"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"flate 0.0.0", "flate 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_bitflags 0.0.0", "rustc_bitflags 0.0.0",
"rustc_const_eval 0.0.0",
"rustc_const_math 0.0.0", "rustc_const_math 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
@ -723,7 +747,7 @@ version = "0.0.0"
dependencies = [ dependencies = [
"alloc_system 0.0.0", "alloc_system 0.0.0",
"build_helper 0.1.0", "build_helper 0.1.0",
"cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)", "cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)",
"core 0.0.0", "core 0.0.0",
] ]
@ -733,10 +757,9 @@ version = "0.0.0"
dependencies = [ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"fmt_macros 0.0.0", "fmt_macros 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_const_eval 0.0.0",
"rustc_const_math 0.0.0", "rustc_const_math 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
@ -751,11 +774,12 @@ version = "0.0.0"
dependencies = [ dependencies = [
"arena 0.0.0", "arena 0.0.0",
"build_helper 0.1.0", "build_helper 0.1.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.0.0", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"pulldown-cmark 0.0.14 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc 0.0.0", "rustc 0.0.0",
"rustc_back 0.0.0", "rustc_back 0.0.0",
"rustc_const_eval 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"rustc_driver 0.0.0", "rustc_driver 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
@ -799,7 +823,7 @@ dependencies = [
"collections 0.0.0", "collections 0.0.0",
"compiler_builtins 0.0.0", "compiler_builtins 0.0.0",
"core 0.0.0", "core 0.0.0",
"gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)", "gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.0.0", "libc 0.0.0",
"panic_abort 0.0.0", "panic_abort 0.0.0",
"panic_unwind 0.0.0", "panic_unwind 0.0.0",
@ -828,7 +852,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
name = "syntax" name = "syntax"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc_bitflags 0.0.0", "rustc_bitflags 0.0.0",
"rustc_data_structures 0.0.0", "rustc_data_structures 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
@ -841,7 +865,7 @@ name = "syntax_ext"
version = "0.0.0" version = "0.0.0"
dependencies = [ dependencies = [
"fmt_macros 0.0.0", "fmt_macros 0.0.0",
"log 0.0.0", "log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)",
"proc_macro 0.0.0", "proc_macro 0.0.0",
"rustc_errors 0.0.0", "rustc_errors 0.0.0",
"syntax 0.0.0", "syntax 0.0.0",
@ -904,7 +928,7 @@ name = "toml"
version = "0.1.30" version = "0.1.30"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [ dependencies = [
"rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)", "rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)",
] ]
[[package]] [[package]]
@ -940,7 +964,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
name = "vec_map" name = "vec_map"
version = "0.6.0" version = "0.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]] [[package]]
@ -959,35 +983,38 @@ version = "0.1.1"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
[metadata] [metadata]
"checksum aho-corasick 0.6.2 (registry+https://github.com/rust-lang/crates.io-index)" = "0638fd549427caa90c499814196d1b9e3725eb4d15d7339d6de073a680ed0ca2" "checksum aho-corasick 0.6.3 (registry+https://github.com/rust-lang/crates.io-index)" = "500909c4f87a9e52355b26626d890833e9e1d53ac566db76c36faa984b889699"
"checksum ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "23ac7c30002a5accbf7e8987d0632fa6de155b7c3d39d0067317a391e00a2ef6" "checksum ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "23ac7c30002a5accbf7e8987d0632fa6de155b7c3d39d0067317a391e00a2ef6"
"checksum atty 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "d912da0db7fa85514874458ca3651fe2cddace8d0b0505571dbdcd41ab490159"
"checksum bitflags 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)" = "4f67931368edf3a9a51d29886d245f1c3db2f1ef0dcc9e35ff70341b78c10d23" "checksum bitflags 0.5.0 (registry+https://github.com/rust-lang/crates.io-index)" = "4f67931368edf3a9a51d29886d245f1c3db2f1ef0dcc9e35ff70341b78c10d23"
"checksum bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "aad18937a628ec6abcd26d1489012cc0e18c21798210f491af69ded9b881106d" "checksum bitflags 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)" = "1370e9fc2a6ae53aea8b7a5110edbd08836ed87c88736dfabccade1c2b44bff4"
"checksum clap 2.20.5 (registry+https://github.com/rust-lang/crates.io-index)" = "7db281b0520e97fbd15cd615dcd8f8bcad0c26f5f7d5effe705f090f39e9a758" "checksum clap 2.22.1 (registry+https://github.com/rust-lang/crates.io-index)" = "e17a4a72ffea176f77d6e2db609c6c919ef221f23862c9915e687fb54d833485"
"checksum cmake 0.1.21 (registry+https://github.com/rust-lang/crates.io-index)" = "e1acc68a3f714627af38f9f5d09706a28584ba60dfe2cca68f40bf779f941b25" "checksum cmake 0.1.22 (registry+https://github.com/rust-lang/crates.io-index)" = "d18d68987ed4c516dcc3e7913659bfa4076f5182eea4a7e0038bb060953e76ac"
"checksum dtoa 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "80c8b71fd71146990a9742fc06dcbbde19161a267e0ad4e572c35162f4578c90" "checksum dtoa 0.4.1 (registry+https://github.com/rust-lang/crates.io-index)" = "80c8b71fd71146990a9742fc06dcbbde19161a267e0ad4e572c35162f4578c90"
"checksum env_logger 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "15abd780e45b3ea4f76b4e9a26ff4843258dd8a3eed2775a0e7368c2e7936c2f"
"checksum env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "e3856f1697098606fc6cb97a93de88ca3f3bc35bb878c725920e6e82ecf05e83" "checksum env_logger 0.4.2 (registry+https://github.com/rust-lang/crates.io-index)" = "e3856f1697098606fc6cb97a93de88ca3f3bc35bb878c725920e6e82ecf05e83"
"checksum filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)" = "5363ab8e4139b8568a6237db5248646e5a8a2f89bd5ccb02092182b11fd3e922" "checksum filetime 0.1.10 (registry+https://github.com/rust-lang/crates.io-index)" = "5363ab8e4139b8568a6237db5248646e5a8a2f89bd5ccb02092182b11fd3e922"
"checksum gcc 0.3.43 (registry+https://github.com/rust-lang/crates.io-index)" = "c07c758b972368e703a562686adb39125707cc1ef3399da8c019fc6c2498a75d" "checksum gcc 0.3.45 (registry+https://github.com/rust-lang/crates.io-index)" = "40899336fb50db0c78710f53e87afc54d8c7266fb76262fecc78ca1a7f09deae"
"checksum getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)" = "d9047cfbd08a437050b363d35ef160452c5fe8ea5187ae0a624708c91581d685" "checksum getopts 0.2.14 (registry+https://github.com/rust-lang/crates.io-index)" = "d9047cfbd08a437050b363d35ef160452c5fe8ea5187ae0a624708c91581d685"
"checksum handlebars 0.25.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b2249f6f0dc5a3bb2b3b1a8f797dfccbc4b053344d773d654ad565e51427d335" "checksum handlebars 0.25.2 (registry+https://github.com/rust-lang/crates.io-index)" = "663e1728d8037fb0d4e13bcd1b1909fb5d913690a9929eb385922df157c2ff8f"
"checksum itoa 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "eb2f404fbc66fd9aac13e998248505e7ecb2ad8e44ab6388684c5fb11c6c251c" "checksum itoa 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "eb2f404fbc66fd9aac13e998248505e7ecb2ad8e44ab6388684c5fb11c6c251c"
"checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d" "checksum kernel32-sys 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "7507624b29483431c0ba2d82aece8ca6cdba9382bff4ddd0f7490560c056098d"
"checksum lazy_static 0.2.4 (registry+https://github.com/rust-lang/crates.io-index)" = "7291b1dd97d331f752620b02dfdbc231df7fc01bf282a00769e1cdb963c460dc" "checksum lazy_static 0.2.5 (registry+https://github.com/rust-lang/crates.io-index)" = "4732c563b9a21a406565c4747daa7b46742f082911ae4753f390dc9ec7ee1a97"
"checksum libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)" = "88ee81885f9f04bff991e306fea7c1c60a5f0f9e409e99f6b40e3311a3363135" "checksum libc 0.2.21 (registry+https://github.com/rust-lang/crates.io-index)" = "88ee81885f9f04bff991e306fea7c1c60a5f0f9e409e99f6b40e3311a3363135"
"checksum log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)" = "5141eca02775a762cc6cd564d8d2c50f67c0ea3a372cbf1c51592b3e029e10ad" "checksum log 0.3.7 (registry+https://github.com/rust-lang/crates.io-index)" = "5141eca02775a762cc6cd564d8d2c50f67c0ea3a372cbf1c51592b3e029e10ad"
"checksum mdbook 0.0.18 (registry+https://github.com/rust-lang/crates.io-index)" = "06a68e8738e42b38a02755d3ce5fa12d559e17acb238e4326cbc3cc056e65280" "checksum mdbook 0.0.19 (registry+https://github.com/rust-lang/crates.io-index)" = "2598843aeda0c5bb2e8e4d714564f1c3fc40f7844157e34563bf96ae3866b56e"
"checksum memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "1dbccc0e46f1ea47b9f17e6d67c5a96bd27030519c519c9c91327e31275a47b4" "checksum memchr 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "1dbccc0e46f1ea47b9f17e6d67c5a96bd27030519c519c9c91327e31275a47b4"
"checksum num-traits 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)" = "e1cbfa3781f3fe73dc05321bed52a06d2d491eaa764c52335cf4399f046ece99" "checksum num-traits 0.1.37 (registry+https://github.com/rust-lang/crates.io-index)" = "e1cbfa3781f3fe73dc05321bed52a06d2d491eaa764c52335cf4399f046ece99"
"checksum num_cpus 0.2.13 (registry+https://github.com/rust-lang/crates.io-index)" = "cee7e88156f3f9e19bdd598f8d6c9db7bf4078f99f8381f43a55b09648d1a6e3" "checksum num_cpus 0.2.13 (registry+https://github.com/rust-lang/crates.io-index)" = "cee7e88156f3f9e19bdd598f8d6c9db7bf4078f99f8381f43a55b09648d1a6e3"
"checksum open 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "3478ed1686bd1300c8a981a940abc92b06fac9cbef747f4c668d4e032ff7b842" "checksum open 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "3478ed1686bd1300c8a981a940abc92b06fac9cbef747f4c668d4e032ff7b842"
"checksum pest 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "0a6dda33d67c26f0aac90d324ab2eb7239c819fc7b2552fe9faa4fe88441edc8" "checksum pest 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "0a6dda33d67c26f0aac90d324ab2eb7239c819fc7b2552fe9faa4fe88441edc8"
"checksum pulldown-cmark 0.0.14 (registry+https://github.com/rust-lang/crates.io-index)" = "d9ab1e588ef8efd702c7ed9d2bd774db5e6f4d878bb5a1a9f371828fbdff6973"
"checksum pulldown-cmark 0.0.8 (registry+https://github.com/rust-lang/crates.io-index)" = "1058d7bb927ca067656537eec4e02c2b4b70eaaa129664c5b90c111e20326f41" "checksum pulldown-cmark 0.0.8 (registry+https://github.com/rust-lang/crates.io-index)" = "1058d7bb927ca067656537eec4e02c2b4b70eaaa129664c5b90c111e20326f41"
"checksum quick-error 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "0aad603e8d7fb67da22dbdf1f4b826ce8829e406124109e73cf1b2454b93a71c" "checksum quick-error 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "0aad603e8d7fb67da22dbdf1f4b826ce8829e406124109e73cf1b2454b93a71c"
"checksum regex 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)" = "4278c17d0f6d62dfef0ab00028feb45bd7d2102843f80763474eeb1be8a10c01" "checksum regex 0.2.1 (registry+https://github.com/rust-lang/crates.io-index)" = "4278c17d0f6d62dfef0ab00028feb45bd7d2102843f80763474eeb1be8a10c01"
"checksum regex-syntax 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "2f9191b1f57603095f105d317e375d19b1c9c5c3185ea9633a99a6dcbed04457" "checksum regex-syntax 0.4.0 (registry+https://github.com/rust-lang/crates.io-index)" = "2f9191b1f57603095f105d317e375d19b1c9c5c3185ea9633a99a6dcbed04457"
"checksum rustc-serialize 0.3.22 (registry+https://github.com/rust-lang/crates.io-index)" = "237546c689f20bb44980270c73c3b9edd0891c1be49cc1274406134a66d3957b" "checksum rls-data 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "af1dfff00189fd7b78edb9af131b0de703676c04fa8126aed77fd2c586775a4d"
"checksum rls-span 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "8656f7b850ac85fb204ef94318c641bbb15a32766e12f9a589a23e4c0fbc38db"
"checksum rustc-serialize 0.3.23 (registry+https://github.com/rust-lang/crates.io-index)" = "684ce48436d6465300c9ea783b6b14c4361d6b8dcbb1375b486a69cc19e2dfb0"
"checksum serde 0.9.11 (registry+https://github.com/rust-lang/crates.io-index)" = "a702319c807c016e51f672e5c77d6f0b46afddd744b5e437d6b8436b888b458f" "checksum serde 0.9.11 (registry+https://github.com/rust-lang/crates.io-index)" = "a702319c807c016e51f672e5c77d6f0b46afddd744b5e437d6b8436b888b458f"
"checksum serde_json 0.9.9 (registry+https://github.com/rust-lang/crates.io-index)" = "dbc45439552eb8fb86907a2c41c1fd0ef97458efb87ff7f878db466eb581824e" "checksum serde_json 0.9.9 (registry+https://github.com/rust-lang/crates.io-index)" = "dbc45439552eb8fb86907a2c41c1fd0ef97458efb87ff7f878db466eb581824e"
"checksum strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "b4d15c810519a91cf877e7e36e63fe068815c678181439f2f29e2562147c3694" "checksum strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "b4d15c810519a91cf877e7e36e63fe068815c678181439f2f29e2562147c3694"
@ -1000,7 +1027,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
"checksum unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "bf3a113775714a22dcb774d8ea3655c53a32debae63a063acc00a91cc586245f" "checksum unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)" = "bf3a113775714a22dcb774d8ea3655c53a32debae63a063acc00a91cc586245f"
"checksum unreachable 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "1f2ae5ddb18e1c92664717616dd9549dde73f539f01bd7b77c2edb2446bdff91" "checksum unreachable 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "1f2ae5ddb18e1c92664717616dd9549dde73f539f01bd7b77c2edb2446bdff91"
"checksum utf8-ranges 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "662fab6525a98beff2921d7f61a39e7d59e0b425ebc7d0d9e66d316e55124122" "checksum utf8-ranges 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "662fab6525a98beff2921d7f61a39e7d59e0b425ebc7d0d9e66d316e55124122"
"checksum vec_map 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)" = "cac5efe5cb0fa14ec2f84f83c701c562ee63f6dcc680861b21d65c682adfb05f" "checksum vec_map 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "f8cdc8b93bd0198ed872357fb2e667f7125646b1762f16d60b2c96350d361897"
"checksum void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6a02e4885ed3bc0f2de90ea6dd45ebcbb66dacffe03547fadbb0eeae2770887d" "checksum void 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6a02e4885ed3bc0f2de90ea6dd45ebcbb66dacffe03547fadbb0eeae2770887d"
"checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a" "checksum winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)" = "167dc9d6949a9b857f3451275e911c3f44255842c1f7a76f33c55103a909087a"
"checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc" "checksum winapi-build 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "2d315eee3b34aca4797b2da6b13ed88266e6d612562a0c46390af8299fc699bc"

View File

@ -4,10 +4,6 @@ This is an in-progress README which is targeted at helping to explain how Rust
is bootstrapped and in general some of the technical details of the build is bootstrapped and in general some of the technical details of the build
system. system.
> **Note**: This build system is currently under active development and is not
> intended to be the primarily used one just yet. The makefiles are currently
> the ones that are still "guaranteed to work" as much as possible at least.
## Using rustbuild ## Using rustbuild
The rustbuild build system has a primary entry point, a top level `x.py` script: The rustbuild build system has a primary entry point, a top level `x.py` script:

View File

@ -94,6 +94,13 @@ fn main() {
cmd.arg("-Cprefer-dynamic"); cmd.arg("-Cprefer-dynamic");
} }
// Pass the `rustbuild` feature flag to crates which rustbuild is
// building. See the comment in bootstrap/lib.rs where this env var is
// set for more details.
if env::var_os("RUSTBUILD_UNSTABLE").is_some() {
cmd.arg("--cfg").arg("rustbuild");
}
// Help the libc crate compile by assisting it in finding the MUSL // Help the libc crate compile by assisting it in finding the MUSL
// native libraries. // native libraries.
if let Some(s) = env::var_os("MUSL_ROOT") { if let Some(s) = env::var_os("MUSL_ROOT") {
@ -182,7 +189,7 @@ fn main() {
if env::var("RUSTC_RPATH") == Ok("true".to_string()) { if env::var("RUSTC_RPATH") == Ok("true".to_string()) {
let rpath = if target.contains("apple") { let rpath = if target.contains("apple") {
// Note that we need to take one extra step on OSX to also pass // Note that we need to take one extra step on macOS to also pass
// `-Wl,-instal_name,@rpath/...` to get things to work right. To // `-Wl,-instal_name,@rpath/...` to get things to work right. To
// do that we pass a weird flag to the compiler to get it to do // do that we pass a weird flag to the compiler to get it to do
// so. Note that this is definitely a hack, and we should likely // so. Note that this is definitely a hack, and we should likely

View File

@ -40,6 +40,14 @@ fn main() {
.arg(sysroot) .arg(sysroot)
.env(bootstrap::util::dylib_path_var(), .env(bootstrap::util::dylib_path_var(),
env::join_paths(&dylib_path).unwrap()); env::join_paths(&dylib_path).unwrap());
// Pass the `rustbuild` feature flag to crates which rustbuild is
// building. See the comment in bootstrap/lib.rs where this env var is
// set for more details.
if env::var_os("RUSTBUILD_UNSTABLE").is_some() {
cmd.arg("--cfg").arg("rustbuild");
}
std::process::exit(match cmd.status() { std::process::exit(match cmd.status() {
Ok(s) => s.code().unwrap_or(1), Ok(s) => s.code().unwrap_or(1),
Err(e) => panic!("\n\nfailed to run {:?}: {}\n\n", cmd, e), Err(e) => panic!("\n\nfailed to run {:?}: {}\n\n", cmd, e),

View File

@ -160,18 +160,16 @@ class RustBuild(object):
def download_stage0(self): def download_stage0(self):
cache_dst = os.path.join(self.build_dir, "cache") cache_dst = os.path.join(self.build_dir, "cache")
rustc_cache = os.path.join(cache_dst, self.stage0_rustc_date()) rustc_cache = os.path.join(cache_dst, self.stage0_rustc_date())
cargo_cache = os.path.join(cache_dst, self.stage0_cargo_rev())
if not os.path.exists(rustc_cache): if not os.path.exists(rustc_cache):
os.makedirs(rustc_cache) os.makedirs(rustc_cache)
if not os.path.exists(cargo_cache):
os.makedirs(cargo_cache) channel = self.stage0_rustc_channel()
if self.rustc().startswith(self.bin_root()) and \ if self.rustc().startswith(self.bin_root()) and \
(not os.path.exists(self.rustc()) or self.rustc_out_of_date()): (not os.path.exists(self.rustc()) or self.rustc_out_of_date()):
self.print_what_it_means_to_bootstrap() self.print_what_it_means_to_bootstrap()
if os.path.exists(self.bin_root()): if os.path.exists(self.bin_root()):
shutil.rmtree(self.bin_root()) shutil.rmtree(self.bin_root())
channel = self.stage0_rustc_channel()
filename = "rust-std-{}-{}.tar.gz".format(channel, self.build) filename = "rust-std-{}-{}.tar.gz".format(channel, self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date() url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
tarball = os.path.join(rustc_cache, filename) tarball = os.path.join(rustc_cache, filename)
@ -192,18 +190,26 @@ class RustBuild(object):
with open(self.rustc_stamp(), 'w') as f: with open(self.rustc_stamp(), 'w') as f:
f.write(self.stage0_rustc_date()) f.write(self.stage0_rustc_date())
if "pc-windows-gnu" in self.build:
filename = "rust-mingw-{}-{}.tar.gz".format(channel, self.build)
url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match="rust-mingw", verbose=self.verbose)
if self.cargo().startswith(self.bin_root()) and \ if self.cargo().startswith(self.bin_root()) and \
(not os.path.exists(self.cargo()) or self.cargo_out_of_date()): (not os.path.exists(self.cargo()) or self.cargo_out_of_date()):
self.print_what_it_means_to_bootstrap() self.print_what_it_means_to_bootstrap()
filename = "cargo-nightly-{}.tar.gz".format(self.build) filename = "cargo-{}-{}.tar.gz".format('0.18.0', self.build)
url = "https://s3.amazonaws.com/rust-lang-ci/cargo-builds/" + self.stage0_cargo_rev() url = "https://static.rust-lang.org/dist/" + self.stage0_rustc_date()
tarball = os.path.join(cargo_cache, filename) tarball = os.path.join(rustc_cache, filename)
if not os.path.exists(tarball): if not os.path.exists(tarball):
get("{}/{}".format(url, filename), tarball, verbose=self.verbose) get("{}/{}".format(url, filename), tarball, verbose=self.verbose)
unpack(tarball, self.bin_root(), match="cargo", verbose=self.verbose) unpack(tarball, self.bin_root(), match="cargo", verbose=self.verbose)
self.fix_executable(self.bin_root() + "/bin/cargo") self.fix_executable(self.bin_root() + "/bin/cargo")
with open(self.cargo_stamp(), 'w') as f: with open(self.cargo_stamp(), 'w') as f:
f.write(self.stage0_cargo_rev()) f.write(self.stage0_rustc_date())
def fix_executable(self, fname): def fix_executable(self, fname):
# If we're on NixOS we need to change the path to the dynamic loader # If we're on NixOS we need to change the path to the dynamic loader
@ -258,9 +264,6 @@ class RustBuild(object):
print("warning: failed to call patchelf: %s" % e) print("warning: failed to call patchelf: %s" % e)
return return
def stage0_cargo_rev(self):
return self._cargo_rev
def stage0_rustc_date(self): def stage0_rustc_date(self):
return self._rustc_date return self._rustc_date
@ -283,7 +286,7 @@ class RustBuild(object):
if not os.path.exists(self.cargo_stamp()) or self.clean: if not os.path.exists(self.cargo_stamp()) or self.clean:
return True return True
with open(self.cargo_stamp(), 'r') as f: with open(self.cargo_stamp(), 'r') as f:
return self.stage0_cargo_rev() != f.read() return self.stage0_rustc_date() != f.read()
def bin_root(self): def bin_root(self):
return os.path.join(self.build_dir, self.build, "stage0") return os.path.join(self.build_dir, self.build, "stage0")
@ -401,14 +404,6 @@ class RustBuild(object):
raise Exception(err) raise Exception(err)
sys.exit(err) sys.exit(err)
# Darwin's `uname -s` lies and always returns i386. We have to use
# sysctl instead.
if ostype == 'Darwin' and cputype == 'i686':
args = ['sysctl', 'hw.optional.x86_64']
sysctl = subprocess.check_output(args).decode(default_encoding)
if ': 1' in sysctl:
cputype = 'x86_64'
# The goal here is to come up with the same triple as LLVM would, # The goal here is to come up with the same triple as LLVM would,
# at least for the subset of platforms we're willing to target. # at least for the subset of platforms we're willing to target.
if ostype == 'Linux': if ostype == 'Linux':
@ -469,10 +464,10 @@ class RustBuild(object):
cputype = 'i686' cputype = 'i686'
elif cputype in {'xscale', 'arm'}: elif cputype in {'xscale', 'arm'}:
cputype = 'arm' cputype = 'arm'
elif cputype in {'armv6l', 'armv7l', 'armv8l'}: elif cputype == 'armv6l':
cputype = 'arm' cputype = 'arm'
ostype += 'eabihf' ostype += 'eabihf'
elif cputype == 'armv7l': elif cputype in {'armv7l', 'armv8l'}:
cputype = 'armv7' cputype = 'armv7'
ostype += 'eabihf' ostype += 'eabihf'
elif cputype == 'aarch64': elif cputype == 'aarch64':
@ -578,7 +573,6 @@ def bootstrap():
data = stage0_data(rb.rust_root) data = stage0_data(rb.rust_root)
rb._rustc_channel, rb._rustc_date = data['rustc'].split('-', 1) rb._rustc_channel, rb._rustc_date = data['rustc'].split('-', 1)
rb._cargo_rev = data['cargo']
# Fetch/build the bootstrap # Fetch/build the bootstrap
rb.build = rb.build_triple() rb.build = rb.build_triple()
@ -598,8 +592,10 @@ def bootstrap():
def main(): def main():
start_time = time() start_time = time()
help_triggered = ('-h' in sys.argv) or ('--help' in sys.argv) or (len(sys.argv) == 1)
try: try:
bootstrap() bootstrap()
if not help_triggered:
print("Build completed successfully in %s" % format_build_time(time() - start_time)) print("Build completed successfully in %s" % format_build_time(time() - start_time))
except (SystemExit, KeyboardInterrupt) as e: except (SystemExit, KeyboardInterrupt) as e:
if hasattr(e, 'code') and isinstance(e.code, int): if hasattr(e, 'code') and isinstance(e.code, int):
@ -607,6 +603,7 @@ def main():
else: else:
exit_code = 1 exit_code = 1
print(e) print(e)
if not help_triggered:
print("Build completed unsuccessfully in %s" % format_build_time(time() - start_time)) print("Build completed unsuccessfully in %s" % format_build_time(time() - start_time))
sys.exit(exit_code) sys.exit(exit_code)

View File

@ -23,12 +23,12 @@ use build_helper::output;
use Build; use Build;
// The version number // The version number
pub const CFG_RELEASE_NUM: &'static str = "1.17.0"; pub const CFG_RELEASE_NUM: &'static str = "1.18.0";
// An optional number to put after the label, e.g. '.2' -> '-beta.2' // An optional number to put after the label, e.g. '.2' -> '-beta.2'
// Be sure to make this starts with a dot to conform to semver pre-release // Be sure to make this starts with a dot to conform to semver pre-release
// versions (section 9) // versions (section 9)
pub const CFG_PRERELEASE_VERSION: &'static str = ".3"; pub const CFG_PRERELEASE_VERSION: &'static str = ".4";
pub struct GitInfo { pub struct GitInfo {
inner: Option<Info>, inner: Option<Info>,

View File

@ -176,7 +176,7 @@ pub fn compiletest(build: &Build,
cmd.arg("--docck-python").arg(build.python()); cmd.arg("--docck-python").arg(build.python());
if build.config.build.ends_with("apple-darwin") { if build.config.build.ends_with("apple-darwin") {
// Force /usr/bin/python on OSX for LLDB tests because we're loading the // Force /usr/bin/python on macOS for LLDB tests because we're loading the
// LLDB plugin's compiled module which only works with the system python // LLDB plugin's compiled module which only works with the system python
// (namely not Homebrew-installed python) // (namely not Homebrew-installed python)
cmd.arg("--lldb-python").arg("/usr/bin/python"); cmd.arg("--lldb-python").arg("/usr/bin/python");
@ -285,6 +285,16 @@ pub fn docs(build: &Build, compiler: &Compiler) {
continue continue
} }
// The nostarch directory in the book is for no starch, and so isn't guaranteed to build.
// we don't care if it doesn't build, so skip it.
use std::ffi::OsStr;
let path: &OsStr = p.as_ref();
if let Some(path) = path.to_str() {
if path.contains("nostarch") {
continue;
}
}
println!("doc tests for: {}", p.display()); println!("doc tests for: {}", p.display());
markdown_test(build, compiler, &p); markdown_test(build, compiler, &p);
} }
@ -576,7 +586,7 @@ fn android_copy_libs(build: &Build, compiler: &Compiler, target: &str) {
.arg(ADB_TEST_DIR)); .arg(ADB_TEST_DIR));
let target_dir = format!("{}/{}", ADB_TEST_DIR, target); let target_dir = format!("{}/{}", ADB_TEST_DIR, target);
build.run(Command::new("adb").args(&["shell", "mkdir", &target_dir[..]])); build.run(Command::new("adb").args(&["shell", "mkdir", &target_dir]));
for f in t!(build.sysroot_libdir(compiler, target).read_dir()) { for f in t!(build.sysroot_libdir(compiler, target).read_dir()) {
let f = t!(f); let f = t!(f);

View File

@ -22,9 +22,9 @@ use std::path::Path;
use Build; use Build;
pub fn clean(build: &Build) { pub fn clean(build: &Build) {
rm_rf(build, "tmp".as_ref()); rm_rf("tmp".as_ref());
rm_rf(build, &build.out.join("tmp")); rm_rf(&build.out.join("tmp"));
rm_rf(build, &build.out.join("dist")); rm_rf(&build.out.join("dist"));
for host in build.config.host.iter() { for host in build.config.host.iter() {
let entries = match build.out.join(host).read_dir() { let entries = match build.out.join(host).read_dir() {
@ -38,32 +38,31 @@ pub fn clean(build: &Build) {
continue continue
} }
let path = t!(entry.path().canonicalize()); let path = t!(entry.path().canonicalize());
rm_rf(build, &path); rm_rf(&path);
} }
} }
} }
fn rm_rf(build: &Build, path: &Path) { fn rm_rf(path: &Path) {
if !path.exists() { match path.symlink_metadata() {
return Err(e) => {
if e.kind() == ErrorKind::NotFound {
return;
} }
if path.is_file() { panic!("failed to get metadata for file {}: {}", path.display(), e);
return do_op(path, "remove file", |p| fs::remove_file(p)); },
Ok(metadata) => {
if metadata.file_type().is_file() || metadata.file_type().is_symlink() {
do_op(path, "remove file", |p| fs::remove_file(p));
return;
} }
for file in t!(fs::read_dir(path)) { for file in t!(fs::read_dir(path)) {
let file = t!(file).path(); rm_rf(&t!(file).path());
if file.is_dir() {
rm_rf(build, &file);
} else {
// On windows we can't remove a readonly file, and git will
// often clone files as readonly. As a result, we have some
// special logic to remove readonly files on windows.
do_op(&file, "remove file", |p| fs::remove_file(p));
}
} }
do_op(path, "remove dir", |p| fs::remove_dir(p)); do_op(path, "remove dir", |p| fs::remove_dir(p));
},
};
} }
fn do_op<F>(path: &Path, desc: &str, mut f: F) fn do_op<F>(path: &Path, desc: &str, mut f: F)
@ -71,9 +70,12 @@ fn do_op<F>(path: &Path, desc: &str, mut f: F)
{ {
match f(path) { match f(path) {
Ok(()) => {} Ok(()) => {}
// On windows we can't remove a readonly file, and git will often clone files as readonly.
// As a result, we have some special logic to remove readonly files on windows.
// This is also the reason that we can't use things like fs::remove_dir_all().
Err(ref e) if cfg!(windows) && Err(ref e) if cfg!(windows) &&
e.kind() == ErrorKind::PermissionDenied => { e.kind() == ErrorKind::PermissionDenied => {
let mut p = t!(path.metadata()).permissions(); let mut p = t!(path.symlink_metadata()).permissions();
p.set_readonly(false); p.set_readonly(false);
t!(fs::set_permissions(path, p)); t!(fs::set_permissions(path, p));
f(path).unwrap_or_else(|e| { f(path).unwrap_or_else(|e| {

View File

@ -151,6 +151,7 @@ pub fn build_startup_objects(build: &Build, for_compiler: &Compiler, target: &st
if !up_to_date(src_file, dst_file) { if !up_to_date(src_file, dst_file) {
let mut cmd = Command::new(&compiler_path); let mut cmd = Command::new(&compiler_path);
build.run(cmd.env("RUSTC_BOOTSTRAP", "1") build.run(cmd.env("RUSTC_BOOTSTRAP", "1")
.arg("--cfg").arg(format!("stage{}", compiler.stage))
.arg("--target").arg(target) .arg("--target").arg(target)
.arg("--emit=obj") .arg("--emit=obj")
.arg("--out-dir").arg(dst_dir) .arg("--out-dir").arg(dst_dir)
@ -258,7 +259,7 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
cargo.env("CFG_LLVM_ROOT", s); cargo.env("CFG_LLVM_ROOT", s);
} }
// Building with a static libstdc++ is only supported on linux right now, // Building with a static libstdc++ is only supported on linux right now,
// not for MSVC or OSX // not for MSVC or macOS
if build.config.llvm_static_stdcpp && if build.config.llvm_static_stdcpp &&
!target.contains("windows") && !target.contains("windows") &&
!target.contains("apple") { !target.contains("apple") {
@ -275,6 +276,7 @@ pub fn rustc(build: &Build, target: &str, compiler: &Compiler) {
cargo.env("CFG_DEFAULT_AR", s); cargo.env("CFG_DEFAULT_AR", s);
} }
build.run(&mut cargo); build.run(&mut cargo);
update_mtime(build, &librustc_stamp(build, compiler, target));
} }
/// Same as `std_link`, only for librustc /// Same as `std_link`, only for librustc
@ -305,6 +307,12 @@ fn libtest_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf {
build.cargo_out(compiler, Mode::Libtest, target).join(".libtest.stamp") build.cargo_out(compiler, Mode::Libtest, target).join(".libtest.stamp")
} }
/// Cargo's output path for librustc in a given stage, compiled by a particular
/// compiler for the specified target.
fn librustc_stamp(build: &Build, compiler: &Compiler, target: &str) -> PathBuf {
build.cargo_out(compiler, Mode::Librustc, target).join(".librustc.stamp")
}
fn compiler_file(compiler: &Path, file: &str) -> PathBuf { fn compiler_file(compiler: &Path, file: &str) -> PathBuf {
let out = output(Command::new(compiler) let out = output(Command::new(compiler)
.arg(format!("-print-file-name={}", file))); .arg(format!("-print-file-name={}", file)));
@ -407,6 +415,23 @@ fn add_to_sysroot(out_dir: &Path, sysroot_dst: &Path) {
} }
} }
/// Build a tool in `src/tools`
///
/// This will build the specified tool with the specified `host` compiler in
/// `stage` into the normal cargo output directory.
pub fn maybe_clean_tools(build: &Build, stage: u32, target: &str, mode: Mode) {
let compiler = Compiler::new(stage, &build.config.build);
let stamp = match mode {
Mode::Libstd => libstd_stamp(build, &compiler, target),
Mode::Libtest => libtest_stamp(build, &compiler, target),
Mode::Librustc => librustc_stamp(build, &compiler, target),
_ => panic!(),
};
let out_dir = build.cargo_out(&compiler, Mode::Tool, target);
build.clear_if_dirty(&out_dir, &stamp);
}
/// Build a tool in `src/tools` /// Build a tool in `src/tools`
/// ///
/// This will build the specified tool with the specified `host` compiler in /// This will build the specified tool with the specified `host` compiler in
@ -416,15 +441,6 @@ pub fn tool(build: &Build, stage: u32, target: &str, tool: &str) {
let compiler = Compiler::new(stage, &build.config.build); let compiler = Compiler::new(stage, &build.config.build);
// FIXME: need to clear out previous tool and ideally deps, may require
// isolating output directories or require a pseudo shim step to
// clear out all the info.
//
// Maybe when libstd is compiled it should clear out the rustc of the
// corresponding stage?
// let out_dir = build.cargo_out(stage, &host, Mode::Librustc, target);
// build.clear_if_dirty(&out_dir, &libstd_stamp(build, stage, &host, target));
let mut cargo = build.cargo(&compiler, Mode::Tool, target, "build"); let mut cargo = build.cargo(&compiler, Mode::Tool, target, "build");
let mut dir = build.src.join(tool); let mut dir = build.src.join(tool);
if !dir.exists() { if !dir.exists() {

View File

@ -23,7 +23,7 @@ use std::process;
use num_cpus; use num_cpus;
use rustc_serialize::Decodable; use rustc_serialize::Decodable;
use toml::{Parser, Decoder, Value}; use toml::{Parser, Decoder, Value};
use util::push_exe_path; use util::{exe, push_exe_path};
/// Global configuration for the entire build and/or bootstrap. /// Global configuration for the entire build and/or bootstrap.
/// ///
@ -570,6 +570,12 @@ impl Config {
.or_insert(Target::default()); .or_insert(Target::default());
target.ndk = Some(parse_configure_path(value)); target.ndk = Some(parse_configure_path(value));
} }
"CFG_X86_64_LINUX_ANDROID_NDK" if value.len() > 0 => {
let target = "x86_64-linux-android".to_string();
let target = self.target_config.entry(target)
.or_insert(Target::default());
target.ndk = Some(parse_configure_path(value));
}
"CFG_LOCAL_RUST_ROOT" if value.len() > 0 => { "CFG_LOCAL_RUST_ROOT" if value.len() > 0 => {
let path = parse_configure_path(value); let path = parse_configure_path(value);
self.rustc = Some(push_exe_path(path.clone(), &["bin", "rustc"])); self.rustc = Some(push_exe_path(path.clone(), &["bin", "rustc"]));
@ -580,10 +586,10 @@ impl Config {
self.python = Some(path); self.python = Some(path);
} }
"CFG_ENABLE_CCACHE" if value == "1" => { "CFG_ENABLE_CCACHE" if value == "1" => {
self.ccache = Some("ccache".to_string()); self.ccache = Some(exe("ccache", &self.build));
} }
"CFG_ENABLE_SCCACHE" if value == "1" => { "CFG_ENABLE_SCCACHE" if value == "1" => {
self.ccache = Some("sccache".to_string()); self.ccache = Some(exe("sccache", &self.build));
} }
"CFG_CONFIGURE_ARGS" if value.len() > 0 => { "CFG_CONFIGURE_ARGS" if value.len() > 0 => {
self.configure_args = value.split_whitespace() self.configure_args = value.split_whitespace()

View File

@ -88,11 +88,11 @@
# for each target triple. # for each target triple.
#target = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple #target = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple
# Instead of downloading the src/nightlies.txt version of Cargo specified, use # Instead of downloading the src/stage0.txt version of Cargo specified, use
# this Cargo binary instead to build all Rust code # this Cargo binary instead to build all Rust code
#cargo = "/path/to/bin/cargo" #cargo = "/path/to/bin/cargo"
# Instead of downloading the src/nightlies.txt version of the compiler # Instead of downloading the src/stage0.txt version of the compiler
# specified, use this rustc binary instead as the stage0 snapshot compiler. # specified, use this rustc binary instead as the stage0 snapshot compiler.
#rustc = "/path/to/bin/rustc" #rustc = "/path/to/bin/rustc"

View File

@ -39,6 +39,8 @@ use util::{cp_r, libdir, is_dylib, cp_filtered, copy, exe};
fn pkgname(build: &Build, component: &str) -> String { fn pkgname(build: &Build, component: &str) -> String {
if component == "cargo" { if component == "cargo" {
format!("{}-{}", component, build.cargo_package_vers()) format!("{}-{}", component, build.cargo_package_vers())
} else if component == "rls" {
format!("{}-{}", component, build.package_vers(&build.release_num("rls")))
} else { } else {
assert!(component.starts_with("rust")); assert!(component.starts_with("rust"));
format!("{}-{}", component, build.rust_package_vers()) format!("{}-{}", component, build.rust_package_vers())
@ -315,19 +317,12 @@ pub fn rust_src_location(build: &Build) -> PathBuf {
/// Creates a tarball of save-analysis metadata, if available. /// Creates a tarball of save-analysis metadata, if available.
pub fn analysis(build: &Build, compiler: &Compiler, target: &str) { pub fn analysis(build: &Build, compiler: &Compiler, target: &str) {
assert!(build.config.extended);
println!("Dist analysis"); println!("Dist analysis");
if build.config.channel != "nightly" {
println!("\tskipping - not on nightly channel");
return;
}
if compiler.host != build.config.build { if compiler.host != build.config.build {
println!("\tskipping - not a build host"); println!("\tskipping, not a build host");
return return;
}
if compiler.stage != 2 {
println!("\tskipping - not stage2");
return
} }
// Package save-analysis from stage1 if not doing a full bootstrap, as the // Package save-analysis from stage1 if not doing a full bootstrap, as the
@ -397,6 +392,7 @@ pub fn rust_src(build: &Build) {
"man", "man",
"src", "src",
"cargo", "cargo",
"rls",
]; ];
let filter_fn = move |path: &Path| { let filter_fn = move |path: &Path| {
@ -543,7 +539,7 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
let src = build.src.join("cargo"); let src = build.src.join("cargo");
let etc = src.join("src/etc"); let etc = src.join("src/etc");
let release_num = build.cargo_release_num(); let release_num = build.release_num("cargo");
let name = pkgname(build, "cargo"); let name = pkgname(build, "cargo");
let version = build.cargo_info.version(build, &release_num); let version = build.cargo_info.version(build, &release_num);
@ -597,6 +593,55 @@ pub fn cargo(build: &Build, stage: u32, target: &str) {
build.run(&mut cmd); build.run(&mut cmd);
} }
pub fn rls(build: &Build, stage: u32, target: &str) {
assert!(build.config.extended);
println!("Dist RLS stage{} ({})", stage, target);
let compiler = Compiler::new(stage, &build.config.build);
let src = build.src.join("rls");
let release_num = build.release_num("rls");
let name = pkgname(build, "rls");
let version = build.rls_info.version(build, &release_num);
let tmp = tmpdir(build);
let image = tmp.join("rls-image");
drop(fs::remove_dir_all(&image));
t!(fs::create_dir_all(&image));
// Prepare the image directory
let rls = build.cargo_out(&compiler, Mode::Tool, target)
.join(exe("rls", target));
install(&rls, &image.join("bin"), 0o755);
let doc = image.join("share/doc/rls");
install(&src.join("README.md"), &doc, 0o644);
install(&src.join("LICENSE-MIT"), &doc, 0o644);
install(&src.join("LICENSE-APACHE"), &doc, 0o644);
// Prepare the overlay
let overlay = tmp.join("rls-overlay");
drop(fs::remove_dir_all(&overlay));
t!(fs::create_dir_all(&overlay));
install(&src.join("README.md"), &overlay, 0o644);
install(&src.join("LICENSE-MIT"), &overlay, 0o644);
install(&src.join("LICENSE-APACHE"), &overlay, 0o644);
t!(t!(File::create(overlay.join("version"))).write_all(version.as_bytes()));
// Generate the installer tarball
let mut cmd = Command::new("sh");
cmd.arg(sanitize_sh(&build.src.join("src/rust-installer/gen-installer.sh")))
.arg("--product-name=Rust")
.arg("--rel-manifest-dir=rustlib")
.arg("--success-message=RLS-ready-to-serve.")
.arg(format!("--image-dir={}", sanitize_sh(&image)))
.arg(format!("--work-dir={}", sanitize_sh(&tmpdir(build))))
.arg(format!("--output-dir={}", sanitize_sh(&distdir(build))))
.arg(format!("--non-installed-overlay={}", sanitize_sh(&overlay)))
.arg(format!("--package-name={}-{}", name, target))
.arg("--component-name=rls")
.arg("--legacy-manifest-dirs=rustlib,cargo");
build.run(&mut cmd);
}
/// Creates a combined installer for the specified target in the provided stage. /// Creates a combined installer for the specified target in the provided stage.
pub fn extended(build: &Build, stage: u32, target: &str) { pub fn extended(build: &Build, stage: u32, target: &str) {
println!("Dist extended stage{} ({})", stage, target); println!("Dist extended stage{} ({})", stage, target);
@ -608,6 +653,12 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
let cargo_installer = dist.join(format!("{}-{}.tar.gz", let cargo_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "cargo"), pkgname(build, "cargo"),
target)); target));
let rls_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rls"),
target));
let analysis_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rust-analysis"),
target));
let docs_installer = dist.join(format!("{}-{}.tar.gz", let docs_installer = dist.join(format!("{}-{}.tar.gz",
pkgname(build, "rust-docs"), pkgname(build, "rust-docs"),
target)); target));
@ -635,9 +686,11 @@ pub fn extended(build: &Build, stage: u32, target: &str) {
// upgrades rustc was upgraded before rust-std. To avoid rustc clobbering // upgrades rustc was upgraded before rust-std. To avoid rustc clobbering
// the std files during uninstall. To do this ensure that rustc comes // the std files during uninstall. To do this ensure that rustc comes
// before rust-std in the list below. // before rust-std in the list below.
let mut input_tarballs = format!("{},{},{},{}", let mut input_tarballs = format!("{},{},{},{},{},{}",
sanitize_sh(&rustc_installer), sanitize_sh(&rustc_installer),
sanitize_sh(&cargo_installer), sanitize_sh(&cargo_installer),
sanitize_sh(&rls_installer),
sanitize_sh(&analysis_installer),
sanitize_sh(&docs_installer), sanitize_sh(&docs_installer),
sanitize_sh(&std_installer)); sanitize_sh(&std_installer));
if target.contains("pc-windows-gnu") { if target.contains("pc-windows-gnu") {
@ -950,7 +1003,8 @@ pub fn hash_and_sign(build: &Build) {
cmd.arg(distdir(build)); cmd.arg(distdir(build));
cmd.arg(today.trim()); cmd.arg(today.trim());
cmd.arg(build.rust_package_vers()); cmd.arg(build.rust_package_vers());
cmd.arg(build.package_vers(&build.cargo_release_num())); cmd.arg(build.package_vers(&build.release_num("cargo")));
cmd.arg(build.package_vers(&build.release_num("rls")));
cmd.arg(addr); cmd.arg(addr);
t!(fs::create_dir_all(distdir(build))); t!(fs::create_dir_all(distdir(build)));

View File

@ -53,6 +53,82 @@ pub fn rustbook(build: &Build, target: &str, name: &str) {
.arg(out)); .arg(out));
} }
/// Build the book and associated stuff.
///
/// We need to build:
///
/// * Book (first edition)
/// * Book (second edition)
/// * Index page
/// * Redirect pages
pub fn book(build: &Build, target: &str, name: &str) {
// build book first edition
rustbook(build, target, &format!("{}/first-edition", name));
// build book second edition
rustbook(build, target, &format!("{}/second-edition", name));
// build the index page
let index = format!("{}/index.md", name);
println!("Documenting book index ({})", target);
invoke_rustdoc(build, target, &index);
// build the redirect pages
println!("Documenting book redirect pages ({})", target);
for file in t!(fs::read_dir(build.src.join("src/doc/book/redirects"))) {
let file = t!(file);
let path = file.path();
let path = path.to_str().unwrap();
invoke_rustdoc(build, target, path);
}
}
fn invoke_rustdoc(build: &Build, target: &str, markdown: &str) {
let out = build.doc_out(target);
let compiler = Compiler::new(0, &build.config.build);
let path = build.src.join("src/doc").join(markdown);
let rustdoc = build.rustdoc(&compiler);
let favicon = build.src.join("src/doc/favicon.inc");
let footer = build.src.join("src/doc/footer.inc");
let version_input = build.src.join("src/doc/version_info.html.template");
let version_info = out.join("version_info.html");
if !up_to_date(&version_input, &version_info) {
let mut info = String::new();
t!(t!(File::open(&version_input)).read_to_string(&mut info));
let info = info.replace("VERSION", &build.rust_release())
.replace("SHORT_HASH", build.rust_info.sha_short().unwrap_or(""))
.replace("STAMP", build.rust_info.sha().unwrap_or(""));
t!(t!(File::create(&version_info)).write_all(info.as_bytes()));
}
let mut cmd = Command::new(&rustdoc);
build.add_rustc_lib_path(&compiler, &mut cmd);
let out = out.join("book");
t!(fs::copy(build.src.join("src/doc/rust.css"), out.join("rust.css")));
cmd.arg("--html-after-content").arg(&footer)
.arg("--html-before-content").arg(&version_info)
.arg("--html-in-header").arg(&favicon)
.arg("--markdown-playground-url")
.arg("https://play.rust-lang.org/")
.arg("-o").arg(&out)
.arg(&path)
.arg("--markdown-css")
.arg("rust.css");
build.run(&mut cmd);
}
/// Generates all standalone documentation as compiled by the rustdoc in `stage` /// Generates all standalone documentation as compiled by the rustdoc in `stage`
/// for the `target` into `out`. /// for the `target` into `out`.
/// ///

View File

@ -18,7 +18,7 @@ use std::fs;
use std::path::PathBuf; use std::path::PathBuf;
use std::process; use std::process;
use getopts::{Matches, Options}; use getopts::Options;
use Build; use Build;
use config::Config; use config::Config;
@ -75,7 +75,22 @@ pub enum Subcommand {
impl Flags { impl Flags {
pub fn parse(args: &[String]) -> Flags { pub fn parse(args: &[String]) -> Flags {
let mut extra_help = String::new();
let mut subcommand_help = format!("\
Usage: x.py <subcommand> [options] [<paths>...]
Subcommands:
build Compile either the compiler or libraries
test Build and run some test suites
bench Build and run some benchmarks
doc Build documentation
clean Clean out build directories
dist Build and/or install distribution artifacts
To learn more about a subcommand, run `./x.py <subcommand> -h`");
let mut opts = Options::new(); let mut opts = Options::new();
// Options common to all subcommands
opts.optflagmulti("v", "verbose", "use verbose output (-vv for very verbose)"); opts.optflagmulti("v", "verbose", "use verbose output (-vv for very verbose)");
opts.optflag("i", "incremental", "use incremental compilation"); opts.optflag("i", "incremental", "use incremental compilation");
opts.optopt("", "config", "TOML configuration file for build", "FILE"); opts.optopt("", "config", "TOML configuration file for build", "FILE");
@ -89,21 +104,83 @@ impl Flags {
opts.optopt("j", "jobs", "number of jobs to run in parallel", "JOBS"); opts.optopt("j", "jobs", "number of jobs to run in parallel", "JOBS");
opts.optflag("h", "help", "print this help message"); opts.optflag("h", "help", "print this help message");
let usage = |n, opts: &Options| -> ! { // fn usage()
let command = args.get(0).map(|s| &**s); let usage = |exit_code: i32, opts: &Options, subcommand_help: &str, extra_help: &str| -> ! {
let brief = format!("Usage: x.py {} [options] [<args>...]", println!("{}", opts.usage(subcommand_help));
command.unwrap_or("<command>")); if !extra_help.is_empty() {
println!("{}", extra_help);
}
process::exit(exit_code);
};
println!("{}", opts.usage(&brief)); // We can't use getopt to parse the options until we have completed specifying which
match command { // options are valid, but under the current implementation, some options are conditional on
Some("build") => { // the subcommand. Therefore we must manually identify the subcommand first, so that we can
println!("\ // complete the definition of the options. Then we can use the getopt::Matches object from
// there on out.
let mut possible_subcommands = args.iter().collect::<Vec<_>>();
possible_subcommands.retain(|&s|
(s == "build")
|| (s == "test")
|| (s == "bench")
|| (s == "doc")
|| (s == "clean")
|| (s == "dist"));
let subcommand = match possible_subcommands.first() {
Some(s) => s,
None => {
// No subcommand -- show the general usage and subcommand help
println!("{}\n", subcommand_help);
process::exit(0);
}
};
// Some subcommands get extra options
match subcommand.as_str() {
"test" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); },
"bench" => { opts.optmulti("", "test-args", "extra arguments", "ARGS"); },
"dist" => { opts.optflag("", "install", "run installer as well"); },
_ => { },
};
// Done specifying what options are possible, so do the getopts parsing
let matches = opts.parse(&args[..]).unwrap_or_else(|e| {
// Invalid argument/option format
println!("\n{}\n", e);
usage(1, &opts, &subcommand_help, &extra_help);
});
// Extra sanity check to make sure we didn't hit this crazy corner case:
//
// ./x.py --frobulate clean build
// ^-- option ^ ^- actual subcommand
// \_ arg to option could be mistaken as subcommand
let mut pass_sanity_check = true;
match matches.free.get(0) {
Some(check_subcommand) => {
if &check_subcommand != subcommand {
pass_sanity_check = false;
}
},
None => {
pass_sanity_check = false;
}
}
if !pass_sanity_check {
println!("{}\n", subcommand_help);
println!("Sorry, I couldn't figure out which subcommand you were trying to specify.\n\
You may need to move some options to after the subcommand.\n");
process::exit(1);
}
// Extra help text for some commands
match subcommand.as_str() {
"build" => {
subcommand_help.push_str("\n
Arguments: Arguments:
This subcommand accepts a number of positional arguments of directories to This subcommand accepts a number of paths to directories to the crates
the crates and/or artifacts to compile. For example: and/or artifacts to compile. For example:
./x.py build src/libcore ./x.py build src/libcore
./x.py build src/libproc_macro ./x.py build src/libcore src/libproc_macro
./x.py build src/libstd --stage 1 ./x.py build src/libstd --stage 1
If no arguments are passed then the complete artifacts for that stage are If no arguments are passed then the complete artifacts for that stage are
@ -114,15 +191,13 @@ Arguments:
For a quick build with a usable compile, you can pass: For a quick build with a usable compile, you can pass:
./x.py build --stage 1 src/libtest ./x.py build --stage 1 src/libtest");
");
} }
"test" => {
Some("test") => { subcommand_help.push_str("\n
println!("\
Arguments: Arguments:
This subcommand accepts a number of positional arguments of directories to This subcommand accepts a number of paths to directories to tests that
tests that should be compiled and run. For example: should be compiled and run. For example:
./x.py test src/test/run-pass ./x.py test src/test/run-pass
./x.py test src/libstd --test-args hash_map ./x.py test src/libstd --test-args hash_map
@ -132,139 +207,90 @@ Arguments:
compiled and tested. compiled and tested.
./x.py test ./x.py test
./x.py test --stage 1 ./x.py test --stage 1");
");
} }
"doc" => {
Some("doc") => { subcommand_help.push_str("\n
println!("\
Arguments: Arguments:
This subcommand accepts a number of positional arguments of directories of This subcommand accepts a number of paths to directories of documentation
documentation to build. For example: to build. For example:
./x.py doc src/doc/book ./x.py doc src/doc/book
./x.py doc src/doc/nomicon ./x.py doc src/doc/nomicon
./x.py doc src/libstd ./x.py doc src/doc/book src/libstd
If no arguments are passed then everything is documented: If no arguments are passed then everything is documented:
./x.py doc ./x.py doc
./x.py doc --stage 1 ./x.py doc --stage 1");
");
} }
_ => { }
};
// Get any optional paths which occur after the subcommand
let cwd = t!(env::current_dir());
let paths = matches.free[1..].iter().map(|p| cwd.join(p)).collect::<Vec<_>>();
_ => {}
}
if let Some(command) = command { // All subcommands can have an optional "Available paths" section
if command == "build" || if matches.opt_present("verbose") {
command == "dist" ||
command == "doc" ||
command == "test" ||
command == "bench" ||
command == "clean" {
println!("Available invocations:");
if args.iter().any(|a| a == "-v") {
let flags = Flags::parse(&["build".to_string()]); let flags = Flags::parse(&["build".to_string()]);
let mut config = Config::default(); let mut config = Config::default();
config.build = flags.build.clone(); config.build = flags.build.clone();
let mut build = Build::new(flags, config); let mut build = Build::new(flags, config);
metadata::build(&mut build); metadata::build(&mut build);
step::build_rules(&build).print_help(command); let maybe_rules_help = step::build_rules(&build).get_help(subcommand);
if maybe_rules_help.is_some() {
extra_help.push_str(maybe_rules_help.unwrap().as_str());
}
} else { } else {
println!(" ... elided, run `./x.py {} -h -v` to see", extra_help.push_str(format!("Run `./x.py {} -h -v` to see a list of available paths.",
command); subcommand).as_str());
} }
println!(""); // User passed in -h/--help?
} if matches.opt_present("help") {
usage(0, &opts, &subcommand_help, &extra_help);
} }
println!("\ let cmd = match subcommand.as_str() {
Subcommands:
build Compile either the compiler or libraries
test Build and run some test suites
bench Build and run some benchmarks
doc Build documentation
clean Clean out build directories
dist Build and/or install distribution artifacts
To learn more about a subcommand, run `./x.py <command> -h`
");
process::exit(n);
};
if args.len() == 0 {
println!("a command must be passed");
usage(1, &opts);
}
let parse = |opts: &Options| {
let m = opts.parse(&args[1..]).unwrap_or_else(|e| {
println!("failed to parse options: {}", e);
usage(1, opts);
});
if m.opt_present("h") {
usage(0, opts);
}
return m
};
let cwd = t!(env::current_dir());
let remaining_as_path = |m: &Matches| {
m.free.iter().map(|p| cwd.join(p)).collect::<Vec<_>>()
};
let m: Matches;
let cmd = match &args[0][..] {
"build" => { "build" => {
m = parse(&opts); Subcommand::Build { paths: paths }
Subcommand::Build { paths: remaining_as_path(&m) }
}
"doc" => {
m = parse(&opts);
Subcommand::Doc { paths: remaining_as_path(&m) }
} }
"test" => { "test" => {
opts.optmulti("", "test-args", "extra arguments", "ARGS");
m = parse(&opts);
Subcommand::Test { Subcommand::Test {
paths: remaining_as_path(&m), paths: paths,
test_args: m.opt_strs("test-args"), test_args: matches.opt_strs("test-args"),
} }
} }
"bench" => { "bench" => {
opts.optmulti("", "test-args", "extra arguments", "ARGS");
m = parse(&opts);
Subcommand::Bench { Subcommand::Bench {
paths: remaining_as_path(&m), paths: paths,
test_args: m.opt_strs("test-args"), test_args: matches.opt_strs("test-args"),
} }
} }
"doc" => {
Subcommand::Doc { paths: paths }
}
"clean" => { "clean" => {
m = parse(&opts); if paths.len() > 0 {
if m.free.len() > 0 { println!("\nclean takes no arguments\n");
println!("clean takes no arguments"); usage(1, &opts, &subcommand_help, &extra_help);
usage(1, &opts);
} }
Subcommand::Clean Subcommand::Clean
} }
"dist" => { "dist" => {
opts.optflag("", "install", "run installer as well");
m = parse(&opts);
Subcommand::Dist { Subcommand::Dist {
paths: remaining_as_path(&m), paths: paths,
install: m.opt_present("install"), install: matches.opt_present("install"),
} }
} }
"--help" => usage(0, &opts), _ => {
cmd => { usage(1, &opts, &subcommand_help, &extra_help);
println!("unknown command: {}", cmd);
usage(1, &opts);
} }
}; };
let cfg_file = m.opt_str("config").map(PathBuf::from).or_else(|| { let cfg_file = matches.opt_str("config").map(PathBuf::from).or_else(|| {
if fs::metadata("config.toml").is_ok() { if fs::metadata("config.toml").is_ok() {
Some(PathBuf::from("config.toml")) Some(PathBuf::from("config.toml"))
} else { } else {
@ -272,31 +298,29 @@ To learn more about a subcommand, run `./x.py <command> -h`
} }
}); });
let mut stage = m.opt_str("stage").map(|j| j.parse().unwrap()); let mut stage = matches.opt_str("stage").map(|j| j.parse().unwrap());
let incremental = m.opt_present("i"); if matches.opt_present("incremental") {
if incremental {
if stage.is_none() { if stage.is_none() {
stage = Some(1); stage = Some(1);
} }
} }
Flags { Flags {
verbose: m.opt_count("v"), verbose: matches.opt_count("verbose"),
stage: stage, stage: stage,
on_fail: m.opt_str("on-fail"), on_fail: matches.opt_str("on-fail"),
keep_stage: m.opt_str("keep-stage").map(|j| j.parse().unwrap()), keep_stage: matches.opt_str("keep-stage").map(|j| j.parse().unwrap()),
build: m.opt_str("build").unwrap_or_else(|| { build: matches.opt_str("build").unwrap_or_else(|| {
env::var("BUILD").unwrap() env::var("BUILD").unwrap()
}), }),
host: split(m.opt_strs("host")), host: split(matches.opt_strs("host")),
target: split(m.opt_strs("target")), target: split(matches.opt_strs("target")),
config: cfg_file, config: cfg_file,
src: m.opt_str("src").map(PathBuf::from), src: matches.opt_str("src").map(PathBuf::from),
jobs: m.opt_str("jobs").map(|j| j.parse().unwrap()), jobs: matches.opt_str("jobs").map(|j| j.parse().unwrap()),
cmd: cmd, cmd: cmd,
incremental: incremental, incremental: matches.opt_present("incremental"),
} }
} }
} }

View File

@ -49,8 +49,12 @@ pub fn install(build: &Build, stage: u32, host: &str) {
install_sh(&build, "docs", "rust-docs", stage, host, &prefix, install_sh(&build, "docs", "rust-docs", stage, host, &prefix,
&docdir, &libdir, &mandir, &empty_dir); &docdir, &libdir, &mandir, &empty_dir);
} }
install_sh(&build, "std", "rust-std", stage, host, &prefix,
for target in build.config.target.iter() {
install_sh(&build, "std", "rust-std", stage, target, &prefix,
&docdir, &libdir, &mandir, &empty_dir); &docdir, &libdir, &mandir, &empty_dir);
}
install_sh(&build, "rustc", "rustc", stage, host, &prefix, install_sh(&build, "rustc", "rustc", stage, host, &prefix,
&docdir, &libdir, &mandir, &empty_dir); &docdir, &libdir, &mandir, &empty_dir);
t!(fs::remove_dir_all(&empty_dir)); t!(fs::remove_dir_all(&empty_dir));

View File

@ -151,6 +151,7 @@ pub struct Build {
out: PathBuf, out: PathBuf,
rust_info: channel::GitInfo, rust_info: channel::GitInfo,
cargo_info: channel::GitInfo, cargo_info: channel::GitInfo,
rls_info: channel::GitInfo,
local_rebuild: bool, local_rebuild: bool,
// Probed tools at runtime // Probed tools at runtime
@ -181,7 +182,7 @@ struct Crate {
/// ///
/// These entries currently correspond to the various output directories of the /// These entries currently correspond to the various output directories of the
/// build system, with each mod generating output in a different directory. /// build system, with each mod generating output in a different directory.
#[derive(Clone, Copy, PartialEq)] #[derive(Clone, Copy, PartialEq, Eq)]
pub enum Mode { pub enum Mode {
/// This cargo is going to build the standard library, placing output in the /// This cargo is going to build the standard library, placing output in the
/// "stageN-std" directory. /// "stageN-std" directory.
@ -234,6 +235,7 @@ impl Build {
}; };
let rust_info = channel::GitInfo::new(&src); let rust_info = channel::GitInfo::new(&src);
let cargo_info = channel::GitInfo::new(&src.join("cargo")); let cargo_info = channel::GitInfo::new(&src.join("cargo"));
let rls_info = channel::GitInfo::new(&src.join("rls"));
let src_is_git = src.join(".git").exists(); let src_is_git = src.join(".git").exists();
Build { Build {
@ -246,6 +248,7 @@ impl Build {
rust_info: rust_info, rust_info: rust_info,
cargo_info: cargo_info, cargo_info: cargo_info,
rls_info: rls_info,
local_rebuild: local_rebuild, local_rebuild: local_rebuild,
cc: HashMap::new(), cc: HashMap::new(),
cxx: HashMap::new(), cxx: HashMap::new(),
@ -496,7 +499,7 @@ impl Build {
// For other crates, however, we know that we've already got a standard // For other crates, however, we know that we've already got a standard
// library up and running, so we can use the normal compiler to compile // library up and running, so we can use the normal compiler to compile
// build scripts in that situation. // build scripts in that situation.
if let Mode::Libstd = mode { if mode == Mode::Libstd {
cargo.env("RUSTC_SNAPSHOT", &self.rustc) cargo.env("RUSTC_SNAPSHOT", &self.rustc)
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_snapshot_libdir()); .env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_snapshot_libdir());
} else { } else {
@ -504,6 +507,27 @@ impl Build {
.env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler)); .env("RUSTC_SNAPSHOT_LIBDIR", self.rustc_libdir(compiler));
} }
// There are two invariants we try must maintain:
// * stable crates cannot depend on unstable crates (general Rust rule),
// * crates that end up in the sysroot must be unstable (rustbuild rule).
//
// In order to do enforce the latter, we pass the env var
// `RUSTBUILD_UNSTABLE` down the line for any crates which will end up
// in the sysroot. We read this in bootstrap/bin/rustc.rs and if it is
// set, then we pass the `rustbuild` feature to rustc when building the
// the crate.
//
// In turn, crates that can be used here should recognise the `rustbuild`
// feature and opt-in to `rustc_private`.
//
// We can't always pass `rustbuild` because crates which are outside of
// the comipiler, libs, and tests are stable and we don't want to make
// their deps unstable (since this would break the first invariant
// above).
if mode != Mode::Tool {
cargo.env("RUSTBUILD_UNSTABLE", "1");
}
// Ignore incremental modes except for stage0, since we're // Ignore incremental modes except for stage0, since we're
// not guaranteeing correctness acros builds if the compiler // not guaranteeing correctness acros builds if the compiler
// is changing under your feet.` // is changing under your feet.`
@ -529,7 +553,7 @@ impl Build {
.env(format!("CFLAGS_{}", target), self.cflags(target).join(" ")); .env(format!("CFLAGS_{}", target), self.cflags(target).join(" "));
} }
if self.config.channel == "nightly" && compiler.is_final_stage(self) { if self.config.extended && compiler.is_final_stage(self) {
cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string()); cargo.env("RUSTC_SAVE_ANALYSIS", "api".to_string());
} }
@ -851,7 +875,7 @@ impl Build {
.filter(|s| !s.starts_with("-O") && !s.starts_with("/O")) .filter(|s| !s.starts_with("-O") && !s.starts_with("/O"))
.collect::<Vec<_>>(); .collect::<Vec<_>>();
// If we're compiling on OSX then we add a few unconditional flags // If we're compiling on macOS then we add a few unconditional flags
// indicating that we want libc++ (more filled out than libstdc++) and // indicating that we want libc++ (more filled out than libstdc++) and
// we want to compile for 10.7. This way we can ensure that // we want to compile for 10.7. This way we can ensure that
// LLVM/jemalloc/etc are all properly compiled. // LLVM/jemalloc/etc are all properly compiled.
@ -1001,7 +1025,7 @@ impl Build {
/// Returns the value of `package_vers` above for Cargo /// Returns the value of `package_vers` above for Cargo
fn cargo_package_vers(&self) -> String { fn cargo_package_vers(&self) -> String {
self.package_vers(&self.cargo_release_num()) self.package_vers(&self.release_num("cargo"))
} }
/// Returns the `version` string associated with this compiler for Rust /// Returns the `version` string associated with this compiler for Rust
@ -1013,10 +1037,11 @@ impl Build {
self.rust_info.version(self, channel::CFG_RELEASE_NUM) self.rust_info.version(self, channel::CFG_RELEASE_NUM)
} }
/// Returns the `a.b.c` version that Cargo is at. /// Returns the `a.b.c` version that the given package is at.
fn cargo_release_num(&self) -> String { fn release_num(&self, package: &str) -> String {
let mut toml = String::new(); let mut toml = String::new();
t!(t!(File::open(self.src.join("cargo/Cargo.toml"))).read_to_string(&mut toml)); let toml_file_name = self.src.join(&format!("{}/Cargo.toml", package));
t!(t!(File::open(toml_file_name)).read_to_string(&mut toml));
for line in toml.lines() { for line in toml.lines() {
let prefix = "version = \""; let prefix = "version = \"";
let suffix = "\""; let suffix = "\"";
@ -1025,7 +1050,7 @@ impl Build {
} }
} }
panic!("failed to find version in cargo's Cargo.toml") panic!("failed to find version in {}'s Cargo.toml", package)
} }
/// Returns whether unstable features should be enabled for the compiler /// Returns whether unstable features should be enabled for the compiler

View File

@ -18,6 +18,7 @@
//! LLVM and compiler-rt are essentially just wired up to everything else to //! LLVM and compiler-rt are essentially just wired up to everything else to
//! ensure that they're always in place if needed. //! ensure that they're always in place if needed.
use std::env;
use std::fs::{self, File}; use std::fs::{self, File};
use std::io::{Read, Write}; use std::io::{Read, Write};
use std::path::Path; use std::path::Path;
@ -145,6 +146,10 @@ pub fn llvm(build: &Build, target: &str) {
cfg.define("CMAKE_CXX_FLAGS", build.cflags(target).join(" ")); cfg.define("CMAKE_CXX_FLAGS", build.cflags(target).join(" "));
} }
if env::var_os("SCCACHE_ERROR_LOG").is_some() {
cfg.env("RUST_LOG", "sccache=info");
}
// FIXME: we don't actually need to build all LLVM tools and all LLVM // FIXME: we don't actually need to build all LLVM tools and all LLVM
// libraries here, e.g. we just want a few components and a few // libraries here, e.g. we just want a few components and a few
// tools. Figure out how to filter them down and only build the right // tools. Figure out how to filter them down and only build the right
@ -222,9 +227,24 @@ pub fn openssl(build: &Build, target: &str) {
let tarball = out.join(&name); let tarball = out.join(&name);
if !tarball.exists() { if !tarball.exists() {
let tmp = tarball.with_extension("tmp"); let tmp = tarball.with_extension("tmp");
build.run(Command::new("curl") // originally from https://www.openssl.org/source/...
let url = format!("https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/{}",
name);
let mut ok = false;
for _ in 0..3 {
let status = Command::new("curl")
.arg("-o").arg(&tmp) .arg("-o").arg(&tmp)
.arg(format!("https://www.openssl.org/source/{}", name))); .arg(&url)
.status()
.expect("failed to spawn curl");
if status.success() {
ok = true;
break
}
}
if !ok {
panic!("failed to download openssl source")
}
let mut shasum = if target.contains("apple") { let mut shasum = if target.contains("apple") {
let mut cmd = Command::new("shasum"); let mut cmd = Command::new("shasum");
cmd.arg("-a").arg("256"); cmd.arg("-a").arg("256");
@ -286,7 +306,7 @@ pub fn openssl(build: &Build, target: &str) {
println!("Configuring openssl for {}", target); println!("Configuring openssl for {}", target);
build.run_quiet(&mut configure); build.run_quiet(&mut configure);
println!("Building openssl for {}", target); println!("Building openssl for {}", target);
build.run_quiet(Command::new("make").current_dir(&obj)); build.run_quiet(Command::new("make").arg("-j1").current_dir(&obj));
println!("Installing openssl for {}", target); println!("Installing openssl for {}", target);
build.run_quiet(Command::new("make").arg("install").current_dir(&obj)); build.run_quiet(Command::new("make").arg("install").current_dir(&obj));

View File

@ -151,10 +151,10 @@ pub fn check(build: &mut Build) {
} }
for target in build.config.target.iter() { for target in build.config.target.iter() {
// Can't compile for iOS unless we're on OSX // Can't compile for iOS unless we're on macOS
if target.contains("apple-ios") && if target.contains("apple-ios") &&
!build.config.build.contains("apple-darwin") { !build.config.build.contains("apple-darwin") {
panic!("the iOS target is only supported on OSX"); panic!("the iOS target is only supported on macOS");
} }
// Make sure musl-root is valid if specified // Make sure musl-root is valid if specified

View File

@ -26,7 +26,7 @@
//! along with the actual implementation elsewhere. You can find more comments //! along with the actual implementation elsewhere. You can find more comments
//! about how to define rules themselves below. //! about how to define rules themselves below.
use std::collections::{BTreeMap, HashSet}; use std::collections::{BTreeMap, HashSet, HashMap};
use std::mem; use std::mem;
use check::{self, TestKind}; use check::{self, TestKind};
@ -137,7 +137,9 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
while let Some(krate) = list.pop() { while let Some(krate) = list.pop() {
let default = krate == name; let default = krate == name;
let krate = &build.crates[krate]; let krate = &build.crates[krate];
let path = krate.path.strip_prefix(&build.src).unwrap(); let path = krate.path.strip_prefix(&build.src)
// This handles out of tree paths
.unwrap_or(&krate.path);
ret.push((krate, path.to_str().unwrap(), default)); ret.push((krate, path.to_str().unwrap(), default));
for dep in krate.deps.iter() { for dep in krate.deps.iter() {
if visited.insert(dep) && dep != "build_helper" { if visited.insert(dep) && dep != "build_helper" {
@ -533,34 +535,44 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
// //
// Tools used during the build system but not shipped // Tools used during the build system but not shipped
rules.build("tool-rustbook", "src/tools/rustbook") rules.build("tool-rustbook", "src/tools/rustbook")
.dep(|s| s.name("librustc")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("librustc-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "rustbook")); .run(move |s| compile::tool(build, s.stage, s.target, "rustbook"));
rules.build("tool-error-index", "src/tools/error_index_generator") rules.build("tool-error-index", "src/tools/error_index_generator")
.dep(|s| s.name("librustc")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("librustc-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "error_index_generator")); .run(move |s| compile::tool(build, s.stage, s.target, "error_index_generator"));
rules.build("tool-tidy", "src/tools/tidy") rules.build("tool-tidy", "src/tools/tidy")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "tidy")); .run(move |s| compile::tool(build, s.stage, s.target, "tidy"));
rules.build("tool-linkchecker", "src/tools/linkchecker") rules.build("tool-linkchecker", "src/tools/linkchecker")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "linkchecker")); .run(move |s| compile::tool(build, s.stage, s.target, "linkchecker"));
rules.build("tool-cargotest", "src/tools/cargotest") rules.build("tool-cargotest", "src/tools/cargotest")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "cargotest")); .run(move |s| compile::tool(build, s.stage, s.target, "cargotest"));
rules.build("tool-compiletest", "src/tools/compiletest") rules.build("tool-compiletest", "src/tools/compiletest")
.dep(|s| s.name("libtest")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libtest-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "compiletest")); .run(move |s| compile::tool(build, s.stage, s.target, "compiletest"));
rules.build("tool-build-manifest", "src/tools/build-manifest") rules.build("tool-build-manifest", "src/tools/build-manifest")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "build-manifest")); .run(move |s| compile::tool(build, s.stage, s.target, "build-manifest"));
rules.build("tool-qemu-test-server", "src/tools/qemu-test-server") rules.build("tool-qemu-test-server", "src/tools/qemu-test-server")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-server")); .run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-server"));
rules.build("tool-qemu-test-client", "src/tools/qemu-test-client") rules.build("tool-qemu-test-client", "src/tools/qemu-test-client")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-client")); .run(move |s| compile::tool(build, s.stage, s.target, "qemu-test-client"));
rules.build("tool-cargo", "cargo") rules.build("tool-cargo", "cargo")
.dep(|s| s.name("libstd")) .dep(|s| s.name("maybe-clean-tools"))
.dep(|s| s.name("libstd-tool"))
.dep(|s| s.stage(0).host(s.target).name("openssl")) .dep(|s| s.stage(0).host(s.target).name("openssl"))
.dep(move |s| { .dep(move |s| {
// Cargo depends on procedural macros, which requires a full host // Cargo depends on procedural macros, which requires a full host
@ -570,6 +582,36 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.host(&build.config.build) .host(&build.config.build)
}) })
.run(move |s| compile::tool(build, s.stage, s.target, "cargo")); .run(move |s| compile::tool(build, s.stage, s.target, "cargo"));
rules.build("tool-rls", "rls")
.host(true)
.dep(|s| s.name("librustc-tool"))
.dep(|s| s.stage(0).host(s.target).name("openssl"))
.dep(move |s| {
// rls, like cargo, uses procedural macros
s.name("librustc-link")
.target(&build.config.build)
.host(&build.config.build)
})
.run(move |s| compile::tool(build, s.stage, s.target, "rls"));
// "pseudo rule" which represents completely cleaning out the tools dir in
// one stage. This needs to happen whenever a dependency changes (e.g.
// libstd, libtest, librustc) and all of the tool compilations above will
// be sequenced after this rule.
rules.build("maybe-clean-tools", "path/to/nowhere")
.after("librustc-tool")
.after("libtest-tool")
.after("libstd-tool");
rules.build("librustc-tool", "path/to/nowhere")
.dep(|s| s.name("librustc"))
.run(move |s| compile::maybe_clean_tools(build, s.stage, s.target, Mode::Librustc));
rules.build("libtest-tool", "path/to/nowhere")
.dep(|s| s.name("libtest"))
.run(move |s| compile::maybe_clean_tools(build, s.stage, s.target, Mode::Libtest));
rules.build("libstd-tool", "path/to/nowhere")
.dep(|s| s.name("libstd"))
.run(move |s| compile::maybe_clean_tools(build, s.stage, s.target, Mode::Libstd));
// ======================================================================== // ========================================================================
// Documentation targets // Documentation targets
@ -581,7 +623,7 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.stage(0) .stage(0)
}) })
.default(build.config.docs) .default(build.config.docs)
.run(move |s| doc::rustbook(build, s.target, "book")); .run(move |s| doc::book(build, s.target, "book"));
rules.doc("doc-nomicon", "src/doc/nomicon") rules.doc("doc-nomicon", "src/doc/nomicon")
.dep(move |s| { .dep(move |s| {
s.name("tool-rustbook") s.name("tool-rustbook")
@ -690,10 +732,15 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|s| s.name("default:doc")) .dep(|s| s.name("default:doc"))
.run(move |s| dist::docs(build, s.stage, s.target)); .run(move |s| dist::docs(build, s.stage, s.target));
rules.dist("dist-analysis", "analysis") rules.dist("dist-analysis", "analysis")
.default(build.config.extended)
.dep(|s| s.name("dist-std")) .dep(|s| s.name("dist-std"))
.default(true)
.only_host_build(true) .only_host_build(true)
.run(move |s| dist::analysis(build, &s.compiler(), s.target)); .run(move |s| dist::analysis(build, &s.compiler(), s.target));
rules.dist("dist-rls", "rls")
.host(true)
.only_host_build(true)
.dep(|s| s.name("tool-rls"))
.run(move |s| dist::rls(build, s.stage, s.target));
rules.dist("install", "path/to/nowhere") rules.dist("install", "path/to/nowhere")
.dep(|s| s.name("default:dist")) .dep(|s| s.name("default:dist"))
.run(move |s| install::install(build, s.stage, s.target)); .run(move |s| install::install(build, s.stage, s.target));
@ -711,6 +758,8 @@ pub fn build_rules<'a>(build: &'a Build) -> Rules {
.dep(|d| d.name("dist-mingw")) .dep(|d| d.name("dist-mingw"))
.dep(|d| d.name("dist-docs")) .dep(|d| d.name("dist-docs"))
.dep(|d| d.name("dist-cargo")) .dep(|d| d.name("dist-cargo"))
.dep(|d| d.name("dist-rls"))
.dep(|d| d.name("dist-analysis"))
.run(move |s| dist::extended(build, s.stage, s.target)); .run(move |s| dist::extended(build, s.stage, s.target));
rules.dist("dist-sign", "hash-and-sign") rules.dist("dist-sign", "hash-and-sign")
@ -811,6 +860,11 @@ struct Rule<'a> {
/// Whether this rule is only for the build triple, not anything in hosts or /// Whether this rule is only for the build triple, not anything in hosts or
/// targets. /// targets.
only_build: bool, only_build: bool,
/// A list of "order only" dependencies. This rules does not actually
/// depend on these rules, but if they show up in the dependency graph then
/// this rule must be executed after all these rules.
after: Vec<&'a str>,
} }
#[derive(PartialEq)] #[derive(PartialEq)]
@ -834,6 +888,7 @@ impl<'a> Rule<'a> {
host: false, host: false,
only_host_build: false, only_host_build: false,
only_build: false, only_build: false,
after: Vec::new(),
} }
} }
} }
@ -853,6 +908,11 @@ impl<'a, 'b> RuleBuilder<'a, 'b> {
self self
} }
fn after(&mut self, step: &'a str) -> &mut Self {
self.rule.after.push(step);
self
}
fn run<F>(&mut self, f: F) -> &mut Self fn run<F>(&mut self, f: F) -> &mut Self
where F: Fn(&Step<'a>) + 'a, where F: Fn(&Step<'a>) + 'a,
{ {
@ -978,26 +1038,25 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
} }
} }
pub fn print_help(&self, command: &str) { pub fn get_help(&self, command: &str) -> Option<String> {
let kind = match command { let kind = match command {
"build" => Kind::Build, "build" => Kind::Build,
"doc" => Kind::Doc, "doc" => Kind::Doc,
"test" => Kind::Test, "test" => Kind::Test,
"bench" => Kind::Bench, "bench" => Kind::Bench,
"dist" => Kind::Dist, "dist" => Kind::Dist,
_ => return, _ => return None,
}; };
let rules = self.rules.values().filter(|r| r.kind == kind); let rules = self.rules.values().filter(|r| r.kind == kind);
let rules = rules.filter(|r| !r.path.contains("nowhere")); let rules = rules.filter(|r| !r.path.contains("nowhere"));
let mut rules = rules.collect::<Vec<_>>(); let mut rules = rules.collect::<Vec<_>>();
rules.sort_by_key(|r| r.path); rules.sort_by_key(|r| r.path);
println!("Available paths:\n"); let mut help_string = String::from("Available paths:\n");
for rule in rules { for rule in rules {
print!(" ./x.py {} {}", command, rule.path); help_string.push_str(format!(" ./x.py {} {}\n", command, rule.path).as_str());
println!("");
} }
Some(help_string)
} }
/// Construct the top-level build steps that we're going to be executing, /// Construct the top-level build steps that we're going to be executing,
@ -1137,31 +1196,52 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
/// From the top level targets `steps` generate a topological ordering of /// From the top level targets `steps` generate a topological ordering of
/// all steps needed to run those steps. /// all steps needed to run those steps.
fn expand(&self, steps: &[Step<'a>]) -> Vec<Step<'a>> { fn expand(&self, steps: &[Step<'a>]) -> Vec<Step<'a>> {
// First up build a graph of steps and their dependencies. The `nodes`
// map is a map from step to a unique number. The `edges` map is a
// map from these unique numbers to a list of other numbers,
// representing dependencies.
let mut nodes = HashMap::new();
nodes.insert(Step::noop(), 0);
let mut edges = HashMap::new();
edges.insert(0, HashSet::new());
for step in steps {
self.build_graph(step.clone(), &mut nodes, &mut edges);
}
// Now that we've built up the actual dependency graph, draw more
// dependency edges to satisfy the `after` dependencies field for each
// rule.
self.satisfy_after_deps(&nodes, &mut edges);
// And finally, perform a topological sort to return a list of steps to
// execute.
let mut order = Vec::new(); let mut order = Vec::new();
let mut added = HashSet::new(); let mut visited = HashSet::new();
added.insert(Step::noop()); visited.insert(0);
for step in steps.iter().cloned() { let idx_to_node = nodes.iter().map(|p| (*p.1, p.0)).collect::<HashMap<_, _>>();
self.fill(step, &mut order, &mut added); for idx in 0..nodes.len() {
self.topo_sort(idx, &idx_to_node, &edges, &mut visited, &mut order);
} }
return order return order
} }
/// Performs topological sort of dependencies rooted at the `step` /// Builds the dependency graph rooted at `step`.
/// specified, pushing all results onto the `order` vector provided.
/// ///
/// In other words, when this method returns, the `order` vector will /// The `nodes` and `edges` maps are filled out according to the rule
/// contain a list of steps which if executed in order will eventually /// described by `step.name`.
/// complete the `step` specified as well. fn build_graph(&self,
///
/// The `added` set specified here is the set of steps that are already
/// present in `order` (and hence don't need to be added again).
fn fill(&self,
step: Step<'a>, step: Step<'a>,
order: &mut Vec<Step<'a>>, nodes: &mut HashMap<Step<'a>, usize>,
added: &mut HashSet<Step<'a>>) { edges: &mut HashMap<usize, HashSet<usize>>) -> usize {
if !added.insert(step.clone()) { use std::collections::hash_map::Entry;
return
let idx = nodes.len();
match nodes.entry(step.clone()) {
Entry::Vacant(e) => { e.insert(idx); }
Entry::Occupied(e) => return *e.get(),
} }
let mut deps = Vec::new();
for dep in self.rules[step.name].deps.iter() { for dep in self.rules[step.name].deps.iter() {
let dep = dep(&step); let dep = dep(&step);
if dep.name.starts_with("default:") { if dep.name.starts_with("default:") {
@ -1173,13 +1253,61 @@ invalid rule dependency graph detected, was a rule added and maybe typo'd?
let host = self.build.config.host.iter().any(|h| h == dep.target); let host = self.build.config.host.iter().any(|h| h == dep.target);
let rules = self.rules.values().filter(|r| r.default); let rules = self.rules.values().filter(|r| r.default);
for rule in rules.filter(|r| r.kind == kind && (!r.host || host)) { for rule in rules.filter(|r| r.kind == kind && (!r.host || host)) {
self.fill(dep.name(rule.name), order, added); deps.push(self.build_graph(dep.name(rule.name), nodes, edges));
} }
} else { } else {
self.fill(dep, order, added); deps.push(self.build_graph(dep, nodes, edges));
} }
} }
order.push(step);
edges.entry(idx).or_insert(HashSet::new()).extend(deps);
return idx
}
/// Given a dependency graph with a finished list of `nodes`, fill out more
/// dependency `edges`.
///
/// This is the step which satisfies all `after` listed dependencies in
/// `Rule` above.
fn satisfy_after_deps(&self,
nodes: &HashMap<Step<'a>, usize>,
edges: &mut HashMap<usize, HashSet<usize>>) {
// Reverse map from the name of a step to the node indices that it
// appears at.
let mut name_to_idx = HashMap::new();
for (step, &idx) in nodes {
name_to_idx.entry(step.name).or_insert(Vec::new()).push(idx);
}
for (step, idx) in nodes {
if *step == Step::noop() {
continue
}
for after in self.rules[step.name].after.iter() {
// This is the critical piece of an `after` dependency. If the
// dependency isn't actually in our graph then no edge is drawn,
// only if it's already present do we draw the edges.
if let Some(idxs) = name_to_idx.get(after) {
edges.get_mut(idx).unwrap()
.extend(idxs.iter().cloned());
}
}
}
}
fn topo_sort(&self,
cur: usize,
nodes: &HashMap<usize, &Step<'a>>,
edges: &HashMap<usize, HashSet<usize>>,
visited: &mut HashSet<usize>,
order: &mut Vec<Step<'a>>) {
if !visited.insert(cur) {
return
}
for dep in edges[&cur].iter() {
self.topo_sort(*dep, nodes, edges, visited, order);
}
order.push(nodes[&cur].clone());
} }
} }

View File

@ -152,18 +152,13 @@ For targets: `powerpc-unknown-linux-gnu`
- Path and misc options > Patches origin = Bundled, then local - Path and misc options > Patches origin = Bundled, then local
- Path and misc options > Local patch directory = /tmp/patches - Path and misc options > Local patch directory = /tmp/patches
- Target options > Target Architecture = powerpc - Target options > Target Architecture = powerpc
- Target options > Emit assembly for CPU = power4 -- (+) - Target options > Emit assembly for CPU = powerpc -- pure 32-bit PowerPC
- Target options > Tune for CPU = power6 -- (+)
- Operating System > Target OS = linux - Operating System > Target OS = linux
- Operating System > Linux kernel version = 2.6.32.68 -- ~RHEL6 kernel - Operating System > Linux kernel version = 2.6.32.68 -- ~RHEL6 kernel
- C-library > glibc version = 2.12.2 -- ~RHEL6 glibc - C-library > glibc version = 2.12.2 -- ~RHEL6 glibc
- C compiler > gcc version = 4.9.3 - C compiler > gcc version = 4.9.3
- C compiler > Core gcc extra config = --with-cpu-32=power4 --with-cpu=default32 -- (+)
- C compiler > gcc extra config = --with-cpu-32=power4 --with-cpu=default32 -- (+)
- C compiler > C++ = ENABLE -- to cross compile LLVM - C compiler > C++ = ENABLE -- to cross compile LLVM
(+) These CPU options match the configuration of the toolchains in RHEL6.
## `powerpc64-linux-gnu.config` ## `powerpc64-linux-gnu.config`
For targets: `powerpc64-unknown-linux-gnu` For targets: `powerpc64-unknown-linux-gnu`

View File

@ -13,7 +13,7 @@ RUN dpkg --add-architecture i386 && \
cmake \ cmake \
unzip \ unzip \
expect \ expect \
openjdk-9-jre \ openjdk-9-jre-headless \
sudo \ sudo \
libstdc++6:i386 \ libstdc++6:i386 \
xz-utils \ xz-utils \

View File

@ -10,7 +10,9 @@
# except according to those terms. # except according to those terms.
set -ex set -ex
ANDROID_EMULATOR_FORCE_32BIT=true \
nohup nohup emulator @arm-18 -no-window -partition-size 2047 \ # Setting SHELL to a file instead on a symlink helps android
0<&- &>/dev/null & # emulator identify the system
export SHELL=/bin/bash
nohup nohup emulator @arm-18 -no-window -partition-size 2047 0<&- &>/dev/null &
exec "$@" exec "$@"

View File

@ -74,6 +74,7 @@ ENV CC_mipsel_unknown_linux_musl=mipsel-openwrt-linux-gcc \
ENV STAGING_DIR=/tmp ENV STAGING_DIR=/tmp
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--enable-extended \
--target=$TARGETS \ --target=$TARGETS \
--musl-root-arm=/usr/local/arm-linux-musleabi \ --musl-root-arm=/usr/local/arm-linux-musleabi \
--musl-root-armhf=/usr/local/arm-linux-musleabihf \ --musl-root-armhf=/usr/local/arm-linux-musleabihf \

View File

@ -27,10 +27,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \ dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb rm dumb-init_*.deb
@ -60,24 +56,22 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild USER rustbuild
WORKDIR /tmp WORKDIR /tmp
COPY armv7-linux-gnueabihf.config /tmp/ COPY aarch64-linux-gnu.config build-toolchains.sh /tmp/
COPY armv7-linux-gnueabihf.config aarch64-linux-gnu.config build-toolchains.sh /tmp/
RUN ./build-toolchains.sh RUN ./build-toolchains.sh
USER root USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/aarch64-unknown-linux-gnueabi/bin ENV PATH=$PATH:/x-tools/aarch64-unknown-linux-gnueabi/bin
ENV PATH=$PATH:/x-tools/armv7-unknown-linux-gnueabihf/bin
ENV CC_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-gcc \ ENV CC_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-gcc \
AR_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-ar \ AR_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-ar \
CXX_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-g++ \ CXX_aarch64_unknown_linux_gnu=aarch64-unknown-linux-gnueabi-g++
CC_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-gcc \
AR_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-ar \
CXX_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-g++
ENV HOSTS=armv7-unknown-linux-gnueabihf ENV HOSTS=aarch64-unknown-linux-gnu
ENV HOSTS=$HOSTS,aarch64-unknown-linux-gnu
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS

View File

@ -0,0 +1,37 @@
#!/bin/bash
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
hide_output() {
set +x
on_err="
echo ERROR: An error was encountered with the build.
cat /tmp/build.log
exit 1
"
trap "$on_err" ERR
bash -c "while true; do sleep 30; echo \$(date) - building ...; done" &
PING_LOOP_PID=$!
$@ &> /tmp/build.log
rm /tmp/build.log
trap - ERR
kill $PING_LOOP_PID
set -x
}
mkdir build
cd build
cp ../aarch64-linux-gnu.config .config
ct-ng oldconfig
hide_output ct-ng build
cd ..
rm -rf build

View File

@ -36,15 +36,18 @@ RUN curl -o /usr/local/bin/sccache \
chmod +x /usr/local/bin/sccache chmod +x /usr/local/bin/sccache
ENV TARGETS=arm-linux-androideabi ENV TARGETS=arm-linux-androideabi
ENV TARGETS=$TARGETS,armv7-linux-androideabi
ENV TARGETS=$TARGETS,i686-linux-android ENV TARGETS=$TARGETS,i686-linux-android
ENV TARGETS=$TARGETS,aarch64-linux-android ENV TARGETS=$TARGETS,aarch64-linux-android
ENV TARGETS=$TARGETS,armv7-linux-androideabi ENV TARGETS=$TARGETS,x86_64-linux-android
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--target=$TARGETS \ --target=$TARGETS \
--enable-extended \
--arm-linux-androideabi-ndk=/android/ndk-arm-9 \ --arm-linux-androideabi-ndk=/android/ndk-arm-9 \
--armv7-linux-androideabi-ndk=/android/ndk-arm-9 \ --armv7-linux-androideabi-ndk=/android/ndk-arm-9 \
--i686-linux-android-ndk=/android/ndk-x86-9 \ --i686-linux-android-ndk=/android/ndk-x86-9 \
--aarch64-linux-android-ndk=/android/ndk-aarch64 --aarch64-linux-android-ndk=/android/ndk-arm64-21 \
--x86_64-linux-android-ndk=/android/ndk-x86_64-21
ENV SCRIPT python2.7 ../x.py dist --target $TARGETS ENV SCRIPT python2.7 ../x.py dist --target $TARGETS

View File

@ -25,7 +25,7 @@ bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \ bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-21 \ --platform=android-21 \
--toolchain=aarch64-linux-android-4.9 \ --toolchain=aarch64-linux-android-4.9 \
--install-dir=/android/ndk-aarch64 \ --install-dir=/android/ndk-arm64-21 \
--ndk-dir=/android/android-ndk-r11c \ --ndk-dir=/android/android-ndk-r11c \
--arch=arm64 --arch=arm64
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \ bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
@ -34,5 +34,11 @@ bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--install-dir=/android/ndk-x86-9 \ --install-dir=/android/ndk-x86-9 \
--ndk-dir=/android/android-ndk-r11c \ --ndk-dir=/android/android-ndk-r11c \
--arch=x86 --arch=x86
bash android-ndk-r11c/build/tools/make-standalone-toolchain.sh \
--platform=android-21 \
--toolchain=x86_64-4.9 \
--install-dir=/android/ndk-x86_64-21 \
--ndk-dir=/android/android-ndk-r11c \
--arch=x86_64
rm -rf ./android-ndk-r11c-linux-x86_64.zip ./android-ndk-r11c rm -rf ./android-ndk-r11c-linux-x86_64.zip ./android-ndk-r11c

View File

@ -27,10 +27,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev \ libssl-dev \
pkg-config pkg-config
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \ dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb rm dumb-init_*.deb
@ -60,23 +56,22 @@ RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild USER rustbuild
WORKDIR /tmp WORKDIR /tmp
COPY arm-linux-gnueabihf.config arm-linux-gnueabi.config build-toolchains.sh /tmp/ COPY arm-linux-gnueabi.config build-toolchains.sh /tmp/
RUN ./build-toolchains.sh RUN ./build-toolchains.sh
USER root USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabi/bin ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabi/bin
ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabihf/bin
ENV CC_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-gcc \ ENV CC_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-gcc \
AR_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-ar \ AR_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-ar \
CXX_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-g++ \ CXX_arm_unknown_linux_gnueabi=arm-unknown-linux-gnueabi-g++
CC_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-gcc \
AR_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-ar \
CXX_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-g++
ENV HOSTS=arm-unknown-linux-gnueabi ENV HOSTS=arm-unknown-linux-gnueabi
ENV HOSTS=$HOSTS,arm-unknown-linux-gnueabihf
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS

View File

@ -35,11 +35,3 @@ ct-ng oldconfig
hide_output ct-ng build hide_output ct-ng build
cd .. cd ..
rm -rf build rm -rf build
mkdir build
cd build
cp ../arm-linux-gnueabihf.config .config
ct-ng oldconfig
hide_output ct-ng build
cd ..
rm -rf build

View File

@ -0,0 +1,77 @@
FROM ubuntu:16.04
RUN apt-get update && apt-get install -y --no-install-recommends \
automake \
bison \
bzip2 \
ca-certificates \
cmake \
curl \
file \
flex \
g++ \
gawk \
gdb \
git \
gperf \
help2man \
libncurses-dev \
libtool-bin \
make \
patch \
python2.7 \
sudo \
texinfo \
wget \
xz-utils \
libssl-dev \
pkg-config
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this contianer) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3
RUN curl https://ftp.gnu.org/gnu/make/make-3.81.tar.gz | tar xzf - && \
cd make-3.81 && \
./configure --prefix=/usr && \
make && \
make install && \
cd .. && \
rm -rf make-3.81
RUN curl http://crosstool-ng.org/download/crosstool-ng/crosstool-ng-1.22.0.tar.bz2 | \
tar xjf - && \
cd crosstool-ng && \
./configure --prefix=/usr/local && \
make -j$(nproc) && \
make install && \
cd .. && \
rm -rf crosstool-ng
RUN groupadd -r rustbuild && useradd -m -r -g rustbuild rustbuild
RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY arm-linux-gnueabihf.config build-toolchains.sh /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/arm-unknown-linux-gnueabihf/bin
ENV CC_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-gcc \
AR_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-ar \
CXX_arm_unknown_linux_gnueabihf=arm-unknown-linux-gnueabihf-g++
ENV HOSTS=arm-unknown-linux-gnueabihf
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS

View File

@ -0,0 +1,37 @@
#!/bin/bash
# Copyright 2017 The Rust Project Developers. See the COPYRIGHT
# file at the top-level directory of this distribution and at
# http://rust-lang.org/COPYRIGHT.
#
# Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
# http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
# <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
# option. This file may not be copied, modified, or distributed
# except according to those terms.
set -ex
hide_output() {
set +x
on_err="
echo ERROR: An error was encountered with the build.
cat /tmp/build.log
exit 1
"
trap "$on_err" ERR
bash -c "while true; do sleep 30; echo \$(date) - building ...; done" &
PING_LOOP_PID=$!
$@ &> /tmp/build.log
rm /tmp/build.log
trap - ERR
kill $PING_LOOP_PID
set -x
}
mkdir build
cd build
cp ../arm-linux-gnueabihf.config .config
ct-ng oldconfig
hide_output ct-ng build
cd ..
rm -rf build

View File

@ -0,0 +1,77 @@
FROM ubuntu:16.04
RUN apt-get update && apt-get install -y --no-install-recommends \
automake \
bison \
bzip2 \
ca-certificates \
cmake \
curl \
file \
flex \
g++ \
gawk \
gdb \
git \
gperf \
help2man \
libncurses-dev \
libtool-bin \
make \
patch \
python2.7 \
sudo \
texinfo \
wget \
xz-utils \
libssl-dev \
pkg-config
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
dpkg -i dumb-init_*.deb && \
rm dumb-init_*.deb
ENTRYPOINT ["/usr/bin/dumb-init", "--"]
# Ubuntu 16.04 (this contianer) ships with make 4, but something in the
# toolchains we build below chokes on that, so go back to make 3
RUN curl https://ftp.gnu.org/gnu/make/make-3.81.tar.gz | tar xzf - && \
cd make-3.81 && \
./configure --prefix=/usr && \
make && \
make install && \
cd .. && \
rm -rf make-3.81
RUN curl http://crosstool-ng.org/download/crosstool-ng/crosstool-ng-1.22.0.tar.bz2 | \
tar xjf - && \
cd crosstool-ng && \
./configure --prefix=/usr/local && \
make -j$(nproc) && \
make install && \
cd .. && \
rm -rf crosstool-ng
RUN groupadd -r rustbuild && useradd -m -r -g rustbuild rustbuild
RUN mkdir /x-tools && chown rustbuild:rustbuild /x-tools
USER rustbuild
WORKDIR /tmp
COPY build-toolchains.sh armv7-linux-gnueabihf.config /tmp/
RUN ./build-toolchains.sh
USER root
RUN curl -o /usr/local/bin/sccache \
https://s3.amazonaws.com/rust-lang-ci/rust-ci-mirror/2017-04-04-sccache-x86_64-unknown-linux-musl && \
chmod +x /usr/local/bin/sccache
ENV PATH=$PATH:/x-tools/armv7-unknown-linux-gnueabihf/bin
ENV CC_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-gcc \
AR_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-ar \
CXX_armv7_unknown_linux_gnueabihf=armv7-unknown-linux-gnueabihf-g++
ENV HOSTS=armv7-unknown-linux-gnueabihf
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS

View File

@ -35,11 +35,3 @@ ct-ng oldconfig
hide_output ct-ng build hide_output ct-ng build
cd .. cd ..
rm -rf build rm -rf build
mkdir build
cd build
cp ../aarch64-linux-gnu.config .config
ct-ng oldconfig
hide_output ct-ng build
cd ..
rm -rf build

View File

@ -44,5 +44,5 @@ ENV \
ENV TARGETS=x86_64-unknown-fuchsia ENV TARGETS=x86_64-unknown-fuchsia
ENV TARGETS=$TARGETS,aarch64-unknown-fuchsia ENV TARGETS=$TARGETS,aarch64-unknown-fuchsia
ENV RUST_CONFIGURE_ARGS --target=$TARGETS ENV RUST_CONFIGURE_ARGS --target=$TARGETS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --target $TARGETS ENV SCRIPT python2.7 ../x.py dist --target $TARGETS

View File

@ -31,7 +31,8 @@ RUN curl -o /usr/local/bin/sccache \
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--target=i686-unknown-linux-musl,i586-unknown-linux-gnu \ --target=i686-unknown-linux-musl,i586-unknown-linux-gnu \
--musl-root-i686=/musl-i686 --musl-root-i686=/musl-i686 \
--enable-extended
# Newer binutils broke things on some vms/distros (i.e., linking against # Newer binutils broke things on some vms/distros (i.e., linking against
# unknown relocs disabled by the following flag), so we need to go out of our # unknown relocs disabled by the following flag), so we need to go out of our

View File

@ -15,11 +15,14 @@ set -ex
export CFLAGS="-fPIC -Wa,-mrelax-relocations=no" export CFLAGS="-fPIC -Wa,-mrelax-relocations=no"
export CXXFLAGS="-Wa,-mrelax-relocations=no" export CXXFLAGS="-Wa,-mrelax-relocations=no"
MUSL=musl-1.1.14 MUSL=musl-1.1.16
curl https://www.musl-libc.org/releases/$MUSL.tar.gz | tar xzf - curl https://www.musl-libc.org/releases/$MUSL.tar.gz | tar xzf -
cd $MUSL cd $MUSL
CFLAGS="$CFLAGS -m32" ./configure --prefix=/musl-i686 --disable-shared --target=i686 CC=gcc \
make -j10 CFLAGS="$CFLAGS -m32" \
./configure --prefix=/musl-i686 --disable-shared \
--target=i686
make AR=ar RANLIB=ranlib -j10
make install make install
cd .. cd ..

View File

@ -17,7 +17,6 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
pkg-config pkg-config
COPY build-toolchain.sh /tmp/ COPY build-toolchain.sh /tmp/
RUN /tmp/build-toolchain.sh x86_64
RUN /tmp/build-toolchain.sh i686 RUN /tmp/build-toolchain.sh i686
RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \ RUN curl -OL https://github.com/Yelp/dumb-init/releases/download/v1.2.0/dumb-init_1.2.0_amd64.deb && \
@ -30,15 +29,11 @@ RUN curl -o /usr/local/bin/sccache \
chmod +x /usr/local/bin/sccache chmod +x /usr/local/bin/sccache
ENV \ ENV \
AR_x86_64_unknown_freebsd=x86_64-unknown-freebsd10-ar \
CC_x86_64_unknown_freebsd=x86_64-unknown-freebsd10-gcc \
CXX_x86_64_unknown_freebsd=x86_64-unknown-freebsd10-g++ \
AR_i686_unknown_freebsd=i686-unknown-freebsd10-ar \ AR_i686_unknown_freebsd=i686-unknown-freebsd10-ar \
CC_i686_unknown_freebsd=i686-unknown-freebsd10-gcc \ CC_i686_unknown_freebsd=i686-unknown-freebsd10-gcc \
CXX_i686_unknown_freebsd=i686-unknown-freebsd10-g++ CXX_i686_unknown_freebsd=i686-unknown-freebsd10-g++
ENV HOSTS=x86_64-unknown-freebsd ENV HOSTS=i686-unknown-freebsd
ENV HOSTS=$HOSTS,i686-unknown-freebsd
ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended ENV RUST_CONFIGURE_ARGS --host=$HOSTS --enable-extended
ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS ENV SCRIPT python2.7 ../x.py dist --host $HOSTS --target $HOSTS

View File

@ -12,6 +12,7 @@ RUN yum upgrade -y && yum install -y \
curl \ curl \
bzip2 \ bzip2 \
gcc \ gcc \
gcc-c++ \
make \ make \
glibc-devel \ glibc-devel \
perl \ perl \
@ -85,7 +86,6 @@ RUN curl -o /usr/local/bin/sccache \
chmod +x /usr/local/bin/sccache chmod +x /usr/local/bin/sccache
ENV HOSTS=i686-unknown-linux-gnu ENV HOSTS=i686-unknown-linux-gnu
ENV HOSTS=$HOSTS,x86_64-unknown-linux-gnu
ENV RUST_CONFIGURE_ARGS \ ENV RUST_CONFIGURE_ARGS \
--host=$HOSTS \ --host=$HOSTS \

View File

@ -13,12 +13,14 @@ set -ex
source shared.sh source shared.sh
curl https://ftp.gnu.org/gnu/gcc/gcc-4.7.4/gcc-4.7.4.tar.bz2 | tar xjf - GCC=4.8.5
cd gcc-4.7.4
curl https://ftp.gnu.org/gnu/gcc/gcc-$GCC/gcc-$GCC.tar.bz2 | tar xjf -
cd gcc-$GCC
./contrib/download_prerequisites ./contrib/download_prerequisites
mkdir ../gcc-build mkdir ../gcc-build
cd ../gcc-build cd ../gcc-build
hide_output ../gcc-4.7.4/configure \ hide_output ../gcc-$GCC/configure \
--prefix=/rustroot \ --prefix=/rustroot \
--enable-languages=c,c++ --enable-languages=c,c++
hide_output make -j10 hide_output make -j10
@ -27,5 +29,5 @@ ln -nsf gcc /rustroot/bin/cc
cd .. cd ..
rm -rf gcc-build rm -rf gcc-build
rm -rf gcc-4.7.4 rm -rf gcc-$GCC
yum erase -y gcc binutils yum erase -y gcc gcc-c++ binutils

Some files were not shown because too many files have changed in this diff Show More