rustbuild: Document many more parts of the build

This commit expands the bootstrap build system's `README.md` as well as ensuring
that all API documentation is present and up-to-date. Additionally a new
`config.toml.example` file is checked in with commented out versions of all
possible configuration values.
This commit is contained in:
Alex Crichton 2016-05-02 15:16:15 -07:00
parent d80497e628
commit f72bfe6661
19 changed files with 726 additions and 47 deletions

View File

@ -1,4 +1,4 @@
# Bootstrapping Rust
# rustbuild - Bootstrapping Rust
This is an in-progress README which is targeted at helping to explain how Rust
is bootstrapped and in general some of the technical details of the build
@ -8,20 +8,64 @@ system.
> intended to be the primarily used one just yet. The makefiles are currently
> the ones that are still "guaranteed to work" as much as possible at least.
## Using the new build system
## Using rustbuild
When configuring Rust via `./configure`, pass the following to enable building
via this build system:
```
./configure --enable-rustbuild
make
```
## ...
Afterwards the `Makefile` which is generated will have a few commands like
`make check`, `make tidy`, etc. For finer-grained control, the
`bootstrap.py` entry point can be used:
```
python src/bootstrap/bootstrap.py
```
This accepts a number of options like `--stage` and `--step` which can configure
what's actually being done.
## Configuring rustbuild
There are currently two primary methods for configuring the rustbuild build
system. First, the `./configure` options serialized in `config.mk` will be
parsed and read. That is, if any `./configure` options are passed, they'll be
handled naturally.
Next, rustbuild offers a TOML-based configuration system with a `config.toml`
file in the same location as `config.mk`. An example of this configuration can
be found at `src/bootstrap/config.toml.example`, and the configuration file
can also be passed as `--config path/to/config.toml` if the build system is
being invoked manually (via the python script).
## Build stages
The rustbuild build system goes through a few phases to actually build the
compiler. What actually happens when you invoke rustbuild is:
1. The entry point script, `src/bootstrap/bootstrap.py` is run. This script is
responsible for downloading the stage0 compiler/Cargo binaries, and it then
compiles the build system itself (this folder). Finally, it then invokes the
actual `boostrap` binary build system.
2. In Rust, `bootstrap` will slurp up all configuration, perform a number of
sanity checks (compilers exist for example), and then start building the
stage0 artifacts.
3. The stage0 `cargo` downloaded earlier is used to build the standard library
and the compiler, and then these binaries are then copied to the `stage1`
directory. That compiler is then used to generate the stage1 artifacts which
are then copied to the stage2 directory, and then finally the stage2
artifacts are generated using that compiler.
The goal of each stage is to (a) leverage Cargo as much as possible and failing
that (b) leverage Rust as much as possible!
## Directory Layout
This build system houses all output under the `target` directory, which looks
This build system houses all output under the `build` directory, which looks
like this:
```
@ -42,6 +86,12 @@ build/
debug/
release/
# Output of the dist-related steps like dist-std, dist-rustc, and dist-docs
dist/
# Temporary directory used for various input/output as part of various stages
tmp/
# Each remaining directory is scoped by the "host" triple of compilation at
# hand.
x86_64-unknown-linux-gnu/
@ -50,7 +100,8 @@ build/
# folder is under. The exact layout here will likely depend on the platform,
# and this is also built with CMake so the build system is also likely
# different.
compiler-rt/build/
compiler-rt/
build/
# Output folder for LLVM if it is compiled for this target
llvm/
@ -67,6 +118,17 @@ build/
share/
...
# Output folder for all documentation of this target. This is what's filled
# in whenever the `doc` step is run.
doc/
# Output for all compiletest-based test suites
test/
run-pass/
compile-fail/
debuginfo/
...
# Location where the stage0 Cargo and Rust compiler are unpacked. This
# directory is purely an extracted and overlaid tarball of these two (done
# by the bootstrapy python script). In theory the build system does not
@ -82,7 +144,9 @@ build/
# invocation. The build system instruments calling Cargo in the right order
# with the right variables to ensure these are filled in correctly.
stageN-std/
stageN-test/
stageN-rustc/
stageN-tools/
# This is a special case of the above directories, **not** filled in via
# Cargo but rather the build system itself. The stage0 compiler already has
@ -96,7 +160,7 @@ build/
# Basically this directory is just a temporary artifact use to configure the
# stage0 compiler to ensure that the libstd we just built is used to
# compile the stage1 compiler.
stage0-rustc/lib/
stage0-sysroot/lib/
# These output directories are intended to be standalone working
# implementations of the compiler (corresponding to each stage). The build
@ -108,3 +172,69 @@ build/
stage2/
stage3/
```
## Cargo projects
The current build is unfortunately not quite as simple as `cargo build` in a
directory, but rather the compiler is split into three different Cargo projects:
* `src/rustc/std_shim` - a project which builds and compiles libstd
* `src/rustc/test_shim` - a project which builds and compiles libtest
* `src/rustc` - the actual compiler itself
Each "project" has a corresponding Cargo.lock file with all dependencies, and
this means that building the compiler involves running Cargo three times. The
structure here serves two goals:
1. Facilitating dependencies coming from crates.io. These dependencies don't
depend on `std`, so libstd is a separate project compiled ahead of time
before the actual compiler builds.
2. Splitting "host artifacts" from "target artifacts". That is, when building
code for an arbitrary target you don't need the entire compiler, but you'll
end up needing libraries like libtest that depend on std but also want to use
crates.io dependencies. Hence, libtest is split out as its own project that
is sequenced after `std` but before `rustc`. This project is built for all
targets.
There is some loss in build parallelism here because libtest can be compiled in
parallel with a number of rustc artifacts, but in theory the loss isn't too bad!
## Build tools
We've actually got quite a few tools that we use in the compiler's build system
and for testing. To organize these, each tool is a project in `src/tools` with a
corresponding `Cargo.toml`. All tools are compiled with Cargo (currently having
independent `Cargo.lock` files) and do not currently explicitly depend on the
compiler or standard library. Compiling each tool is sequenced after the
appropriate libstd/libtest/librustc compile above.
## Extending rustbuild
So you'd like to add a feature to the rustbuild build system or just fix a bug.
Great! One of the major motivational factors for moving away from `make` is that
Rust is in theory much easier to read, modify, and write. If you find anything
excessively confusing, please open an issue on this and we'll try to get it
documented or simplified pronto.
First up, you'll probably want to read over the documentation above as that'll
give you a high level overview of what rustbuild is doing. You also probably
want to play around a bit yourself by just getting it up and running before you
dive too much into the actual build system itself.
After that, each module in rustbuild should have enough documentation to keep
you up and running. Some general areas that you may be interested in modifying
are:
* Adding a new build tool? Take a look at `build/step.rs` for examples of other
tools, as well as `build/mod.rs`.
* Adding a new compiler crate? Look no further! Adding crates can be done by
adding a new directory with `Cargo.toml` followed by configuring all
`Cargo.toml` files accordingly.
* Adding a new dependency from crates.io? We're still working on that, so hold
off on that for now.
* Adding a new configuration option? Take a look at `build/config.rs` or perhaps
`build/flags.rs` and then modify the build elsewhere to read that option.
* Adding a sanity check? Take a look at `build/sanity.rs`.
If you have any questions feel free to reach out on `#rust-internals` on IRC or
open an issue in the bug tracker!

View File

@ -8,6 +8,29 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! C-compiler probing and detection.
//!
//! This module will fill out the `cc` and `cxx` maps of `Build` by looking for
//! C and C++ compilers for each target configured. A compiler is found through
//! a number of vectors (in order of precedence)
//!
//! 1. Configuration via `target.$target.cc` in `config.toml`.
//! 2. Configuration via `target.$target.android-ndk` in `config.toml`, if
//! applicable
//! 3. Special logic to probe on OpenBSD
//! 4. The `CC_$target` environment variable.
//! 5. The `CC` environment variable.
//! 6. "cc"
//!
//! Some of this logic is implemented here, but much of it is farmed out to the
//! `gcc` crate itself, so we end up having the same fallbacks as there.
//! Similar logic is then used to find a C++ compiler, just some s/cc/c++/ is
//! used.
//!
//! It is intended that after this module has run no C/C++ compiler will
//! ever be probed for. Instead the compilers found here will be used for
//! everything.
use std::process::Command;
use build_helper::{cc2ar, output};

View File

@ -8,7 +8,13 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use std::env;
//! Build configuration for Rust's release channels.
//!
//! Implements the stable/beta/nightly channel distinctions by setting various
//! flags like the `unstable_features`, calculating variables like `release` and
//! `package_vers`, and otherwise indicating to the compiler what it should
//! print out as part of its version information.
use std::fs::{self, File};
use std::io::prelude::*;
use std::process::Command;
@ -19,6 +25,9 @@ use md5;
use build::Build;
pub fn collect(build: &mut Build) {
// Currently the canonical source for the release number (e.g. 1.10.0) and
// the prerelease version (e.g. `.1`) is in `mk/main.mk`. We "parse" that
// here to learn about those numbers.
let mut main_mk = String::new();
t!(t!(File::open(build.src.join("mk/main.mk"))).read_to_string(&mut main_mk));
let mut release_num = "";
@ -32,7 +41,8 @@ pub fn collect(build: &mut Build) {
}
}
// FIXME: this is duplicating makefile logic
// Depending on the channel, passed in `./configure --release-channel`,
// determine various properties of the build.
match &build.config.channel[..] {
"stable" => {
build.release = release_num.to_string();
@ -58,6 +68,8 @@ pub fn collect(build: &mut Build) {
}
build.version = build.release.clone();
// If we have a git directory, add in some various SHA information of what
// commit this compiler was compiled from.
if fs::metadata(build.src.join(".git")).is_ok() {
let ver_date = output(Command::new("git").current_dir(&build.src)
.arg("log").arg("-1")
@ -80,11 +92,14 @@ pub fn collect(build: &mut Build) {
build.short_ver_hash = Some(short_ver_hash);
}
// Calculate this compiler's bootstrap key, which is currently defined as
// the first 8 characters of the md5 of the release string.
let key = md5::compute(build.release.as_bytes());
build.bootstrap_key = format!("{:02x}{:02x}{:02x}{:02x}",
key[0], key[1], key[2], key[3]);
env::set_var("RUSTC_BOOTSTRAP_KEY", &build.bootstrap_key);
// Slurp up the stage0 bootstrap key as we're bootstrapping from an
// otherwise stable compiler.
let mut s = String::new();
t!(t!(File::open(build.src.join("src/stage0.txt"))).read_to_string(&mut s));
if let Some(line) = s.lines().find(|l| l.starts_with("rustc_key")) {

View File

@ -8,6 +8,11 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Implementation of the various `check-*` targets of the build system.
//!
//! This file implements the various regression test suites that we execute on
//! our CI.
use std::fs;
use std::path::{PathBuf, Path};
use std::process::Command;
@ -16,6 +21,10 @@ use build_helper::output;
use build::{Build, Compiler};
/// Runs the `linkchecker` tool as compiled in `stage` by the `host` compiler.
///
/// This tool in `src/tools` will verify the validity of all our links in the
/// documentation to ensure we don't have a bunch of dead ones.
pub fn linkcheck(build: &Build, stage: u32, host: &str) {
println!("Linkcheck stage{} ({})", stage, host);
let compiler = Compiler::new(stage, host);
@ -23,8 +32,11 @@ pub fn linkcheck(build: &Build, stage: u32, host: &str) {
.arg(build.out.join(host).join("doc")));
}
/// Runs the `cargotest` tool as compiled in `stage` by the `host` compiler.
///
/// This tool in `src/tools` will check out a few Rust projects and run `cargo
/// test` to ensure that we don't regress the test suites there.
pub fn cargotest(build: &Build, stage: u32, host: &str) {
let ref compiler = Compiler::new(stage, host);
// Configure PATH to find the right rustc. NB. we have to use PATH
@ -47,6 +59,11 @@ pub fn cargotest(build: &Build, stage: u32, host: &str) {
.arg(&out_dir));
}
/// Runs the `tidy` tool as compiled in `stage` by the `host` compiler.
///
/// This tool in `src/tools` checks up on various bits and pieces of style and
/// otherwise just implements a few lint-like checks that are specific to the
/// compiler itself.
pub fn tidy(build: &Build, stage: u32, host: &str) {
println!("tidy check stage{} ({})", stage, host);
let compiler = Compiler::new(stage, host);
@ -58,6 +75,11 @@ fn testdir(build: &Build, host: &str) -> PathBuf {
build.out.join(host).join("test")
}
/// Executes the `compiletest` tool to run a suite of tests.
///
/// Compiles all tests with `compiler` for `target` with the specified
/// compiletest `mode` and `suite` arguments. For example `mode` can be
/// "run-pass" or `suite` can be something like `debuginfo`.
pub fn compiletest(build: &Build,
compiler: &Compiler,
target: &str,
@ -65,6 +87,9 @@ pub fn compiletest(build: &Build,
suite: &str) {
let mut cmd = build.tool_cmd(compiler, "compiletest");
// compiletest currently has... a lot of arguments, so let's just pass all
// of them!
cmd.arg("--compile-lib-path").arg(build.rustc_libdir(compiler));
cmd.arg("--run-lib-path").arg(build.sysroot_libdir(compiler, target));
cmd.arg("--rustc-path").arg(build.compiler_path(compiler));
@ -114,6 +139,8 @@ pub fn compiletest(build: &Build,
cmd.arg("--verbose");
}
// Only pass correct values for these flags for the `run-make` suite as it
// requires that a C++ compiler was configured which isn't always the case.
if suite == "run-make" {
let llvm_config = build.llvm_config(target);
let llvm_components = output(Command::new(&llvm_config).arg("--components"));
@ -140,11 +167,19 @@ pub fn compiletest(build: &Build,
}
}
}
build.add_bootstrap_key(compiler, &mut cmd);
build.run(&mut cmd);
}
/// Run `rustdoc --test` for all documentation in `src/doc`.
///
/// This will run all tests in our markdown documentation (e.g. the book)
/// located in `src/doc`. The `rustdoc` that's run is the one that sits next to
/// `compiler`.
pub fn docs(build: &Build, compiler: &Compiler) {
// Do a breadth-first traversal of the `src/doc` directory and just run
// tests for all files that end in `*.md`
let mut stack = vec![build.src.join("src/doc")];
while let Some(p) = stack.pop() {
@ -162,6 +197,12 @@ pub fn docs(build: &Build, compiler: &Compiler) {
}
}
/// Run the error index generator tool to execute the tests located in the error
/// index.
///
/// The `error_index_generator` tool lives in `src/tools` and is used to
/// generate a markdown file from the error indexes of the code base which is
/// then passed to `rustdoc --test`.
pub fn error_index(build: &Build, compiler: &Compiler) {
println!("Testing error-index stage{}", compiler.stage);

View File

@ -8,6 +8,13 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Implementation of `make clean` in rustbuild.
//!
//! Responsible for cleaning out a build directory of all old and stale
//! artifacts to prepare for a fresh build. Currently doesn't remove the
//! `build/cache` directory (download cache) or the `build/$target/llvm`
//! directory as we want that cached between builds.
use std::fs;
use std::path::Path;

View File

@ -8,6 +8,14 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Implementation of compiling various phases of the compiler and standard
//! library.
//!
//! This module contains some of the real meat in the rustbuild build system
//! which is where Cargo is used to compiler the standard library, libtest, and
//! compiler. This module is also responsible for assembling the sysroot as it
//! goes along from the output of the previous stage.
use std::collections::HashMap;
use std::fs;
use std::path::{Path, PathBuf};
@ -35,6 +43,8 @@ pub fn std<'a>(build: &'a Build, target: &str, compiler: &Compiler<'a>) {
copy(&build.compiler_rt_built.borrow()[target],
&libdir.join(staticlib("compiler-rt", target)));
// Some platforms have startup objects that may be required to produce the
// libstd dynamic library, for example.
build_startup_objects(build, target, &libdir);
let out_dir = build.cargo_out(compiler, Mode::Libstd, target);
@ -154,7 +164,6 @@ pub fn test_link(build: &Build,
add_to_sysroot(&out_dir, &libdir);
}
/// Build the compiler.
///
/// This will build the compiler for a particular stage of the build using

View File

@ -8,6 +8,11 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Serialized configuration of a build.
//!
//! This module implements parsing `config.mk` and `config.toml` configuration
//! files to tweak how the build runs.
use std::collections::HashMap;
use std::env;
use std::fs::File;
@ -27,7 +32,9 @@ use toml::{Parser, Decoder, Value};
/// is generated from `./configure`.
///
/// Note that this structure is not decoded directly into, but rather it is
/// filled out from the decoded forms of the structs below.
/// filled out from the decoded forms of the structs below. For documentation
/// each field, see the corresponding fields in
/// `src/bootstrap/config.toml.example`.
#[derive(Default)]
pub struct Config {
pub ccache: bool,
@ -250,6 +257,11 @@ impl Config {
return config
}
/// "Temporary" routine to parse `config.mk` into this configuration.
///
/// While we still have `./configure` this implements the ability to decode
/// that configuration into this. This isn't exactly a full-blown makefile
/// parser, but hey it gets the job done!
pub fn update_with_config_mk(&mut self) {
let mut config = String::new();
File::open("config.mk").unwrap().read_to_string(&mut config).unwrap();

View File

@ -8,6 +8,16 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Implementation of the various distribution aspects of the compiler.
//!
//! This module is responsible for creating tarballs of the standard library,
//! compiler, and documentation. This ends up being what we distribute to
//! everyone as well.
//!
//! No tarball is actually created literally in this file, but rather we shell
//! out to `rust-installer` still. This may one day be replaced with bits and
//! pieces of `rustup.rs`!
use std::fs::{self, File};
use std::io::Write;
use std::path::{PathBuf, Path};
@ -33,6 +43,9 @@ fn tmpdir(build: &Build) -> PathBuf {
build.out.join("tmp/dist")
}
/// Builds the `rust-docs` installer component.
///
/// Slurps up documentation from the `stage`'s `host`.
pub fn docs(build: &Build, stage: u32, host: &str) {
println!("Dist docs stage{} ({})", stage, host);
let name = format!("rust-docs-{}", package_vers(build));
@ -68,6 +81,12 @@ pub fn docs(build: &Build, stage: u32, host: &str) {
}
}
/// Build the `rust-mingw` installer component.
///
/// This contains all the bits and pieces to run the MinGW Windows targets
/// without any extra installed software (e.g. we bundle gcc, libraries, etc).
/// Currently just shells out to a python script, but that should be rewritten
/// in Rust.
pub fn mingw(build: &Build, host: &str) {
println!("Dist mingw ({})", host);
let name = format!("rust-mingw-{}", package_vers(build));
@ -102,6 +121,7 @@ pub fn mingw(build: &Build, host: &str) {
t!(fs::remove_dir_all(&image));
}
/// Creates the `rustc` installer component.
pub fn rustc(build: &Build, stage: u32, host: &str) {
println!("Dist rustc stage{} ({})", stage, host);
let name = format!("rustc-{}", package_vers(build));
@ -209,6 +229,7 @@ pub fn rustc(build: &Build, stage: u32, host: &str) {
}
}
/// Copies debugger scripts for `host` into the `sysroot` specified.
pub fn debugger_scripts(build: &Build,
sysroot: &Path,
host: &str) {
@ -237,7 +258,8 @@ pub fn debugger_scripts(build: &Build,
}
}
/// Creates the `rust-std` installer component as compiled by `compiler` for the
/// target `target`.
pub fn std(build: &Build, compiler: &Compiler, target: &str) {
println!("Dist std stage{} ({} -> {})", compiler.stage, compiler.host,
target);

View File

@ -8,6 +8,15 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Documentation generation for rustbuild.
//!
//! This module implements generation for all bits and pieces of documentation
//! for the Rust project. This notably includes suites like the rust book, the
//! nomicon, standalone documentation, etc.
//!
//! Everything here is basically just a shim around calling either `rustbook` or
//! `rustdoc`.
use std::fs::{self, File};
use std::io::prelude::*;
use std::path::Path;
@ -16,6 +25,11 @@ use std::process::Command;
use build::{Build, Compiler, Mode};
use build::util::{up_to_date, cp_r};
/// Invoke `rustbook` as compiled in `stage` for `target` for the doc book
/// `name` into the `out` path.
///
/// This will not actually generate any documentation if the documentation has
/// already been generated.
pub fn rustbook(build: &Build, stage: u32, target: &str, name: &str, out: &Path) {
t!(fs::create_dir_all(out));
@ -35,6 +49,14 @@ pub fn rustbook(build: &Build, stage: u32, target: &str, name: &str, out: &Path)
.arg(out));
}
/// Generates all standalone documentation as compiled by the rustdoc in `stage`
/// for the `target` into `out`.
///
/// This will list all of `src/doc` looking for markdown files and appropriately
/// perform transformations like substituting `VERSION`, `SHORT_HASH`, and
/// `STAMP` alongw ith providing the various header/footer HTML we've cutomized.
///
/// In the end, this is just a glorified wrapper around rustdoc!
pub fn standalone(build: &Build, stage: u32, target: &str, out: &Path) {
println!("Documenting stage{} standalone ({})", stage, target);
t!(fs::create_dir_all(out));
@ -105,6 +127,10 @@ pub fn standalone(build: &Build, stage: u32, target: &str, out: &Path) {
}
}
/// Compile all standard library documentation.
///
/// This will generate all documentation for the standard library and its
/// dependencies. This is largely just a wrapper around `cargo doc`.
pub fn std(build: &Build, stage: u32, target: &str, out: &Path) {
println!("Documenting stage{} std ({})", stage, target);
t!(fs::create_dir_all(out));
@ -123,6 +149,10 @@ pub fn std(build: &Build, stage: u32, target: &str, out: &Path) {
cp_r(&out_dir, out)
}
/// Compile all libtest documentation.
///
/// This will generate all documentation for libtest and its dependencies. This
/// is largely just a wrapper around `cargo doc`.
pub fn test(build: &Build, stage: u32, target: &str, out: &Path) {
println!("Documenting stage{} test ({})", stage, target);
let compiler = Compiler::new(stage, &build.config.build);
@ -139,6 +169,10 @@ pub fn test(build: &Build, stage: u32, target: &str, out: &Path) {
cp_r(&out_dir, out)
}
/// Generate all compiler documentation.
///
/// This will generate all documentation for the compiler libraries and their
/// dependencies. This is largely just a wrapper around `cargo doc`.
pub fn rustc(build: &Build, stage: u32, target: &str, out: &Path) {
println!("Documenting stage{} compiler ({})", stage, target);
let compiler = Compiler::new(stage, &build.config.build);
@ -156,6 +190,8 @@ pub fn rustc(build: &Build, stage: u32, target: &str, out: &Path) {
cp_r(&out_dir, out)
}
/// Generates the HTML rendered error-index by running the
/// `error_index_generator` tool.
pub fn error_index(build: &Build, stage: u32, target: &str, out: &Path) {
println!("Documenting stage{} error index ({})", stage, target);
t!(fs::create_dir_all(out));

View File

@ -8,6 +8,11 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Command-line interface of the rustbuild build system.
//!
//! This module implements the command-line parsing of the build system which
//! has various flags to configure how it's run.
use std::fs;
use std::path::PathBuf;
use std::process;
@ -15,6 +20,7 @@ use std::slice;
use getopts::Options;
/// Deserialized version of all flags for this compile.
pub struct Flags {
pub verbose: bool,
pub stage: Option<u32>,

View File

@ -8,6 +8,15 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Implementation of rustbuild, the Rust build system.
//!
//! This module, and its descendants, are the implementation of the Rust build
//! system. Most of this build system is backed by Cargo but the outer layer
//! here serves as the ability to orchestrate calling Cargo, sequencing Cargo
//! builds, building artifacts like LLVM, etc.
//!
//! More documentation can be found in each respective module below.
use std::cell::RefCell;
use std::collections::HashMap;
use std::env;
@ -21,6 +30,14 @@ use num_cpus;
use build::util::{exe, mtime, libdir, add_lib_path};
/// A helper macro to `unwrap` a result except also print out details like:
///
/// * The file/line of the panic
/// * The expression that failed
/// * The error itself
///
/// This is currently used judiciously throughout the build system rather than
/// using a `Result` with `try!`, but this may change on day...
macro_rules! t {
($e:expr) => (match $e {
Ok(e) => e,
@ -53,12 +70,27 @@ mod job {
pub use build::config::Config;
pub use build::flags::Flags;
/// A structure representing a Rust compiler.
///
/// Each compiler has a `stage` that it is associated with and a `host` that
/// corresponds to the platform the compiler runs on. This structure is used as
/// a parameter to many methods below.
#[derive(Eq, PartialEq, Clone, Copy, Hash, Debug)]
pub struct Compiler<'a> {
stage: u32,
host: &'a str,
}
/// Global configuration for the build system.
///
/// This structure transitively contains all configuration for the build system.
/// All filesystem-encoded configuration is in `config`, all flags are in
/// `flags`, and then parsed or probed information is listed in the keys below.
///
/// This structure is a parameter of almost all methods in the build system,
/// although most functions are implemented as free functions rather than
/// methods specifically on this structure itself (to make it easier to
/// organize).
pub struct Build {
// User-specified configuration via config.toml
config: Config,
@ -92,14 +124,33 @@ pub struct Build {
compiler_rt_built: RefCell<HashMap<String, PathBuf>>,
}
/// The various "modes" of invoking Cargo.
///
/// These entries currently correspond to the various output directories of the
/// build system, with each mod generating output in a different directory.
pub enum Mode {
/// This cargo is going to build the standard library, placing output in the
/// "stageN-std" directory.
Libstd,
/// This cargo is going to build libtest, placing output in the
/// "stageN-test" directory.
Libtest,
/// This cargo is going to build librustc and compiler libraries, placing
/// output in the "stageN-rustc" directory.
Librustc,
/// This cargo is going to some build tool, placing output in the
/// "stageN-tools" directory.
Tool,
}
impl Build {
/// Creates a new set of build configuration from the `flags` on the command
/// line and the filesystem `config`.
///
/// By default all build output will be placed in the current directory.
pub fn new(flags: Flags, config: Config) -> Build {
let cwd = t!(env::current_dir());
let src = flags.src.clone().unwrap_or(cwd.clone());
@ -141,6 +192,7 @@ impl Build {
}
}
/// Executes the entire build, as configured by the flags and configuration.
pub fn build(&mut self) {
use build::step::Source::*;
@ -161,6 +213,16 @@ impl Build {
self.verbose("updating submodules");
self.update_submodules();
// The main loop of the build system.
//
// The `step::all` function returns a topographically sorted list of all
// steps that need to be executed as part of this build. Each step has a
// corresponding entry in `step.rs` and indicates some unit of work that
// needs to be done as part of the build.
//
// Almost all of these are simple one-liners that shell out to the
// corresponding functionality in the extra modules, where more
// documentation can be found.
for target in step::all(self) {
let doc_out = self.out.join(&target.target).join("doc");
match target.src {
@ -338,6 +400,10 @@ impl Build {
}
}
/// Updates all git submodules that we have.
///
/// This will detect if any submodules are out of date an run the necessary
/// commands to sync them all with upstream.
fn update_submodules(&self) {
if !self.config.submodules {
return
@ -350,6 +416,11 @@ impl Build {
cmd.current_dir(&self.src).arg("submodule");
return cmd
};
// FIXME: this takes a seriously long time to execute on Windows and a
// nontrivial amount of time on Unix, we should have a better way
// of detecting whether we need to run all the submodule commands
// below.
let out = output(git_submodule().arg("status"));
if !out.lines().any(|l| l.starts_with("+") || l.starts_with("-")) {
return
@ -366,8 +437,9 @@ impl Build {
.arg("git").arg("checkout").arg("."));
}
/// Clear out `dir` if our build has been flagged as dirty, and also set
/// ourselves as dirty if `file` changes when `f` is executed.
/// Clear out `dir` if `input` is newer.
///
/// After this executes, it will also ensure that `dir` exists.
fn clear_if_dirty(&self, dir: &Path, input: &Path) {
let stamp = dir.join(".stamp");
if mtime(&stamp) < mtime(input) {
@ -381,8 +453,10 @@ impl Build {
/// Prepares an invocation of `cargo` to be run.
///
/// This will create a `Command` that represents a pending execution of
/// Cargo for the specified stage, whether or not the standard library is
/// being built, and using the specified compiler targeting `target`.
/// Cargo. This cargo will be configured to use `compiler` as the actual
/// rustc compiler, its output will be scoped by `mode`'s output directory,
/// it will pass the `--target` flag for the specified `target`, and will be
/// executing the Cargo command `cmd`.
fn cargo(&self,
compiler: &Compiler,
mode: Mode,
@ -398,6 +472,9 @@ impl Build {
// Customize the compiler we're running. Specify the compiler to cargo
// as our shim and then pass it some various options used to configure
// how the actual compiler itself is called.
//
// These variables are primarily all read by
// src/bootstrap/{rustc,rustdoc.rs}
cargo.env("RUSTC", self.out.join("bootstrap/debug/rustc"))
.env("RUSTC_REAL", self.compiler_path(compiler))
.env("RUSTC_STAGE", compiler.stage.to_string())
@ -414,16 +491,7 @@ impl Build {
.env("RUSTDOC_REAL", self.rustdoc(compiler))
.env("RUSTC_FLAGS", self.rustc_flags(target).join(" "));
// Set the bootstrap key depending on which stage compiler we're using.
// In stage0 we're using a previously released stable compiler, so we
// use the stage0 bootstrap key. Otherwise we use our own build's
// bootstrap key.
let bootstrap_key = if compiler.is_snapshot(self) {
&self.bootstrap_key_stage0
} else {
&self.bootstrap_key
};
cargo.env("RUSTC_BOOTSTRAP_KEY", bootstrap_key);
self.add_bootstrap_key(compiler, &mut cargo);
// Specify some various options for build scripts used throughout
// the build.
@ -443,7 +511,7 @@ impl Build {
// Environment variables *required* needed throughout the build
//
// FIXME: should update code to not require this env vars
// FIXME: should update code to not require this env var
cargo.env("CFG_COMPILER_HOST_TRIPLE", target);
if self.config.verbose || self.flags.verbose {
@ -522,6 +590,12 @@ impl Build {
if self.config.rust_optimize {"release"} else {"debug"}
}
/// Returns the sysroot for the `compiler` specified that *this build system
/// generates*.
///
/// That is, the sysroot for the stage0 compiler is not what the compiler
/// thinks it is by default, but it's the same as the default for stages
/// 1-3.
fn sysroot(&self, compiler: &Compiler) -> PathBuf {
if compiler.stage == 0 {
self.out.join(compiler.host).join("stage0-sysroot")
@ -530,6 +604,8 @@ impl Build {
}
}
/// Returns the libdir where the standard library and other artifacts are
/// found for a compiler's sysroot.
fn sysroot_libdir(&self, compiler: &Compiler, target: &str) -> PathBuf {
self.sysroot(compiler).join("lib").join("rustlib")
.join(target).join("lib")
@ -561,11 +637,17 @@ impl Build {
}
/// Root output directory for LLVM compiled for `target`
///
/// Note that if LLVM is configured externally then the directory returned
/// will likely be empty.
fn llvm_out(&self, target: &str) -> PathBuf {
self.out.join(target).join("llvm")
}
/// Returns the path to `llvm-config` for the specified target
/// Returns the path to `llvm-config` for the specified target.
///
/// If a custom `llvm-config` was specified for target then that's returned
/// instead.
fn llvm_config(&self, target: &str) -> PathBuf {
let target_config = self.config.target_config.get(target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
@ -576,7 +658,7 @@ impl Build {
}
}
/// Returns the path to `llvm-config` for the specified target
/// Returns the path to `FileCheck` binary for the specified target
fn llvm_filecheck(&self, target: &str) -> PathBuf {
let target_config = self.config.target_config.get(target);
if let Some(s) = target_config.and_then(|c| c.llvm_config.as_ref()) {
@ -603,15 +685,37 @@ impl Build {
self.out.join(target).join("rust-test-helpers")
}
/// Adds the compiler's directory of dynamic libraries to `cmd`'s dynamic
/// library lookup path.
fn add_rustc_lib_path(&self, compiler: &Compiler, cmd: &mut Command) {
// Windows doesn't need dylib path munging because the dlls for the
// compiler live next to the compiler and the system will find them
// automatically.
if cfg!(windows) { return }
if cfg!(windows) {
return
}
add_lib_path(vec![self.rustc_libdir(compiler)], cmd);
}
/// Adds the compiler's bootstrap key to the environment of `cmd`.
fn add_bootstrap_key(&self, compiler: &Compiler, cmd: &mut Command) {
// In stage0 we're using a previously released stable compiler, so we
// use the stage0 bootstrap key. Otherwise we use our own build's
// bootstrap key.
let bootstrap_key = if compiler.is_snapshot(self) {
&self.bootstrap_key_stage0
} else {
&self.bootstrap_key
};
cmd.env("RUSTC_BOOTSTRAP_KEY", bootstrap_key);
}
/// Returns the compiler's libdir where it stores the dynamic libraries that
/// it itself links against.
///
/// For example this returns `<sysroot>/lib` on Unix and `<sysroot>/bin` on
/// Windows.
fn rustc_libdir(&self, compiler: &Compiler) -> PathBuf {
if compiler.is_snapshot(self) {
self.rustc_snapshot_libdir()
@ -620,30 +724,38 @@ impl Build {
}
}
/// Returns the libdir of the snapshot compiler.
fn rustc_snapshot_libdir(&self) -> PathBuf {
self.rustc.parent().unwrap().parent().unwrap()
.join(libdir(&self.config.build))
}
/// Runs a command, printing out nice contextual information if it fails.
fn run(&self, cmd: &mut Command) {
self.verbose(&format!("running: {:?}", cmd));
run_silent(cmd)
}
/// Prints a message if this build is configured in verbose mode.
fn verbose(&self, msg: &str) {
if self.flags.verbose || self.config.verbose {
println!("{}", msg);
}
}
/// Returns the number of parallel jobs that have been configured for this
/// build.
fn jobs(&self) -> u32 {
self.flags.jobs.unwrap_or(num_cpus::get() as u32)
}
/// Returns the path to the C compiler for the target specified.
fn cc(&self, target: &str) -> &Path {
self.cc[target].0.path()
}
/// Returns a list of flags to pass to the C compiler for the target
/// specified.
fn cflags(&self, target: &str) -> Vec<String> {
// Filter out -O and /O (the optimization flags) that we picked up from
// gcc-rs because the build scripts will determine that for themselves.
@ -663,15 +775,26 @@ impl Build {
return base
}
/// Returns the path to the `ar` archive utility for the target specified.
fn ar(&self, target: &str) -> &Path {
&self.cc[target].1
}
/// Returns the path to the C++ compiler for the target specified, may panic
/// if no C++ compiler was configured for the target.
fn cxx(&self, target: &str) -> &Path {
self.cxx[target].path()
}
/// Returns flags to pass to the compiler to generate code for `target`.
fn rustc_flags(&self, target: &str) -> Vec<String> {
// New flags should be added here with great caution!
//
// It's quite unfortunate to **require** flags to generate code for a
// target, so it should only be passed here if absolutely necessary!
// Most default configuration should be done through target specs rather
// than an entry here.
let mut base = Vec::new();
if target != self.config.build && !target.contains("msvc") {
base.push(format!("-Clinker={}", self.cc(target).display()));
@ -681,10 +804,12 @@ impl Build {
}
impl<'a> Compiler<'a> {
/// Creates a new complier for the specified stage/host
fn new(stage: u32, host: &'a str) -> Compiler<'a> {
Compiler { stage: stage, host: host }
}
/// Returns whether this is a snapshot compiler for `build`'s configuration
fn is_snapshot(&self, build: &Build) -> bool {
self.stage == 0 && self.host == build.config.build
}

View File

@ -8,6 +8,16 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Compilation of native dependencies like LLVM.
//!
//! Native projects like LLVM unfortunately aren't suited just yet for
//! compilation in build scripts that Cargo has. This is because thie
//! compilation takes a *very* long time but also because we don't want to
//! compile LLVM 3 times as part of a normal bootstrap (we want it cached).
//!
//! LLVM and compiler-rt are essentially just wired up to everything else to
//! ensure that they're always in place if needed.
use std::path::Path;
use std::process::Command;
use std::fs;
@ -19,6 +29,7 @@ use gcc;
use build::Build;
use build::util::{exe, staticlib, up_to_date};
/// Compile LLVM for `target`.
pub fn llvm(build: &Build, target: &str) {
// If we're using a custom LLVM bail out here, but we can only use a
// custom LLVM for the build triple.
@ -116,6 +127,10 @@ fn check_llvm_version(build: &Build, llvm_config: &Path) {
panic!("\n\nbad LLVM version: {}, need >=3.5\n\n", version)
}
/// Compiles the `compiler-rt` library, or at least the builtins part of it.
///
/// This uses the CMake build system and an existing LLVM build directory to
/// compile the project.
pub fn compiler_rt(build: &Build, target: &str) {
let dst = build.compiler_rt_out(target);
let arch = target.split('-').next().unwrap();
@ -171,6 +186,8 @@ pub fn compiler_rt(build: &Build, target: &str) {
cfg.build();
}
/// Compiles the `rust_test_helpers.c` library which we used in various
/// `run-pass` test suites for ABI testing.
pub fn test_helpers(build: &Build, target: &str) {
let dst = build.test_helpers_out(target);
let src = build.src.join("src/rt/rust_test_helpers.c");

View File

@ -8,6 +8,16 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Sanity checking performed by rustbuild before actually executing anything.
//!
//! This module contains the implementation of ensuring that the build
//! environment looks reasonable before progressing. This will verify that
//! various programs like git and python exist, along with ensuring that all C
//! compilers for cross-compiling are found.
//!
//! In theory if we get past this phase it's a bug if a build fails, but in
//! practice that's likely not true!
use std::collections::HashSet;
use std::env;
use std::ffi::{OsStr, OsString};

View File

@ -8,6 +8,18 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Major workhorse of rustbuild, definition and dependencies between stages of
//! the copmile.
//!
//! The primary purpose of this module is to define the various `Step`s of
//! execution of the build. Each `Step` has a corresponding `Source` indicating
//! what it's actually doing along with a number of dependencies which must be
//! executed first.
//!
//! This module will take the CLI as input and calculate the steps required for
//! the build requested, ensuring that all intermediate pieces are in place.
//! Essentially this module is a `make`-replacement, but not as good.
use std::collections::HashSet;
use build::{Build, Compiler};
@ -18,6 +30,15 @@ pub struct Step<'a> {
pub target: &'a str,
}
/// Macro used to iterate over all targets that are recognized by the build
/// system.
///
/// Whenever a new step is added it will involve adding an entry here, updating
/// the dependencies section below, and then adding an implementation of the
/// step in `build/mod.rs`.
///
/// This macro takes another macro as an argument and then calls that macro with
/// all steps that the build system knows about.
macro_rules! targets {
($m:ident) => {
$m! {
@ -110,6 +131,9 @@ macro_rules! targets {
}
}
// Define the `Source` enum by iterating over all the steps and peeling out just
// the types that we want to define.
macro_rules! item { ($a:item) => ($a) }
macro_rules! define_source {
@ -125,6 +149,12 @@ macro_rules! define_source {
targets!(define_source);
/// Calculate a list of all steps described by `build`.
///
/// This will inspect the flags passed in on the command line and use that to
/// build up a list of steps to execute. These steps will then be transformed
/// into a topologically sorted list which when executed left-to-right will
/// correctly sequence the entire build.
pub fn all(build: &Build) -> Vec<Step> {
let mut ret = Vec::new();
let mut all = HashSet::new();
@ -146,6 +176,8 @@ pub fn all(build: &Build) -> Vec<Step> {
}
}
/// Determines what top-level targets are requested as part of this build,
/// returning them as a list.
fn top_level(build: &Build) -> Vec<Step> {
let mut targets = Vec::new();
let stage = build.flags.stage.unwrap_or(2);
@ -161,8 +193,10 @@ fn top_level(build: &Build) -> Vec<Step> {
.unwrap_or(host.target)
};
// First, try to find steps on the command line.
add_steps(build, stage, &host, &target, &mut targets);
// If none are specified, then build everything.
if targets.len() == 0 {
let t = Step {
src: Source::Llvm { _dummy: () },
@ -260,8 +294,14 @@ impl<'a> Step<'a> {
Step { target: target, src: self.src.clone() }
}
// Define ergonomic constructors for each step defined above so they can be
// easily constructed.
targets!(constructors);
/// Mapping of all dependencies for rustbuild.
///
/// This function receives a step, the build that we're building for, and
/// then returns a list of all the dependencies of that step.
pub fn deps(&self, build: &'a Build) -> Vec<Step<'a>> {
match self.src {
Source::Rustc { stage: 0 } => {

View File

@ -8,6 +8,11 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Various utility functions used throughout rustbuild.
//!
//! Simple things like testing the various filesystem operations here and there,
//! not a lot of interesting happenings here unfortunately.
use std::env;
use std::path::{Path, PathBuf};
use std::fs;
@ -16,6 +21,7 @@ use std::process::Command;
use bootstrap::{dylib_path, dylib_path_var};
use filetime::FileTime;
/// Returns the `name` as the filename of a static library for `target`.
pub fn staticlib(name: &str, target: &str) -> String {
if target.contains("windows-msvc") {
format!("{}.lib", name)
@ -24,12 +30,15 @@ pub fn staticlib(name: &str, target: &str) -> String {
}
}
/// Returns the last-modified time for `path`, or zero if it doesn't exist.
pub fn mtime(path: &Path) -> FileTime {
fs::metadata(path).map(|f| {
FileTime::from_last_modification_time(&f)
}).unwrap_or(FileTime::zero())
}
/// Copies a file from `src` to `dst`, attempting to use hard links and then
/// falling back to an actually filesystem copy if necessary.
pub fn copy(src: &Path, dst: &Path) {
let res = fs::hard_link(src, dst);
let res = res.or_else(|_| fs::copy(src, dst).map(|_| ()));
@ -39,6 +48,8 @@ pub fn copy(src: &Path, dst: &Path) {
}
}
/// Copies the `src` directory recursively to `dst`. Both are assumed to exist
/// when this function is called.
pub fn cp_r(src: &Path, dst: &Path) {
for f in t!(fs::read_dir(src)) {
let f = t!(f);
@ -66,14 +77,18 @@ pub fn exe(name: &str, target: &str) -> String {
}
}
/// Returns whether the file name given looks like a dynamic library.
pub fn is_dylib(name: &str) -> bool {
name.ends_with(".dylib") || name.ends_with(".so") || name.ends_with(".dll")
}
/// Returns the corresponding relative library directory that the compiler's
/// dylibs will be found in.
pub fn libdir(target: &str) -> &'static str {
if target.contains("windows") {"bin"} else {"lib"}
}
/// Adds a list of lookup paths to `cmd`'s dynamic library lookup path.
pub fn add_lib_path(path: Vec<PathBuf>, cmd: &mut Command) {
let mut list = dylib_path();
for path in path {
@ -82,7 +97,10 @@ pub fn add_lib_path(path: Vec<PathBuf>, cmd: &mut Command) {
cmd.env(dylib_path_var(), t!(env::join_paths(list)));
}
#[allow(dead_code)] // this will be used soon
/// Returns whether `dst` is up to date given that the file or files in `src`
/// are used to generate it.
///
/// Uses last-modified time checks to verify this.
pub fn up_to_date(src: &Path, dst: &Path) -> bool {
let threshold = mtime(dst);
let meta = t!(fs::metadata(src));

View File

@ -0,0 +1,154 @@
# Sample TOML configuration file for building Rust.
#
# All options are commented out by default in this file, and they're commented
# out with their default values. The build system by default looks for
# `config.toml` in the current directory of a build for build configuration, but
# a custom configuration file can also be specified with `--config` to the build
# system.
# =============================================================================
# Tweaking how LLVM is compiled
# =============================================================================
[llvm]
# Indicates whether the LLVM build is a Release or Debug build
#optimize = true
# Indicates whether the LLVM assertions are enabled or not
#assertions = false
# Indicates whether ccache is used when building LLVM
#ccache = false
# If an external LLVM root is specified, we automatically check the version by
# default to make sure it's within the range that we're expecting, but setting
# this flag will indicate that this version check should not be done.
#version-check = false
# Link libstdc++ statically into the librustc_llvm instead of relying on a
# dynamic version to be available.
#static-libstdcpp = false
# Tell the LLVM build system to use Ninja instead of the platform default for
# the generated build system. This can sometimes be faster than make, for
# example.
#ninja = false
# =============================================================================
# General build configuration options
# =============================================================================
[build]
# Build triple for the original snapshot compiler. This must be a compiler that
# nightlies are already produced for. The current platform must be able to run
# binaries of this build triple and the nightly will be used to bootstrap the
# first compiler.
#build = "x86_64-unknown-linux-gnu" # defaults to your host platform
# In addition to the build triple, other triples to produce full compiler
# toolchains for. Each of these triples will be bootstrapped from the build
# triple and then will continue to bootstrap themselves. This platform must
# currently be able to run all of the triples provided here.
#host = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple
# In addition to all host triples, other triples to produce the standard library
# for. Each host triple will be used to produce a copy of the standard library
# for each target triple.
#target = ["x86_64-unknown-linux-gnu"] # defaults to just the build triple
# Instead of downloading the src/nightlies.txt version of Cargo specified, use
# this Cargo binary instead to build all Rust code
#cargo = "/path/to/bin/cargo"
# Instead of downloading the src/nightlies.txt version of the compiler
# specified, use this rustc binary instead as the stage0 snapshot compiler.
#rustc = "/path/to/bin/rustc"
# Flag to specify whether any documentation is built. If false, rustdoc and
# friends will still be compiled but they will not be used to generate any
# documentation.
#docs = true
# Indicate whether the compiler should be documented in addition to the standard
# library and facade crates.
#compiler-docs = false
# =============================================================================
# Options for compiling Rust code itself
# =============================================================================
[rust]
# Whether or not to optimize the compiler and standard library
#optimize = true
# Number of codegen units to use for each compiler invocation. A value of 0
# means "the number of cores on this machine", and 1+ is passed through to the
# compiler.
#codegen-units = 1
# Whether or not debug assertions are enabled for the compiler and standard
# library
#debug-assertions = false
# Whether or not debuginfo is emitted
#debuginfo = false
# Whether or not jemalloc is built and enabled
#use-jemalloc = true
# Whether or not jemalloc is built with its debug option set
#debug-jemalloc = false
# The default linker that will be used by the generated compiler. Note that this
# is not the linker used to link said compiler.
#default-linker = "cc"
# The default ar utility that will be used by the generated compiler if LLVM
# cannot be used. Note that this is not used to assemble said compiler.
#default-ar = "ar"
# The "channel" for the Rust build to produce. The stable/beta channels only
# allow using stable features, whereas the nightly and dev channels allow using
# nightly features
#channel = "dev"
# The root location of the MUSL installation directory. The library directory
# will also need to contain libunwind.a for an unwinding implementation.
#musl-root = "..."
# By default the `rustc` executable is built with `-Wl,-rpath` flags on Unix
# platforms to ensure that the compiler is usable by default from the build
# directory (as it links to a number of dynamic libraries). This may not be
# desired in distributions, for example.
#rpath = true
# =============================================================================
# Options for specific targets
#
# Each of the following options is scoped to the specific target triple in
# question and is used for determining how to compile each target.
# =============================================================================
[target.x86_64-unknown-linux-gnu]
# C compiler to be used to compiler C code and link Rust code. Note that the
# default value is platform specific, and if not specified it may also depend on
# what platform is crossing to what platform.
#cc = "cc"
# C++ compiler to be used to compiler C++ code (e.g. LLVM and our LLVM shims).
# This is only used for host targets.
#cxx = "c++"
# Path to the `llvm-config` binary of the installation of a custom LLVM to link
# against. Note that if this is specifed we don't compile LLVM at all for this
# target.
#llvm-config = "../path/to/llvm/root/bin/llvm-config"
# Path to the custom jemalloc static library to link into the standard library
# by default. This is only used if jemalloc is still enabled above
#jemalloc = "/path/to/jemalloc/libjemalloc_pic.a"
# If this target is for Android, this option will be required to specify where
# the NDK for the target lives. This is used to find the C compiler to link and
# build native code.
#android-ndk = "/path/to/ndk"

View File

@ -8,10 +8,17 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! A small helper library shared between the build system's executables
//!
//! Currently this just has some simple utilities for modifying the dynamic
//! library lookup path.
use std::env;
use std::ffi::OsString;
use std::path::PathBuf;
/// Returns the environment variable which the dynamic library lookup path
/// resides in for this platform.
pub fn dylib_path_var() -> &'static str {
if cfg!(target_os = "windows") {
"PATH"
@ -22,6 +29,8 @@ pub fn dylib_path_var() -> &'static str {
}
}
/// Parses the `dylib_path_var()` environment variable, returning a list of
/// paths that are members of this lookup path.
pub fn dylib_path() -> Vec<PathBuf> {
env::split_paths(&env::var_os(dylib_path_var()).unwrap_or(OsString::new()))
.collect()

View File

@ -8,6 +8,13 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! rustbuild, the Rust build system
//!
//! This is the entry point for the build system used to compile the `rustc`
//! compiler. Lots of documentation can be found in the `README.md` file next to
//! this file, and otherwise documentation can be found throughout the `build`
//! directory in each respective module.
#![deny(warnings)]
extern crate bootstrap;
@ -32,8 +39,11 @@ fn main() {
let args = env::args().skip(1).collect::<Vec<_>>();
let flags = Flags::parse(&args);
let mut config = Config::parse(&flags.build, flags.config.clone());
// compat with `./configure` while we're still using that
if std::fs::metadata("config.mk").is_ok() {
config.update_with_config_mk();
}
Build::new(flags, config).build();
}

View File

@ -29,7 +29,6 @@ extern crate bootstrap;
use std::env;
use std::ffi::OsString;
use std::path::PathBuf;
use std::process::Command;
fn main() {
@ -54,16 +53,9 @@ fn main() {
cmd.args(&args)
.arg("--cfg").arg(format!("stage{}", env::var("RUSTC_STAGE").unwrap()));
if target.is_none() {
// Build scripts are always built with the snapshot compiler, so we need
// to be sure to set up the right path information for the OS dynamic
// linker to find the libraries in question.
if let Some(p) = env::var_os("RUSTC_SNAPSHOT_LIBDIR") {
let mut path = bootstrap::dylib_path();
path.insert(0, PathBuf::from(p));
cmd.env(bootstrap::dylib_path_var(), env::join_paths(path).unwrap());
}
} else {
if let Some(target) = target {
// The stage0 compiler has a special sysroot distinct from what we
// actually downloaded, so we just always pass the `--sysroot` option.
cmd.arg("--sysroot").arg(env::var_os("RUSTC_SYSROOT").unwrap());
// When we build Rust dylibs they're all intended for intermediate
@ -71,20 +63,23 @@ fn main() {
// linking all deps statically into the dylib.
cmd.arg("-Cprefer-dynamic");
// Help the libc crate compile by assisting it in finding the MUSL
// native libraries.
if let Some(s) = env::var_os("MUSL_ROOT") {
let mut root = OsString::from("native=");
root.push(&s);
root.push("/lib");
cmd.arg("-L").arg(&root);
}
// Pass down extra flags, commonly used to configure `-Clinker` when
// cross compiling.
if let Ok(s) = env::var("RUSTC_FLAGS") {
cmd.args(&s.split(" ").filter(|s| !s.is_empty()).collect::<Vec<_>>());
}
}
// Set various options from config.toml to configure how we're building
// code.
if let Some(target) = target {
// Set various options from config.toml to configure how we're building
// code.
if env::var("RUSTC_DEBUGINFO") == Ok("true".to_string()) {
cmd.arg("-g");
}