Auto merge of #43454 - Mark-Simulacrum:rollup, r=Mark-Simulacrum
Rollup of 11 pull requests - Successful merges: #43297, #43322, #43342, #43361, #43366, #43374, #43379, #43401, #43421, #43428, #43446 - Failed merges:
This commit is contained in:
commit
598eddf4f7
8
configure
vendored
8
configure
vendored
@ -560,8 +560,8 @@ case "$CFG_RELEASE_CHANNEL" in
|
|||||||
*-pc-windows-gnu)
|
*-pc-windows-gnu)
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
CFG_ENABLE_DEBUGINFO_LINES=1
|
enable_if_not_disabled debuginfo-lines
|
||||||
CFG_ENABLE_DEBUGINFO_ONLY_STD=1
|
enable_if_not_disabled debuginfo-only-std
|
||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
|
|
||||||
@ -572,8 +572,8 @@ case "$CFG_RELEASE_CHANNEL" in
|
|||||||
*-pc-windows-gnu)
|
*-pc-windows-gnu)
|
||||||
;;
|
;;
|
||||||
*)
|
*)
|
||||||
CFG_ENABLE_DEBUGINFO_LINES=1
|
enable_if_not_disabled debuginfo-lines
|
||||||
CFG_ENABLE_DEBUGINFO_ONLY_STD=1
|
enable_if_not_disabled debuginfo-only-std
|
||||||
;;
|
;;
|
||||||
esac
|
esac
|
||||||
;;
|
;;
|
||||||
|
@ -194,6 +194,14 @@ pub extern fn rust_begin_panic(_msg: core::fmt::Arguments,
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
In many cases, you may need to manually link to the `compiler_builtins` crate
|
||||||
|
when building a `no_std` binary. You may observe this via linker error messages
|
||||||
|
such as "```undefined reference to `__rust_probestack'```". Using this crate
|
||||||
|
also requires enabling the library feature `compiler_builtins_lib`. You can read
|
||||||
|
more about this [here][compiler-builtins-lib].
|
||||||
|
|
||||||
|
[compiler-builtins-lib]: library-features/compiler-builtins-lib.html
|
||||||
|
|
||||||
## More about the language items
|
## More about the language items
|
||||||
|
|
||||||
The compiler currently makes a few assumptions about symbols which are
|
The compiler currently makes a few assumptions about symbols which are
|
||||||
|
@ -0,0 +1,35 @@
|
|||||||
|
# `compiler_builtins_lib`
|
||||||
|
|
||||||
|
The tracking issue for this feature is: None.
|
||||||
|
|
||||||
|
------------------------
|
||||||
|
|
||||||
|
This feature is required to link to the `compiler_builtins` crate which contains
|
||||||
|
"compiler intrinsics". Compiler intrinsics are software implementations of basic
|
||||||
|
operations like multiplication of `u64`s. These intrinsics are only required on
|
||||||
|
platforms where these operations don't directly map to a hardware instruction.
|
||||||
|
|
||||||
|
You should never need to explicitly link to the `compiler_builtins` crate when
|
||||||
|
building "std" programs as `compiler_builtins` is already in the dependency
|
||||||
|
graph of `std`. But you may need it when building `no_std` **binary** crates. If
|
||||||
|
you get a *linker* error like:
|
||||||
|
|
||||||
|
``` text
|
||||||
|
$PWD/src/main.rs:11: undefined reference to `__aeabi_lmul'
|
||||||
|
$PWD/src/main.rs:11: undefined reference to `__aeabi_uldivmod'
|
||||||
|
```
|
||||||
|
|
||||||
|
That means that you need to link to this crate.
|
||||||
|
|
||||||
|
When you link to this crate, make sure it only appears once in your crate
|
||||||
|
dependency graph. Also, it doesn't matter where in the dependency graph you
|
||||||
|
place the `compiler_builtins` crate.
|
||||||
|
|
||||||
|
<!-- NOTE(ignore) doctests don't support `no_std` binaries -->
|
||||||
|
|
||||||
|
``` rust,ignore
|
||||||
|
#![feature(compiler_builtins_lib)]
|
||||||
|
#![no_std]
|
||||||
|
|
||||||
|
extern crate compiler_builtins;
|
||||||
|
```
|
@ -1252,12 +1252,13 @@ impl<T> [T] {
|
|||||||
///
|
///
|
||||||
/// # Current implementation
|
/// # Current implementation
|
||||||
///
|
///
|
||||||
/// The current algorithm is based on Orson Peters' [pattern-defeating quicksort][pdqsort],
|
/// The current algorithm is based on [pattern-defeating quicksort][pdqsort] by Orson Peters,
|
||||||
/// which is a quicksort variant designed to be very fast on certain kinds of patterns,
|
/// which combines the fast average case of randomized quicksort with the fast worst case of
|
||||||
/// sometimes achieving linear time. It is randomized but deterministic, and falls back to
|
/// heapsort, while achieving linear time on slices with certain patterns. It uses some
|
||||||
/// heapsort on degenerate inputs.
|
/// randomization to avoid degenerate cases, but with a fixed seed to always provide
|
||||||
|
/// deterministic behavior.
|
||||||
///
|
///
|
||||||
/// It is generally faster than stable sorting, except in a few special cases, e.g. when the
|
/// It is typically faster than stable sorting, except in a few special cases, e.g. when the
|
||||||
/// slice consists of several concatenated sorted sequences.
|
/// slice consists of several concatenated sorted sequences.
|
||||||
///
|
///
|
||||||
/// # Examples
|
/// # Examples
|
||||||
@ -1286,12 +1287,13 @@ impl<T> [T] {
|
|||||||
///
|
///
|
||||||
/// # Current implementation
|
/// # Current implementation
|
||||||
///
|
///
|
||||||
/// The current algorithm is based on Orson Peters' [pattern-defeating quicksort][pdqsort],
|
/// The current algorithm is based on [pattern-defeating quicksort][pdqsort] by Orson Peters,
|
||||||
/// which is a quicksort variant designed to be very fast on certain kinds of patterns,
|
/// which combines the fast average case of randomized quicksort with the fast worst case of
|
||||||
/// sometimes achieving linear time. It is randomized but deterministic, and falls back to
|
/// heapsort, while achieving linear time on slices with certain patterns. It uses some
|
||||||
/// heapsort on degenerate inputs.
|
/// randomization to avoid degenerate cases, but with a fixed seed to always provide
|
||||||
|
/// deterministic behavior.
|
||||||
///
|
///
|
||||||
/// It is generally faster than stable sorting, except in a few special cases, e.g. when the
|
/// It is typically faster than stable sorting, except in a few special cases, e.g. when the
|
||||||
/// slice consists of several concatenated sorted sequences.
|
/// slice consists of several concatenated sorted sequences.
|
||||||
///
|
///
|
||||||
/// # Examples
|
/// # Examples
|
||||||
@ -1323,12 +1325,13 @@ impl<T> [T] {
|
|||||||
///
|
///
|
||||||
/// # Current implementation
|
/// # Current implementation
|
||||||
///
|
///
|
||||||
/// The current algorithm is based on Orson Peters' [pattern-defeating quicksort][pdqsort],
|
/// The current algorithm is based on [pattern-defeating quicksort][pdqsort] by Orson Peters,
|
||||||
/// which is a quicksort variant designed to be very fast on certain kinds of patterns,
|
/// which combines the fast average case of randomized quicksort with the fast worst case of
|
||||||
/// sometimes achieving linear time. It is randomized but deterministic, and falls back to
|
/// heapsort, while achieving linear time on slices with certain patterns. It uses some
|
||||||
/// heapsort on degenerate inputs.
|
/// randomization to avoid degenerate cases, but with a fixed seed to always provide
|
||||||
|
/// deterministic behavior.
|
||||||
///
|
///
|
||||||
/// It is generally faster than stable sorting, except in a few special cases, e.g. when the
|
/// It is typically faster than stable sorting, except in a few special cases, e.g. when the
|
||||||
/// slice consists of several concatenated sorted sequences.
|
/// slice consists of several concatenated sorted sequences.
|
||||||
///
|
///
|
||||||
/// # Examples
|
/// # Examples
|
||||||
|
@ -847,7 +847,7 @@ macro_rules! generate_pattern_iterators {
|
|||||||
internal:
|
internal:
|
||||||
$internal_iterator:ident yielding ($iterty:ty);
|
$internal_iterator:ident yielding ($iterty:ty);
|
||||||
|
|
||||||
// Kind of delgation - either single ended or double ended
|
// Kind of delegation - either single ended or double ended
|
||||||
delegate $($t:tt)*
|
delegate $($t:tt)*
|
||||||
} => {
|
} => {
|
||||||
$(#[$forward_iterator_attribute])*
|
$(#[$forward_iterator_attribute])*
|
||||||
|
@ -83,7 +83,7 @@ pub enum SearchStep {
|
|||||||
/// Note that there might be more than one `Reject` between two `Match`es,
|
/// Note that there might be more than one `Reject` between two `Match`es,
|
||||||
/// there is no requirement for them to be combined into one.
|
/// there is no requirement for them to be combined into one.
|
||||||
Reject(usize, usize),
|
Reject(usize, usize),
|
||||||
/// Expresses that every byte of the haystack has been visted, ending
|
/// Expresses that every byte of the haystack has been visited, ending
|
||||||
/// the iteration.
|
/// the iteration.
|
||||||
Done
|
Done
|
||||||
}
|
}
|
||||||
@ -101,7 +101,7 @@ pub enum SearchStep {
|
|||||||
/// the haystack. This enables consumers of this trait to
|
/// the haystack. This enables consumers of this trait to
|
||||||
/// slice the haystack without additional runtime checks.
|
/// slice the haystack without additional runtime checks.
|
||||||
pub unsafe trait Searcher<'a> {
|
pub unsafe trait Searcher<'a> {
|
||||||
/// Getter for the underlaying string to be searched in
|
/// Getter for the underlying string to be searched in
|
||||||
///
|
///
|
||||||
/// Will always return the same `&str`
|
/// Will always return the same `&str`
|
||||||
fn haystack(&self) -> &'a str;
|
fn haystack(&self) -> &'a str;
|
||||||
@ -1153,7 +1153,7 @@ impl TwoWaySearcher {
|
|||||||
// The maximal suffix is a possible critical factorization (u', v') of `arr`.
|
// The maximal suffix is a possible critical factorization (u', v') of `arr`.
|
||||||
//
|
//
|
||||||
// Returns `i` where `i` is the starting index of v', from the back;
|
// Returns `i` where `i` is the starting index of v', from the back;
|
||||||
// returns immedately when a period of `known_period` is reached.
|
// returns immediately when a period of `known_period` is reached.
|
||||||
//
|
//
|
||||||
// `order_greater` determines if lexical order is `<` or `>`. Both
|
// `order_greater` determines if lexical order is `<` or `>`. Both
|
||||||
// orders must be computed -- the ordering with the largest `i` gives
|
// orders must be computed -- the ordering with the largest `i` gives
|
||||||
|
@ -1 +1 @@
|
|||||||
Subproject commit 2015cf17a6a2a2280e93d9c57214ba92dbbaf42f
|
Subproject commit ec1e5ab1ef8baca57f8776bbebd9343572a87082
|
15
src/librustc/build.rs
Normal file
15
src/librustc/build.rs
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_LIBDIR_RELATIVE");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_COMPILER_HOST_TRIPLE");
|
||||||
|
}
|
@ -1708,7 +1708,7 @@ not apply to structs.
|
|||||||
representation of enums isn't strictly defined in Rust, and this attribute
|
representation of enums isn't strictly defined in Rust, and this attribute
|
||||||
won't work on enums.
|
won't work on enums.
|
||||||
|
|
||||||
`#[repr(simd)]` will give a struct consisting of a homogenous series of machine
|
`#[repr(simd)]` will give a struct consisting of a homogeneous series of machine
|
||||||
types (i.e. `u8`, `i32`, etc) a representation that permits vectorization via
|
types (i.e. `u8`, `i32`, etc) a representation that permits vectorization via
|
||||||
SIMD. This doesn't make much sense for enums since they don't consist of a
|
SIMD. This doesn't make much sense for enums since they don't consist of a
|
||||||
single list of data.
|
single list of data.
|
||||||
|
@ -136,6 +136,10 @@ impl DefIndex {
|
|||||||
pub fn as_array_index(&self) -> usize {
|
pub fn as_array_index(&self) -> usize {
|
||||||
(self.0 & !DEF_INDEX_HI_START.0) as usize
|
(self.0 & !DEF_INDEX_HI_START.0) as usize
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn from_array_index(i: usize, address_space: DefIndexAddressSpace) -> DefIndex {
|
||||||
|
DefIndex::new(address_space.start() + i)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The start of the "high" range of DefIndexes.
|
/// The start of the "high" range of DefIndexes.
|
||||||
|
@ -18,7 +18,7 @@ use hir;
|
|||||||
use hir::def_id::{CrateNum, DefId, DefIndex, LOCAL_CRATE, DefIndexAddressSpace,
|
use hir::def_id::{CrateNum, DefId, DefIndex, LOCAL_CRATE, DefIndexAddressSpace,
|
||||||
CRATE_DEF_INDEX};
|
CRATE_DEF_INDEX};
|
||||||
use ich::Fingerprint;
|
use ich::Fingerprint;
|
||||||
use rustc_data_structures::fx::FxHashMap;
|
use rustc_data_structures::fx::{FxHashMap, FxHashSet};
|
||||||
use rustc_data_structures::indexed_vec::IndexVec;
|
use rustc_data_structures::indexed_vec::IndexVec;
|
||||||
use rustc_data_structures::stable_hasher::StableHasher;
|
use rustc_data_structures::stable_hasher::StableHasher;
|
||||||
use serialize::{Encodable, Decodable, Encoder, Decoder};
|
use serialize::{Encodable, Decodable, Encoder, Decoder};
|
||||||
@ -36,7 +36,6 @@ use util::nodemap::NodeMap;
|
|||||||
/// There is one DefPathTable for each crate.
|
/// There is one DefPathTable for each crate.
|
||||||
pub struct DefPathTable {
|
pub struct DefPathTable {
|
||||||
index_to_key: [Vec<DefKey>; 2],
|
index_to_key: [Vec<DefKey>; 2],
|
||||||
key_to_index: FxHashMap<DefKey, DefIndex>,
|
|
||||||
def_path_hashes: [Vec<DefPathHash>; 2],
|
def_path_hashes: [Vec<DefPathHash>; 2],
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -47,7 +46,6 @@ impl Clone for DefPathTable {
|
|||||||
DefPathTable {
|
DefPathTable {
|
||||||
index_to_key: [self.index_to_key[0].clone(),
|
index_to_key: [self.index_to_key[0].clone(),
|
||||||
self.index_to_key[1].clone()],
|
self.index_to_key[1].clone()],
|
||||||
key_to_index: self.key_to_index.clone(),
|
|
||||||
def_path_hashes: [self.def_path_hashes[0].clone(),
|
def_path_hashes: [self.def_path_hashes[0].clone(),
|
||||||
self.def_path_hashes[1].clone()],
|
self.def_path_hashes[1].clone()],
|
||||||
}
|
}
|
||||||
@ -65,10 +63,9 @@ impl DefPathTable {
|
|||||||
let index_to_key = &mut self.index_to_key[address_space.index()];
|
let index_to_key = &mut self.index_to_key[address_space.index()];
|
||||||
let index = DefIndex::new(index_to_key.len() + address_space.start());
|
let index = DefIndex::new(index_to_key.len() + address_space.start());
|
||||||
debug!("DefPathTable::insert() - {:?} <-> {:?}", key, index);
|
debug!("DefPathTable::insert() - {:?} <-> {:?}", key, index);
|
||||||
index_to_key.push(key.clone());
|
index_to_key.push(key);
|
||||||
index
|
index
|
||||||
};
|
};
|
||||||
self.key_to_index.insert(key, index);
|
|
||||||
self.def_path_hashes[address_space.index()].push(def_path_hash);
|
self.def_path_hashes[address_space.index()].push(def_path_hash);
|
||||||
debug_assert!(self.def_path_hashes[address_space.index()].len() ==
|
debug_assert!(self.def_path_hashes[address_space.index()].len() ==
|
||||||
self.index_to_key[address_space.index()].len());
|
self.index_to_key[address_space.index()].len());
|
||||||
@ -87,47 +84,6 @@ impl DefPathTable {
|
|||||||
[index.as_array_index()]
|
[index.as_array_index()]
|
||||||
}
|
}
|
||||||
|
|
||||||
#[inline(always)]
|
|
||||||
pub fn def_index_for_def_key(&self, key: &DefKey) -> Option<DefIndex> {
|
|
||||||
self.key_to_index.get(key).cloned()
|
|
||||||
}
|
|
||||||
|
|
||||||
#[inline(always)]
|
|
||||||
pub fn contains_key(&self, key: &DefKey) -> bool {
|
|
||||||
self.key_to_index.contains_key(key)
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn retrace_path(&self,
|
|
||||||
path_data: &[DisambiguatedDefPathData])
|
|
||||||
-> Option<DefIndex> {
|
|
||||||
let root_key = DefKey {
|
|
||||||
parent: None,
|
|
||||||
disambiguated_data: DisambiguatedDefPathData {
|
|
||||||
data: DefPathData::CrateRoot,
|
|
||||||
disambiguator: 0,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
let root_index = self.key_to_index
|
|
||||||
.get(&root_key)
|
|
||||||
.expect("no root key?")
|
|
||||||
.clone();
|
|
||||||
|
|
||||||
debug!("retrace_path: root_index={:?}", root_index);
|
|
||||||
|
|
||||||
let mut index = root_index;
|
|
||||||
for data in path_data {
|
|
||||||
let key = DefKey { parent: Some(index), disambiguated_data: data.clone() };
|
|
||||||
debug!("retrace_path: key={:?}", key);
|
|
||||||
match self.key_to_index.get(&key) {
|
|
||||||
Some(&i) => index = i,
|
|
||||||
None => return None,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Some(index)
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn add_def_path_hashes_to(&self,
|
pub fn add_def_path_hashes_to(&self,
|
||||||
cnum: CrateNum,
|
cnum: CrateNum,
|
||||||
out: &mut FxHashMap<DefPathHash, DefId>) {
|
out: &mut FxHashMap<DefPathHash, DefId>) {
|
||||||
@ -149,7 +105,7 @@ impl DefPathTable {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub fn size(&self) -> usize {
|
pub fn size(&self) -> usize {
|
||||||
self.key_to_index.len()
|
self.index_to_key.iter().map(|v| v.len()).sum()
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -179,19 +135,8 @@ impl Decodable for DefPathTable {
|
|||||||
let index_to_key = [index_to_key_lo, index_to_key_hi];
|
let index_to_key = [index_to_key_lo, index_to_key_hi];
|
||||||
let def_path_hashes = [def_path_hashes_lo, def_path_hashes_hi];
|
let def_path_hashes = [def_path_hashes_lo, def_path_hashes_hi];
|
||||||
|
|
||||||
let mut key_to_index = FxHashMap();
|
|
||||||
|
|
||||||
for space in &[DefIndexAddressSpace::Low, DefIndexAddressSpace::High] {
|
|
||||||
key_to_index.extend(index_to_key[space.index()]
|
|
||||||
.iter()
|
|
||||||
.enumerate()
|
|
||||||
.map(|(index, key)| (key.clone(),
|
|
||||||
DefIndex::new(index + space.start()))))
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(DefPathTable {
|
Ok(DefPathTable {
|
||||||
index_to_key,
|
index_to_key,
|
||||||
key_to_index,
|
|
||||||
def_path_hashes,
|
def_path_hashes,
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
@ -208,6 +153,7 @@ pub struct Definitions {
|
|||||||
pub(super) node_to_hir_id: IndexVec<ast::NodeId, hir::HirId>,
|
pub(super) node_to_hir_id: IndexVec<ast::NodeId, hir::HirId>,
|
||||||
macro_def_scopes: FxHashMap<Mark, DefId>,
|
macro_def_scopes: FxHashMap<Mark, DefId>,
|
||||||
expansions: FxHashMap<DefIndex, Mark>,
|
expansions: FxHashMap<DefIndex, Mark>,
|
||||||
|
keys_created: FxHashSet<DefKey>,
|
||||||
}
|
}
|
||||||
|
|
||||||
// Unfortunately we have to provide a manual impl of Clone because of the
|
// Unfortunately we have to provide a manual impl of Clone because of the
|
||||||
@ -224,6 +170,7 @@ impl Clone for Definitions {
|
|||||||
node_to_hir_id: self.node_to_hir_id.clone(),
|
node_to_hir_id: self.node_to_hir_id.clone(),
|
||||||
macro_def_scopes: self.macro_def_scopes.clone(),
|
macro_def_scopes: self.macro_def_scopes.clone(),
|
||||||
expansions: self.expansions.clone(),
|
expansions: self.expansions.clone(),
|
||||||
|
keys_created: self.keys_created.clone(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@ -448,7 +395,6 @@ impl Definitions {
|
|||||||
Definitions {
|
Definitions {
|
||||||
table: DefPathTable {
|
table: DefPathTable {
|
||||||
index_to_key: [vec![], vec![]],
|
index_to_key: [vec![], vec![]],
|
||||||
key_to_index: FxHashMap(),
|
|
||||||
def_path_hashes: [vec![], vec![]],
|
def_path_hashes: [vec![], vec![]],
|
||||||
},
|
},
|
||||||
node_to_def_index: NodeMap(),
|
node_to_def_index: NodeMap(),
|
||||||
@ -456,6 +402,7 @@ impl Definitions {
|
|||||||
node_to_hir_id: IndexVec::new(),
|
node_to_hir_id: IndexVec::new(),
|
||||||
macro_def_scopes: FxHashMap(),
|
macro_def_scopes: FxHashMap(),
|
||||||
expansions: FxHashMap(),
|
expansions: FxHashMap(),
|
||||||
|
keys_created: FxHashSet(),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -478,10 +425,6 @@ impl Definitions {
|
|||||||
self.table.def_path_hash(index)
|
self.table.def_path_hash(index)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn def_index_for_def_key(&self, key: DefKey) -> Option<DefIndex> {
|
|
||||||
self.table.def_index_for_def_key(&key)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Returns the path from the crate root to `index`. The root
|
/// Returns the path from the crate root to `index`. The root
|
||||||
/// nodes are not included in the path (i.e., this will be an
|
/// nodes are not included in the path (i.e., this will be an
|
||||||
/// empty vector for the crate root). For an inlined item, this
|
/// empty vector for the crate root). For an inlined item, this
|
||||||
@ -583,9 +526,10 @@ impl Definitions {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
while self.table.contains_key(&key) {
|
while self.keys_created.contains(&key) {
|
||||||
key.disambiguated_data.disambiguator += 1;
|
key.disambiguated_data.disambiguator += 1;
|
||||||
}
|
}
|
||||||
|
self.keys_created.insert(key.clone());
|
||||||
|
|
||||||
let parent_hash = self.table.def_path_hash(parent);
|
let parent_hash = self.table.def_path_hash(parent);
|
||||||
let def_path_hash = key.compute_stable_hash(parent_hash);
|
let def_path_hash = key.compute_stable_hash(parent_hash);
|
||||||
@ -710,6 +654,8 @@ macro_rules! define_global_metadata_kind {
|
|||||||
$($variant),*
|
$($variant),*
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const GLOBAL_MD_ADDRESS_SPACE: DefIndexAddressSpace = DefIndexAddressSpace::High;
|
||||||
|
|
||||||
impl GlobalMetaDataKind {
|
impl GlobalMetaDataKind {
|
||||||
fn allocate_def_indices(definitions: &mut Definitions) {
|
fn allocate_def_indices(definitions: &mut Definitions) {
|
||||||
$({
|
$({
|
||||||
@ -718,7 +664,7 @@ macro_rules! define_global_metadata_kind {
|
|||||||
CRATE_DEF_INDEX,
|
CRATE_DEF_INDEX,
|
||||||
ast::DUMMY_NODE_ID,
|
ast::DUMMY_NODE_ID,
|
||||||
DefPathData::GlobalMetaData(instance.name()),
|
DefPathData::GlobalMetaData(instance.name()),
|
||||||
DefIndexAddressSpace::High,
|
GLOBAL_MD_ADDRESS_SPACE,
|
||||||
Mark::root()
|
Mark::root()
|
||||||
);
|
);
|
||||||
|
|
||||||
@ -736,7 +682,14 @@ macro_rules! define_global_metadata_kind {
|
|||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
def_path_table.key_to_index[&def_key]
|
// These DefKeys are all right after the root,
|
||||||
|
// so a linear search is fine.
|
||||||
|
let index = def_path_table.index_to_key[GLOBAL_MD_ADDRESS_SPACE.index()]
|
||||||
|
.iter()
|
||||||
|
.position(|k| *k == def_key)
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
DefIndex::from_array_index(index, GLOBAL_MD_ADDRESS_SPACE)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn name(&self) -> Symbol {
|
fn name(&self) -> Symbol {
|
||||||
|
@ -17,7 +17,7 @@ pub use self::definitions::{Definitions, DefKey, DefPath, DefPathData,
|
|||||||
|
|
||||||
use dep_graph::{DepGraph, DepNode, DepKind};
|
use dep_graph::{DepGraph, DepNode, DepKind};
|
||||||
|
|
||||||
use hir::def_id::{CRATE_DEF_INDEX, DefId, DefIndex, DefIndexAddressSpace};
|
use hir::def_id::{CRATE_DEF_INDEX, DefId, DefIndexAddressSpace};
|
||||||
|
|
||||||
use syntax::abi::Abi;
|
use syntax::abi::Abi;
|
||||||
use syntax::ast::{self, Name, NodeId, CRATE_NODE_ID};
|
use syntax::ast::{self, Name, NodeId, CRATE_NODE_ID};
|
||||||
@ -377,10 +377,6 @@ impl<'hir> Map<'hir> {
|
|||||||
self.definitions.def_path(def_id.index)
|
self.definitions.def_path(def_id.index)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn def_index_for_def_key(&self, def_key: DefKey) -> Option<DefIndex> {
|
|
||||||
self.definitions.def_index_for_def_key(def_key)
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn local_def_id(&self, node: NodeId) -> DefId {
|
pub fn local_def_id(&self, node: NodeId) -> DefId {
|
||||||
self.opt_local_def_id(node).unwrap_or_else(|| {
|
self.opt_local_def_id(node).unwrap_or_else(|| {
|
||||||
bug!("local_def_id: no entry for `{}`, which has a map of `{:?}`",
|
bug!("local_def_id: no entry for `{}`, which has a map of `{:?}`",
|
||||||
|
@ -25,8 +25,7 @@
|
|||||||
use hir::def;
|
use hir::def;
|
||||||
use hir::def_id::{CrateNum, DefId, DefIndex};
|
use hir::def_id::{CrateNum, DefId, DefIndex};
|
||||||
use hir::map as hir_map;
|
use hir::map as hir_map;
|
||||||
use hir::map::definitions::{Definitions, DefKey, DisambiguatedDefPathData,
|
use hir::map::definitions::{Definitions, DefKey, DefPathTable};
|
||||||
DefPathTable};
|
|
||||||
use hir::svh::Svh;
|
use hir::svh::Svh;
|
||||||
use ich;
|
use ich;
|
||||||
use middle::lang_items;
|
use middle::lang_items;
|
||||||
@ -269,10 +268,6 @@ pub trait CrateStore {
|
|||||||
fn is_no_builtins(&self, cnum: CrateNum) -> bool;
|
fn is_no_builtins(&self, cnum: CrateNum) -> bool;
|
||||||
|
|
||||||
// resolve
|
// resolve
|
||||||
fn retrace_path(&self,
|
|
||||||
cnum: CrateNum,
|
|
||||||
path_data: &[DisambiguatedDefPathData])
|
|
||||||
-> Option<DefId>;
|
|
||||||
fn def_key(&self, def: DefId) -> DefKey;
|
fn def_key(&self, def: DefId) -> DefKey;
|
||||||
fn def_path(&self, def: DefId) -> hir_map::DefPath;
|
fn def_path(&self, def: DefId) -> hir_map::DefPath;
|
||||||
fn def_path_hash(&self, def: DefId) -> hir_map::DefPathHash;
|
fn def_path_hash(&self, def: DefId) -> hir_map::DefPathHash;
|
||||||
@ -392,13 +387,6 @@ impl CrateStore for DummyCrateStore {
|
|||||||
fn is_no_builtins(&self, cnum: CrateNum) -> bool { bug!("is_no_builtins") }
|
fn is_no_builtins(&self, cnum: CrateNum) -> bool { bug!("is_no_builtins") }
|
||||||
|
|
||||||
// resolve
|
// resolve
|
||||||
fn retrace_path(&self,
|
|
||||||
cnum: CrateNum,
|
|
||||||
path_data: &[DisambiguatedDefPathData])
|
|
||||||
-> Option<DefId> {
|
|
||||||
None
|
|
||||||
}
|
|
||||||
|
|
||||||
fn def_key(&self, def: DefId) -> DefKey { bug!("def_key") }
|
fn def_key(&self, def: DefId) -> DefKey { bug!("def_key") }
|
||||||
fn def_path(&self, def: DefId) -> hir_map::DefPath {
|
fn def_path(&self, def: DefId) -> hir_map::DefPath {
|
||||||
bug!("relative_def_path")
|
bug!("relative_def_path")
|
||||||
|
@ -88,7 +88,7 @@ pub fn translate_substs<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
|||||||
// vary across impls
|
// vary across impls
|
||||||
let target_substs = match target_node {
|
let target_substs = match target_node {
|
||||||
specialization_graph::Node::Impl(target_impl) => {
|
specialization_graph::Node::Impl(target_impl) => {
|
||||||
// no need to translate if we're targetting the impl we started with
|
// no need to translate if we're targeting the impl we started with
|
||||||
if source_impl == target_impl {
|
if source_impl == target_impl {
|
||||||
return source_substs;
|
return source_substs;
|
||||||
}
|
}
|
||||||
@ -96,7 +96,7 @@ pub fn translate_substs<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
|||||||
fulfill_implication(infcx, param_env, source_trait_ref, target_impl)
|
fulfill_implication(infcx, param_env, source_trait_ref, target_impl)
|
||||||
.unwrap_or_else(|_| {
|
.unwrap_or_else(|_| {
|
||||||
bug!("When translating substitutions for specialization, the expected \
|
bug!("When translating substitutions for specialization, the expected \
|
||||||
specializaiton failed to hold")
|
specialization failed to hold")
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
specialization_graph::Node::Trait(..) => source_trait_ref.substs,
|
specialization_graph::Node::Trait(..) => source_trait_ref.substs,
|
||||||
@ -107,7 +107,7 @@ pub fn translate_substs<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
|||||||
}
|
}
|
||||||
|
|
||||||
/// Given a selected impl described by `impl_data`, returns the
|
/// Given a selected impl described by `impl_data`, returns the
|
||||||
/// definition and substitions for the method with the name `name`
|
/// definition and substitutions for the method with the name `name`
|
||||||
/// the kind `kind`, and trait method substitutions `substs`, in
|
/// the kind `kind`, and trait method substitutions `substs`, in
|
||||||
/// that impl, a less specialized impl, or the trait default,
|
/// that impl, a less specialized impl, or the trait default,
|
||||||
/// whichever applies.
|
/// whichever applies.
|
||||||
@ -305,7 +305,7 @@ pub(super) fn specialization_graph_provider<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx
|
|||||||
// The coherence checking implementation seems to rely on impls being
|
// The coherence checking implementation seems to rely on impls being
|
||||||
// iterated over (roughly) in definition order, so we are sorting by
|
// iterated over (roughly) in definition order, so we are sorting by
|
||||||
// negated CrateNum (so remote definitions are visited first) and then
|
// negated CrateNum (so remote definitions are visited first) and then
|
||||||
// by a flattend version of the DefIndex.
|
// by a flattened version of the DefIndex.
|
||||||
trait_impls.sort_unstable_by_key(|def_id| {
|
trait_impls.sort_unstable_by_key(|def_id| {
|
||||||
(-(def_id.krate.as_u32() as i64),
|
(-(def_id.krate.as_u32() as i64),
|
||||||
def_id.index.address_space().index(),
|
def_id.index.address_space().index(),
|
||||||
|
@ -18,7 +18,7 @@ use hir::TraitMap;
|
|||||||
use hir::def::{Def, ExportMap};
|
use hir::def::{Def, ExportMap};
|
||||||
use hir::def_id::{CrateNum, DefId, LOCAL_CRATE};
|
use hir::def_id::{CrateNum, DefId, LOCAL_CRATE};
|
||||||
use hir::map as hir_map;
|
use hir::map as hir_map;
|
||||||
use hir::map::{DisambiguatedDefPathData, DefPathHash};
|
use hir::map::DefPathHash;
|
||||||
use middle::free_region::FreeRegionMap;
|
use middle::free_region::FreeRegionMap;
|
||||||
use middle::lang_items;
|
use middle::lang_items;
|
||||||
use middle::resolve_lifetime;
|
use middle::resolve_lifetime;
|
||||||
@ -570,23 +570,6 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn retrace_path(self,
|
|
||||||
krate: CrateNum,
|
|
||||||
path_data: &[DisambiguatedDefPathData])
|
|
||||||
-> Option<DefId> {
|
|
||||||
debug!("retrace_path(path={:?}, krate={:?})", path_data, self.crate_name(krate));
|
|
||||||
|
|
||||||
if krate == LOCAL_CRATE {
|
|
||||||
self.hir
|
|
||||||
.definitions()
|
|
||||||
.def_path_table()
|
|
||||||
.retrace_path(path_data)
|
|
||||||
.map(|def_index| DefId { krate: krate, index: def_index })
|
|
||||||
} else {
|
|
||||||
self.sess.cstore.retrace_path(krate, path_data)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn alloc_generics(self, generics: ty::Generics) -> &'gcx ty::Generics {
|
pub fn alloc_generics(self, generics: ty::Generics) -> &'gcx ty::Generics {
|
||||||
self.global_arenas.generics.alloc(generics)
|
self.global_arenas.generics.alloc(generics)
|
||||||
}
|
}
|
||||||
|
15
src/librustc_back/build.rs
Normal file
15
src/librustc_back/build.rs
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_DEFAULT_LINKER");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_DEFAULT_AR");
|
||||||
|
}
|
@ -117,11 +117,11 @@ enum NodeState {
|
|||||||
/// non-ambiguous result.
|
/// non-ambiguous result.
|
||||||
Pending,
|
Pending,
|
||||||
|
|
||||||
/// This obligation was selected successfuly, but may or
|
/// This obligation was selected successfully, but may or
|
||||||
/// may not have subobligations.
|
/// may not have subobligations.
|
||||||
Success,
|
Success,
|
||||||
|
|
||||||
/// This obligation was selected sucessfully, but it has
|
/// This obligation was selected successfully, but it has
|
||||||
/// a pending subobligation.
|
/// a pending subobligation.
|
||||||
Waiting,
|
Waiting,
|
||||||
|
|
||||||
|
17
src/librustc_driver/build.rs
Normal file
17
src/librustc_driver/build.rs
Normal file
@ -0,0 +1,17 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_RELEASE");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VERSION");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VER_DATE");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VER_HASH");
|
||||||
|
}
|
14
src/librustc_incremental/build.rs
Normal file
14
src/librustc_incremental/build.rs
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VERSION");
|
||||||
|
}
|
14
src/librustc_metadata/build.rs
Normal file
14
src/librustc_metadata/build.rs
Normal file
@ -0,0 +1,14 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VERSION");
|
||||||
|
}
|
@ -22,7 +22,7 @@ use rustc::session::Session;
|
|||||||
use rustc::ty::{self, TyCtxt};
|
use rustc::ty::{self, TyCtxt};
|
||||||
use rustc::ty::maps::Providers;
|
use rustc::ty::maps::Providers;
|
||||||
use rustc::hir::def_id::{CrateNum, DefId, DefIndex, CRATE_DEF_INDEX, LOCAL_CRATE};
|
use rustc::hir::def_id::{CrateNum, DefId, DefIndex, CRATE_DEF_INDEX, LOCAL_CRATE};
|
||||||
use rustc::hir::map::{DefKey, DefPath, DisambiguatedDefPathData, DefPathHash};
|
use rustc::hir::map::{DefKey, DefPath, DefPathHash};
|
||||||
use rustc::hir::map::blocks::FnLikeNode;
|
use rustc::hir::map::blocks::FnLikeNode;
|
||||||
use rustc::hir::map::definitions::{DefPathTable, GlobalMetaDataKind};
|
use rustc::hir::map::definitions::{DefPathTable, GlobalMetaDataKind};
|
||||||
use rustc::util::nodemap::{NodeSet, DefIdMap};
|
use rustc::util::nodemap::{NodeSet, DefIdMap};
|
||||||
@ -307,16 +307,6 @@ impl CrateStore for cstore::CStore {
|
|||||||
self.get_crate_data(cnum).is_no_builtins(&self.dep_graph)
|
self.get_crate_data(cnum).is_no_builtins(&self.dep_graph)
|
||||||
}
|
}
|
||||||
|
|
||||||
fn retrace_path(&self,
|
|
||||||
cnum: CrateNum,
|
|
||||||
path: &[DisambiguatedDefPathData])
|
|
||||||
-> Option<DefId> {
|
|
||||||
let cdata = self.get_crate_data(cnum);
|
|
||||||
cdata.def_path_table
|
|
||||||
.retrace_path(&path)
|
|
||||||
.map(|index| DefId { krate: cnum, index: index })
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Returns the `DefKey` for a given `DefId`. This indicates the
|
/// Returns the `DefKey` for a given `DefId`. This indicates the
|
||||||
/// parent `DefId` as well as some idea of what kind of data the
|
/// parent `DefId` as well as some idea of what kind of data the
|
||||||
/// `DefId` refers to.
|
/// `DefId` refers to.
|
||||||
|
@ -259,7 +259,7 @@ impl<'a, 'tcx> MoveDataBuilder<'a, 'tcx> {
|
|||||||
/// NOTE: lvalues behind references *do not* get a move path, which is
|
/// NOTE: lvalues behind references *do not* get a move path, which is
|
||||||
/// problematic for borrowck.
|
/// problematic for borrowck.
|
||||||
///
|
///
|
||||||
/// Maybe we should have seperate "borrowck" and "moveck" modes.
|
/// Maybe we should have separate "borrowck" and "moveck" modes.
|
||||||
fn move_path_for(&mut self, lval: &Lvalue<'tcx>)
|
fn move_path_for(&mut self, lval: &Lvalue<'tcx>)
|
||||||
-> Result<MovePathIndex, MovePathError>
|
-> Result<MovePathIndex, MovePathError>
|
||||||
{
|
{
|
||||||
|
@ -238,7 +238,7 @@ impl Uniform {
|
|||||||
|
|
||||||
pub trait LayoutExt<'tcx> {
|
pub trait LayoutExt<'tcx> {
|
||||||
fn is_aggregate(&self) -> bool;
|
fn is_aggregate(&self) -> bool;
|
||||||
fn homogenous_aggregate<'a>(&self, ccx: &CrateContext<'a, 'tcx>) -> Option<Reg>;
|
fn homogeneous_aggregate<'a>(&self, ccx: &CrateContext<'a, 'tcx>) -> Option<Reg>;
|
||||||
}
|
}
|
||||||
|
|
||||||
impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
||||||
@ -258,7 +258,7 @@ impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn homogenous_aggregate<'a>(&self, ccx: &CrateContext<'a, 'tcx>) -> Option<Reg> {
|
fn homogeneous_aggregate<'a>(&self, ccx: &CrateContext<'a, 'tcx>) -> Option<Reg> {
|
||||||
match *self.layout {
|
match *self.layout {
|
||||||
// The primitives for this algorithm.
|
// The primitives for this algorithm.
|
||||||
Layout::Scalar { value, .. } |
|
Layout::Scalar { value, .. } |
|
||||||
@ -291,7 +291,7 @@ impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
|||||||
|
|
||||||
Layout::Array { count, .. } => {
|
Layout::Array { count, .. } => {
|
||||||
if count > 0 {
|
if count > 0 {
|
||||||
self.field(ccx, 0).homogenous_aggregate(ccx)
|
self.field(ccx, 0).homogeneous_aggregate(ccx)
|
||||||
} else {
|
} else {
|
||||||
None
|
None
|
||||||
}
|
}
|
||||||
@ -307,8 +307,8 @@ impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let field = self.field(ccx, i);
|
let field = self.field(ccx, i);
|
||||||
match (result, field.homogenous_aggregate(ccx)) {
|
match (result, field.homogeneous_aggregate(ccx)) {
|
||||||
// The field itself must be a homogenous aggregate.
|
// The field itself must be a homogeneous aggregate.
|
||||||
(_, None) => return None,
|
(_, None) => return None,
|
||||||
// If this is the first field, record the unit.
|
// If this is the first field, record the unit.
|
||||||
(None, Some(unit)) => {
|
(None, Some(unit)) => {
|
||||||
@ -344,8 +344,8 @@ impl<'tcx> LayoutExt<'tcx> for TyLayout<'tcx> {
|
|||||||
|
|
||||||
for i in 0..self.field_count() {
|
for i in 0..self.field_count() {
|
||||||
let field = self.field(ccx, i);
|
let field = self.field(ccx, i);
|
||||||
match (result, field.homogenous_aggregate(ccx)) {
|
match (result, field.homogeneous_aggregate(ccx)) {
|
||||||
// The field itself must be a homogenous aggregate.
|
// The field itself must be a homogeneous aggregate.
|
||||||
(_, None) => return None,
|
(_, None) => return None,
|
||||||
// If this is the first field, record the unit.
|
// If this is the first field, record the unit.
|
||||||
(None, Some(unit)) => {
|
(None, Some(unit)) => {
|
||||||
@ -830,7 +830,7 @@ impl<'a, 'tcx> FnType<'tcx> {
|
|||||||
|
|
||||||
let size = arg.layout.size(ccx);
|
let size = arg.layout.size(ccx);
|
||||||
|
|
||||||
if let Some(unit) = arg.layout.homogenous_aggregate(ccx) {
|
if let Some(unit) = arg.layout.homogeneous_aggregate(ccx) {
|
||||||
// Replace newtypes with their inner-most type.
|
// Replace newtypes with their inner-most type.
|
||||||
if unit.size == size {
|
if unit.size == size {
|
||||||
// Needs a cast as we've unpacked a newtype.
|
// Needs a cast as we've unpacked a newtype.
|
||||||
|
16
src/librustc_trans/build.rs
Normal file
16
src/librustc_trans/build.rs
Normal file
@ -0,0 +1,16 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_VERSION");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_PREFIX");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_LLVM_ROOT");
|
||||||
|
}
|
@ -11,9 +11,9 @@
|
|||||||
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
||||||
use context::CrateContext;
|
use context::CrateContext;
|
||||||
|
|
||||||
fn is_homogenous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
fn is_homogeneous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
||||||
-> Option<Uniform> {
|
-> Option<Uniform> {
|
||||||
arg.layout.homogenous_aggregate(ccx).and_then(|unit| {
|
arg.layout.homogeneous_aggregate(ccx).and_then(|unit| {
|
||||||
let size = arg.layout.size(ccx);
|
let size = arg.layout.size(ccx);
|
||||||
|
|
||||||
// Ensure we have at most four uniquely addressable members.
|
// Ensure we have at most four uniquely addressable members.
|
||||||
@ -43,7 +43,7 @@ fn classify_ret_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, ret: &mut ArgType<'tc
|
|||||||
ret.extend_integer_width_to(32);
|
ret.extend_integer_width_to(32);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, ret) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, ret) {
|
||||||
ret.cast_to(ccx, uniform);
|
ret.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@ -74,7 +74,7 @@ fn classify_arg_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tc
|
|||||||
arg.extend_integer_width_to(32);
|
arg.extend_integer_width_to(32);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, arg) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, arg) {
|
||||||
arg.cast_to(ccx, uniform);
|
arg.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
@ -18,7 +18,7 @@ use context::CrateContext;
|
|||||||
|
|
||||||
fn classify_ret_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, ret: &mut ArgType<'tcx>) {
|
fn classify_ret_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, ret: &mut ArgType<'tcx>) {
|
||||||
if ret.layout.is_aggregate() {
|
if ret.layout.is_aggregate() {
|
||||||
if let Some(unit) = ret.layout.homogenous_aggregate(ccx) {
|
if let Some(unit) = ret.layout.homogeneous_aggregate(ccx) {
|
||||||
let size = ret.layout.size(ccx);
|
let size = ret.layout.size(ccx);
|
||||||
if unit.size == size {
|
if unit.size == size {
|
||||||
ret.cast_to(ccx, Uniform {
|
ret.cast_to(ccx, Uniform {
|
||||||
|
@ -15,9 +15,9 @@
|
|||||||
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
||||||
use context::CrateContext;
|
use context::CrateContext;
|
||||||
|
|
||||||
fn is_homogenous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
fn is_homogeneous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
||||||
-> Option<Uniform> {
|
-> Option<Uniform> {
|
||||||
arg.layout.homogenous_aggregate(ccx).and_then(|unit| {
|
arg.layout.homogeneous_aggregate(ccx).and_then(|unit| {
|
||||||
let size = arg.layout.size(ccx);
|
let size = arg.layout.size(ccx);
|
||||||
|
|
||||||
// Ensure we have at most eight uniquely addressable members.
|
// Ensure we have at most eight uniquely addressable members.
|
||||||
@ -53,7 +53,7 @@ fn classify_ret_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, ret: &mut ArgType<'tc
|
|||||||
ret.make_indirect(ccx);
|
ret.make_indirect(ccx);
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, ret) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, ret) {
|
||||||
ret.cast_to(ccx, uniform);
|
ret.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@ -86,7 +86,7 @@ fn classify_arg_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tc
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, arg) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, arg) {
|
||||||
arg.cast_to(ccx, uniform);
|
arg.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
@ -13,9 +13,9 @@
|
|||||||
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
use abi::{FnType, ArgType, LayoutExt, Reg, RegKind, Uniform};
|
||||||
use context::CrateContext;
|
use context::CrateContext;
|
||||||
|
|
||||||
fn is_homogenous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
fn is_homogeneous_aggregate<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tcx>)
|
||||||
-> Option<Uniform> {
|
-> Option<Uniform> {
|
||||||
arg.layout.homogenous_aggregate(ccx).and_then(|unit| {
|
arg.layout.homogeneous_aggregate(ccx).and_then(|unit| {
|
||||||
let size = arg.layout.size(ccx);
|
let size = arg.layout.size(ccx);
|
||||||
|
|
||||||
// Ensure we have at most eight uniquely addressable members.
|
// Ensure we have at most eight uniquely addressable members.
|
||||||
@ -46,7 +46,7 @@ fn classify_ret_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, ret: &mut ArgType<'tc
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, ret) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, ret) {
|
||||||
ret.cast_to(ccx, uniform);
|
ret.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@ -80,7 +80,7 @@ fn classify_arg_ty<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>, arg: &mut ArgType<'tc
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if let Some(uniform) = is_homogenous_aggregate(ccx, arg) {
|
if let Some(uniform) = is_homogeneous_aggregate(ccx, arg) {
|
||||||
arg.cast_to(ccx, uniform);
|
arg.cast_to(ccx, uniform);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
@ -74,7 +74,7 @@ pub fn compute_abi_info<'a, 'tcx>(ccx: &CrateContext<'a, 'tcx>,
|
|||||||
if arg.is_ignore() || arg.is_indirect() { continue; }
|
if arg.is_ignore() || arg.is_indirect() { continue; }
|
||||||
|
|
||||||
// At this point we know this must be a primitive of sorts.
|
// At this point we know this must be a primitive of sorts.
|
||||||
let unit = arg.layout.homogenous_aggregate(ccx).unwrap();
|
let unit = arg.layout.homogeneous_aggregate(ccx).unwrap();
|
||||||
let size = arg.layout.size(ccx);
|
let size = arg.layout.size(ccx);
|
||||||
assert_eq!(unit.size, size);
|
assert_eq!(unit.size, size);
|
||||||
if unit.kind == RegKind::Float {
|
if unit.kind == RegKind::Float {
|
||||||
|
@ -251,7 +251,7 @@ pub fn partition<'a, 'tcx, I>(scx: &SharedCrateContext<'a, 'tcx>,
|
|||||||
exported_symbols,
|
exported_symbols,
|
||||||
trans_items);
|
trans_items);
|
||||||
|
|
||||||
debug_dump(tcx, "INITIAL PARTITONING:", initial_partitioning.codegen_units.iter());
|
debug_dump(tcx, "INITIAL PARTITIONING:", initial_partitioning.codegen_units.iter());
|
||||||
|
|
||||||
// If the partitioning should produce a fixed count of codegen units, merge
|
// If the partitioning should produce a fixed count of codegen units, merge
|
||||||
// until that count is reached.
|
// until that count is reached.
|
||||||
@ -261,7 +261,7 @@ pub fn partition<'a, 'tcx, I>(scx: &SharedCrateContext<'a, 'tcx>,
|
|||||||
debug_dump(tcx, "POST MERGING:", initial_partitioning.codegen_units.iter());
|
debug_dump(tcx, "POST MERGING:", initial_partitioning.codegen_units.iter());
|
||||||
}
|
}
|
||||||
|
|
||||||
// In the next step, we use the inlining map to determine which addtional
|
// In the next step, we use the inlining map to determine which additional
|
||||||
// translation items have to go into each codegen unit. These additional
|
// translation items have to go into each codegen unit. These additional
|
||||||
// translation items can be drop-glue, functions from external crates, and
|
// translation items can be drop-glue, functions from external crates, and
|
||||||
// local functions the definition of which is marked with #[inline].
|
// local functions the definition of which is marked with #[inline].
|
||||||
|
@ -1065,6 +1065,7 @@
|
|||||||
block("macro", "Macros");
|
block("macro", "Macros");
|
||||||
block("struct", "Structs");
|
block("struct", "Structs");
|
||||||
block("enum", "Enums");
|
block("enum", "Enums");
|
||||||
|
block("union", "Unions");
|
||||||
block("constant", "Constants");
|
block("constant", "Constants");
|
||||||
block("static", "Statics");
|
block("static", "Statics");
|
||||||
block("trait", "Traits");
|
block("trait", "Traits");
|
||||||
|
@ -729,7 +729,7 @@ pub trait SpecializationError {
|
|||||||
/// `S` is the encoder/decoder state type,
|
/// `S` is the encoder/decoder state type,
|
||||||
/// `T` is the type being encoded/decoded, and
|
/// `T` is the type being encoded/decoded, and
|
||||||
/// the arguments are the names of the trait
|
/// the arguments are the names of the trait
|
||||||
/// and method that should've been overriden.
|
/// and method that should've been overridden.
|
||||||
fn not_found<S, T: ?Sized>(trait_name: &'static str,
|
fn not_found<S, T: ?Sized>(trait_name: &'static str,
|
||||||
method_name: &'static str) -> Self;
|
method_name: &'static str) -> Self;
|
||||||
}
|
}
|
||||||
@ -737,7 +737,7 @@ pub trait SpecializationError {
|
|||||||
impl<E> SpecializationError for E {
|
impl<E> SpecializationError for E {
|
||||||
default fn not_found<S, T: ?Sized>(trait_name: &'static str,
|
default fn not_found<S, T: ?Sized>(trait_name: &'static str,
|
||||||
method_name: &'static str) -> E {
|
method_name: &'static str) -> E {
|
||||||
panic!("missing specializaiton: `<{} as {}<{}>>::{}` not overriden",
|
panic!("missing specialization: `<{} as {}<{}>>::{}` not overridden",
|
||||||
unsafe { intrinsics::type_name::<S>() },
|
unsafe { intrinsics::type_name::<S>() },
|
||||||
trait_name,
|
trait_name,
|
||||||
unsafe { intrinsics::type_name::<T>() },
|
unsafe { intrinsics::type_name::<T>() },
|
||||||
|
@ -203,7 +203,7 @@ const DISPLACEMENT_THRESHOLD: usize = 128;
|
|||||||
// so we round that up to 128.
|
// so we round that up to 128.
|
||||||
//
|
//
|
||||||
// At a load factor of α, the odds of finding the target bucket after exactly n
|
// At a load factor of α, the odds of finding the target bucket after exactly n
|
||||||
// unsuccesful probes[1] are
|
// unsuccessful probes[1] are
|
||||||
//
|
//
|
||||||
// Pr_α{displacement = n} =
|
// Pr_α{displacement = n} =
|
||||||
// (1 - α) / α * ∑_{k≥1} e^(-kα) * (kα)^(k+n) / (k + n)! * (1 - kα / (k + n + 1))
|
// (1 - α) / α * ∑_{k≥1} e^(-kα) * (kα)^(k+n) / (k + n)! * (1 - kα / (k + n + 1))
|
||||||
|
@ -37,7 +37,7 @@ use memchr;
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f = File::open("log.txt")?;
|
/// let f = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::new(f);
|
/// let mut reader = BufReader::new(f);
|
||||||
///
|
///
|
||||||
/// let mut line = String::new();
|
/// let mut line = String::new();
|
||||||
@ -64,8 +64,8 @@ impl<R: Read> BufReader<R> {
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f = File::open("log.txt")?;
|
/// let f = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::new(f);
|
/// let reader = BufReader::new(f);
|
||||||
/// # Ok(())
|
/// # Ok(())
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
@ -85,8 +85,8 @@ impl<R: Read> BufReader<R> {
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f = File::open("log.txt")?;
|
/// let f = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::with_capacity(10, f);
|
/// let reader = BufReader::with_capacity(10, f);
|
||||||
/// # Ok(())
|
/// # Ok(())
|
||||||
/// # }
|
/// # }
|
||||||
/// ```
|
/// ```
|
||||||
@ -116,8 +116,8 @@ impl<R: Read> BufReader<R> {
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f1 = File::open("log.txt")?;
|
/// let f1 = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::new(f1);
|
/// let reader = BufReader::new(f1);
|
||||||
///
|
///
|
||||||
/// let f2 = reader.get_ref();
|
/// let f2 = reader.get_ref();
|
||||||
/// # Ok(())
|
/// # Ok(())
|
||||||
@ -137,7 +137,7 @@ impl<R: Read> BufReader<R> {
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f1 = File::open("log.txt")?;
|
/// let f1 = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::new(f1);
|
/// let mut reader = BufReader::new(f1);
|
||||||
///
|
///
|
||||||
/// let f2 = reader.get_mut();
|
/// let f2 = reader.get_mut();
|
||||||
@ -158,8 +158,8 @@ impl<R: Read> BufReader<R> {
|
|||||||
/// use std::fs::File;
|
/// use std::fs::File;
|
||||||
///
|
///
|
||||||
/// # fn foo() -> std::io::Result<()> {
|
/// # fn foo() -> std::io::Result<()> {
|
||||||
/// let mut f1 = File::open("log.txt")?;
|
/// let f1 = File::open("log.txt")?;
|
||||||
/// let mut reader = BufReader::new(f1);
|
/// let reader = BufReader::new(f1);
|
||||||
///
|
///
|
||||||
/// let f2 = reader.into_inner();
|
/// let f2 = reader.into_inner();
|
||||||
/// # Ok(())
|
/// # Ok(())
|
||||||
|
@ -476,7 +476,7 @@ impl<'a> Hash for PrefixComponent<'a> {
|
|||||||
|
|
||||||
/// A single component of a path.
|
/// A single component of a path.
|
||||||
///
|
///
|
||||||
/// A `Component` roughtly corresponds to a substring between path separators
|
/// A `Component` roughly corresponds to a substring between path separators
|
||||||
/// (`/` or `\`).
|
/// (`/` or `\`).
|
||||||
///
|
///
|
||||||
/// This `enum` is created by iterating over [`Components`], which in turn is
|
/// This `enum` is created by iterating over [`Components`], which in turn is
|
||||||
@ -571,7 +571,7 @@ impl<'a> AsRef<OsStr> for Component<'a> {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/// An interator over the [`Component`]s of a [`Path`].
|
/// An iterator over the [`Component`]s of a [`Path`].
|
||||||
///
|
///
|
||||||
/// This `struct` is created by the [`components`] method on [`Path`].
|
/// This `struct` is created by the [`components`] method on [`Path`].
|
||||||
/// See its documentation for more.
|
/// See its documentation for more.
|
||||||
@ -2019,7 +2019,7 @@ impl Path {
|
|||||||
/// * Repeated separators are ignored, so `a/b` and `a//b` both have
|
/// * Repeated separators are ignored, so `a/b` and `a//b` both have
|
||||||
/// `a` and `b` as components.
|
/// `a` and `b` as components.
|
||||||
///
|
///
|
||||||
/// * Occurences of `.` are normalized away, except if they are at the
|
/// * Occurrences of `.` are normalized away, except if they are at the
|
||||||
/// beginning of the path. For example, `a/./b`, `a/b/`, `a/b/.` and
|
/// beginning of the path. For example, `a/./b`, `a/b/`, `a/b/.` and
|
||||||
/// `a/b` all have `a` and `b` as components, but `./a/b` starts with
|
/// `a/b` all have `a` and `b` as components, but `./a/b` starts with
|
||||||
/// an additional [`CurDir`] component.
|
/// an additional [`CurDir`] component.
|
||||||
|
@ -799,8 +799,8 @@ impl From<fs::File> for Stdio {
|
|||||||
pub struct ExitStatus(imp::ExitStatus);
|
pub struct ExitStatus(imp::ExitStatus);
|
||||||
|
|
||||||
impl ExitStatus {
|
impl ExitStatus {
|
||||||
/// Was termination successful? Signal termination not considered a success,
|
/// Was termination successful? Signal termination is not considered a
|
||||||
/// and success is defined as a zero exit status.
|
/// success, and success is defined as a zero exit status.
|
||||||
///
|
///
|
||||||
/// # Examples
|
/// # Examples
|
||||||
///
|
///
|
||||||
|
@ -190,7 +190,7 @@ pub use self::local::{LocalKey, LocalKeyState, AccessError};
|
|||||||
/// - [`name`]: allows to give a name to the thread which is currently
|
/// - [`name`]: allows to give a name to the thread which is currently
|
||||||
/// only used in `panic` messages.
|
/// only used in `panic` messages.
|
||||||
/// - [`stack_size`]: specifies the desired stack size. Note that this can
|
/// - [`stack_size`]: specifies the desired stack size. Note that this can
|
||||||
/// be overriden by the OS.
|
/// be overridden by the OS.
|
||||||
///
|
///
|
||||||
/// If the [`stack_size`] field is not specified, the stack size
|
/// If the [`stack_size`] field is not specified, the stack size
|
||||||
/// will be the `RUST_MIN_STACK` environment variable. If it is
|
/// will be the `RUST_MIN_STACK` environment variable. If it is
|
||||||
@ -529,7 +529,7 @@ pub fn current() -> Thread {
|
|||||||
/// Thus the pattern of `yield`ing after a failed poll is rather common when
|
/// Thus the pattern of `yield`ing after a failed poll is rather common when
|
||||||
/// implementing low-level shared resources or synchronization primitives.
|
/// implementing low-level shared resources or synchronization primitives.
|
||||||
///
|
///
|
||||||
/// However programmers will usualy prefer to use, [`channel`]s, [`Condvar`]s,
|
/// However programmers will usually prefer to use, [`channel`]s, [`Condvar`]s,
|
||||||
/// [`Mutex`]es or [`join`] for their synchronisation routines, as they avoid
|
/// [`Mutex`]es or [`join`] for their synchronisation routines, as they avoid
|
||||||
/// thinking about thread schedulling.
|
/// thinking about thread schedulling.
|
||||||
///
|
///
|
||||||
@ -770,7 +770,7 @@ pub fn park_timeout_ms(ms: u32) {
|
|||||||
/// preemption or platform differences that may not cause the maximum
|
/// preemption or platform differences that may not cause the maximum
|
||||||
/// amount of time waited to be precisely `dur` long.
|
/// amount of time waited to be precisely `dur` long.
|
||||||
///
|
///
|
||||||
/// See the [park dococumentation][park] for more details.
|
/// See the [park documentation][park] for more details.
|
||||||
///
|
///
|
||||||
/// # Platform behavior
|
/// # Platform behavior
|
||||||
///
|
///
|
||||||
@ -891,7 +891,7 @@ struct Inner {
|
|||||||
/// The [`thread::current`] function is available even for threads not spawned
|
/// The [`thread::current`] function is available even for threads not spawned
|
||||||
/// by the APIs of this module.
|
/// by the APIs of this module.
|
||||||
///
|
///
|
||||||
/// There is usualy no need to create a `Thread` struct yourself, one
|
/// There is usually no need to create a `Thread` struct yourself, one
|
||||||
/// should instead use a function like `spawn` to create new threads, see the
|
/// should instead use a function like `spawn` to create new threads, see the
|
||||||
/// docs of [`Builder`] and [`spawn`] for more details.
|
/// docs of [`Builder`] and [`spawn`] for more details.
|
||||||
///
|
///
|
||||||
|
15
src/libsyntax/build.rs
Normal file
15
src/libsyntax/build.rs
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||||
|
// file at the top-level directory of this distribution and at
|
||||||
|
// http://rust-lang.org/COPYRIGHT.
|
||||||
|
//
|
||||||
|
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||||
|
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||||
|
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||||
|
// option. This file may not be copied, modified, or distributed
|
||||||
|
// except according to those terms.
|
||||||
|
|
||||||
|
fn main() {
|
||||||
|
println!("cargo:rerun-if-changed=build.rs");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_RELEASE_CHANNEL");
|
||||||
|
println!("cargo:rerun-if-env-changed=CFG_DISABLE_UNSTABLE_FEATURES");
|
||||||
|
}
|
@ -312,7 +312,7 @@ impl MultiSpan {
|
|||||||
&self.primary_spans
|
&self.primary_spans
|
||||||
}
|
}
|
||||||
|
|
||||||
/// Replaces all occurances of one Span with another. Used to move Spans in areas that don't
|
/// Replaces all occurrences of one Span with another. Used to move Spans in areas that don't
|
||||||
/// display well (like std macros). Returns true if replacements occurred.
|
/// display well (like std macros). Returns true if replacements occurred.
|
||||||
pub fn replace(&mut self, before: Span, after: Span) -> bool {
|
pub fn replace(&mut self, before: Span, after: Span) -> bool {
|
||||||
let mut replacements_occurred = false;
|
let mut replacements_occurred = false;
|
||||||
|
@ -10,7 +10,7 @@
|
|||||||
|
|
||||||
// compile-flags: -Z parse-only
|
// compile-flags: -Z parse-only
|
||||||
|
|
||||||
// Test successful and unsucessful parsing of the `default` contextual keyword
|
// Test successful and unsuccessful parsing of the `default` contextual keyword
|
||||||
|
|
||||||
trait Foo {
|
trait Foo {
|
||||||
fn foo<T: Default>() -> T;
|
fn foo<T: Default>() -> T;
|
||||||
|
@ -16,7 +16,7 @@ pub fn callback<F>(f: F) where F: FnOnce((&'static str, u32)) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// LLVM does not yet output the required debug info to support showing inlined
|
// LLVM does not yet output the required debug info to support showing inlined
|
||||||
// function calls in backtraces when targetting MSVC, so disable inlining in
|
// function calls in backtraces when targeting MSVC, so disable inlining in
|
||||||
// this case.
|
// this case.
|
||||||
#[cfg_attr(not(target_env = "msvc"), inline(always))]
|
#[cfg_attr(not(target_env = "msvc"), inline(always))]
|
||||||
#[cfg_attr(target_env = "msvc", inline(never))]
|
#[cfg_attr(target_env = "msvc", inline(never))]
|
||||||
|
@ -10,7 +10,7 @@
|
|||||||
|
|
||||||
// We disable tail merging here because it can't preserve debuginfo and thus
|
// We disable tail merging here because it can't preserve debuginfo and thus
|
||||||
// potentially breaks the backtraces. Also, subtle changes can decide whether
|
// potentially breaks the backtraces. Also, subtle changes can decide whether
|
||||||
// tail merging suceeds, so the test might work today but fail tomorrow due to a
|
// tail merging succeeds, so the test might work today but fail tomorrow due to a
|
||||||
// seemingly completely unrelated change.
|
// seemingly completely unrelated change.
|
||||||
// Unfortunately, LLVM has no "disable" option for this, so we have to set
|
// Unfortunately, LLVM has no "disable" option for this, so we have to set
|
||||||
// "enable" to 0 instead.
|
// "enable" to 0 instead.
|
||||||
@ -88,7 +88,7 @@ fn inner(counter: &mut i32, main_pos: Pos, outer_pos: Pos) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// LLVM does not yet output the required debug info to support showing inlined
|
// LLVM does not yet output the required debug info to support showing inlined
|
||||||
// function calls in backtraces when targetting MSVC, so disable inlining in
|
// function calls in backtraces when targeting MSVC, so disable inlining in
|
||||||
// this case.
|
// this case.
|
||||||
#[cfg_attr(not(target_env = "msvc"), inline(always))]
|
#[cfg_attr(not(target_env = "msvc"), inline(always))]
|
||||||
#[cfg_attr(target_env = "msvc", inline(never))]
|
#[cfg_attr(target_env = "msvc", inline(never))]
|
||||||
|
@ -247,6 +247,18 @@ pub fn collect_lang_features(base_src_path: &Path) -> Features {
|
|||||||
|
|
||||||
pub fn collect_lib_features(base_src_path: &Path) -> Features {
|
pub fn collect_lib_features(base_src_path: &Path) -> Features {
|
||||||
let mut lib_features = Features::new();
|
let mut lib_features = Features::new();
|
||||||
|
|
||||||
|
// This library feature is defined in the `compiler_builtins` crate, which
|
||||||
|
// has been moved out-of-tree. Now it can no longer be auto-discovered by
|
||||||
|
// `tidy`, because we need to filter out its (submodule) directory. Manually
|
||||||
|
// add it to the set of known library features so we can still generate docs.
|
||||||
|
lib_features.insert("compiler_builtins_lib".to_owned(), Feature {
|
||||||
|
level: Status::Unstable,
|
||||||
|
since: "".to_owned(),
|
||||||
|
has_gate_test: false,
|
||||||
|
tracking_issue: None,
|
||||||
|
});
|
||||||
|
|
||||||
map_lib_features(base_src_path,
|
map_lib_features(base_src_path,
|
||||||
&mut |res, _, _| {
|
&mut |res, _, _| {
|
||||||
match res {
|
match res {
|
||||||
|
Loading…
Reference in New Issue
Block a user