std: Stabilize and deprecate APIs for 1.13
This commit is intended to be backported to the 1.13 branch, and works with the
following APIs:
Stabilized
* `i32::checked_abs`
* `i32::wrapping_abs`
* `i32::overflowing_abs`
* `RefCell::try_borrow`
* `RefCell::try_borrow_mut`
Deprecated
* `BinaryHeap::push_pop`
* `BinaryHeap::replace`
* `SipHash13`
* `SipHash24`
* `SipHasher` - use `DefaultHasher` instead in the `std::collections::hash_map`
module
Closes#28147Closes#34767Closes#35057Closes#35070
This commit is intended to be backported to the 1.13 branch, and works with the
following APIs:
Stabilized
* `i32::checked_abs`
* `i32::wrapping_abs`
* `i32::overflowing_abs`
* `RefCell::try_borrow`
* `RefCell::try_borrow_mut`
* `DefaultHasher`
* `DefaultHasher::new`
* `DefaultHasher::default`
Deprecated
* `BinaryHeap::push_pop`
* `BinaryHeap::replace`
* `SipHash13`
* `SipHash24`
* `SipHasher` - use `DefaultHasher` instead in the `std::collections::hash_map`
module
Closes#28147Closes#34767Closes#35057Closes#35070
fix a few errant `Krate` edges
Exploring the effect of small changes on `syntex` reuse, I discovered the following sources of unnecessary edges from `Krate`
r? @michaelwoerister
incr. comp.: Take spans into account for ICH
This PR makes the ICH (incr. comp. hash) take spans into account when debuginfo is enabled.
A side-effect of this is that the SVH (which is based on the ICHs of all items in the crate) becomes sensitive to the tiniest change in a code base if debuginfo is enabled. Since we are not trying to model ABI compatibility via the SVH anymore (this is done via the crate disambiguator now), this should be not be a problem.
Fixes#33888.
Fixes#32753.
Implement synchronization scheme for incr. comp. directory
This PR implements a copy-on-write-based synchronization scheme for the incremental compilation cache directory. For technical details, see the documentation at the beginning of `rustc_incremental/persist/fs.rs`.
The PR contains unit tests for some functions but for testing whether the scheme properly handles races, a more elaborate test setup would be needed. It would probably involve a small tool that allows to manipulate the incremental compilation directory in a controlled way and then letting a compiler instance run against directories in different states. I don't know if it's worth the trouble of adding another test category to `compiletest`, but I'd be happy to do so.
Fixes#32754Fixes#34957
Experimentally, this fixes the poor re-use observed in
libsyntex-syntax. I'm not sure how to make a regression test for this,
though, given the non-deterministic nature of it.
This avoids the compile-time overhead of computing them twice. It also fixes
an issue where the hash computed after typeck is differen than the hash before,
because typeck mutates the def-map in place.
Fixes#35549.
Fixes#35593.
This massively speeds up serialization. It also
seems to produce deterministic metadata hashes
(before I was seeing inconsistent results).
Fixes#35232.
When we hash the inputs to a MetaData node, we have to hash them in a
consistent order. We achieve this by sorting the stringfied `DefPath`
entries. Also, micro-optimie by cache more results across the saving
process.
The biggest problem, actually, is krate numbers being removed entirely,
which can lead to array-index-out-of-bounds errors.
cc #35123 -- not a complete fix, since really we ought to "map" the old
crate numbers to the new ones, not just detect changes.
The way we do HIR inlining introduces reads of the "Hir" into the graph,
but this Hir in fact belongs to other crates, so when we try to load
later, we ICE because the Hir nodes in question don't belond to the
crate (and we haven't done inlining yet). This pass rewrites those HIR
nodes to the metadata from which the inlined HIR was loaded.
Better attribute and metaitem encapsulation throughout the compiler
This PR refactors most (hopefully all?) of the `MetaItem` interactions outside of `libsyntax` (and a few inside) to interact with MetaItems through the provided traits instead of directly creating / destruct / matching against them. This is a necessary first step to eventually converting `MetaItem`s to internally use `TokenStream` representations (which will make `MetaItem` interactions much nicer for macro writers once the new macro system is in place).
r? @nrc
In the older version, a `.o` and ` .bc` file were separate
work-products. This newer version keeps, for each codegen-unit, a set
of files of different kinds. We assume that if any kinds are available
then all the kinds we need are available, since the precise set of
switches will depend on attributes and command-line switches.
Should probably test this: the effect of changing attributes in
particular might not be successfully tracked?
This checks the `previous_work_products` data from the dep-graph and
tries to simply copy a `.o` file if possible. We also add new
work-products into the dep-graph, and create edges to/from the dep-node
for a work-product.
We used to use `Name`, but the session outlives the tokenizer, which
means that attempts to read this field after trans has complete
otherwise panic. All reads want an `InternedString` anyhow.
To handle the general case, we include a vector of def-ids, so that we
can account for things like `(Foo, Bar)` which references both `Foo` and
`Bar`. This means it is not Copy, so re-jigger some APIs to use
borrowing more intelligently.
Apply visit_path to import prefixes by default
Overriding `visit_path` is not enough to visit all paths, some import prefixes are not visited and `visit_path_list_item` need to be overridden as well. This PR removes this catch, it should be less error prone this way. Also, the prefix is visited once now, not repeatedly for each path list item.
r? @eddyb
This makes the "shadowing labels" warning *not* print the entire loop
as a span, but only the lifetime.
Also makes #31719 go away, but does not fix its root cause (the span
of the expanded loop is still wonky, but not used anymore).
This commit reorganizes how the persist code treats hashing. The idea is
that each crate saves a file containing hashes representing the metadata
for each item X. When we see a read from `MetaData(X)`, we can load this
hash up (if we don't find a file for that crate, we just use the SVH for
the entire crate).
To compute the hash for `MetaData(Y)`, where Y is some local item, we
examine all the predecessors of the `MetaData(Y)` node and hash their
hashes together.
For external crates, we must build up a map that goes from
the DefKey to the DefIndex. We do this by iterating over each
index that is found in the metadata and loading the associated
DefKey.
The compiler created a directory as part of `-Z incremental` but that may be
hierarchically used concurrently so we need to protect ourselves against that.
This commit rewrites all of the tidy checks we have, namely:
* featureck
* errorck
* tidy
* binaries
into Rust under a new `tidy` tool inside of the `src/tools` directory. This at
the same time deletes all the corresponding Python tidy checks so we can be sure
to only have one source of truth for all the tidy checks.
cc #31590