Remove global call sets: IRA

For -fipa-ra, IRA already keeps track of which specific registers
are call-clobbered in a region, rather than using global information.
The patch generalises this so that it tracks which ABIs are used
by calls in the region.

We can then use the new ABI descriptors to handle partially-clobbered
registers in the same way as fully-clobbered registers, without having
special code for targetm.hard_regno_call_part_clobbered.  This in turn
makes -fipa-ra work for partially-clobbered registers too.

A side-effect of allowing multiple ABIs is that we no longer have
an obvious set of conflicting registers for the self-described
"fragile hack" in ira-constraints.c.  This code kicks in for
user-defined registers that aren't live across a call at -O0,
and it tries to avoid allocating a call-clobbered register to them.
Here I've used the set of call-clobbered registers in the current
function's ABI, applying on top of any registers that are clobbered by
called functions.  This is enough to keep gcc.dg/debug/dwarf2/pr5948.c
happy.

The handling of GENERIC_STACK_CHECK in do_reload seemed to have
a reversed condition:

      for (int i = 0; i < FIRST_PSEUDO_REGISTER; i++)
	if (df_regs_ever_live_p (i)
	    && !fixed_regs[i]
	    && call_used_or_fixed_reg_p (i))
	  size += UNITS_PER_WORD;

The final part of the condition counts registers that don't need to be
saved in the prologue, but I think the opposite was intended.

2019-09-30  Richard Sandiford  <richard.sandiford@arm.com>

gcc/
	* function-abi.h (call_clobbers_in_region): Declare.
	(call_clobbered_in_region_p): New function.
	* function-abi.cc (call_clobbers_in_region): Likewise.
	* ira-int.h: Include function-abi.h.
	(ira_allocno::crossed_calls_abis): New field.
	(ALLOCNO_CROSSED_CALLS_ABIS): New macro.
	(ira_need_caller_save_regs): New function.
	(ira_need_caller_save_p): Likewise.
	* ira.c (setup_reg_renumber): Use ira_need_caller_save_p instead
	of call_used_or_fixed_regs.
	(do_reload): Use crtl->abi to test whether the current function
	needs to save a register in the prologue.  Count registers that
	need to be saved rather than registers that don't.
	* ira-build.c (create_cap_allocno): Copy ALLOCNO_CROSSED_CALLS_ABIS.
	Remove unnecessary | from ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS.
	(propagate_allocno_info): Merge ALLOCNO_CROSSED_CALLS_ABIS too.
	(propagate_some_info_from_allocno): Likewise.
	(copy_info_to_removed_store_destinations): Likewise.
	(ira_flattening): Say that ALLOCNO_CROSSED_CALLS_ABIS and
	ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS are handled conservatively.
	(ira_build): Use ira_need_caller_save_regs instead of
	call_used_or_fixed_regs.
	* ira-color.c (calculate_saved_nregs): Use crtl->abi to test
	whether the current function would need to save a register
	before using it.
	(calculate_spill_cost): Likewise.
	(allocno_reload_assign): Use ira_need_caller_save_regs and
	ira_need_caller_save_p instead of call_used_or_fixed_regs.
	* ira-conflicts.c (ira_build_conflicts): Use
	ira_need_caller_save_regs rather than call_used_or_fixed_regs
	as the set of call-clobbered registers.  Remove the
	call_used_or_fixed_regs mask from the calculation of
	temp_hard_reg_set and mask its use instead.  Remove special
	handling of partially-clobbered registers.
	* ira-costs.c (ira_tune_allocno_costs): Use ira_need_caller_save_p.
	* ira-lives.c (process_bb_node_lives): Use mode_clobbers to
	calculate the set of conflicting registers for calls that
	can throw.  Record the ABIs of calls in ALLOCNO_CROSSED_CALLS_ABIS.
	Use full_and_partial_reg_clobbers rather than full_reg_clobbers
	for the calculation of ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS.
	Use eh_edge_abi to calculate the set of registers that could
	be clobbered by an EH edge.  Include partially-clobbered as
	well as fully-clobbered registers.

From-SVN: r276325
This commit is contained in:
Richard Sandiford 2019-09-30 16:20:52 +00:00 committed by Richard Sandiford
parent 7450506b5d
commit 6c47622219
10 changed files with 178 additions and 80 deletions

View File

@ -1,3 +1,49 @@
2019-09-30 Richard Sandiford <richard.sandiford@arm.com>
* function-abi.h (call_clobbers_in_region): Declare.
(call_clobbered_in_region_p): New function.
* function-abi.cc (call_clobbers_in_region): Likewise.
* ira-int.h: Include function-abi.h.
(ira_allocno::crossed_calls_abis): New field.
(ALLOCNO_CROSSED_CALLS_ABIS): New macro.
(ira_need_caller_save_regs): New function.
(ira_need_caller_save_p): Likewise.
* ira.c (setup_reg_renumber): Use ira_need_caller_save_p instead
of call_used_or_fixed_regs.
(do_reload): Use crtl->abi to test whether the current function
needs to save a register in the prologue. Count registers that
need to be saved rather than registers that don't.
* ira-build.c (create_cap_allocno): Copy ALLOCNO_CROSSED_CALLS_ABIS.
Remove unnecessary | from ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS.
(propagate_allocno_info): Merge ALLOCNO_CROSSED_CALLS_ABIS too.
(propagate_some_info_from_allocno): Likewise.
(copy_info_to_removed_store_destinations): Likewise.
(ira_flattening): Say that ALLOCNO_CROSSED_CALLS_ABIS and
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS are handled conservatively.
(ira_build): Use ira_need_caller_save_regs instead of
call_used_or_fixed_regs.
* ira-color.c (calculate_saved_nregs): Use crtl->abi to test
whether the current function would need to save a register
before using it.
(calculate_spill_cost): Likewise.
(allocno_reload_assign): Use ira_need_caller_save_regs and
ira_need_caller_save_p instead of call_used_or_fixed_regs.
* ira-conflicts.c (ira_build_conflicts): Use
ira_need_caller_save_regs rather than call_used_or_fixed_regs
as the set of call-clobbered registers. Remove the
call_used_or_fixed_regs mask from the calculation of
temp_hard_reg_set and mask its use instead. Remove special
handling of partially-clobbered registers.
* ira-costs.c (ira_tune_allocno_costs): Use ira_need_caller_save_p.
* ira-lives.c (process_bb_node_lives): Use mode_clobbers to
calculate the set of conflicting registers for calls that
can throw. Record the ABIs of calls in ALLOCNO_CROSSED_CALLS_ABIS.
Use full_and_partial_reg_clobbers rather than full_reg_clobbers
for the calculation of ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS.
Use eh_edge_abi to calculate the set of registers that could
be clobbered by an EH edge. Include partially-clobbered as
well as fully-clobbered registers.
2019-09-30 Richard Sandiford <richard.sandiford@arm.com>
* haifa-sched.c: Include function-abi.h.

View File

@ -126,6 +126,31 @@ predefined_function_abi::add_full_reg_clobber (unsigned int regno)
SET_HARD_REG_BIT (m_mode_clobbers[i], regno);
}
/* Return the set of registers that cannot be used to hold a value of
mode MODE across the calls in a region described by ABIS and MASK, where:
* Bit ID of ABIS is set if the region contains a call with
function_abi identifier ID.
* MASK contains all the registers that are fully or partially
clobbered by calls in the region.
This is not quite as accurate as testing each individual call,
but it's a close and conservatively-correct approximation.
It's much better for some targets than just using MASK. */
HARD_REG_SET
call_clobbers_in_region (unsigned int abis, const_hard_reg_set mask,
machine_mode mode)
{
HARD_REG_SET result;
CLEAR_HARD_REG_SET (result);
for (unsigned int id = 0; abis; abis >>= 1, ++id)
if (abis & 1)
result |= function_abis[id].mode_clobbers (mode);
return result & mask;
}
/* Return the predefined ABI used by functions with type TYPE. */
const predefined_function_abi &

View File

@ -265,6 +265,32 @@ extern target_function_abi_info *this_target_function_abi_info;
(this_target_function_abi_info->x_function_abis[0])
#define eh_edge_abi default_function_abi
extern HARD_REG_SET call_clobbers_in_region (unsigned int, const_hard_reg_set,
machine_mode mode);
/* Return true if (reg:MODE REGNO) might be clobbered by one of the
calls in a region described by ABIS and MASK, where:
* Bit ID of ABIS is set if the region contains a call with
function_abi identifier ID.
* MASK contains all the registers that are fully or partially
clobbered by calls in the region.
This is not quite as accurate as testing each individual call,
but it's a close and conservatively-correct approximation.
It's much better for some targets than:
overlaps_hard_reg_set_p (MASK, MODE, REGNO). */
inline bool
call_clobbered_in_region_p (unsigned int abis, const_hard_reg_set mask,
machine_mode mode, unsigned int regno)
{
HARD_REG_SET clobbers = call_clobbers_in_region (abis, mask, mode);
return overlaps_hard_reg_set_p (clobbers, mode, regno);
}
extern const predefined_function_abi &fntype_abi (const_tree);
extern function_abi fndecl_abi (const_tree);
extern function_abi insn_callee_abi (const rtx_insn *);

View File

@ -903,8 +903,9 @@ create_cap_allocno (ira_allocno_t a)
ALLOCNO_CALLS_CROSSED_NUM (cap) = ALLOCNO_CALLS_CROSSED_NUM (a);
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (cap) = ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a);
ALLOCNO_CROSSED_CALLS_ABIS (cap) = ALLOCNO_CROSSED_CALLS_ABIS (a);
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (cap)
|= ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a);
= ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a);
if (internal_flag_ira_verbose > 2 && ira_dump_file != NULL)
{
fprintf (ira_dump_file, " Creating cap ");
@ -2032,6 +2033,8 @@ propagate_allocno_info (void)
+= ALLOCNO_CALLS_CROSSED_NUM (a);
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (parent_a)
+= ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a);
ALLOCNO_CROSSED_CALLS_ABIS (parent_a)
|= ALLOCNO_CROSSED_CALLS_ABIS (a);
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (parent_a)
|= ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a);
ALLOCNO_EXCESS_PRESSURE_POINTS_NUM (parent_a)
@ -2415,6 +2418,7 @@ propagate_some_info_from_allocno (ira_allocno_t a, ira_allocno_t from_a)
ALLOCNO_CALLS_CROSSED_NUM (a) += ALLOCNO_CALLS_CROSSED_NUM (from_a);
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a)
+= ALLOCNO_CHEAP_CALLS_CROSSED_NUM (from_a);
ALLOCNO_CROSSED_CALLS_ABIS (a) |= ALLOCNO_CROSSED_CALLS_ABIS (from_a);
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a)
|= ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (from_a);
@ -3056,6 +3060,8 @@ copy_info_to_removed_store_destinations (int regno)
+= ALLOCNO_CALLS_CROSSED_NUM (a);
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (parent_a)
+= ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a);
ALLOCNO_CROSSED_CALLS_ABIS (parent_a)
|= ALLOCNO_CROSSED_CALLS_ABIS (a);
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (parent_a)
|= ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a);
ALLOCNO_EXCESS_PRESSURE_POINTS_NUM (parent_a)
@ -3155,6 +3161,9 @@ ira_flattening (int max_regno_before_emit, int ira_max_point_before_emit)
-= ALLOCNO_CALLS_CROSSED_NUM (a);
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (parent_a)
-= ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a);
/* Assume that ALLOCNO_CROSSED_CALLS_ABIS and
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS stay the same.
We'd need to rebuild the IR to do better. */
ALLOCNO_EXCESS_PRESSURE_POINTS_NUM (parent_a)
-= ALLOCNO_EXCESS_PRESSURE_POINTS_NUM (a);
ira_assert (ALLOCNO_CALLS_CROSSED_NUM (parent_a) >= 0
@ -3462,7 +3471,7 @@ ira_build (void)
allocno crossing calls. */
FOR_EACH_ALLOCNO (a, ai)
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
ior_hard_reg_conflicts (a, call_used_or_fixed_regs);
ior_hard_reg_conflicts (a, ira_need_caller_save_regs (a));
}
if (internal_flag_ira_verbose > 2 && ira_dump_file != NULL)
print_copies (ira_dump_file);

View File

@ -1650,7 +1650,7 @@ calculate_saved_nregs (int hard_regno, machine_mode mode)
ira_assert (hard_regno >= 0);
for (i = hard_regno_nregs (hard_regno, mode) - 1; i >= 0; i--)
if (!allocated_hardreg_p[hard_regno + i]
&& !TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno + i)
&& !crtl->abi->clobbers_full_reg_p (hard_regno + i)
&& !LOCAL_REGNO (hard_regno + i))
nregs++;
return nregs;
@ -4379,7 +4379,7 @@ allocno_reload_assign (ira_allocno_t a, HARD_REG_SET forbidden_regs)
saved[i] = OBJECT_TOTAL_CONFLICT_HARD_REGS (obj);
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= forbidden_regs;
if (! flag_caller_saves && ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= ira_need_caller_save_regs (a);
}
ALLOCNO_ASSIGNED_P (a) = false;
aclass = ALLOCNO_CLASS (a);
@ -4398,9 +4398,7 @@ allocno_reload_assign (ira_allocno_t a, HARD_REG_SET forbidden_regs)
? ALLOCNO_CLASS_COST (a)
: ALLOCNO_HARD_REG_COSTS (a)[ira_class_hard_reg_index
[aclass][hard_regno]]));
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0
&& ira_hard_reg_set_intersection_p (hard_regno, ALLOCNO_MODE (a),
call_used_or_fixed_regs))
if (ira_need_caller_save_p (a, regno))
{
ira_assert (flag_caller_saves);
caller_save_needed = 1;
@ -4687,16 +4685,16 @@ ira_mark_new_stack_slot (rtx x, int regno, poly_uint64 total_size)
given IN and OUT for INSN. Return also number points (through
EXCESS_PRESSURE_LIVE_LENGTH) where the pseudo-register lives and
the register pressure is high, number of references of the
pseudo-registers (through NREFS), number of callee-clobbered
hard-registers occupied by the pseudo-registers (through
CALL_USED_COUNT), and the first hard regno occupied by the
pseudo-registers (through NREFS), the number of psuedo registers
whose allocated register wouldn't need saving in the prologue
(through CALL_USED_COUNT), and the first hard regno occupied by the
pseudo-registers (through FIRST_HARD_REGNO). */
static int
calculate_spill_cost (int *regnos, rtx in, rtx out, rtx_insn *insn,
int *excess_pressure_live_length,
int *nrefs, int *call_used_count, int *first_hard_regno)
{
int i, cost, regno, hard_regno, j, count, saved_cost, nregs;
int i, cost, regno, hard_regno, count, saved_cost;
bool in_p, out_p;
int length;
ira_allocno_t a;
@ -4713,11 +4711,8 @@ calculate_spill_cost (int *regnos, rtx in, rtx out, rtx_insn *insn,
a = ira_regno_allocno_map[regno];
length += ALLOCNO_EXCESS_PRESSURE_POINTS_NUM (a) / ALLOCNO_NUM_OBJECTS (a);
cost += ALLOCNO_MEMORY_COST (a) - ALLOCNO_CLASS_COST (a);
nregs = hard_regno_nregs (hard_regno, ALLOCNO_MODE (a));
for (j = 0; j < nregs; j++)
if (! TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno + j))
break;
if (j == nregs)
if (in_hard_reg_set_p (crtl->abi->full_reg_clobbers (),
ALLOCNO_MODE (a), hard_regno))
count++;
in_p = in && REG_P (in) && (int) REGNO (in) == hard_regno;
out_p = out && REG_P (out) && (int) REGNO (out) == hard_regno;

View File

@ -770,9 +770,7 @@ ira_build_conflicts (void)
if (! targetm.class_likely_spilled_p (base))
CLEAR_HARD_REG_SET (temp_hard_reg_set);
else
temp_hard_reg_set = (reg_class_contents[base]
& ~ira_no_alloc_regs
& call_used_or_fixed_regs);
temp_hard_reg_set = reg_class_contents[base] & ~ira_no_alloc_regs;
FOR_EACH_ALLOCNO (a, ai)
{
int i, n = ALLOCNO_NUM_OBJECTS (a);
@ -780,29 +778,28 @@ ira_build_conflicts (void)
for (i = 0; i < n; i++)
{
ira_object_t obj = ALLOCNO_OBJECT (a, i);
machine_mode obj_mode = obj->allocno->mode;
rtx allocno_reg = regno_reg_rtx [ALLOCNO_REGNO (a)];
if ((! flag_caller_saves && ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
/* For debugging purposes don't put user defined variables in
callee-clobbered registers. However, do allow parameters
in callee-clobbered registers to improve debugging. This
is a bit of a fragile hack. */
|| (optimize == 0
&& REG_USERVAR_P (allocno_reg)
&& ! reg_is_parm_p (allocno_reg)))
/* For debugging purposes don't put user defined variables in
callee-clobbered registers. However, do allow parameters
in callee-clobbered registers to improve debugging. This
is a bit of a fragile hack. */
if (optimize == 0
&& REG_USERVAR_P (allocno_reg)
&& ! reg_is_parm_p (allocno_reg))
{
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
OBJECT_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
HARD_REG_SET new_conflict_regs = crtl->abi->full_reg_clobbers ();
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= new_conflict_regs;
OBJECT_CONFLICT_HARD_REGS (obj) |= new_conflict_regs;
}
else if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
{
HARD_REG_SET no_caller_save_reg_set
= (call_used_or_fixed_regs & ~savable_regs);
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= no_caller_save_reg_set;
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= temp_hard_reg_set;
OBJECT_CONFLICT_HARD_REGS (obj) |= no_caller_save_reg_set;
OBJECT_CONFLICT_HARD_REGS (obj) |= temp_hard_reg_set;
HARD_REG_SET new_conflict_regs = ira_need_caller_save_regs (a);
if (flag_caller_saves)
new_conflict_regs &= (~savable_regs | temp_hard_reg_set);
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= new_conflict_regs;
OBJECT_CONFLICT_HARD_REGS (obj) |= new_conflict_regs;
}
/* Now we deal with paradoxical subreg cases where certain registers
@ -829,23 +826,6 @@ ira_build_conflicts (void)
}
}
}
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
{
int regno;
/* Allocnos bigger than the saved part of call saved
regs must conflict with them. */
for (regno = 0; regno < FIRST_PSEUDO_REGISTER; regno++)
if (!TEST_HARD_REG_BIT (call_used_or_fixed_regs, regno)
&& targetm.hard_regno_call_part_clobbered (0, regno,
obj_mode))
{
SET_HARD_REG_BIT (OBJECT_CONFLICT_HARD_REGS (obj), regno);
SET_HARD_REG_BIT (OBJECT_TOTAL_CONFLICT_HARD_REGS (obj),
regno);
}
}
}
}
if (optimize && ira_conflicts_p

View File

@ -2340,7 +2340,6 @@ ira_tune_allocno_costs (void)
ira_allocno_object_iterator oi;
ira_object_t obj;
bool skip_p;
HARD_REG_SET *crossed_calls_clobber_regs;
FOR_EACH_ALLOCNO (a, ai)
{
@ -2375,14 +2374,7 @@ ira_tune_allocno_costs (void)
continue;
rclass = REGNO_REG_CLASS (regno);
cost = 0;
crossed_calls_clobber_regs
= &(ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a));
if (ira_hard_reg_set_intersection_p (regno, mode,
*crossed_calls_clobber_regs)
&& (ira_hard_reg_set_intersection_p (regno, mode,
call_used_or_fixed_regs)
|| targetm.hard_regno_call_part_clobbered (0, regno,
mode)))
if (ira_need_caller_save_p (a, regno))
cost += (ALLOCNO_CALL_FREQ (a)
* (ira_memory_move_cost[mode][rclass][0]
+ ira_memory_move_cost[mode][rclass][1]));

View File

@ -22,6 +22,7 @@ along with GCC; see the file COPYING3. If not see
#define GCC_IRA_INT_H
#include "recog.h"
#include "function-abi.h"
/* To provide consistency in naming, all IRA external variables,
functions, common typedefs start with prefix ira_. */
@ -287,6 +288,9 @@ struct ira_allocno
/* Register class which should be used for allocation for given
allocno. NO_REGS means that we should use memory. */
ENUM_BITFIELD (reg_class) aclass : 16;
/* A bitmask of the ABIs used by calls that occur while the allocno
is live. */
unsigned int crossed_calls_abis : NUM_ABI_IDS;
/* During the reload, value TRUE means that we should not reassign a
hard register to the allocno got memory earlier. It is set up
when we removed memory-memory move insn before each iteration of
@ -423,6 +427,7 @@ struct ira_allocno
#define ALLOCNO_CALL_FREQ(A) ((A)->call_freq)
#define ALLOCNO_CALLS_CROSSED_NUM(A) ((A)->calls_crossed_num)
#define ALLOCNO_CHEAP_CALLS_CROSSED_NUM(A) ((A)->cheap_calls_crossed_num)
#define ALLOCNO_CROSSED_CALLS_ABIS(A) ((A)->crossed_calls_abis)
#define ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS(A) \
((A)->crossed_calls_clobbered_regs)
#define ALLOCNO_MEM_OPTIMIZED_DEST(A) ((A)->mem_optimized_dest)
@ -1510,4 +1515,28 @@ ira_allocate_and_set_or_copy_costs (int **vec, enum reg_class aclass,
extern rtx ira_create_new_reg (rtx);
extern int first_moveable_pseudo, last_moveable_pseudo;
/* Return the set of registers that would need a caller save if allocno A
overlapped them. */
inline HARD_REG_SET
ira_need_caller_save_regs (ira_allocno_t a)
{
return call_clobbers_in_region (ALLOCNO_CROSSED_CALLS_ABIS (a),
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a),
ALLOCNO_MODE (a));
}
/* Return true if we would need to save allocno A around a call if we
assigned hard register REGNO. */
inline bool
ira_need_caller_save_p (ira_allocno_t a, unsigned int regno)
{
if (ALLOCNO_CALLS_CROSSED_NUM (a) == 0)
return false;
return call_clobbered_in_region_p (ALLOCNO_CROSSED_CALLS_ABIS (a),
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a),
ALLOCNO_MODE (a), regno);
}
#endif /* GCC_IRA_INT_H */

View File

@ -1255,11 +1255,7 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
ira_object_t obj = ira_object_id_map[i];
a = OBJECT_ALLOCNO (obj);
int num = ALLOCNO_NUM (a);
HARD_REG_SET this_call_used_reg_set
= insn_callee_abi (insn).full_reg_clobbers ();
/* ??? This preserves traditional behavior; it might not be
needed. */
this_call_used_reg_set |= fixed_reg_set;
function_abi callee_abi = insn_callee_abi (insn);
/* Don't allocate allocnos that cross setjmps or any
call, if this function receives a nonlocal
@ -1275,9 +1271,9 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
if (can_throw_internal (insn))
{
OBJECT_CONFLICT_HARD_REGS (obj)
|= this_call_used_reg_set;
|= callee_abi.mode_clobbers (ALLOCNO_MODE (a));
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj)
|= this_call_used_reg_set;
|= callee_abi.mode_clobbers (ALLOCNO_MODE (a));
}
if (sparseset_bit_p (allocnos_processed, num))
@ -1294,8 +1290,9 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
/* Mark it as saved at the next call. */
allocno_saved_at_call[num] = last_call_num + 1;
ALLOCNO_CALLS_CROSSED_NUM (a)++;
ALLOCNO_CROSSED_CALLS_ABIS (a) |= 1 << callee_abi.id ();
ALLOCNO_CROSSED_CALLS_CLOBBERED_REGS (a)
|= this_call_used_reg_set;
|= callee_abi.full_and_partial_reg_clobbers ();
if (cheap_reg != NULL_RTX
&& ALLOCNO_REGNO (a) == (int) REGNO (cheap_reg))
ALLOCNO_CHEAP_CALLS_CROSSED_NUM (a)++;
@ -1359,10 +1356,11 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
}
/* Allocnos can't go in stack regs at the start of a basic block
that is reached by an abnormal edge. Likewise for call
clobbered regs, because caller-save, fixup_abnormal_edges and
possibly the table driven EH machinery are not quite ready to
handle such allocnos live across such edges. */
that is reached by an abnormal edge. Likewise for registers
that are at least partly call clobbered, because caller-save,
fixup_abnormal_edges and possibly the table driven EH machinery
are not quite ready to handle such allocnos live across such
edges. */
if (bb_has_abnormal_pred (bb))
{
#ifdef STACK_REGS
@ -1382,7 +1380,7 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
if (!cfun->has_nonlocal_label
&& has_abnormal_call_or_eh_pred_edge_p (bb))
for (px = 0; px < FIRST_PSEUDO_REGISTER; px++)
if (call_used_or_fixed_reg_p (px)
if (eh_edge_abi.clobbers_at_least_part_of_reg_p (px)
#ifdef REAL_PIC_OFFSET_TABLE_REGNUM
/* We should create a conflict of PIC pseudo with
PIC hard reg as PIC hard reg can have a wrong

View File

@ -2368,9 +2368,7 @@ setup_reg_renumber (void)
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj)
|= ~reg_class_contents[pclass];
}
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0
&& ira_hard_reg_set_intersection_p (hard_regno, ALLOCNO_MODE (a),
call_used_or_fixed_regs))
if (ira_need_caller_save_p (a, hard_regno))
{
ira_assert (!optimize || flag_caller_saves
|| (ALLOCNO_CALLS_CROSSED_NUM (a)
@ -5591,7 +5589,7 @@ do_reload (void)
for (int i = 0; i < FIRST_PSEUDO_REGISTER; i++)
if (df_regs_ever_live_p (i)
&& !fixed_regs[i]
&& call_used_or_fixed_reg_p (i))
&& !crtl->abi->clobbers_full_reg_p (i))
size += UNITS_PER_WORD;
if (constant_lower_bound (size) > STACK_CHECK_MAX_FRAME_SIZE)