Replace call_used_reg_set with call_used_or_fixed_regs

CALL_USED_REGISTERS and call_used_regs infamously contain all fixed
registers (hence the need for CALL_REALLY_USED_REGISTERS etc.).
We try to recover from this to some extent with:

  /* Contains 1 for registers that are set or clobbered by calls.  */
  /* ??? Ideally, this would be just call_used_regs plus global_regs, but
     for someone's bright idea to have call_used_regs strictly include
     fixed_regs.  Which leaves us guessing as to the set of fixed_regs
     that are actually preserved.  We know for sure that those associated
     with the local stack frame are safe, but scant others.  */
  HARD_REG_SET x_regs_invalidated_by_call;

Since global registers are added to fixed_reg_set and call_used_reg_set
too, it's always the case that:

  call_used_reg_set == regs_invalidated_by_call | fixed_reg_set

This patch replaces all uses of call_used_reg_set with a new macro
call_used_or_fixed_regs to make this clearer.

This is part of a series that allows call_used_regs to be what is
now call_really_used_regs.  It's a purely mechanical replacement;
later patches clean up obvious oddities like
"call_used_or_fixed_regs & ~fixed_regs".

2019-09-10  Richard Sandiford  <richard.sandiford@arm.com>

gcc/
	* hard-reg-set.h (target_hard_regs::x_call_used_reg_set): Delete.
	(call_used_reg_set): Delete.
	(call_used_or_fixed_regs): New macro.
	* reginfo.c (init_reg_sets_1, globalize_reg): Remove initialization
	of call_used_reg_set.
	* caller-save.c (setup_save_areas): Use call_used_or_fixed_regs
	instead of call_used_regs.
	(save_call_clobbered_regs): Likewise.
	* cfgcleanup.c (old_insns_match_p): Likewise.
	* config/c6x/c6x.c (c6x_call_saved_register_used): Likewise.
	* config/epiphany/epiphany.c (epiphany_conditional_register_usage):
	Likewise.
	* config/frv/frv.c (frv_ifcvt_modify_tests): Likewise.
	* config/sh/sh.c (output_stack_adjust): Likewise.
	* final.c (collect_fn_hard_reg_usage): Likewise.
	* ira-build.c (ira_build): Likewise.
	* ira-color.c (calculate_saved_nregs): Likewise.
	(allocno_reload_assign, calculate_spill_cost): Likewise.
	* ira-conflicts.c (ira_build_conflicts): Likewise.
	* ira-costs.c (ira_tune_allocno_costs): Likewise.
	* ira-lives.c (process_bb_node_lives): Likewise.
	* ira.c (setup_reg_renumber): Likewise.
	* lra-assigns.c (find_hard_regno_for_1, lra_assign): Likewise.
	* lra-constraints.c (need_for_call_save_p): Likewise.
	(need_for_split_p, inherit_in_ebb): Likewise.
	* lra-lives.c (process_bb_lives): Likewise.
	* lra-remat.c (call_used_input_regno_present_p): Likewise.
	* postreload.c (reload_combine): Likewise.
	* regrename.c (find_rename_reg): Likewise.
	* reload1.c (reload_as_needed): Likewise.
	* rtlanal.c (find_all_hard_reg_sets): Likewise.
	* sel-sched.c (mark_unavailable_hard_regs): Likewise.
	* shrink-wrap.c (requires_stack_frame_p): Likewise.

From-SVN: r275600
This commit is contained in:
Richard Sandiford 2019-09-10 18:56:37 +00:00 committed by Richard Sandiford
parent 026116ce2a
commit a5647ae846
26 changed files with 81 additions and 53 deletions

View File

@ -1,3 +1,39 @@
2019-09-10 Richard Sandiford <richard.sandiford@arm.com>
* hard-reg-set.h (target_hard_regs::x_call_used_reg_set): Delete.
(call_used_reg_set): Delete.
(call_used_or_fixed_regs): New macro.
* reginfo.c (init_reg_sets_1, globalize_reg): Remove initialization
of call_used_reg_set.
* caller-save.c (setup_save_areas): Use call_used_or_fixed_regs
instead of call_used_regs.
(save_call_clobbered_regs): Likewise.
* cfgcleanup.c (old_insns_match_p): Likewise.
* config/c6x/c6x.c (c6x_call_saved_register_used): Likewise.
* config/epiphany/epiphany.c (epiphany_conditional_register_usage):
Likewise.
* config/frv/frv.c (frv_ifcvt_modify_tests): Likewise.
* config/sh/sh.c (output_stack_adjust): Likewise.
* final.c (collect_fn_hard_reg_usage): Likewise.
* ira-build.c (ira_build): Likewise.
* ira-color.c (calculate_saved_nregs): Likewise.
(allocno_reload_assign, calculate_spill_cost): Likewise.
* ira-conflicts.c (ira_build_conflicts): Likewise.
* ira-costs.c (ira_tune_allocno_costs): Likewise.
* ira-lives.c (process_bb_node_lives): Likewise.
* ira.c (setup_reg_renumber): Likewise.
* lra-assigns.c (find_hard_regno_for_1, lra_assign): Likewise.
* lra-constraints.c (need_for_call_save_p): Likewise.
(need_for_split_p, inherit_in_ebb): Likewise.
* lra-lives.c (process_bb_lives): Likewise.
* lra-remat.c (call_used_input_regno_present_p): Likewise.
* postreload.c (reload_combine): Likewise.
* regrename.c (find_rename_reg): Likewise.
* reload1.c (reload_as_needed): Likewise.
* rtlanal.c (find_all_hard_reg_sets): Likewise.
* sel-sched.c (mark_unavailable_hard_regs): Likewise.
* shrink-wrap.c (requires_stack_frame_p): Likewise.
2019-09-10 Richard Sandiford <richard.sandiford@arm.com>
* hard-reg-set.h (target_hard_regs::x_no_caller_save_reg_set): Delete.

View File

@ -426,7 +426,7 @@ setup_save_areas (void)
freq = REG_FREQ_FROM_BB (BLOCK_FOR_INSN (insn));
REG_SET_TO_HARD_REG_SET (hard_regs_to_save,
&chain->live_throughout);
get_call_reg_set_usage (insn, &used_regs, call_used_reg_set);
get_call_reg_set_usage (insn, &used_regs, call_used_or_fixed_regs);
/* Record all registers set in this call insn. These don't
need to be saved. N.B. the call insn might set a subreg
@ -509,7 +509,7 @@ setup_save_areas (void)
REG_SET_TO_HARD_REG_SET (hard_regs_to_save,
&chain->live_throughout);
get_call_reg_set_usage (insn, &used_regs, call_used_reg_set);
get_call_reg_set_usage (insn, &used_regs, call_used_or_fixed_regs);
/* Record all registers set in this call insn. These don't
need to be saved. N.B. the call insn might set a subreg
@ -839,7 +839,7 @@ save_call_clobbered_regs (void)
| hard_regs_saved);
hard_regs_to_save &= savable_regs;
get_call_reg_set_usage (insn, &call_def_reg_set,
call_used_reg_set);
call_used_or_fixed_regs);
hard_regs_to_save &= call_def_reg_set;
for (regno = 0; regno < FIRST_PSEUDO_REGISTER; regno++)
@ -855,7 +855,8 @@ save_call_clobbered_regs (void)
if (cheap
&& HARD_REGISTER_P (cheap)
&& TEST_HARD_REG_BIT (call_used_reg_set, REGNO (cheap)))
&& TEST_HARD_REG_BIT (call_used_or_fixed_regs,
REGNO (cheap)))
{
rtx dest, newpat;
rtx pat = PATTERN (insn);

View File

@ -1228,8 +1228,8 @@ old_insns_match_p (int mode ATTRIBUTE_UNUSED, rtx_insn *i1, rtx_insn *i2)
HARD_REG_SET i1_used, i2_used;
get_call_reg_set_usage (i1, &i1_used, call_used_reg_set);
get_call_reg_set_usage (i2, &i2_used, call_used_reg_set);
get_call_reg_set_usage (i1, &i1_used, call_used_or_fixed_regs);
get_call_reg_set_usage (i2, &i2_used, call_used_or_fixed_regs);
if (i1_used != i2_used)
return dir_none;

View File

@ -1094,7 +1094,7 @@ c6x_call_saved_register_used (tree call_expr)
INIT_CUMULATIVE_ARGS (cum_v, NULL, NULL, 0, 0);
cum = pack_cumulative_args (&cum_v);
call_saved_regset = ~call_used_reg_set;
call_saved_regset = ~call_used_or_fixed_regs;
for (i = 0; i < call_expr_nargs (call_expr); i++)
{
parameter = CALL_EXPR_ARG (call_expr, i);

View File

@ -2242,7 +2242,7 @@ epiphany_conditional_register_usage (void)
CLEAR_HARD_REG_SET (reg_class_contents[SHORT_INSN_REGS]);
reg_class_contents[SIBCALL_REGS] = reg_class_contents[GENERAL_REGS];
/* It would be simpler and quicker if we could just use
&~, alas, call_used_reg_set is yet uninitialized;
&~, alas, call_used_or_fixed_regs is yet uninitialized;
it is set up later by our caller. */
for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
if (!call_used_regs[i])

View File

@ -5201,7 +5201,7 @@ frv_ifcvt_modify_tests (ce_if_block *ce_info, rtx *p_true, rtx *p_false)
not fixed. However, allow the ICC/ICR temporary registers to be allocated
if we did not need to use them in reloading other registers. */
memset (&tmp_reg->regs, 0, sizeof (tmp_reg->regs));
tmp_reg->regs = call_used_reg_set &~ fixed_reg_set;
tmp_reg->regs = call_used_or_fixed_regs &~ fixed_reg_set;
SET_HARD_REG_BIT (tmp_reg->regs, ICC_TEMP);
SET_HARD_REG_BIT (tmp_reg->regs, ICR_TEMP);

View File

@ -6707,7 +6707,7 @@ output_stack_adjust (int size, rtx reg, int epilogue_p,
temp = -1;
if (temp < 0 && ! current_function_interrupt && epilogue_p >= 0)
{
HARD_REG_SET temps = (call_used_reg_set
HARD_REG_SET temps = (call_used_or_fixed_regs
& ~fixed_reg_set
& savable_regs);
if (epilogue_p > 0)

View File

@ -5007,7 +5007,7 @@ collect_fn_hard_reg_usage (void)
&& !self_recursive_call_p (insn))
{
if (!get_call_reg_set_usage (insn, &insn_used_regs,
call_used_reg_set))
call_used_or_fixed_regs))
return;
function_used_regs |= insn_used_regs;
@ -5030,7 +5030,7 @@ collect_fn_hard_reg_usage (void)
/* The information we have gathered is only interesting if it exposes a
register from the call_used_regs that is not used in this function. */
if (hard_reg_set_subset_p (call_used_reg_set, function_used_regs))
if (hard_reg_set_subset_p (call_used_or_fixed_regs, function_used_regs))
return;
node = cgraph_node::rtl_info (current_function_decl);

View File

@ -397,9 +397,6 @@ struct target_hard_regs {
char x_call_really_used_regs[FIRST_PSEUDO_REGISTER];
/* The same info as a HARD_REG_SET. */
HARD_REG_SET x_call_used_reg_set;
/* For targets that use reload rather than LRA, this is the set
of registers that we are able to save and restore around calls
(i.e. those for which we know a suitable mode and set of
@ -480,12 +477,12 @@ extern struct target_hard_regs *this_target_hard_regs;
(this_target_hard_regs->x_call_used_regs)
#define call_really_used_regs \
(this_target_hard_regs->x_call_really_used_regs)
#define call_used_reg_set \
(this_target_hard_regs->x_call_used_reg_set)
#define savable_regs \
(this_target_hard_regs->x_savable_regs)
#define regs_invalidated_by_call \
(this_target_hard_regs->x_regs_invalidated_by_call)
#define call_used_or_fixed_regs \
(regs_invalidated_by_call | fixed_reg_set)
#define reg_alloc_order \
(this_target_hard_regs->x_reg_alloc_order)
#define inv_reg_alloc_order \

View File

@ -3462,7 +3462,7 @@ ira_build (void)
allocno crossing calls. */
FOR_EACH_ALLOCNO (a, ai)
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
ior_hard_reg_conflicts (a, call_used_reg_set);
ior_hard_reg_conflicts (a, call_used_or_fixed_regs);
}
if (internal_flag_ira_verbose > 2 && ira_dump_file != NULL)
print_copies (ira_dump_file);

View File

@ -1650,7 +1650,7 @@ calculate_saved_nregs (int hard_regno, machine_mode mode)
ira_assert (hard_regno >= 0);
for (i = hard_regno_nregs (hard_regno, mode) - 1; i >= 0; i--)
if (!allocated_hardreg_p[hard_regno + i]
&& !TEST_HARD_REG_BIT (call_used_reg_set, hard_regno + i)
&& !TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno + i)
&& !LOCAL_REGNO (hard_regno + i))
nregs++;
return nregs;
@ -4379,7 +4379,7 @@ allocno_reload_assign (ira_allocno_t a, HARD_REG_SET forbidden_regs)
saved[i] = OBJECT_TOTAL_CONFLICT_HARD_REGS (obj);
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= forbidden_regs;
if (! flag_caller_saves && ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_reg_set;
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
}
ALLOCNO_ASSIGNED_P (a) = false;
aclass = ALLOCNO_CLASS (a);
@ -4400,7 +4400,7 @@ allocno_reload_assign (ira_allocno_t a, HARD_REG_SET forbidden_regs)
[aclass][hard_regno]]));
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0
&& ira_hard_reg_set_intersection_p (hard_regno, ALLOCNO_MODE (a),
call_used_reg_set))
call_used_or_fixed_regs))
{
ira_assert (flag_caller_saves);
caller_save_needed = 1;
@ -4715,7 +4715,7 @@ calculate_spill_cost (int *regnos, rtx in, rtx out, rtx_insn *insn,
cost += ALLOCNO_MEMORY_COST (a) - ALLOCNO_CLASS_COST (a);
nregs = hard_regno_nregs (hard_regno, ALLOCNO_MODE (a));
for (j = 0; j < nregs; j++)
if (! TEST_HARD_REG_BIT (call_used_reg_set, hard_regno + j))
if (! TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno + j))
break;
if (j == nregs)
count++;

View File

@ -740,7 +740,7 @@ ira_build_conflicts (void)
else
temp_hard_reg_set = (reg_class_contents[base]
& ~ira_no_alloc_regs
& call_used_reg_set);
& call_used_or_fixed_regs);
FOR_EACH_ALLOCNO (a, ai)
{
int i, n = ALLOCNO_NUM_OBJECTS (a);
@ -760,13 +760,13 @@ ira_build_conflicts (void)
&& REG_USERVAR_P (allocno_reg)
&& ! reg_is_parm_p (allocno_reg)))
{
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_reg_set;
OBJECT_CONFLICT_HARD_REGS (obj) |= call_used_reg_set;
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
OBJECT_CONFLICT_HARD_REGS (obj) |= call_used_or_fixed_regs;
}
else if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0)
{
HARD_REG_SET no_caller_save_reg_set
= (call_used_reg_set & ~savable_regs);
= (call_used_or_fixed_regs & ~savable_regs);
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= no_caller_save_reg_set;
OBJECT_TOTAL_CONFLICT_HARD_REGS (obj) |= temp_hard_reg_set;
OBJECT_CONFLICT_HARD_REGS (obj) |= no_caller_save_reg_set;
@ -805,7 +805,7 @@ ira_build_conflicts (void)
/* Allocnos bigger than the saved part of call saved
regs must conflict with them. */
for (regno = 0; regno < FIRST_PSEUDO_REGISTER; regno++)
if (!TEST_HARD_REG_BIT (call_used_reg_set, regno)
if (!TEST_HARD_REG_BIT (call_used_or_fixed_regs, regno)
&& targetm.hard_regno_call_part_clobbered (NULL, regno,
obj_mode))
{

View File

@ -2380,7 +2380,7 @@ ira_tune_allocno_costs (void)
if (ira_hard_reg_set_intersection_p (regno, mode,
*crossed_calls_clobber_regs)
&& (ira_hard_reg_set_intersection_p (regno, mode,
call_used_reg_set)
call_used_or_fixed_regs)
|| targetm.hard_regno_call_part_clobbered (NULL, regno,
mode)))
cost += (ALLOCNO_CALL_FREQ (a)

View File

@ -1257,7 +1257,7 @@ process_bb_node_lives (ira_loop_tree_node_t loop_tree_node)
HARD_REG_SET this_call_used_reg_set;
get_call_reg_set_usage (insn, &this_call_used_reg_set,
call_used_reg_set);
call_used_or_fixed_regs);
/* Don't allocate allocnos that cross setjmps or any
call, if this function receives a nonlocal

View File

@ -2370,7 +2370,7 @@ setup_reg_renumber (void)
}
if (ALLOCNO_CALLS_CROSSED_NUM (a) != 0
&& ira_hard_reg_set_intersection_p (hard_regno, ALLOCNO_MODE (a),
call_used_reg_set))
call_used_or_fixed_regs))
{
ira_assert (!optimize || flag_caller_saves
|| (ALLOCNO_CALLS_CROSSED_NUM (a)

View File

@ -654,7 +654,7 @@ find_hard_regno_for_1 (int regno, int *cost, int try_only_hard_regno,
for (j = 0;
j < hard_regno_nregs (hard_regno, PSEUDO_REGNO_MODE (regno));
j++)
if (! TEST_HARD_REG_BIT (call_used_reg_set, hard_regno + j)
if (! TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno + j)
&& ! df_regs_ever_live_p (hard_regno + j))
/* It needs save restore. */
hard_regno_costs[hard_regno]
@ -1641,7 +1641,7 @@ lra_assign (bool &fails_p)
for (i = FIRST_PSEUDO_REGISTER; i < max_regno; i++)
if (lra_reg_info[i].nrefs != 0 && reg_renumber[i] >= 0
&& lra_reg_info[i].call_insn
&& overlaps_hard_reg_set_p (call_used_reg_set,
&& overlaps_hard_reg_set_p (call_used_or_fixed_regs,
PSEUDO_REGNO_MODE (i), reg_renumber[i]))
gcc_unreachable ();
/* Setup insns to process on the next constraint pass. */

View File

@ -5439,7 +5439,7 @@ need_for_call_save_p (int regno)
((flag_ipa_ra &&
! hard_reg_set_empty_p (lra_reg_info[regno].actual_call_used_reg_set))
? lra_reg_info[regno].actual_call_used_reg_set
: call_used_reg_set,
: call_used_or_fixed_regs,
PSEUDO_REGNO_MODE (regno), reg_renumber[regno])
|| (targetm.hard_regno_call_part_clobbered
(lra_reg_info[regno].call_insn,
@ -5483,7 +5483,7 @@ need_for_split_p (HARD_REG_SET potential_reload_hard_regs, int regno)
true) the assign pass assumes that all pseudos living
through calls are assigned to call saved hard regs. */
&& (regno >= FIRST_PSEUDO_REGISTER
|| ! TEST_HARD_REG_BIT (call_used_reg_set, regno)
|| ! TEST_HARD_REG_BIT (call_used_or_fixed_regs, regno)
|| usage_insns[regno].calls_num == calls_num)
/* We need at least 2 reloads to make pseudo splitting
profitable. We should provide hard regno splitting in
@ -6458,7 +6458,7 @@ inherit_in_ebb (rtx_insn *head, rtx_insn *tail)
/* If there are pending saves/restores, the
optimization is not worth. */
&& usage_insns[regno].calls_num == calls_num - 1
&& TEST_HARD_REG_BIT (call_used_reg_set, hard_regno))
&& TEST_HARD_REG_BIT (call_used_or_fixed_regs, hard_regno))
{
/* Restore the pseudo from the call result as
REG_RETURNED note says that the pseudo value is

View File

@ -928,12 +928,12 @@ process_bb_lives (basic_block bb, int &curr_point, bool dead_insn_p)
{
call_insn = curr_insn;
if (! flag_ipa_ra && ! targetm.return_call_with_max_clobbers)
last_call_used_reg_set = call_used_reg_set;
last_call_used_reg_set = call_used_or_fixed_regs;
else
{
HARD_REG_SET this_call_used_reg_set;
get_call_reg_set_usage (curr_insn, &this_call_used_reg_set,
call_used_reg_set);
call_used_or_fixed_regs);
bool flush = (! hard_reg_set_empty_p (last_call_used_reg_set)
&& (last_call_used_reg_set

View File

@ -69,9 +69,9 @@ along with GCC; see the file COPYING3. If not see
/* Number of candidates for rematerialization. */
static unsigned int cands_num;
/* The following is used for representation of call_used_reg_set in
/* The following is used for representation of call_used_or_fixed_regs in
form array whose elements are hard register numbers with nonzero bit
in CALL_USED_REG_SET. */
in CALL_USED_OR_FIXED_REGS. */
static int call_used_regs_arr_len;
static int call_used_regs_arr[FIRST_PSEUDO_REGISTER];
@ -710,7 +710,7 @@ call_used_input_regno_present_p (rtx_insn *insn)
reg != NULL;
reg = reg->next)
if (reg->type == OP_IN && reg->regno < FIRST_PSEUDO_REGISTER
&& TEST_HARD_REG_BIT (call_used_reg_set, reg->regno))
&& TEST_HARD_REG_BIT (call_used_or_fixed_regs, reg->regno))
return true;
return false;
}

View File

@ -1332,7 +1332,7 @@ reload_combine (void)
rtx link;
HARD_REG_SET used_regs;
get_call_reg_set_usage (insn, &used_regs, call_used_reg_set);
get_call_reg_set_usage (insn, &used_regs, call_used_or_fixed_regs);
for (r = 0; r < FIRST_PSEUDO_REGISTER; r++)
if (TEST_HARD_REG_BIT (used_regs, r))

View File

@ -350,7 +350,6 @@ init_reg_sets_1 (void)
/* Initialize "constant" tables. */
CLEAR_HARD_REG_SET (fixed_reg_set);
CLEAR_HARD_REG_SET (call_used_reg_set);
CLEAR_HARD_REG_SET (regs_invalidated_by_call);
operand_reg_set &= accessible_reg_set;
@ -384,9 +383,6 @@ init_reg_sets_1 (void)
if (fixed_regs[i])
SET_HARD_REG_BIT (fixed_reg_set, i);
if (call_used_regs[i])
SET_HARD_REG_BIT (call_used_reg_set, i);
/* There are a couple of fixed registers that we know are safe to
exclude from being clobbered by calls:
@ -426,7 +422,6 @@ init_reg_sets_1 (void)
{
fixed_regs[i] = call_used_regs[i] = 1;
SET_HARD_REG_BIT (fixed_reg_set, i);
SET_HARD_REG_BIT (call_used_reg_set, i);
}
}
@ -779,7 +774,6 @@ globalize_reg (tree decl, int i)
#endif
SET_HARD_REG_BIT (fixed_reg_set, i);
SET_HARD_REG_BIT (call_used_reg_set, i);
reinit_regs ();
}

View File

@ -367,7 +367,7 @@ find_rename_reg (du_head_p this_head, enum reg_class super_class,
If the chain needs a call-saved register, mark the call-used
registers as unavailable. */
if (this_head->need_caller_save_reg)
*unavailable |= call_used_reg_set;
*unavailable |= call_used_or_fixed_regs;
/* Mark registers that overlap this chain's lifetime as unavailable. */
merge_overlapping_regs (unavailable, this_head);

View File

@ -4784,7 +4784,7 @@ reload_as_needed (int live_known)
be partially clobbered by the call. */
else if (CALL_P (insn))
{
reg_reloaded_valid &= ~(call_used_reg_set
reg_reloaded_valid &= ~(call_used_or_fixed_regs
| reg_reloaded_call_part_clobbered);
/* If this is a call to a setjmp-type function, we must not

View File

@ -1477,7 +1477,7 @@ find_all_hard_reg_sets (const rtx_insn *insn, HARD_REG_SET *pset, bool implicit)
CLEAR_HARD_REG_SET (*pset);
note_stores (insn, record_hard_reg_sets, pset);
if (CALL_P (insn) && implicit)
*pset |= call_used_reg_set;
*pset |= call_used_or_fixed_regs;
for (link = REG_NOTES (insn); link; link = XEXP (link, 1))
if (REG_NOTE_KIND (link) == REG_INC)
record_hard_reg_sets (XEXP (link, 0), NULL, pset);

View File

@ -1224,10 +1224,10 @@ mark_unavailable_hard_regs (def_t def, struct reg_rename *reg_rename_p,
reg_rename_p->unavailable_hard_regs |= sel_hrd.stack_regs;
#endif
/* If there's a call on this path, make regs from call_used_reg_set
/* If there's a call on this path, make regs from call_used_or_fixed_regs
unavailable. */
if (def->crosses_call)
reg_rename_p->unavailable_hard_regs |= call_used_reg_set;
reg_rename_p->unavailable_hard_regs |= call_used_or_fixed_regs;
/* Stop here before reload: we need FRAME_REGS, STACK_REGS, and crosses_call,
but not register classes. */

View File

@ -76,7 +76,7 @@ requires_stack_frame_p (rtx_insn *insn, HARD_REG_SET prologue_used,
}
if (hard_reg_set_intersect_p (hardregs, prologue_used))
return true;
hardregs &= ~call_used_reg_set;
hardregs &= ~call_used_or_fixed_regs;
for (regno = 0; regno < FIRST_PSEUDO_REGISTER; regno++)
if (TEST_HARD_REG_BIT (hardregs, regno)
&& df_regs_ever_live_p (regno))