Remove clobber_high

The AArch64 SVE tlsdesc patterns were the main motivating reason
for clobber_high.  It's no longer needed now that the patterns use
calls instead.

At the time, one of the possible future uses for clobber_high was for
asm statements.  However, the current code wouldn't handle that case
without modification, so I think we might as well remove it for now.
We can always reapply it in future if it turns out to be useful again.

2019-10-01  Richard Sandiford  <richard.sandiford@arm.com>

gcc/
	* rtl.def (CLOBBER_HIGH): Delete.
	* doc/rtl.texi (clobber_high): Remove documentation.
	* rtl.h (SET_DEST): Remove CLOBBER_HIGH from the list of codes.
	(reg_is_clobbered_by_clobber_high): Delete.
	(gen_hard_reg_clobber_high): Likewise.
	* alias.c (record_set): Remove CLOBBER_HIGH handling.
	* cfgexpand.c (expand_gimple_stmt): Likewise.
	* combine-stack-adj.c (single_set_for_csa): Likewise.
	* combine.c (find_single_use_1, set_nonzero_bits_and_sign_copies)
	(can_combine_p, is_parallel_of_n_reg_sets, try_combine)
	(record_dead_and_set_regs_1, reg_dead_at_p_1): Likewise.
	* cse.c (invalidate_reg): Remove clobber_high parameter.
	(invalidate): Update call accordingly.
	(canonicalize_insn): Remove CLOBBER_HIGH handling.
	(invalidate_from_clobbers, invalidate_from_sets_and_clobbers)
	(count_reg_usage, insn_live_p): Likewise.
	* cselib.h (cselib_invalidate_rtx): Remove sett argument.
	* cselib.c (cselib_invalidate_regno, cselib_invalidate_rtx): Likewise.
	(cselib_invalidate_rtx_note_stores): Update call accordingly.
	(cselib_expand_value_rtx_1): Remove CLOBBER_HIGH handling.
	(cselib_invalidate_regno, cselib_process_insn): Likewise.
	* dce.c (deletable_insn_p, mark_nonreg_stores_1): Likewise.
	(mark_nonreg_stores_2): Likewise.
	* df-scan.c (df_find_hard_reg_defs, df_uses_record): Likewise.
	(df_get_call_refs): Likewise.
	* dwarf2out.c (mem_loc_descriptor): Likewise.
	* emit-rtl.c (verify_rtx_sharing): Likewise.
	(copy_insn_1, copy_rtx_if_shared_1): Likewise.
	(hard_reg_clobbers_high, gen_hard_reg_clobber_high): Delete.
	* genconfig.c (walk_insn_part): Remove CLOBBER_HIGH handling.
	* genemit.c (gen_exp, gen_insn): Likewise.
	* genrecog.c (validate_pattern, remove_clobbers): Likewise.
	* haifa-sched.c (haifa_classify_rtx): Likewise.
	* ira-build.c (create_insn_allocnos): Likewise.
	* ira-costs.c (scan_one_insn): Likewise.
	* ira.c (equiv_init_movable_p, memref_referenced_p): Likewise.
	(rtx_moveable_p, interesting_dest_for_shprep): Likewise.
	* jump.c (mark_jump_label_1): Likewise.
	* lra-int.h (lra_insn_reg::clobber_high): Delete.
	* lra-eliminations.c (lra_eliminate_regs_1): Remove CLOBBER_HIGH
	handling.
	(mark_not_eliminable): Likewise.
	* lra-lives.c (process_bb_lives): Likewise.
	* lra.c (new_insn_reg): Remove clobber_high parameter.
	(collect_non_operand_hard_regs): Likewise.  Update call to new
	insn_reg.  Remove CLOBBER_HIGH handling.
	(lra_set_insn_recog_data): Remove CLOBBER_HIGH handling.  Update call
	to collect_non_operand_hard_regs.
	(add_regs_to_insn_regno_info): Remove CLOBBER_HIGH handling.
	Update call to new_insn_reg.
	(lra_update_insn_regno_info): Remove CLOBBER_HIGH handling.
	* postreload.c (reload_cse_simplify, reload_combine_note_use)
	(move2add_note_store): Likewise.
	* print-rtl.c (print_pattern): Likewise.
	* recog.c (store_data_bypass_p_1, store_data_bypass_p): Likewise.
	(if_test_bypass_p): Likewise.
	* regcprop.c (kill_clobbered_value, kill_set_value): Likewise.
	* reginfo.c (reg_scan_mark_refs): Likewise.
	* reload1.c (maybe_fix_stack_asms, eliminate_regs_1): Likewise.
	(elimination_effects, mark_not_eliminable, scan_paradoxical_subregs)
	(forget_old_reloads_1): Likewise.
	* reorg.c (find_end_label, try_merge_delay_insns, redundant_insn)
	(own_thread_p, fill_simple_delay_slots, fill_slots_from_thread)
	(dbr_schedule): Likewise.
	* resource.c (update_live_status, mark_referenced_resources)
	(mark_set_resources): Likewise.
	* rtl.c (copy_rtx): Likewise.
	* rtlanal.c (reg_referenced_p, set_of_1, single_set_2, noop_move_p)
	(note_pattern_stores): Likewise.
	(reg_is_clobbered_by_clobber_high): Delete.
	* sched-deps.c (sched_analyze_reg, sched_analyze_insn): Remove
	CLOBBER_HIGH handling.

From-SVN: r276393
This commit is contained in:
Richard Sandiford 2019-10-01 08:55:50 +00:00 committed by Richard Sandiford
parent bb6ce448fc
commit 17d184e5c4
38 changed files with 152 additions and 450 deletions

View File

@ -1,3 +1,78 @@
2019-10-01 Richard Sandiford <richard.sandiford@arm.com>
* rtl.def (CLOBBER_HIGH): Delete.
* doc/rtl.texi (clobber_high): Remove documentation.
* rtl.h (SET_DEST): Remove CLOBBER_HIGH from the list of codes.
(reg_is_clobbered_by_clobber_high): Delete.
(gen_hard_reg_clobber_high): Likewise.
* alias.c (record_set): Remove CLOBBER_HIGH handling.
* cfgexpand.c (expand_gimple_stmt): Likewise.
* combine-stack-adj.c (single_set_for_csa): Likewise.
* combine.c (find_single_use_1, set_nonzero_bits_and_sign_copies)
(can_combine_p, is_parallel_of_n_reg_sets, try_combine)
(record_dead_and_set_regs_1, reg_dead_at_p_1): Likewise.
* cse.c (invalidate_reg): Remove clobber_high parameter.
(invalidate): Update call accordingly.
(canonicalize_insn): Remove CLOBBER_HIGH handling.
(invalidate_from_clobbers, invalidate_from_sets_and_clobbers)
(count_reg_usage, insn_live_p): Likewise.
* cselib.h (cselib_invalidate_rtx): Remove sett argument.
* cselib.c (cselib_invalidate_regno, cselib_invalidate_rtx): Likewise.
(cselib_invalidate_rtx_note_stores): Update call accordingly.
(cselib_expand_value_rtx_1): Remove CLOBBER_HIGH handling.
(cselib_invalidate_regno, cselib_process_insn): Likewise.
* dce.c (deletable_insn_p, mark_nonreg_stores_1): Likewise.
(mark_nonreg_stores_2): Likewise.
* df-scan.c (df_find_hard_reg_defs, df_uses_record): Likewise.
(df_get_call_refs): Likewise.
* dwarf2out.c (mem_loc_descriptor): Likewise.
* emit-rtl.c (verify_rtx_sharing): Likewise.
(copy_insn_1, copy_rtx_if_shared_1): Likewise.
(hard_reg_clobbers_high, gen_hard_reg_clobber_high): Delete.
* genconfig.c (walk_insn_part): Remove CLOBBER_HIGH handling.
* genemit.c (gen_exp, gen_insn): Likewise.
* genrecog.c (validate_pattern, remove_clobbers): Likewise.
* haifa-sched.c (haifa_classify_rtx): Likewise.
* ira-build.c (create_insn_allocnos): Likewise.
* ira-costs.c (scan_one_insn): Likewise.
* ira.c (equiv_init_movable_p, memref_referenced_p): Likewise.
(rtx_moveable_p, interesting_dest_for_shprep): Likewise.
* jump.c (mark_jump_label_1): Likewise.
* lra-int.h (lra_insn_reg::clobber_high): Delete.
* lra-eliminations.c (lra_eliminate_regs_1): Remove CLOBBER_HIGH
handling.
(mark_not_eliminable): Likewise.
* lra-lives.c (process_bb_lives): Likewise.
* lra.c (new_insn_reg): Remove clobber_high parameter.
(collect_non_operand_hard_regs): Likewise. Update call to new
insn_reg. Remove CLOBBER_HIGH handling.
(lra_set_insn_recog_data): Remove CLOBBER_HIGH handling. Update call
to collect_non_operand_hard_regs.
(add_regs_to_insn_regno_info): Remove CLOBBER_HIGH handling.
Update call to new_insn_reg.
(lra_update_insn_regno_info): Remove CLOBBER_HIGH handling.
* postreload.c (reload_cse_simplify, reload_combine_note_use)
(move2add_note_store): Likewise.
* print-rtl.c (print_pattern): Likewise.
* recog.c (store_data_bypass_p_1, store_data_bypass_p): Likewise.
(if_test_bypass_p): Likewise.
* regcprop.c (kill_clobbered_value, kill_set_value): Likewise.
* reginfo.c (reg_scan_mark_refs): Likewise.
* reload1.c (maybe_fix_stack_asms, eliminate_regs_1): Likewise.
(elimination_effects, mark_not_eliminable, scan_paradoxical_subregs)
(forget_old_reloads_1): Likewise.
* reorg.c (find_end_label, try_merge_delay_insns, redundant_insn)
(own_thread_p, fill_simple_delay_slots, fill_slots_from_thread)
(dbr_schedule): Likewise.
* resource.c (update_live_status, mark_referenced_resources)
(mark_set_resources): Likewise.
* rtl.c (copy_rtx): Likewise.
* rtlanal.c (reg_referenced_p, set_of_1, single_set_2, noop_move_p)
(note_pattern_stores): Likewise.
(reg_is_clobbered_by_clobber_high): Delete.
* sched-deps.c (sched_analyze_reg, sched_analyze_insn): Remove
CLOBBER_HIGH handling.
2019-10-01 Richard Sandiford <richard.sandiford@arm.com>
PR target/91452

View File

@ -1556,16 +1556,6 @@ record_set (rtx dest, const_rtx set, void *data ATTRIBUTE_UNUSED)
new_reg_base_value[regno] = 0;
return;
}
/* A CLOBBER_HIGH only wipes out the old value if the mode of the old
value is greater than that of the clobber. */
else if (GET_CODE (set) == CLOBBER_HIGH)
{
if (new_reg_base_value[regno] != 0
&& reg_is_clobbered_by_clobber_high (
regno, GET_MODE (new_reg_base_value[regno]), XEXP (set, 0)))
new_reg_base_value[regno] = 0;
return;
}
src = SET_SRC (set);
}

View File

@ -3891,7 +3891,6 @@ expand_gimple_stmt (gimple *stmt)
/* If we want exceptions for non-call insns, any
may_trap_p instruction may throw. */
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& GET_CODE (PATTERN (insn)) != CLOBBER_HIGH
&& GET_CODE (PATTERN (insn)) != USE
&& insn_could_throw_p (insn))
make_reg_eh_region_note (insn, 0, lp_nr);

View File

@ -133,7 +133,6 @@ single_set_for_csa (rtx_insn *insn)
&& SET_SRC (this_rtx) == SET_DEST (this_rtx))
;
else if (GET_CODE (this_rtx) != CLOBBER
&& GET_CODE (this_rtx) != CLOBBER_HIGH
&& GET_CODE (this_rtx) != USE)
return NULL_RTX;
}

View File

@ -572,7 +572,6 @@ find_single_use_1 (rtx dest, rtx *loc)
case SYMBOL_REF:
CASE_CONST_ANY:
case CLOBBER:
case CLOBBER_HIGH:
return 0;
case SET:
@ -1763,9 +1762,6 @@ set_nonzero_bits_and_sign_copies (rtx x, const_rtx set, void *data)
return;
}
/* Should not happen as we only using pseduo registers. */
gcc_assert (GET_CODE (set) != CLOBBER_HIGH);
/* If this register is being initialized using itself, and the
register is uninitialized in this basic block, and there are
no LOG_LINKS which set the register, then part of the
@ -1924,7 +1920,6 @@ can_combine_p (rtx_insn *insn, rtx_insn *i3, rtx_insn *pred ATTRIBUTE_UNUSED,
/* We can ignore CLOBBERs. */
case CLOBBER:
case CLOBBER_HIGH:
break;
case SET:
@ -2595,8 +2590,6 @@ is_parallel_of_n_reg_sets (rtx pat, int n)
if (XEXP (XVECEXP (pat, 0, i), 0) == const0_rtx)
return false;
break;
case CLOBBER_HIGH:
break;
default:
return false;
}
@ -2897,8 +2890,7 @@ try_combine (rtx_insn *i3, rtx_insn *i2, rtx_insn *i1, rtx_insn *i0,
for (i = 0; ok && i < XVECLEN (p2, 0); i++)
{
if ((GET_CODE (XVECEXP (p2, 0, i)) == SET
|| GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER
|| GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER_HIGH)
|| GET_CODE (XVECEXP (p2, 0, i)) == CLOBBER)
&& reg_overlap_mentioned_p (SET_DEST (PATTERN (i3)),
SET_DEST (XVECEXP (p2, 0, i))))
ok = false;
@ -13409,15 +13401,6 @@ record_dead_and_set_regs_1 (rtx dest, const_rtx setter, void *data)
? SET_SRC (setter)
: gen_lowpart (GET_MODE (dest),
SET_SRC (setter)));
else if (GET_CODE (setter) == CLOBBER_HIGH)
{
reg_stat_type *rsp = &reg_stat[REGNO (dest)];
if (rsp->last_set_value
&& reg_is_clobbered_by_clobber_high
(REGNO (dest), GET_MODE (rsp->last_set_value),
XEXP (setter, 0)))
record_value_for_reg (dest, NULL, NULL_RTX);
}
else
record_value_for_reg (dest, record_dead_insn, NULL_RTX);
}
@ -13863,10 +13846,6 @@ reg_dead_at_p_1 (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED)
if (!REG_P (dest))
return;
if (GET_CODE (x) == CLOBBER_HIGH
&& !reg_is_clobbered_by_clobber_high (reg_dead_reg, XEXP (x, 0)))
return;
regno = REGNO (dest);
endregno = END_REGNO (dest);
if (reg_dead_endregno > regno && reg_dead_regno < endregno)

View File

@ -561,7 +561,6 @@ static struct table_elt *insert_with_costs (rtx, struct table_elt *, unsigned,
static struct table_elt *insert (rtx, struct table_elt *, unsigned,
machine_mode);
static void merge_equiv_classes (struct table_elt *, struct table_elt *);
static void invalidate_reg (rtx, bool);
static void invalidate (rtx, machine_mode);
static void remove_invalid_refs (unsigned int);
static void remove_invalid_subreg_refs (unsigned int, poly_uint64,
@ -1822,12 +1821,10 @@ check_dependence (const_rtx x, rtx exp, machine_mode mode, rtx addr)
}
/* Remove from the hash table, or mark as invalid, all expressions whose
values could be altered by storing in register X.
CLOBBER_HIGH is set if X was part of a CLOBBER_HIGH expression. */
values could be altered by storing in register X. */
static void
invalidate_reg (rtx x, bool clobber_high)
invalidate_reg (rtx x)
{
gcc_assert (GET_CODE (x) == REG);
@ -1852,10 +1849,7 @@ invalidate_reg (rtx x, bool clobber_high)
SUBREG_TICKED (regno) = -1;
if (regno >= FIRST_PSEUDO_REGISTER)
{
gcc_assert (!clobber_high);
remove_pseudo_from_table (x, hash);
}
remove_pseudo_from_table (x, hash);
else
{
HOST_WIDE_INT in_table = TEST_HARD_REG_BIT (hard_regs_in_table, regno);
@ -1883,18 +1877,10 @@ invalidate_reg (rtx x, bool clobber_high)
if (!REG_P (p->exp) || REGNO (p->exp) >= FIRST_PSEUDO_REGISTER)
continue;
if (clobber_high)
{
if (reg_is_clobbered_by_clobber_high (p->exp, x))
remove_from_table (p, hash);
}
else
{
unsigned int tregno = REGNO (p->exp);
unsigned int tendregno = END_REGNO (p->exp);
if (tendregno > regno && tregno < endregno)
remove_from_table (p, hash);
}
unsigned int tregno = REGNO (p->exp);
unsigned int tendregno = END_REGNO (p->exp);
if (tendregno > regno && tregno < endregno)
remove_from_table (p, hash);
}
}
}
@ -1921,7 +1907,7 @@ invalidate (rtx x, machine_mode full_mode)
switch (GET_CODE (x))
{
case REG:
invalidate_reg (x, false);
invalidate_reg (x);
return;
case SUBREG:
@ -4425,8 +4411,6 @@ canonicalize_insn (rtx_insn *insn, struct set **psets, int n_sets)
if (MEM_P (XEXP (x, 0)))
canon_reg (XEXP (x, 0), insn);
}
else if (GET_CODE (x) == CLOBBER_HIGH)
gcc_assert (REG_P (XEXP (x, 0)));
else if (GET_CODE (x) == USE
&& ! (REG_P (XEXP (x, 0))
&& REGNO (XEXP (x, 0)) < FIRST_PSEUDO_REGISTER))
@ -4458,8 +4442,6 @@ canonicalize_insn (rtx_insn *insn, struct set **psets, int n_sets)
if (MEM_P (XEXP (y, 0)))
canon_reg (XEXP (y, 0), insn);
}
else if (GET_CODE (y) == CLOBBER_HIGH)
gcc_assert (REG_P (XEXP (y, 0)));
else if (GET_CODE (y) == USE
&& ! (REG_P (XEXP (y, 0))
&& REGNO (XEXP (y, 0)) < FIRST_PSEUDO_REGISTER))
@ -6149,12 +6131,6 @@ invalidate_from_clobbers (rtx_insn *insn)
invalidate (XEXP (ref, 0), GET_MODE (ref));
}
}
if (GET_CODE (x) == CLOBBER_HIGH)
{
rtx ref = XEXP (x, 0);
gcc_assert (REG_P (ref));
invalidate_reg (ref, true);
}
else if (GET_CODE (x) == PARALLEL)
{
int i;
@ -6171,12 +6147,6 @@ invalidate_from_clobbers (rtx_insn *insn)
|| GET_CODE (ref) == ZERO_EXTRACT)
invalidate (XEXP (ref, 0), GET_MODE (ref));
}
else if (GET_CODE (y) == CLOBBER_HIGH)
{
rtx ref = XEXP (y, 0);
gcc_assert (REG_P (ref));
invalidate_reg (ref, true);
}
}
}
}
@ -6198,12 +6168,6 @@ invalidate_from_sets_and_clobbers (rtx_insn *insn)
rtx temx = XEXP (tem, 0);
if (GET_CODE (temx) == CLOBBER)
invalidate (SET_DEST (temx), VOIDmode);
else if (GET_CODE (temx) == CLOBBER_HIGH)
{
rtx temref = XEXP (temx, 0);
gcc_assert (REG_P (temref));
invalidate_reg (temref, true);
}
}
}
@ -6231,12 +6195,6 @@ invalidate_from_sets_and_clobbers (rtx_insn *insn)
|| GET_CODE (clobbered) == ZERO_EXTRACT)
invalidate (XEXP (clobbered, 0), GET_MODE (clobbered));
}
else if (GET_CODE (y) == CLOBBER_HIGH)
{
rtx ref = XEXP (y, 0);
gcc_assert (REG_P (ref));
invalidate_reg (ref, true);
}
else if (GET_CODE (y) == SET && GET_CODE (SET_SRC (y)) == CALL)
invalidate (SET_DEST (y), VOIDmode);
}
@ -6896,10 +6854,6 @@ count_reg_usage (rtx x, int *counts, rtx dest, int incr)
count_reg_usage (XEXP (XEXP (x, 0), 0), counts, NULL_RTX, incr);
return;
case CLOBBER_HIGH:
gcc_assert (REG_P ((XEXP (x, 0))));
return;
case SET:
/* Unless we are setting a REG, count everything in SET_DEST. */
if (!REG_P (SET_DEST (x)))
@ -6952,8 +6906,7 @@ count_reg_usage (rtx x, int *counts, rtx dest, int incr)
|| (REG_NOTE_KIND (x) != REG_NONNEG && GET_CODE (XEXP (x,0)) == USE)
/* FUNCTION_USAGE expression lists may include (CLOBBER (mem /u)),
involving registers in the address. */
|| GET_CODE (XEXP (x, 0)) == CLOBBER
|| GET_CODE (XEXP (x, 0)) == CLOBBER_HIGH)
|| GET_CODE (XEXP (x, 0)) == CLOBBER)
count_reg_usage (XEXP (x, 0), counts, NULL_RTX, incr);
count_reg_usage (XEXP (x, 1), counts, NULL_RTX, incr);
@ -7037,9 +6990,7 @@ insn_live_p (rtx_insn *insn, int *counts)
if (set_live_p (elt, insn, counts))
return true;
}
else if (GET_CODE (elt) != CLOBBER
&& GET_CODE (elt) != CLOBBER_HIGH
&& GET_CODE (elt) != USE)
else if (GET_CODE (elt) != CLOBBER && GET_CODE (elt) != USE)
return true;
}
return false;

View File

@ -55,8 +55,7 @@ static unsigned int cselib_hash_rtx (rtx, int, machine_mode);
static cselib_val *new_cselib_val (unsigned int, machine_mode, rtx);
static void add_mem_for_addr (cselib_val *, cselib_val *, rtx);
static cselib_val *cselib_lookup_mem (rtx, int);
static void cselib_invalidate_regno (unsigned int, machine_mode,
const_rtx = NULL);
static void cselib_invalidate_regno (unsigned int, machine_mode);
static void cselib_invalidate_mem (rtx);
static void cselib_record_set (rtx, cselib_val *, cselib_val *);
static void cselib_record_sets (rtx_insn *);
@ -1663,7 +1662,6 @@ cselib_expand_value_rtx_1 (rtx orig, struct expand_value_data *evd,
/* SCRATCH must be shared because they represent distinct values. */
return orig;
case CLOBBER:
case CLOBBER_HIGH:
if (REG_P (XEXP (orig, 0)) && HARD_REGISTER_NUM_P (REGNO (XEXP (orig, 0))))
return orig;
break;
@ -2166,8 +2164,7 @@ cselib_lookup (rtx x, machine_mode mode,
invalidating call clobbered registers across a call. */
static void
cselib_invalidate_regno (unsigned int regno, machine_mode mode,
const_rtx setter)
cselib_invalidate_regno (unsigned int regno, machine_mode mode)
{
unsigned int endregno;
unsigned int i;
@ -2190,9 +2187,6 @@ cselib_invalidate_regno (unsigned int regno, machine_mode mode,
i = regno - max_value_regs;
endregno = end_hard_regno (mode, regno);
if (setter && GET_CODE (setter) == CLOBBER_HIGH)
gcc_assert (endregno == regno + 1);
}
else
{
@ -2225,19 +2219,6 @@ cselib_invalidate_regno (unsigned int regno, machine_mode mode,
continue;
}
/* Ignore if clobber high and the register isn't clobbered. */
if (setter && GET_CODE (setter) == CLOBBER_HIGH)
{
gcc_assert (endregno == regno + 1);
const_rtx x = XEXP (setter, 0);
if (!reg_is_clobbered_by_clobber_high (i, GET_MODE (v->val_rtx),
x))
{
l = &(*l)->next;
continue;
}
}
/* We have an overlap. */
if (*l == REG_VALUES (i))
{
@ -2372,10 +2353,10 @@ cselib_invalidate_mem (rtx mem_rtx)
*vp = &dummy_val;
}
/* Invalidate DEST, which is being assigned to or clobbered by SETTER. */
/* Invalidate DEST. */
void
cselib_invalidate_rtx (rtx dest, const_rtx setter)
cselib_invalidate_rtx (rtx dest)
{
while (GET_CODE (dest) == SUBREG
|| GET_CODE (dest) == ZERO_EXTRACT
@ -2383,7 +2364,7 @@ cselib_invalidate_rtx (rtx dest, const_rtx setter)
dest = XEXP (dest, 0);
if (REG_P (dest))
cselib_invalidate_regno (REGNO (dest), GET_MODE (dest), setter);
cselib_invalidate_regno (REGNO (dest), GET_MODE (dest));
else if (MEM_P (dest))
cselib_invalidate_mem (dest);
}
@ -2391,10 +2372,10 @@ cselib_invalidate_rtx (rtx dest, const_rtx setter)
/* A wrapper for cselib_invalidate_rtx to be called via note_stores. */
static void
cselib_invalidate_rtx_note_stores (rtx dest, const_rtx setter,
cselib_invalidate_rtx_note_stores (rtx dest, const_rtx,
void *data ATTRIBUTE_UNUSED)
{
cselib_invalidate_rtx (dest, setter);
cselib_invalidate_rtx (dest);
}
/* Record the result of a SET instruction. DEST is being set; the source
@ -2809,11 +2790,9 @@ cselib_process_insn (rtx_insn *insn)
if (CALL_P (insn))
{
for (x = CALL_INSN_FUNCTION_USAGE (insn); x; x = XEXP (x, 1))
{
gcc_assert (GET_CODE (XEXP (x, 0)) != CLOBBER_HIGH);
if (GET_CODE (XEXP (x, 0)) == CLOBBER)
cselib_invalidate_rtx (XEXP (XEXP (x, 0), 0));
}
if (GET_CODE (XEXP (x, 0)) == CLOBBER)
cselib_invalidate_rtx (XEXP (XEXP (x, 0), 0));
/* Flush everything on setjmp. */
if (cselib_preserve_constants
&& find_reg_note (insn, REG_SETJMP, NULL))

View File

@ -92,7 +92,7 @@ extern bool cselib_dummy_expand_value_rtx_cb (rtx, bitmap, int,
cselib_expand_callback, void *);
extern rtx cselib_subst_to_values (rtx, machine_mode);
extern rtx cselib_subst_to_values_from_insn (rtx, machine_mode, rtx_insn *);
extern void cselib_invalidate_rtx (rtx, const_rtx = NULL);
extern void cselib_invalidate_rtx (rtx);
extern void cselib_reset_table (unsigned int);
extern unsigned int cselib_get_next_uid (void);

View File

@ -174,7 +174,6 @@ deletable_insn_p (rtx_insn *insn, bool fast, bitmap arg_stores)
return false;
case CLOBBER:
case CLOBBER_HIGH:
if (fast)
{
/* A CLOBBER of a dead pseudo register serves no purpose.
@ -244,10 +243,7 @@ static void
mark_nonreg_stores_1 (rtx dest, const_rtx pattern, void *data)
{
if (GET_CODE (pattern) != CLOBBER && !REG_P (dest))
{
gcc_checking_assert (GET_CODE (pattern) != CLOBBER_HIGH);
mark_insn ((rtx_insn *) data, true);
}
mark_insn ((rtx_insn *) data, true);
}
@ -258,10 +254,7 @@ static void
mark_nonreg_stores_2 (rtx dest, const_rtx pattern, void *data)
{
if (GET_CODE (pattern) != CLOBBER && !REG_P (dest))
{
gcc_checking_assert (GET_CODE (pattern) != CLOBBER_HIGH);
mark_insn ((rtx_insn *) data, false);
}
mark_insn ((rtx_insn *) data, false);
}

View File

@ -2775,7 +2775,6 @@ df_find_hard_reg_defs (rtx x, HARD_REG_SET *defs)
break;
case CLOBBER:
case CLOBBER_HIGH:
df_find_hard_reg_defs_1 (XEXP (x, 0), defs);
break;
@ -2835,10 +2834,6 @@ df_uses_record (class df_collection_rec *collection_rec,
/* If we're clobbering a REG then we have a def so ignore. */
return;
case CLOBBER_HIGH:
gcc_assert (REG_P (XEXP (x, 0)));
return;
case MEM:
df_uses_record (collection_rec,
&XEXP (x, 0), DF_REF_REG_MEM_LOAD,
@ -3133,7 +3128,6 @@ df_get_call_refs (class df_collection_rec *collection_rec,
for (note = CALL_INSN_FUNCTION_USAGE (insn_info->insn); note;
note = XEXP (note, 1))
{
gcc_assert (GET_CODE (XEXP (note, 0)) != CLOBBER_HIGH);
if (GET_CODE (XEXP (note, 0)) == USE)
df_uses_record (collection_rec, &XEXP (XEXP (note, 0), 0),
DF_REF_REG_USE, bb, insn_info, flags);

View File

@ -3295,18 +3295,6 @@ There is one other known use for clobbering a pseudo register in a
clobbered by the insn. In this case, using the same pseudo register in
the clobber and elsewhere in the insn produces the expected results.
@findex clobber_high
@item (clobber_high @var{x})
Represents the storing or possible storing of an unpredictable,
undescribed value into the upper parts of @var{x}. The mode of the expression
represents the lower parts of the register which will not be overwritten.
@code{reg} must be a reg expression.
One place this is used is when calling into functions where the registers are
preserved, but only up to a given number of bits. For example when using
Aarch64 SVE, calling a TLS descriptor will cause only the lower 128 bits of
each of the vector registers to be preserved.
@findex use
@item (use @var{x})
Represents the use of the value of @var{x}. It indicates that the
@ -3360,8 +3348,7 @@ Represents several side effects performed in parallel. The square
brackets stand for a vector; the operand of @code{parallel} is a
vector of expressions. @var{x0}, @var{x1} and so on are individual
side effect expressions---expressions of code @code{set}, @code{call},
@code{return}, @code{simple_return}, @code{clobber} @code{use} or
@code{clobber_high}.
@code{return}, @code{simple_return}, @code{clobber} or @code{use}.
``In parallel'' means that first all the values used in the individual
side-effects are computed, and second all the actual side-effects are

View File

@ -16430,7 +16430,6 @@ mem_loc_descriptor (rtx rtl, machine_mode mode,
case CONST_FIXED:
case CLRSB:
case CLOBBER:
case CLOBBER_HIGH:
break;
case CONST_STRING:

View File

@ -2898,7 +2898,6 @@ verify_rtx_sharing (rtx orig, rtx insn)
/* SCRATCH must be shared because they represent distinct values. */
return;
case CLOBBER:
case CLOBBER_HIGH:
/* Share clobbers of hard registers (like cc0), but do not share pseudo reg
clobbers or clobbers of hard registers that originated as pseudos.
This is needed to allow safe register renaming. */
@ -3152,7 +3151,6 @@ repeat:
/* SCRATCH must be shared because they represent distinct values. */
return;
case CLOBBER:
case CLOBBER_HIGH:
/* Share clobbers of hard registers (like cc0), but do not share pseudo reg
clobbers or clobbers of hard registers that originated as pseudos.
This is needed to allow safe register renaming. */
@ -5726,7 +5724,6 @@ copy_insn_1 (rtx orig)
case SIMPLE_RETURN:
return orig;
case CLOBBER:
case CLOBBER_HIGH:
/* Share clobbers of hard registers (like cc0), but do not share pseudo reg
clobbers or clobbers of hard registers that originated as pseudos.
This is needed to allow safe register renaming. */
@ -6538,21 +6535,6 @@ gen_hard_reg_clobber (machine_mode mode, unsigned int regno)
gen_rtx_CLOBBER (VOIDmode, gen_rtx_REG (mode, regno)));
}
static GTY((deletable)) rtx
hard_reg_clobbers_high[NUM_MACHINE_MODES][FIRST_PSEUDO_REGISTER];
/* Return a CLOBBER_HIGH expression for register REGNO that clobbers MODE,
caching into HARD_REG_CLOBBERS_HIGH. */
rtx
gen_hard_reg_clobber_high (machine_mode mode, unsigned int regno)
{
if (hard_reg_clobbers_high[mode][regno])
return hard_reg_clobbers_high[mode][regno];
else
return (hard_reg_clobbers_high[mode][regno]
= gen_rtx_CLOBBER_HIGH (VOIDmode, gen_rtx_REG (mode, regno)));
}
location_t prologue_location;
location_t epilogue_location;

View File

@ -72,7 +72,6 @@ walk_insn_part (rtx part, int recog_p, int non_pc_set_src)
switch (code)
{
case CLOBBER:
case CLOBBER_HIGH:
clobbers_seen_this_insn++;
break;

View File

@ -169,15 +169,6 @@ gen_exp (rtx x, enum rtx_code subroutine_type, char *used, md_rtx_info *info)
return;
}
break;
case CLOBBER_HIGH:
if (!REG_P (XEXP (x, 0)))
error ("CLOBBER_HIGH argument is not a register expr, at %s:%d",
info->loc.filename, info->loc.lineno);
printf ("gen_hard_reg_clobber_high (%smode, %i)",
GET_MODE_NAME (GET_MODE (XEXP (x, 0))),
REGNO (XEXP (x, 0)));
return;
break;
case CC0:
printf ("cc0_rtx");
return;
@ -343,8 +334,7 @@ gen_insn (md_rtx_info *info)
for (i = XVECLEN (insn, 1) - 1; i > 0; i--)
{
if (GET_CODE (XVECEXP (insn, 1, i)) != CLOBBER
&& GET_CODE (XVECEXP (insn, 1, i)) != CLOBBER_HIGH)
if (GET_CODE (XVECEXP (insn, 1, i)) != CLOBBER)
break;
if (REG_P (XEXP (XVECEXP (insn, 1, i), 0)))

View File

@ -718,7 +718,6 @@ validate_pattern (rtx pattern, md_rtx_info *info, rtx set, int set_code)
}
case CLOBBER:
case CLOBBER_HIGH:
validate_pattern (SET_DEST (pattern), info, pattern, '=');
return;
@ -5313,7 +5312,7 @@ remove_clobbers (acceptance_type *acceptance_ptr, rtx *pattern_ptr)
for (i = XVECLEN (pattern, 0); i > 0; i--)
{
rtx x = XVECEXP (pattern, 0, i - 1);
if ((GET_CODE (x) != CLOBBER && GET_CODE (x) != CLOBBER_HIGH)
if (GET_CODE (x) != CLOBBER
|| (!REG_P (XEXP (x, 0))
&& GET_CODE (XEXP (x, 0)) != MATCH_SCRATCH))
break;

View File

@ -530,9 +530,6 @@ haifa_classify_rtx (const_rtx x)
/* Test if it is a 'store'. */
tmp_class = may_trap_exp (XEXP (x, 0), 1);
break;
case CLOBBER_HIGH:
gcc_assert (REG_P (XEXP (x, 0)));
break;
case SET:
/* Test if it is a store. */
tmp_class = may_trap_exp (SET_DEST (x), 1);

View File

@ -1873,11 +1873,6 @@ create_insn_allocnos (rtx x, rtx outer, bool output_p)
create_insn_allocnos (XEXP (x, 0), NULL, true);
return;
}
else if (code == CLOBBER_HIGH)
{
gcc_assert (REG_P (XEXP (x, 0)) && HARD_REGISTER_P (XEXP (x, 0)));
return;
}
else if (code == MEM)
{
create_insn_allocnos (XEXP (x, 0), NULL, false);

View File

@ -1477,13 +1477,6 @@ scan_one_insn (rtx_insn *insn)
return insn;
}
if (pat_code == CLOBBER_HIGH)
{
gcc_assert (REG_P (XEXP (PATTERN (insn), 0))
&& HARD_REGISTER_P (XEXP (PATTERN (insn), 0)));
return insn;
}
counted_mem = false;
set = single_set (insn);
extract_insn (insn);

View File

@ -3063,7 +3063,6 @@ equiv_init_movable_p (rtx x, int regno)
case CC0:
case CLOBBER:
case CLOBBER_HIGH:
return 0;
case PRE_INC:
@ -3170,7 +3169,6 @@ memref_referenced_p (rtx memref, rtx x, bool read_p)
return memref_referenced_p (memref, SET_SRC (x), true);
case CLOBBER:
case CLOBBER_HIGH:
if (process_set_for_memref_referenced_p (memref, XEXP (x, 0)))
return true;
@ -4451,7 +4449,6 @@ rtx_moveable_p (rtx *loc, enum op_type type)
&& rtx_moveable_p (&XEXP (x, 2), OP_IN));
case CLOBBER:
case CLOBBER_HIGH:
return rtx_moveable_p (&SET_DEST (x), OP_OUT);
case UNSPEC_VOLATILE:
@ -4904,9 +4901,7 @@ interesting_dest_for_shprep (rtx_insn *insn, basic_block call_dom)
for (int i = 0; i < XVECLEN (pat, 0); i++)
{
rtx sub = XVECEXP (pat, 0, i);
if (GET_CODE (sub) == USE
|| GET_CODE (sub) == CLOBBER
|| GET_CODE (sub) == CLOBBER_HIGH)
if (GET_CODE (sub) == USE || GET_CODE (sub) == CLOBBER)
continue;
if (GET_CODE (sub) != SET
|| side_effects_p (sub))

View File

@ -1094,7 +1094,6 @@ mark_jump_label_1 (rtx x, rtx_insn *insn, bool in_mem, bool is_target)
case CC0:
case REG:
case CLOBBER:
case CLOBBER_HIGH:
case CALL:
return;

View File

@ -655,7 +655,6 @@ lra_eliminate_regs_1 (rtx_insn *insn, rtx x, machine_mode mem_mode,
return x;
case CLOBBER:
case CLOBBER_HIGH:
case SET:
gcc_unreachable ();
@ -808,16 +807,6 @@ mark_not_eliminable (rtx x, machine_mode mem_mode)
setup_can_eliminate (ep, false);
return;
case CLOBBER_HIGH:
gcc_assert (REG_P (XEXP (x, 0)));
gcc_assert (REGNO (XEXP (x, 0)) < FIRST_PSEUDO_REGISTER);
for (ep = reg_eliminate;
ep < &reg_eliminate[NUM_ELIMINABLE_REGS];
ep++)
if (reg_is_clobbered_by_clobber_high (ep->to_rtx, XEXP (x, 0)))
setup_can_eliminate (ep, false);
return;
case SET:
if (SET_DEST (x) == stack_pointer_rtx
&& GET_CODE (SET_SRC (x)) == PLUS

View File

@ -154,8 +154,6 @@ struct lra_insn_reg
/* True if the reg is accessed through a subreg and the subreg is
just a part of the register. */
unsigned int subreg_p : 1;
/* True if the reg is clobber highed by the operand. */
unsigned int clobber_high : 1;
/* The corresponding regno of the register. */
int regno;
/* Next reg info of the same insn. */

View File

@ -668,7 +668,7 @@ process_bb_lives (basic_block bb, int &curr_point, bool dead_insn_p)
bool call_p;
int n_alt, dst_regno, src_regno;
rtx set;
struct lra_insn_reg *reg, *hr;
struct lra_insn_reg *reg;
if (!NONDEBUG_INSN_P (curr_insn))
continue;
@ -700,7 +700,7 @@ process_bb_lives (basic_block bb, int &curr_point, bool dead_insn_p)
break;
}
for (reg = curr_static_id->hard_regs; reg != NULL; reg = reg->next)
if (reg->type != OP_IN && !reg->clobber_high)
if (reg->type != OP_IN)
{
remove_p = false;
break;
@ -837,23 +837,13 @@ process_bb_lives (basic_block bb, int &curr_point, bool dead_insn_p)
unused values because they still conflict with quantities
that are live at the time of the definition. */
for (reg = curr_id->regs; reg != NULL; reg = reg->next)
{
if (reg->type != OP_IN)
{
update_pseudo_point (reg->regno, curr_point, USE_POINT);
mark_regno_live (reg->regno, reg->biggest_mode);
/* ??? Should be a no-op for unused registers. */
check_pseudos_live_through_calls (reg->regno, last_call_abi);
}
if (!HARD_REGISTER_NUM_P (reg->regno))
for (hr = curr_static_id->hard_regs; hr != NULL; hr = hr->next)
if (hr->clobber_high
&& maybe_gt (GET_MODE_SIZE (PSEUDO_REGNO_MODE (reg->regno)),
GET_MODE_SIZE (hr->biggest_mode)))
SET_HARD_REG_BIT (lra_reg_info[reg->regno].conflict_hard_regs,
hr->regno);
}
if (reg->type != OP_IN)
{
update_pseudo_point (reg->regno, curr_point, USE_POINT);
mark_regno_live (reg->regno, reg->biggest_mode);
/* ??? Should be a no-op for unused registers. */
check_pseudos_live_through_calls (reg->regno, last_call_abi);
}
for (reg = curr_static_id->hard_regs; reg != NULL; reg = reg->next)
if (reg->type != OP_IN)

View File

@ -540,13 +540,12 @@ object_allocator<lra_insn_reg> lra_insn_reg_pool ("insn regs");
is reference through subreg (SUBREG_P), and reference to the next
insn reg info (NEXT). If REGNO can be early clobbered,
alternatives in which it can be early clobbered are given by
EARLY_CLOBBER_ALTS. CLOBBER_HIGH marks if reference is a clobber
high. */
EARLY_CLOBBER_ALTS. */
static struct lra_insn_reg *
new_insn_reg (rtx_insn *insn, int regno, enum op_type type,
machine_mode mode, bool subreg_p,
alternative_mask early_clobber_alts,
struct lra_insn_reg *next, bool clobber_high)
struct lra_insn_reg *next)
{
lra_insn_reg *ir = lra_insn_reg_pool.allocate ();
ir->type = type;
@ -556,7 +555,6 @@ new_insn_reg (rtx_insn *insn, int regno, enum op_type type,
lra_reg_info[regno].biggest_mode = mode;
ir->subreg_p = subreg_p;
ir->early_clobber_alts = early_clobber_alts;
ir->clobber_high = clobber_high;
ir->regno = regno;
ir->next = next;
return ir;
@ -824,13 +822,12 @@ setup_operand_alternative (lra_insn_recog_data_t data,
not the insn operands, in X with TYPE (in/out/inout) and flag that
it is early clobbered in the insn (EARLY_CLOBBER) and add the info
to LIST. X is a part of insn given by DATA. Return the result
list. CLOBBER_HIGH marks if X is a clobber high. */
list. */
static struct lra_insn_reg *
collect_non_operand_hard_regs (rtx_insn *insn, rtx *x,
lra_insn_recog_data_t data,
struct lra_insn_reg *list,
enum op_type type, bool early_clobber,
bool clobber_high)
enum op_type type, bool early_clobber)
{
int i, j, regno, last;
bool subreg_p;
@ -890,8 +887,7 @@ collect_non_operand_hard_regs (rtx_insn *insn, rtx *x,
&& regno <= LAST_STACK_REG));
#endif
list = new_insn_reg (data->insn, regno, type, mode, subreg_p,
early_clobber ? ALL_ALTERNATIVES : 0, list,
clobber_high);
early_clobber ? ALL_ALTERNATIVES : 0, list);
}
}
return list;
@ -900,31 +896,24 @@ collect_non_operand_hard_regs (rtx_insn *insn, rtx *x,
{
case SET:
list = collect_non_operand_hard_regs (insn, &SET_DEST (op), data,
list, OP_OUT, false, false);
list, OP_OUT, false);
list = collect_non_operand_hard_regs (insn, &SET_SRC (op), data,
list, OP_IN, false, false);
list, OP_IN, false);
break;
case CLOBBER:
/* We treat clobber of non-operand hard registers as early clobber. */
list = collect_non_operand_hard_regs (insn, &XEXP (op, 0), data,
list, OP_OUT, true, false);
break;
case CLOBBER_HIGH:
/* Clobber high should always span exactly one register. */
gcc_assert (REG_NREGS (XEXP (op, 0)) == 1);
/* We treat clobber of non-operand hard registers as early clobber. */
list = collect_non_operand_hard_regs (insn, &XEXP (op, 0), data,
list, OP_OUT, true, true);
list, OP_OUT, true);
break;
case PRE_INC: case PRE_DEC: case POST_INC: case POST_DEC:
list = collect_non_operand_hard_regs (insn, &XEXP (op, 0), data,
list, OP_INOUT, false, false);
list, OP_INOUT, false);
break;
case PRE_MODIFY: case POST_MODIFY:
list = collect_non_operand_hard_regs (insn, &XEXP (op, 0), data,
list, OP_INOUT, false, false);
list, OP_INOUT, false);
list = collect_non_operand_hard_regs (insn, &XEXP (op, 1), data,
list, OP_IN, false, false);
list, OP_IN, false);
break;
default:
fmt = GET_RTX_FORMAT (code);
@ -932,12 +921,11 @@ collect_non_operand_hard_regs (rtx_insn *insn, rtx *x,
{
if (fmt[i] == 'e')
list = collect_non_operand_hard_regs (insn, &XEXP (op, i), data,
list, OP_IN, false, false);
list, OP_IN, false);
else if (fmt[i] == 'E')
for (j = XVECLEN (op, i) - 1; j >= 0; j--)
list = collect_non_operand_hard_regs (insn, &XVECEXP (op, i, j),
data, list, OP_IN, false,
false);
data, list, OP_IN, false);
}
}
return list;
@ -1086,7 +1074,7 @@ lra_set_insn_recog_data (rtx_insn *insn)
else
insn_static_data->hard_regs
= collect_non_operand_hard_regs (insn, &PATTERN (insn), data,
NULL, OP_IN, false, false);
NULL, OP_IN, false);
data->arg_hard_regs = NULL;
if (CALL_P (insn))
{
@ -1112,10 +1100,6 @@ lra_set_insn_recog_data (rtx_insn *insn)
arg_hard_regs[n_hard_regs++]
= regno + i + (use_p ? 0 : FIRST_PSEUDO_REGISTER);
}
else if (GET_CODE (XEXP (link, 0)) == CLOBBER_HIGH)
/* We could support CLOBBER_HIGH and treat it in the same way as
HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that yet. */
gcc_unreachable ();
if (n_hard_regs != 0)
{
@ -1475,7 +1459,7 @@ add_regs_to_insn_regno_info (lra_insn_recog_data_t data, rtx x,
if (bitmap_set_bit (&lra_reg_info[regno].insn_bitmap, INSN_UID (insn)))
{
data->regs = new_insn_reg (data->insn, regno, type, mode, subreg_p,
early_clobber_alts, data->regs, false);
early_clobber_alts, data->regs);
return;
}
else
@ -1488,7 +1472,7 @@ add_regs_to_insn_regno_info (lra_insn_recog_data_t data, rtx x,
structure. */
data->regs = new_insn_reg (data->insn, regno, type, mode,
subreg_p, early_clobber_alts,
data->regs, false);
data->regs);
else
{
if (curr->type != type)
@ -1513,8 +1497,6 @@ add_regs_to_insn_regno_info (lra_insn_recog_data_t data, rtx x,
add_regs_to_insn_regno_info (data, XEXP (x, 0), insn, OP_OUT,
ALL_ALTERNATIVES);
break;
case CLOBBER_HIGH:
gcc_unreachable ();
case PRE_INC: case PRE_DEC: case POST_INC: case POST_DEC:
add_regs_to_insn_regno_info (data, XEXP (x, 0), insn, OP_INOUT, 0);
break;
@ -1650,9 +1632,6 @@ lra_update_insn_regno_info (rtx_insn *insn)
link = XEXP (link, 1))
{
code = GET_CODE (XEXP (link, 0));
/* We could support CLOBBER_HIGH and treat it in the same way as
HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that yet. */
gcc_assert (code != CLOBBER_HIGH);
if ((code == USE || code == CLOBBER)
&& MEM_P (XEXP (XEXP (link, 0), 0)))
add_regs_to_insn_regno_info (data, XEXP (XEXP (link, 0), 0), insn,

View File

@ -134,8 +134,6 @@ reload_cse_simplify (rtx_insn *insn, rtx testreg)
for (i = XVECLEN (body, 0) - 1; i >= 0; --i)
{
rtx part = XVECEXP (body, 0, i);
/* asms can only have full clobbers, not clobber_highs. */
gcc_assert (GET_CODE (part) != CLOBBER_HIGH);
if (GET_CODE (part) == CLOBBER && REG_P (XEXP (part, 0)))
cselib_invalidate_rtx (XEXP (part, 0));
}
@ -158,9 +156,7 @@ reload_cse_simplify (rtx_insn *insn, rtx testreg)
value = SET_DEST (part);
}
}
else if (GET_CODE (part) != CLOBBER
&& GET_CODE (part) != CLOBBER_HIGH
&& GET_CODE (part) != USE)
else if (GET_CODE (part) != CLOBBER && GET_CODE (part) != USE)
break;
}
@ -1515,10 +1511,6 @@ reload_combine_note_use (rtx *xp, rtx_insn *insn, int ruid, rtx containing_mem)
}
break;
case CLOBBER_HIGH:
gcc_assert (REG_P (SET_DEST (x)));
return;
case PLUS:
/* We are interested in (plus (reg) (const_int)) . */
if (!REG_P (XEXP (x, 0))
@ -2284,13 +2276,6 @@ move2add_note_store (rtx dst, const_rtx set, void *data)
move2add_record_mode (dst);
}
else if (GET_CODE (set) == CLOBBER_HIGH)
{
/* Only invalidate if actually clobbered. */
if (reg_mode[regno] == BLKmode
|| reg_is_clobbered_by_clobber_high (regno, reg_mode[regno], dst))
goto invalidate;
}
else
{
invalidate:

View File

@ -1763,7 +1763,6 @@ print_pattern (pretty_printer *pp, const_rtx x, int verbose)
print_exp (pp, x, verbose);
break;
case CLOBBER:
case CLOBBER_HIGH:
case USE:
pp_printf (pp, "%s ", GET_RTX_NAME (GET_CODE (x)));
print_value (pp, XEXP (x, 0), verbose);

View File

@ -3726,8 +3726,7 @@ store_data_bypass_p_1 (rtx_insn *out_insn, rtx in_set)
{
rtx out_exp = XVECEXP (out_pat, 0, i);
if (GET_CODE (out_exp) == CLOBBER || GET_CODE (out_exp) == USE
|| GET_CODE (out_exp) == CLOBBER_HIGH)
if (GET_CODE (out_exp) == CLOBBER || GET_CODE (out_exp) == USE)
continue;
gcc_assert (GET_CODE (out_exp) == SET);
@ -3758,8 +3757,7 @@ store_data_bypass_p (rtx_insn *out_insn, rtx_insn *in_insn)
{
rtx in_exp = XVECEXP (in_pat, 0, i);
if (GET_CODE (in_exp) == CLOBBER || GET_CODE (in_exp) == USE
|| GET_CODE (in_exp) == CLOBBER_HIGH)
if (GET_CODE (in_exp) == CLOBBER || GET_CODE (in_exp) == USE)
continue;
gcc_assert (GET_CODE (in_exp) == SET);
@ -3811,7 +3809,7 @@ if_test_bypass_p (rtx_insn *out_insn, rtx_insn *in_insn)
{
rtx exp = XVECEXP (out_pat, 0, i);
if (GET_CODE (exp) == CLOBBER || GET_CODE (exp) == CLOBBER_HIGH)
if (GET_CODE (exp) == CLOBBER)
continue;
gcc_assert (GET_CODE (exp) == SET);

View File

@ -238,11 +238,8 @@ static void
kill_clobbered_value (rtx x, const_rtx set, void *data)
{
struct value_data *const vd = (struct value_data *) data;
gcc_assert (GET_CODE (set) != CLOBBER_HIGH || REG_P (x));
if (GET_CODE (set) == CLOBBER
|| (GET_CODE (set) == CLOBBER_HIGH
&& reg_is_clobbered_by_clobber_high (x, XEXP (set, 0))))
if (GET_CODE (set) == CLOBBER)
kill_value (x, vd);
}
@ -263,8 +260,7 @@ kill_set_value (rtx x, const_rtx set, void *data)
if (rtx_equal_p (x, ksvd->ignore_set_reg))
return;
gcc_assert (GET_CODE (set) != CLOBBER_HIGH || REG_P (x));
if (GET_CODE (set) != CLOBBER && GET_CODE (set) != CLOBBER_HIGH)
if (GET_CODE (set) != CLOBBER)
{
kill_value (x, ksvd->vd);
if (REG_P (x))

View File

@ -1025,10 +1025,6 @@ reg_scan_mark_refs (rtx x, rtx_insn *insn)
reg_scan_mark_refs (XEXP (XEXP (x, 0), 0), insn);
break;
case CLOBBER_HIGH:
gcc_assert (!(MEM_P (XEXP (x, 0))));
break;
case SET:
/* Count a set of the destination if it is a register. */
for (dest = SET_DEST (x);

View File

@ -1337,8 +1337,6 @@ maybe_fix_stack_asms (void)
rtx t = XVECEXP (pat, 0, i);
if (GET_CODE (t) == CLOBBER && STACK_REG_P (XEXP (t, 0)))
SET_HARD_REG_BIT (clobbered, REGNO (XEXP (t, 0)));
/* CLOBBER_HIGH is only supported for LRA. */
gcc_assert (GET_CODE (t) != CLOBBER_HIGH);
}
/* Get the operand values and constraints out of the insn. */
@ -2879,7 +2877,6 @@ eliminate_regs_1 (rtx x, machine_mode mem_mode, rtx insn,
return x;
case CLOBBER:
case CLOBBER_HIGH:
case ASM_OPERANDS:
gcc_assert (insn && DEBUG_INSN_P (insn));
break;
@ -3090,10 +3087,6 @@ elimination_effects (rtx x, machine_mode mem_mode)
elimination_effects (XEXP (x, 0), mem_mode);
return;
case CLOBBER_HIGH:
/* CLOBBER_HIGH is only supported for LRA. */
return;
case SET:
/* Check for setting a register that we know about. */
if (REG_P (SET_DEST (x)))
@ -3725,9 +3718,6 @@ mark_not_eliminable (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED)
if (dest == hard_frame_pointer_rtx)
return;
/* CLOBBER_HIGH is only supported for LRA. */
gcc_assert (GET_CODE (x) != CLOBBER_HIGH);
for (i = 0; i < NUM_ELIMINABLE_REGS; i++)
if (reg_eliminate[i].can_eliminate && dest == reg_eliminate[i].to_rtx
&& (GET_CODE (x) != SET
@ -4355,7 +4345,6 @@ scan_paradoxical_subregs (rtx x)
case PC:
case USE:
case CLOBBER:
case CLOBBER_HIGH:
return;
case SUBREG:
@ -4809,8 +4798,7 @@ reload_as_needed (int live_known)
to be forgotten later. */
static void
forget_old_reloads_1 (rtx x, const_rtx setter,
void *data)
forget_old_reloads_1 (rtx x, const_rtx, void *data)
{
unsigned int regno;
unsigned int nr;
@ -4829,9 +4817,6 @@ forget_old_reloads_1 (rtx x, const_rtx setter,
if (!REG_P (x))
return;
/* CLOBBER_HIGH is only supported for LRA. */
gcc_assert (setter == NULL_RTX || GET_CODE (setter) != CLOBBER_HIGH);
regno = REGNO (x);
if (regno >= FIRST_PSEUDO_REGISTER)

View File

@ -410,8 +410,7 @@ find_end_label (rtx kind)
while (NOTE_P (insn)
|| (NONJUMP_INSN_P (insn)
&& (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == CLOBBER_HIGH)))
|| GET_CODE (PATTERN (insn)) == CLOBBER)))
insn = PREV_INSN (insn);
/* When a target threads its epilogue we might already have a
@ -1311,8 +1310,7 @@ try_merge_delay_insns (rtx_insn *insn, rtx_insn *thread)
/* TRIAL must be a CALL_INSN or INSN. Skip USE and CLOBBER. */
if (NONJUMP_INSN_P (trial)
&& (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH))
&& (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER))
continue;
if (GET_CODE (next_to_match) == GET_CODE (trial)
@ -1506,8 +1504,7 @@ redundant_insn (rtx insn, rtx_insn *target, const vec<rtx_insn *> &delay_list)
--insns_to_search;
pat = PATTERN (trial);
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH)
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)
continue;
if (GET_CODE (trial) == DEBUG_INSN)
@ -1605,8 +1602,7 @@ redundant_insn (rtx insn, rtx_insn *target, const vec<rtx_insn *> &delay_list)
--insns_to_search;
pat = PATTERN (trial);
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH)
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)
continue;
if (GET_CODE (trial) == DEBUG_INSN)
@ -1718,8 +1714,7 @@ own_thread_p (rtx thread, rtx label, int allow_fallthrough)
|| LABEL_P (insn)
|| (NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& GET_CODE (PATTERN (insn)) != CLOBBER_HIGH))
&& GET_CODE (PATTERN (insn)) != CLOBBER))
return 0;
return 1;
@ -2042,8 +2037,7 @@ fill_simple_delay_slots (int non_jumps_p)
pat = PATTERN (trial);
/* Stand-alone USE and CLOBBER are just for flow. */
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH)
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)
continue;
/* And DEBUG_INSNs never go into delay slots. */
@ -2169,8 +2163,7 @@ fill_simple_delay_slots (int non_jumps_p)
pat = PATTERN (trial);
/* Stand-alone USE and CLOBBER are just for flow. */
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH)
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)
continue;
/* And DEBUG_INSNs do not go in delay slots. */
@ -2438,8 +2431,7 @@ fill_slots_from_thread (rtx_jump_insn *insn, rtx condition,
}
pat = PATTERN (trial);
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER
|| GET_CODE (pat) == CLOBBER_HIGH)
if (GET_CODE (pat) == USE || GET_CODE (pat) == CLOBBER)
continue;
if (GET_CODE (trial) == DEBUG_INSN)
@ -3833,8 +3825,7 @@ dbr_schedule (rtx_insn *first)
if (! insn->deleted ()
&& NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& GET_CODE (PATTERN (insn)) != CLOBBER_HIGH)
&& GET_CODE (PATTERN (insn)) != CLOBBER)
{
if (GET_CODE (PATTERN (insn)) == SEQUENCE)
{

View File

@ -109,11 +109,6 @@ update_live_status (rtx dest, const_rtx x, void *data ATTRIBUTE_UNUSED)
if (GET_CODE (x) == CLOBBER)
for (i = first_regno; i < last_regno; i++)
CLEAR_HARD_REG_BIT (current_live_regs, i);
else if (GET_CODE (x) == CLOBBER_HIGH)
/* No current target supports both branch delay slots and CLOBBER_HIGH.
We'd need more elaborate liveness tracking to handle that
combination. */
gcc_unreachable ();
else
for (i = first_regno; i < last_regno; i++)
{
@ -299,7 +294,6 @@ mark_referenced_resources (rtx x, struct resources *res,
return;
case CLOBBER:
case CLOBBER_HIGH:
return;
case CALL_INSN:
@ -670,15 +664,9 @@ mark_set_resources (rtx x, struct resources *res, int in_dest,
for (link = CALL_INSN_FUNCTION_USAGE (call_insn);
link; link = XEXP (link, 1))
{
/* We could support CLOBBER_HIGH and treat it in the same way as
HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that
yet. */
gcc_assert (GET_CODE (XEXP (link, 0)) != CLOBBER_HIGH);
if (GET_CODE (XEXP (link, 0)) == CLOBBER)
mark_set_resources (SET_DEST (XEXP (link, 0)), res, 1,
MARK_SRC_DEST);
}
if (GET_CODE (XEXP (link, 0)) == CLOBBER)
mark_set_resources (SET_DEST (XEXP (link, 0)), res, 1,
MARK_SRC_DEST);
/* Check for a REG_SETJMP. If it exists, then we must
assume that this call can clobber any register. */
@ -721,12 +709,6 @@ mark_set_resources (rtx x, struct resources *res, int in_dest,
mark_set_resources (XEXP (x, 0), res, 1, MARK_SRC_DEST);
return;
case CLOBBER_HIGH:
/* No current target supports both branch delay slots and CLOBBER_HIGH.
We'd need more elaborate liveness tracking to handle that
combination. */
gcc_unreachable ();
case SEQUENCE:
{
rtx_sequence *seq = as_a <rtx_sequence *> (x);

View File

@ -310,10 +310,6 @@ copy_rtx (rtx orig)
return orig;
break;
case CLOBBER_HIGH:
gcc_assert (REG_P (XEXP (orig, 0)));
return orig;
case CONST:
if (shared_const_p (orig))
return orig;

View File

@ -312,16 +312,6 @@ DEF_RTL_EXPR(USE, "use", "e", RTX_EXTRA)
is considered undeletable before reload. */
DEF_RTL_EXPR(CLOBBER, "clobber", "e", RTX_EXTRA)
/* Indicate that the upper parts of something are clobbered in a way that we
don't want to explain. The MODE references the lower bits that will be
preserved. Anything above that size will be clobbered.
CLOBBER_HIGH only occurs as the operand of a PARALLEL rtx. It cannot appear
in other contexts, and unlike CLOBBER, it cannot appear on its own.
CLOBBER_HIGH can only be used with fixed register rtxes. */
DEF_RTL_EXPR(CLOBBER_HIGH, "clobber_high", "e", RTX_EXTRA)
/* Call a subroutine.
Operand 1 is the address to call.
Operand 2 is the number of arguments. */

View File

@ -2679,7 +2679,7 @@ do { \
/* For a SET rtx, SET_DEST is the place that is set
and SET_SRC is the value it is set to. */
#define SET_DEST(RTX) XC3EXP (RTX, 0, SET, CLOBBER, CLOBBER_HIGH)
#define SET_DEST(RTX) XC2EXP (RTX, 0, SET, CLOBBER)
#define SET_SRC(RTX) XCEXP (RTX, 1, SET)
#define SET_IS_RETURN_P(RTX) \
(RTL_FLAG_CHECK1 ("SET_IS_RETURN_P", (RTX), SET)->jump)
@ -3524,16 +3524,6 @@ extern rtx tablejump_casesi_pattern (const rtx_insn *insn);
extern int computed_jump_p (const rtx_insn *);
extern bool tls_referenced_p (const_rtx);
extern bool contains_mem_rtx_p (rtx x);
extern bool reg_is_clobbered_by_clobber_high (unsigned int, machine_mode,
const_rtx);
/* Convenient wrapper for reg_is_clobbered_by_clobber_high. */
inline bool
reg_is_clobbered_by_clobber_high (const_rtx x, const_rtx clobber_high_op)
{
return reg_is_clobbered_by_clobber_high (REGNO (x), GET_MODE (x),
clobber_high_op);
}
/* Overload for refers_to_regno_p for checking a single register. */
inline bool
@ -4330,7 +4320,6 @@ extern void vt_equate_reg_base_value (const_rtx, const_rtx);
extern bool memory_modified_in_insn_p (const_rtx, const_rtx);
extern bool may_be_sp_based_p (rtx);
extern rtx gen_hard_reg_clobber (machine_mode, unsigned int);
extern rtx gen_hard_reg_clobber_high (machine_mode, unsigned int);
extern rtx get_reg_known_value (unsigned int);
extern bool get_reg_known_equiv_p (unsigned int);
extern rtx get_reg_base_value (unsigned int);

View File

@ -1216,10 +1216,6 @@ reg_referenced_p (const_rtx x, const_rtx body)
return 1;
return 0;
case CLOBBER_HIGH:
gcc_assert (REG_P (XEXP (body, 0)));
return 0;
case COND_EXEC:
if (reg_overlap_mentioned_p (x, COND_EXEC_TEST (body)))
return 1;
@ -1442,11 +1438,7 @@ set_of_1 (rtx x, const_rtx pat, void *data1)
{
struct set_of_data *const data = (struct set_of_data *) (data1);
if (rtx_equal_p (x, data->pat)
|| (GET_CODE (pat) == CLOBBER_HIGH
&& REGNO(data->pat) == REGNO(XEXP (pat, 0))
&& reg_is_clobbered_by_clobber_high (data->pat, XEXP (pat, 0)))
|| (GET_CODE (pat) != CLOBBER_HIGH && !MEM_P (x)
&& reg_overlap_mentioned_p (data->pat, x)))
|| (!MEM_P (x) && reg_overlap_mentioned_p (data->pat, x)))
data->found = pat;
}
@ -1533,7 +1525,6 @@ single_set_2 (const rtx_insn *insn, const_rtx pat)
{
case USE:
case CLOBBER:
case CLOBBER_HIGH:
break;
case SET:
@ -1687,9 +1678,7 @@ noop_move_p (const rtx_insn *insn)
{
rtx tem = XVECEXP (pat, 0, i);
if (GET_CODE (tem) == USE
|| GET_CODE (tem) == CLOBBER
|| GET_CODE (tem) == CLOBBER_HIGH)
if (GET_CODE (tem) == USE || GET_CODE (tem) == CLOBBER)
continue;
if (GET_CODE (tem) != SET || ! set_noop_p (tem))
@ -1923,9 +1912,7 @@ note_pattern_stores (const_rtx x,
if (GET_CODE (x) == COND_EXEC)
x = COND_EXEC_CODE (x);
if (GET_CODE (x) == SET
|| GET_CODE (x) == CLOBBER
|| GET_CODE (x) == CLOBBER_HIGH)
if (GET_CODE (x) == SET || GET_CODE (x) == CLOBBER)
{
rtx dest = SET_DEST (x);
@ -6658,32 +6645,3 @@ tls_referenced_p (const_rtx x)
return true;
return false;
}
/* Return true if reg REGNO with mode REG_MODE would be clobbered by the
clobber_high operand in CLOBBER_HIGH_OP. */
bool
reg_is_clobbered_by_clobber_high (unsigned int regno, machine_mode reg_mode,
const_rtx clobber_high_op)
{
unsigned int clobber_regno = REGNO (clobber_high_op);
machine_mode clobber_mode = GET_MODE (clobber_high_op);
unsigned char regno_nregs = hard_regno_nregs (regno, reg_mode);
/* Clobber high should always span exactly one register. */
gcc_assert (REG_NREGS (clobber_high_op) == 1);
/* Clobber high needs to match with one of the registers in X. */
if (clobber_regno < regno || clobber_regno >= regno + regno_nregs)
return false;
gcc_assert (reg_mode != BLKmode && clobber_mode != BLKmode);
if (reg_mode == VOIDmode)
return clobber_mode != VOIDmode;
/* Clobber high will clobber if its size might be greater than the size of
register regno. */
return maybe_gt (exact_div (GET_MODE_SIZE (reg_mode), regno_nregs),
GET_MODE_SIZE (clobber_mode));
}

View File

@ -2320,13 +2320,6 @@ sched_analyze_reg (class deps_desc *deps, int regno, machine_mode mode,
while (--i >= 0)
note_reg_use (regno + i);
}
else if (ref == CLOBBER_HIGH)
{
gcc_assert (i == 1);
/* We don't know the current state of the register, so have to treat
the clobber high as a full clobber. */
note_reg_clobber (regno);
}
else
{
while (--i >= 0)
@ -2350,8 +2343,6 @@ sched_analyze_reg (class deps_desc *deps, int regno, machine_mode mode,
else if (ref == USE)
note_reg_use (regno);
else
/* For CLOBBER_HIGH, we don't know the current state of the register,
so have to treat it as a full clobber. */
note_reg_clobber (regno);
/* Pseudos that are REG_EQUIV to something may be replaced
@ -2974,7 +2965,7 @@ sched_analyze_insn (class deps_desc *deps, rtx x, rtx_insn *insn)
sub = COND_EXEC_CODE (sub);
code = GET_CODE (sub);
}
else if (code == SET || code == CLOBBER || code == CLOBBER_HIGH)
else if (code == SET || code == CLOBBER)
sched_analyze_1 (deps, sub, insn);
else
sched_analyze_2 (deps, sub, insn);
@ -2990,10 +2981,6 @@ sched_analyze_insn (class deps_desc *deps, rtx x, rtx_insn *insn)
{
if (GET_CODE (XEXP (link, 0)) == CLOBBER)
sched_analyze_1 (deps, XEXP (link, 0), insn);
else if (GET_CODE (XEXP (link, 0)) == CLOBBER_HIGH)
/* We could support CLOBBER_HIGH and treat it in the same way as
HARD_REGNO_CALL_PART_CLOBBERED, but no port needs that yet. */
gcc_unreachable ();
else if (GET_CODE (XEXP (link, 0)) != SET)
sched_analyze_2 (deps, XEXP (link, 0), insn);
}