* Rework fields used to describe positions of bitfields and

modify sizes to be unsigned and use HOST_WIDE_INT.
	* alias.c (reg_known_value_size): Now unsigned.
	* c-typeck.c (build_unary_op, case ADDR_EXPR): Use byte_position.
	(really_start_incremental_init): Use bitsize_zero_node.
	(push_init_level, pop_init_level, output_init_element): Likewise.
	Use bitsize_unit_node and bitsize_one_node.
	(output_pending_init_elements, process_init_element): Likewise.
	* combine.c (combine_max_regno, reg_sign_bit_copies): Now unsigned.
	(make_extraction): Position and length HOST_WIDE_INT and unsigned
	HOST_WIDE_INT, respectively.
	(get_pos_from_mask): Passed in value is unsigned HOST_WIDE_INT.
	(num_sign_bit_copies): Returns unsigned.
	BITWIDTH now unsigned; rework arithmetic.
	Remove recursive call from arg to MAX.
	(combine_instructions, init_reg_last_arrays): NREGS now unsigned.
	(setup_incoming_promotions, can_combine_p, try_combine, simplify_set):
	REGNO now unsigned.
	(set_nonzero_bit_and_sign_copies): NUM now unsigned.
	(find_split_point, expand_compound_operation, make_extraction): LEN
	now unsigned HOST_WIDE_INT, POS now HOST_WIDE_INT.
	(make_field_assignment): Likewise.
	(combine_simplify_rtx): Add cast.
	(expand_compound_operation): MODEWIDTH now unsigned; rework arithmetic.
	(force_to_mode): WIDTH now unsigned; add cast.
	(if_then_else_cond): SIZE now unsigned.
	(nonzero_bits): MODE_WIDTH, RESULT_WIDTH, and WIDTH now unsigned.
	(extended_count): Now returns unsigned.
	(simplify_shift_const): COUNT unsigned; arg is now INPUT_COUNT.
	Add SIGNED_COUNT variable; MODE_WORDS and FIRST_COUNT now unsigned.
	(simplify_comparison): MODE_WIDTH now unsigned.
	(update_table_tick): REGNO and ENDREGNO now unsigned; new var R.
	(mark_used_regs_combine): Likewise; rework arithmetic.
	(record_value_for_reg): REGNO, ENDREGNO, and I now unsigned.
	(record_dead_and_set_regs, reg_dead_at_p, distribute_notes): Likewise.
	(record_promoted_value): REGNO now unsigned.
	(get_last_value_validate): REGNO, ENDREGNO, and J now unsigned.
	(get_last_value): REGNO now unsigned.
	(use_crosses_set_p): REGNO and ENDREGNO now unsigned.
	(reg_dead_regno, reg_dead_endregno): Now unsigned.
	(remove_death): Arg REGNO now unsigned.
	(move_deaths):  REGNO, DEADREGNO, DEADEND, OUREND, and I now unsigned.
	(reg_bitfield_target_p): REGNO, REGNO, ENDREGNO, and ENDTREGNO
	now unsigned.
	* convert.c (convert_to_integer): INPREC and OUTPREC now unsigned.
	* cse.c (struct qty_table_elem): FIRST_REG and LAST_REG now unsigned.
	(struct cse_reg_info): REGNO now unsigned.
	(cached_regno): Now unsigned.
	(REGNO_QTY_VALID_P): Add cast.
	(make_new_qty, make_regs_eqv, delete_reg_eqiv): Regno args unsigned.
	(remove_invalid_regs): Likewise.
	(remove_invalid_subreg_refs): Likewise; arg WORD also unsigned
	as are variables END and I.
	(get_cse_reg_info, insert): Likewise.
	(mention_regs, invalidate_for_call): REGNO, ENDREGNO, and I unsigned.
	(canon_hash): Likewise.
	(insert_regs, lookup_for_remove): REGNO now unsigned.
	(invalidate): REGNO, ENDREGNO, TREGNO, and TENDREGNO now unsigned.
	New variable RN.
	* dbxout.c (dbxout_parms, dbxout_reg_parms): Don't check for REGNO < 0.
	* dwarf2out.c (dwarf2ou_frame_debug_expr): Remove cast.
	* emit-rtl.c (subreg_realpart_p): Add cast.
	(operand_subword): Arg I is now unsigned as is var PARTWORDS.
	(operand_subword_force): Arg I is now unsigned.
	* except.c (eh_regs): Variable I is now unsigned.
	* explow.c (hard_function_value): BYTES is unsigned HOST_WIDE_INT.
	* expmed.c (store_fixed_bit_field): Position is HOST_WIDE_INT;
	length is unsigned HOST_WIDE_INT; likewise for internal variables.
	(store_split_bit_field, extract_fixed_bit_field): Likewise.
	(extract_split_bit_field, store_bit_field, extract_bit_field):
	Likewise.
	* expr.c (store_constructor_fields, store_constructor, store_field):
	Positions are HOST_WIDE_INT and lengths are unsigned HOST_WIDE_INT.
	(expand_assignment, expand_expr, expand_expr_unaligned): Likewise.
	(do_jump): Likewise.
	(move_by_pieces, move_by_pieces_ninsns, clear_by_pieces):
	MAX_SIZE is now unsigned.
	(emit_group_load): BYTEPOS is HOST_WIDE_INT; BYTELEN is unsigned.
	(emit_group_store): Likewise.
	(emit_move_insn): I now unsigned.
	(store_constructor): Use host_integerp, tree_low_cst, and
	bitsize_unit_node.
	(get_inner_reference): Return bitpos and bitsize as HOST_WIDE_INT.
	Rework all calculations to use trees and new fields.
	* expr.h (promoted_input_arg): Regno now unsigned.
	(store_bit_field, extract_bit_field): Adjust types of pos and size.
	(mark_seen_cases): Arg is HOST_WIDE_INT.
	* flow.c (verify_wide_reg_1): REGNO now unsigned.
	* fold-const.c (decode_field_reference): Size and pos HOST_WIDE_INT;
	precisions and alignments are unsigned.
	(optimize_bit_field_compare, fold_truthop): Likewise.
	(int_const_binop): Adjust threshold for size_int_type_wide call.
	(fold_convert): Likewise.
	(size_int_type_wide): Make table larger and fix thinko that only
	had half of table used.
	(all_ones_mask_p, fold): Precisions are unsigned.
	* function.c (put_reg_info_stack): REGNO is unsigned.
	(instantiate_decl): Size is HOST_WIDE_INT.
	(instantiate_virtual_regs): I is unsigned.
	(assign_parms): REGNO, REGNOI, and REGNOR are unsigned.
	(promoted_input_arg): REGNO is unsigned.
	* function.h (struct function): x_max_parm_reg is now unsigned.
	* gcse.c (max_gcse_regno): Now unsigned.
	(struct null_pointer_info): min_reg and max_reg now unsigned.
	(lookup_set, next_set): REGNO arg now unsigned.
	(compute_hash_table): REGNO and I now unsigned.
	(handle_avail_expr): regnum_for_replacing now unsigned.
	(cprop_insn): REGNO now unsigned.
	(delete_null_pointer_checks_1): BLOCK_REG now pointer to unsigned.
	* ggc-common.c (ggc_mark_tree_children, case FIELD_DECL): New case.
	* global.c (set_preference): SRC_REGNO, DEST_REGNO, and I now unsigned.
	* hard-reg-set.h (reg_class_size): Now unsigned.
	* integrate.c (mark_stores): LAST_REG and I now unsigned; new UREGNO.
	* jump.c (mark_modified_reg): I now unsigned; add cast.
	(rtx_equal_for_thread_p): Add cast.
	* loop.c (max_reg_before_loop): Now unsigned.
	(struct_movable): REGNO now unsigned.
	(try_copy_prop): REGNO arg unsigned.
	(regs_match_p): XN and YN now unsigned.
	(consec_sets_invariant_p, maybe_eliminate_biv): REGNO now unsigned.
	(strength_reduce): Likewise; NREGS also unsigned.
	(first_increment_giv, last_increment_giv unsigned): Now unsigned.
	* loop.h (struct iv_class): REGNO now unsigned.
	(max_reg_before_loop, first_increment_giv, last_increment_giv):
	Now unsigned.
	* machmode.h (mode_size, mode_unit_size): Now unsigned.
	(mode_for_size, smallest_mode_for_size): Pass size as unsigned.
	* optabs.c (expand_binop): I and NWORDS now unsigned.
	(expand_unop): I now unsigned.
	* print-tree.c (print_node): Don't print DECL_FIELD_BITPOS, but do
	print DECL_FIELD_OFFSET and DECL_FIELD_BIT_OFFSET.
	* real.c (significand_size): Now returns unsigned.
	* real.h (significand_size): Likewise.
	* regclass.c (reg_class_size): Now unsigned.
	(choose_hard_reg_mode): Both operands now unsigned.
	(record_reg_classes): REGNO and NR now unsigned.
	(reg_scan): NREGS now unsigned.
	(reg_scan_update): old_max_regno now unsigned.
	(reg_scan_mark_refs): Arg MIN_REGNO and var REGNO now unsigned.
	* reload.c (find_valid_class): BEST_SIZE now unsigned.
	(find_dummy_reload): REGNO, NWORDS, and	I now unsigned.
	(hard_reg_set_here_p): Args BEG_REGNO and END_REGNO now unsigned.
	Likewise for variable R.
	(refers_to_regno_for_reload_p): Args REGNO and END_REGNO now unsigned,
	as are variables INNER_REGNO and INNER_ENDREGNO; add new variable R.
	(find_equiv_reg): Add casts.
	(regno_clobbered_p): Arg REGNO now unsigned.
	* reload.h (struct reload): NREGS now unsigned.
	(refers_to_regno_for_reload_p): Regno args are unsigned.
	(regno_clobbered_p): Likewise.
	* reload1.c (reg_max_ref_width, spill_stack_slot_width): Now unsigned.
	(compute_use_by_pseudos): REGNO now unsigned.
	(find_reg): I and J now unsigned, new variable K, and change loop
	variables accordingly; THIS_NREGS now unsigned.
	(alter_reg): INHERENT_SIZE and TOTAL_SIZE now unsigned.
	(spill_hard_reg): REGNO arg now unsigned; add casts.
	(forget_old_reloads_1): REGNO, NR, and I now unsigned.
	(mark_reload_reg_in_use): Arg REGNO and vars NREGS and I now unsigned.
	(clear_reload_reg_in_use): Arg REGNO and vars NREGS, START_REGNO,
	END_REGNO, CONFLICT_START, and CONFLICT_END now unsigned.
	(reload_reg_free_p, reload_reg_reaches_end_p): Arg REGNO now unsigned.
	(choose_reload_regs): MAX_GROUP_SIZE now unsigned.
	(emit_reload_insns): REGNO now unsigned.
	(reload_cse_move2add): Add cast.
	(move2add_note_store): REGNO and I now unsigned; new variable ENDREGNO
	and rework loop.
	* resource.c (mark_referenced_resources, mark_set_resources): New
	variable R; REGNO and LAST_REGNO now unsigned.
	(mark_target_live_regs): J and REGNO now unsigned.
	* rtl.c (mode_size, mode_unit_size): Now unsigned.
	* rtl.h (union rtunion_def): New field rtuint.
	(XCUINT): New macro.
	(ADDRESSOF_REGNO, REGNO, SUBREG_WORD): New XCUINT.
	(operand_subword, operand_subword_force): Word number is unsigned.
	(choose_hard_reg_mode): Operands are unsigned.
	(refers_to-regno_p, dead_or_set_regno_p): Regno arg is unsigned.
	(find_regno_note, find_regno_fusage, replace_regs): Likewise.
	(regno_use_in, combine_instructions, remove_death): Likewise.
	(reg_scan, reg_scan_update): Likewise.
	(extended_count): Return is unsigned.
	* rtlanal.c (refers_to_regno_p): Args REGNO and ENDREGNO and vars I,
	INNER_REGNO, and INNER_ENDREGNO now unsigned; new variable X_REGNO.
	(reg_overlap_mentioned_p): REGNO and ENDREGNO now unsigned.
	(reg_set_last_first_regno, reg_set_last_last_regno): Now unsigned.
	(reg_reg_last_1): FIRS and LAST now unsigned.
	(dead_or_set_p): REGNO, LAST_REGNO, and I now unsigned.
	(dead_or_set_regno_p): Arg TEST_REGNO and vars REGNO and ENDREGNO
	now unsigned.
	(find_regno_note, regno_use_in): Arg REGNO now unsigned.
	(find_regno_fusage): Likewise; also var REGNOTE now unsigned.
	(find_reg_fusage): Variables REGNO, END_REGNO, and I now unsigned.
	(replace_regs): Arg NREGS now unsigned.
	* sdbout.c (sdbout_parms, sdbout_reg_parms): Don't check REGNO < 0.
	* simplify-rtx.c (simplify_unary_operation): WIDTH now unsigned.
	(simplify_binary_operation): Likewise.
	(cselib_invalidate_regno): Arg REGNO and variables ENDREGNO, I, and
	THIS_LAST now unsigned.
	(cselib_record_set): Add cast.
	* ssa.c (ssa_max_reg_num): Now unsigned.
	(rename_block): REGNO now unsigned.
	* stmt.c (expand_return): Bit positions unsigned HOST_WIDE_INT;
	sizes now unsigned.
	(all_cases_count): Just return -1 not -2.
	COUNT, MINVAL, and LASTVAL now HOST_WIDE_INT.
	Rework tests to use trees whenever possible.
	Use host_integerp and tree_low_cst.
	(mark_seen_cases): COUNT arg now HOST_WIDE_INT;
	Likewise variable NEXT_NODE_OFFSET; XLO now unsigned.
	(check_for_full_enumeration_handing): BYTES_NEEDED, I to HOST_WIDE_INT.
	* stor-layout.c (mode_for_size): SIZE arg now unsigned.
	(smallest_mode_for_size): Likewise.
	(layout_decl): Simplify handing of a specified DECL_SIZE_UNIT.
	KNOWN_ALIGN is now an alignment, so simplify code.
	Don't turn off DECL_BIT_FIELD if field is BLKmode, but not type.
	(start_record_layout): Renamed from new_record_layout_info.
	Update to new fields.
	(debug_rli, normalize_rli, rli_size_unit_so_far, rli_size_so_far):
	New functions.
	(place_union_field): Renamed from layout_union_field.
	Update to use new fields in rli.
	(place_field): Renamed from layout_field.
	Major rewrite to use new fields in rli; pass alignment to layout_decl.
	(finalize_record_size): Rework to use new fields in rli and handle
	union.
	(compute_record_mode): Rework to simplify and to use new DECL fields.
	(finalize_type_size): Make rounding more consistent.
	(finish_union_layout): Deleted.
	(layout_type, case VOID_TYPE): Don't set TYPE_SIZE_UNIT either.
	(layout_type, case RECORD_TYPE): Call new function names.
	(initialize_sizetypes): Set TYPE_IS_SIZETYPE.
	(set_sizetype): Set TYPE_IS_SIZETYPE earlier.
	(get_best_mode): UNIT is now unsigned; remove casts.
	* tree.c (bit_position): Compute from new fields.
	(byte_position, int_byte_position): New functions.
	(print_type_hash_statistics): Cast to remove warning.
	(build_range_type): Use host_integerp and tree_low_cst to try to hash.
	(build_index_type): Likewise; make subtype of sizetype.
	(build_index_2_type): Pass sizetype to build_range_type.
	(build_common_tree_nodes): Use size_int and bitsize_int to
	initialize nodes; add bitsize_{zero,one,unit}_node.
	* tree.h (DECL_FIELD_CONTEXT): Use FIELD_DECL_CHECK.
	(DECL_BIT_FIELD_TYPE, DECL_QUALIFIER, DECL_FCONTEXT): Likewise.
	(DECL_PACKED, DECL_BIT_FIELD): Likewise.
	(DECL_FIELD_BITPOS): Deleted.
	(DECL_FIELD_OFFSET, DECL_FIELD_BIT_OFFSET): New fields.
	(DECL_RESULT, DECL_SAVED_INSNS): Use FUNCTION_DECL_CHECK.
	(DECL_FRAME_SIZE, DECL_FUNCTION_CODE, DECL_NO_STATIC_CHAIN): Likewise.
	(DECL_INLINE, DECL_BUILT_IN_NONANSI, DECL_IS_MALLOC): Likewise.
	(DECL_BUILT_IN_CLASS, DECL_STATIC_CONSTRUCTOR): Likewise.
	(DECL_STATIC_DESTRUCTOR, DECL_NO_CHECK_MEMORY_USAGE): Likewise.
	(DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT, DECL_NO_LIMIT_STACK) Likewise.
	(DECL_ORIGINAL_TYPE, TYPE_DECL_SUPPRESS_DEBUG): Use TYPE_DECL_CHECK.
	(DECL_ARG_TYPE_AS_WRITEN, DECL_ARG_TYPE): Use PARM_DECL_CHECK.
	(DECL_INCOMING_RTL, DECL_TRANSPARENT_UNION): Likewise.
	(DECL_ALIGN): Adjust to new field in union.
	(DECL_OFFSET_ALIGN): New field.
	(DECL_ERROR_ISSUED, DECL_TOO_LATE): Use LABEL_DECL_CHECK.
	(DECL_IN_TEXT_SECTION): Use VAR_DECL_CHECK.
	(union tree_decl): Add struct for both aligns.
	(enum tree_index): Add TI_BITSIZE_{ZERO,ONE,UNIT}.
	(bitsize_zero_node, bitsize_one_node, bitsize_unit_node): Added.
	(struct record_layout_info): Rework fields to have offset
	alignment and byte and bit position.
	(start_record_layout, place_field): Renamed from old names.
	(rli_size_so_far, rli_size_unit_so_far, normalize_rli): New decls.
	(byte_position, int_byte_position): Likewise.
	(get_inner_reference): Change types of position and length.
	* unroll.c (unroll_loop): New variable R; use for some loops.
	MAX_LOCAL_REGNUM and MAXREGNUM now unsigned.
	(calculate_giv_inc): Arg REGNO now unsigned.
	(copy_loop_body): REGNO and SRC_REGNO now unsigned.
	* varasm.c (assemble_variable): Clean up handling of size using
	host_integerp and tree_low_cst.
	(decode_addr_const): Use byte, not bit, position.
	(output_constructor): bitpos and offsets are HOST_WIDE_INT;
	use tree_low_cst and int_bit_position.
	* objc/objc-act.c (build_ivar_list_initializer): Use byte_position.
	* ch/actions.c (check_missing_cases): BYTES_NEEDED is HOST_WIDE_INT.
	* ch/typeck.c (expand_constant_to_buffer): Use int_byte_position.
	(extract_constant_from_buffer): Likewise.
	* cp/class.c (build_vbase_pointer_fields): layout_field now
	place_field.
	(get_vfield_offset): Use byte_position.
	(set_rtti_entry): Set OFFSET to ssizetype zero.
	(get_binfo_offset_as_int): Deleted.
	(dfs_record_base_offsets): Use tree_low_cst.
	(dfs_search_base_offsets): Likewise.
	(layout_nonempty_base_or_field): Reflect changes in RLI format
	and call byte_position.
	(layout_empty_base): Convert offset to ssizetype.
	(build_base_field): use rli_size_unit_so_far.
	(dfs_propagate_binfo_offsets): Do computation in proper type.
	(layout_virtual_bases): Pass ssizetype to propagate_binfo_offsets.
	(layout_class_type): Reflect changes in RLI names and fields.
	(finish_struct_1): Set DECL_FIELD_OFFSET.
	* cp/dump.c (dequeue_and_dump): Call bit_position.
	* cp/expr.c (cplus_expand_constant): Use byte_position.
	* cp/rtti.c (expand_class_desc): Use bitsize_one_node.
	* cp/typeck.c (build_component_addr): Use byte_position and don't
	special case for zero offset.
	* f/com.c (ffecom_tree_canonize_ptr_): Use bitsize_zero_node.
	(ffecom_tree_canonize_ref_): Likewise.
	* java/class.c (make_field_value): Use byte_position.
	* java/expr.c (JAVA_ARRAY_LENGTH_OFFSET): Use byte_position.
	(java_array_data_offset): Likewise.
	* java/java-tree.h (MAYBE_CREATE_TYPE_TYPE_LANG_SPECIFIC): Add case to
	bzero call.

From-SVN: r32742
This commit is contained in:
Richard Kenner 2000-03-25 18:34:13 +00:00 committed by Richard Kenner
parent 370af2d55a
commit 770ae6cc71
68 changed files with 1983 additions and 1566 deletions

View File

@ -1,3 +1,283 @@
Sat Mar 25 09:12:10 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* Rework fields used to describe positions of bitfields and
modify sizes to be unsigned and use HOST_WIDE_INT.
* alias.c (reg_known_value_size): Now unsigned.
* c-typeck.c (build_unary_op, case ADDR_EXPR): Use byte_position.
(really_start_incremental_init): Use bitsize_zero_node.
(push_init_level, pop_init_level, output_init_element): Likewise.
Use bitsize_unit_node and bitsize_one_node.
(output_pending_init_elements, process_init_element): Likewise.
* combine.c (combine_max_regno, reg_sign_bit_copies): Now unsigned.
(make_extraction): Position and length HOST_WIDE_INT and unsigned
HOST_WIDE_INT, respectively.
(get_pos_from_mask): Passed in value is unsigned HOST_WIDE_INT.
(num_sign_bit_copies): Returns unsigned.
BITWIDTH now unsigned; rework arithmetic.
Remove recursive call from arg to MAX.
(combine_instructions, init_reg_last_arrays): NREGS now unsigned.
(setup_incoming_promotions, can_combine_p, try_combine, simplify_set):
REGNO now unsigned.
(set_nonzero_bit_and_sign_copies): NUM now unsigned.
(find_split_point, expand_compound_operation, make_extraction): LEN
now unsigned HOST_WIDE_INT, POS now HOST_WIDE_INT.
(make_field_assignment): Likewise.
(combine_simplify_rtx): Add cast.
(expand_compound_operation): MODEWIDTH now unsigned; rework arithmetic.
(force_to_mode): WIDTH now unsigned; add cast.
(if_then_else_cond): SIZE now unsigned.
(nonzero_bits): MODE_WIDTH, RESULT_WIDTH, and WIDTH now unsigned.
(extended_count): Now returns unsigned.
(simplify_shift_const): COUNT unsigned; arg is now INPUT_COUNT.
Add SIGNED_COUNT variable; MODE_WORDS and FIRST_COUNT now unsigned.
(simplify_comparison): MODE_WIDTH now unsigned.
(update_table_tick): REGNO and ENDREGNO now unsigned; new var R.
(mark_used_regs_combine): Likewise; rework arithmetic.
(record_value_for_reg): REGNO, ENDREGNO, and I now unsigned.
(record_dead_and_set_regs, reg_dead_at_p, distribute_notes): Likewise.
(record_promoted_value): REGNO now unsigned.
(get_last_value_validate): REGNO, ENDREGNO, and J now unsigned.
(get_last_value): REGNO now unsigned.
(use_crosses_set_p): REGNO and ENDREGNO now unsigned.
(reg_dead_regno, reg_dead_endregno): Now unsigned.
(remove_death): Arg REGNO now unsigned.
(move_deaths): REGNO, DEADREGNO, DEADEND, OUREND, and I now unsigned.
(reg_bitfield_target_p): REGNO, REGNO, ENDREGNO, and ENDTREGNO
now unsigned.
* convert.c (convert_to_integer): INPREC and OUTPREC now unsigned.
* cse.c (struct qty_table_elem): FIRST_REG and LAST_REG now unsigned.
(struct cse_reg_info): REGNO now unsigned.
(cached_regno): Now unsigned.
(REGNO_QTY_VALID_P): Add cast.
(make_new_qty, make_regs_eqv, delete_reg_eqiv): Regno args unsigned.
(remove_invalid_regs): Likewise.
(remove_invalid_subreg_refs): Likewise; arg WORD also unsigned
as are variables END and I.
(get_cse_reg_info, insert): Likewise.
(mention_regs, invalidate_for_call): REGNO, ENDREGNO, and I unsigned.
(canon_hash): Likewise.
(insert_regs, lookup_for_remove): REGNO now unsigned.
(invalidate): REGNO, ENDREGNO, TREGNO, and TENDREGNO now unsigned.
New variable RN.
* dbxout.c (dbxout_parms, dbxout_reg_parms): Don't check for REGNO < 0.
* dwarf2out.c (dwarf2ou_frame_debug_expr): Remove cast.
* emit-rtl.c (subreg_realpart_p): Add cast.
(operand_subword): Arg I is now unsigned as is var PARTWORDS.
(operand_subword_force): Arg I is now unsigned.
* except.c (eh_regs): Variable I is now unsigned.
* explow.c (hard_function_value): BYTES is unsigned HOST_WIDE_INT.
* expmed.c (store_fixed_bit_field): Position is HOST_WIDE_INT;
length is unsigned HOST_WIDE_INT; likewise for internal variables.
(store_split_bit_field, extract_fixed_bit_field): Likewise.
(extract_split_bit_field, store_bit_field, extract_bit_field):
Likewise.
* expr.c (store_constructor_fields, store_constructor, store_field):
Positions are HOST_WIDE_INT and lengths are unsigned HOST_WIDE_INT.
(expand_assignment, expand_expr, expand_expr_unaligned): Likewise.
(do_jump): Likewise.
(move_by_pieces, move_by_pieces_ninsns, clear_by_pieces):
MAX_SIZE is now unsigned.
(emit_group_load): BYTEPOS is HOST_WIDE_INT; BYTELEN is unsigned.
(emit_group_store): Likewise.
(emit_move_insn): I now unsigned.
(store_constructor): Use host_integerp, tree_low_cst, and
bitsize_unit_node.
(get_inner_reference): Return bitpos and bitsize as HOST_WIDE_INT.
Rework all calculations to use trees and new fields.
* expr.h (promoted_input_arg): Regno now unsigned.
(store_bit_field, extract_bit_field): Adjust types of pos and size.
(mark_seen_cases): Arg is HOST_WIDE_INT.
* flow.c (verify_wide_reg_1): REGNO now unsigned.
* fold-const.c (decode_field_reference): Size and pos HOST_WIDE_INT;
precisions and alignments are unsigned.
(optimize_bit_field_compare, fold_truthop): Likewise.
(int_const_binop): Adjust threshold for size_int_type_wide call.
(fold_convert): Likewise.
(size_int_type_wide): Make table larger and fix thinko that only
had half of table used.
(all_ones_mask_p, fold): Precisions are unsigned.
* function.c (put_reg_info_stack): REGNO is unsigned.
(instantiate_decl): Size is HOST_WIDE_INT.
(instantiate_virtual_regs): I is unsigned.
(assign_parms): REGNO, REGNOI, and REGNOR are unsigned.
(promoted_input_arg): REGNO is unsigned.
* function.h (struct function): x_max_parm_reg is now unsigned.
* gcse.c (max_gcse_regno): Now unsigned.
(struct null_pointer_info): min_reg and max_reg now unsigned.
(lookup_set, next_set): REGNO arg now unsigned.
(compute_hash_table): REGNO and I now unsigned.
(handle_avail_expr): regnum_for_replacing now unsigned.
(cprop_insn): REGNO now unsigned.
(delete_null_pointer_checks_1): BLOCK_REG now pointer to unsigned.
* ggc-common.c (ggc_mark_tree_children, case FIELD_DECL): New case.
* global.c (set_preference): SRC_REGNO, DEST_REGNO, and I now unsigned.
* hard-reg-set.h (reg_class_size): Now unsigned.
* integrate.c (mark_stores): LAST_REG and I now unsigned; new UREGNO.
* jump.c (mark_modified_reg): I now unsigned; add cast.
(rtx_equal_for_thread_p): Add cast.
* loop.c (max_reg_before_loop): Now unsigned.
(struct_movable): REGNO now unsigned.
(try_copy_prop): REGNO arg unsigned.
(regs_match_p): XN and YN now unsigned.
(consec_sets_invariant_p, maybe_eliminate_biv): REGNO now unsigned.
(strength_reduce): Likewise; NREGS also unsigned.
(first_increment_giv, last_increment_giv unsigned): Now unsigned.
* loop.h (struct iv_class): REGNO now unsigned.
(max_reg_before_loop, first_increment_giv, last_increment_giv):
Now unsigned.
* machmode.h (mode_size, mode_unit_size): Now unsigned.
(mode_for_size, smallest_mode_for_size): Pass size as unsigned.
* optabs.c (expand_binop): I and NWORDS now unsigned.
(expand_unop): I now unsigned.
* print-tree.c (print_node): Don't print DECL_FIELD_BITPOS, but do
print DECL_FIELD_OFFSET and DECL_FIELD_BIT_OFFSET.
* real.c (significand_size): Now returns unsigned.
* real.h (significand_size): Likewise.
* regclass.c (reg_class_size): Now unsigned.
(choose_hard_reg_mode): Both operands now unsigned.
(record_reg_classes): REGNO and NR now unsigned.
(reg_scan): NREGS now unsigned.
(reg_scan_update): old_max_regno now unsigned.
(reg_scan_mark_refs): Arg MIN_REGNO and var REGNO now unsigned.
* reload.c (find_valid_class): BEST_SIZE now unsigned.
(find_dummy_reload): REGNO, NWORDS, and I now unsigned.
(hard_reg_set_here_p): Args BEG_REGNO and END_REGNO now unsigned.
Likewise for variable R.
(refers_to_regno_for_reload_p): Args REGNO and END_REGNO now unsigned,
as are variables INNER_REGNO and INNER_ENDREGNO; add new variable R.
(find_equiv_reg): Add casts.
(regno_clobbered_p): Arg REGNO now unsigned.
* reload.h (struct reload): NREGS now unsigned.
(refers_to_regno_for_reload_p): Regno args are unsigned.
(regno_clobbered_p): Likewise.
* reload1.c (reg_max_ref_width, spill_stack_slot_width): Now unsigned.
(compute_use_by_pseudos): REGNO now unsigned.
(find_reg): I and J now unsigned, new variable K, and change loop
variables accordingly; THIS_NREGS now unsigned.
(alter_reg): INHERENT_SIZE and TOTAL_SIZE now unsigned.
(spill_hard_reg): REGNO arg now unsigned; add casts.
(forget_old_reloads_1): REGNO, NR, and I now unsigned.
(mark_reload_reg_in_use): Arg REGNO and vars NREGS and I now unsigned.
(clear_reload_reg_in_use): Arg REGNO and vars NREGS, START_REGNO,
END_REGNO, CONFLICT_START, and CONFLICT_END now unsigned.
(reload_reg_free_p, reload_reg_reaches_end_p): Arg REGNO now unsigned.
(choose_reload_regs): MAX_GROUP_SIZE now unsigned.
(emit_reload_insns): REGNO now unsigned.
(reload_cse_move2add): Add cast.
(move2add_note_store): REGNO and I now unsigned; new variable ENDREGNO
and rework loop.
* resource.c (mark_referenced_resources, mark_set_resources): New
variable R; REGNO and LAST_REGNO now unsigned.
(mark_target_live_regs): J and REGNO now unsigned.
* rtl.c (mode_size, mode_unit_size): Now unsigned.
* rtl.h (union rtunion_def): New field rtuint.
(XCUINT): New macro.
(ADDRESSOF_REGNO, REGNO, SUBREG_WORD): New XCUINT.
(operand_subword, operand_subword_force): Word number is unsigned.
(choose_hard_reg_mode): Operands are unsigned.
(refers_to-regno_p, dead_or_set_regno_p): Regno arg is unsigned.
(find_regno_note, find_regno_fusage, replace_regs): Likewise.
(regno_use_in, combine_instructions, remove_death): Likewise.
(reg_scan, reg_scan_update): Likewise.
(extended_count): Return is unsigned.
* rtlanal.c (refers_to_regno_p): Args REGNO and ENDREGNO and vars I,
INNER_REGNO, and INNER_ENDREGNO now unsigned; new variable X_REGNO.
(reg_overlap_mentioned_p): REGNO and ENDREGNO now unsigned.
(reg_set_last_first_regno, reg_set_last_last_regno): Now unsigned.
(reg_reg_last_1): FIRS and LAST now unsigned.
(dead_or_set_p): REGNO, LAST_REGNO, and I now unsigned.
(dead_or_set_regno_p): Arg TEST_REGNO and vars REGNO and ENDREGNO
now unsigned.
(find_regno_note, regno_use_in): Arg REGNO now unsigned.
(find_regno_fusage): Likewise; also var REGNOTE now unsigned.
(find_reg_fusage): Variables REGNO, END_REGNO, and I now unsigned.
(replace_regs): Arg NREGS now unsigned.
* sdbout.c (sdbout_parms, sdbout_reg_parms): Don't check REGNO < 0.
* simplify-rtx.c (simplify_unary_operation): WIDTH now unsigned.
(simplify_binary_operation): Likewise.
(cselib_invalidate_regno): Arg REGNO and variables ENDREGNO, I, and
THIS_LAST now unsigned.
(cselib_record_set): Add cast.
* ssa.c (ssa_max_reg_num): Now unsigned.
(rename_block): REGNO now unsigned.
* stmt.c (expand_return): Bit positions unsigned HOST_WIDE_INT;
sizes now unsigned.
(all_cases_count): Just return -1 not -2.
COUNT, MINVAL, and LASTVAL now HOST_WIDE_INT.
Rework tests to use trees whenever possible.
Use host_integerp and tree_low_cst.
(mark_seen_cases): COUNT arg now HOST_WIDE_INT;
Likewise variable NEXT_NODE_OFFSET; XLO now unsigned.
(check_for_full_enumeration_handing): BYTES_NEEDED, I to HOST_WIDE_INT.
* stor-layout.c (mode_for_size): SIZE arg now unsigned.
(smallest_mode_for_size): Likewise.
(layout_decl): Simplify handing of a specified DECL_SIZE_UNIT.
KNOWN_ALIGN is now an alignment, so simplify code.
Don't turn off DECL_BIT_FIELD if field is BLKmode, but not type.
(start_record_layout): Renamed from new_record_layout_info.
Update to new fields.
(debug_rli, normalize_rli, rli_size_unit_so_far, rli_size_so_far):
New functions.
(place_union_field): Renamed from layout_union_field.
Update to use new fields in rli.
(place_field): Renamed from layout_field.
Major rewrite to use new fields in rli; pass alignment to layout_decl.
(finalize_record_size): Rework to use new fields in rli and handle
union.
(compute_record_mode): Rework to simplify and to use new DECL fields.
(finalize_type_size): Make rounding more consistent.
(finish_union_layout): Deleted.
(layout_type, case VOID_TYPE): Don't set TYPE_SIZE_UNIT either.
(layout_type, case RECORD_TYPE): Call new function names.
(initialize_sizetypes): Set TYPE_IS_SIZETYPE.
(set_sizetype): Set TYPE_IS_SIZETYPE earlier.
(get_best_mode): UNIT is now unsigned; remove casts.
* tree.c (bit_position): Compute from new fields.
(byte_position, int_byte_position): New functions.
(print_type_hash_statistics): Cast to remove warning.
(build_range_type): Use host_integerp and tree_low_cst to try to hash.
(build_index_type): Likewise; make subtype of sizetype.
(build_index_2_type): Pass sizetype to build_range_type.
(build_common_tree_nodes): Use size_int and bitsize_int to
initialize nodes; add bitsize_{zero,one,unit}_node.
* tree.h (DECL_FIELD_CONTEXT): Use FIELD_DECL_CHECK.
(DECL_BIT_FIELD_TYPE, DECL_QUALIFIER, DECL_FCONTEXT): Likewise.
(DECL_PACKED, DECL_BIT_FIELD): Likewise.
(DECL_FIELD_BITPOS): Deleted.
(DECL_FIELD_OFFSET, DECL_FIELD_BIT_OFFSET): New fields.
(DECL_RESULT, DECL_SAVED_INSNS): Use FUNCTION_DECL_CHECK.
(DECL_FRAME_SIZE, DECL_FUNCTION_CODE, DECL_NO_STATIC_CHAIN): Likewise.
(DECL_INLINE, DECL_BUILT_IN_NONANSI, DECL_IS_MALLOC): Likewise.
(DECL_BUILT_IN_CLASS, DECL_STATIC_CONSTRUCTOR): Likewise.
(DECL_STATIC_DESTRUCTOR, DECL_NO_CHECK_MEMORY_USAGE): Likewise.
(DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT, DECL_NO_LIMIT_STACK) Likewise.
(DECL_ORIGINAL_TYPE, TYPE_DECL_SUPPRESS_DEBUG): Use TYPE_DECL_CHECK.
(DECL_ARG_TYPE_AS_WRITEN, DECL_ARG_TYPE): Use PARM_DECL_CHECK.
(DECL_INCOMING_RTL, DECL_TRANSPARENT_UNION): Likewise.
(DECL_ALIGN): Adjust to new field in union.
(DECL_OFFSET_ALIGN): New field.
(DECL_ERROR_ISSUED, DECL_TOO_LATE): Use LABEL_DECL_CHECK.
(DECL_IN_TEXT_SECTION): Use VAR_DECL_CHECK.
(union tree_decl): Add struct for both aligns.
(enum tree_index): Add TI_BITSIZE_{ZERO,ONE,UNIT}.
(bitsize_zero_node, bitsize_one_node, bitsize_unit_node): Added.
(struct record_layout_info): Rework fields to have offset
alignment and byte and bit position.
(start_record_layout, place_field): Renamed from old names.
(rli_size_so_far, rli_size_unit_so_far, normalize_rli): New decls.
(byte_position, int_byte_position): Likewise.
(get_inner_reference): Change types of position and length.
* unroll.c (unroll_loop): New variable R; use for some loops.
MAX_LOCAL_REGNUM and MAXREGNUM now unsigned.
(calculate_giv_inc): Arg REGNO now unsigned.
(copy_loop_body): REGNO and SRC_REGNO now unsigned.
* varasm.c (assemble_variable): Clean up handling of size using
host_integerp and tree_low_cst.
(decode_addr_const): Use byte, not bit, position.
(output_constructor): bitpos and offsets are HOST_WIDE_INT;
use tree_low_cst and int_bit_position.
* objc/objc-act.c (build_ivar_list_initializer): Use byte_position.
Fri Mar 24 20:13:49 2000 Jason Eckhardt <jle@cygnus.com>
* bb-reorder.c (REORDER_MOVED_BLOCK_END): Removed.

View File

@ -158,7 +158,7 @@ static rtx *alias_invariant;
rtx *reg_known_value;
/* Indicates number of valid entries in reg_known_value. */
static int reg_known_value_size;
static unsigned int reg_known_value_size;
/* Vector recording for each reg_known_value whether it is due to a
REG_EQUIV note. Future passes (viz., reload) may replace the

View File

@ -2982,17 +2982,16 @@ build_unary_op (code, xarg, noconvert)
/* Ordinary case; arg is a COMPONENT_REF or a decl. */
argtype = TREE_TYPE (arg);
/* If the lvalue is const or volatile, merge that into the type
to which the address will point. Note that you can't get a
restricted pointer by taking the address of something, so we
only have to deal with `const' and `volatile' here. */
if (DECL_P (arg) || TREE_CODE_CLASS (TREE_CODE (arg)) == 'r')
{
if (TREE_READONLY (arg) || TREE_THIS_VOLATILE (arg))
argtype = c_build_type_variant (argtype,
TREE_READONLY (arg),
TREE_THIS_VOLATILE (arg));
}
if ((DECL_P (arg) || TREE_CODE_CLASS (TREE_CODE (arg)) == 'r')
&& (TREE_READONLY (arg) || TREE_THIS_VOLATILE (arg)))
argtype = c_build_type_variant (argtype,
TREE_READONLY (arg),
TREE_THIS_VOLATILE (arg));
argtype = build_pointer_type (argtype);
@ -3015,19 +3014,9 @@ build_unary_op (code, xarg, noconvert)
return error_mark_node;
}
addr = convert (argtype, addr);
if (! integer_zerop (bit_position (field)))
{
tree offset
= size_binop (EASY_DIV_EXPR, bit_position (field),
bitsize_int (BITS_PER_UNIT));
int flag = TREE_CONSTANT (addr);
addr = fold (build (PLUS_EXPR, argtype,
addr, convert (argtype, offset)));
TREE_CONSTANT (addr) = flag;
}
addr = fold (build (PLUS_EXPR, argtype,
convert (argtype, addr),
convert (argtype, byte_position (field))));
}
else
addr = build1 (code, argtype, arg);
@ -5026,7 +5015,7 @@ really_start_incremental_init (type)
constructor_fields = TREE_CHAIN (constructor_fields);
constructor_unfilled_fields = constructor_fields;
constructor_bit_index = bitsize_int (0);
constructor_bit_index = bitsize_zero_node;
}
else if (TREE_CODE (constructor_type) == ARRAY_TYPE)
{
@ -5040,7 +5029,7 @@ really_start_incremental_init (type)
TYPE_MIN_VALUE (TYPE_DOMAIN (constructor_type)));
}
else
constructor_index = bitsize_int (0);
constructor_index = bitsize_zero_node;
constructor_unfilled_index = constructor_index;
}
@ -5104,7 +5093,7 @@ push_init_level (implicit)
size_binop (MINUS_EXPR,
bit_position (constructor_fields),
constructor_bit_index),
bitsize_int (BITS_PER_UNIT)),
bitsize_unit_node),
1));
/* Indicate that we have now filled the structure up to the current
@ -5196,7 +5185,7 @@ push_init_level (implicit)
constructor_fields = TREE_CHAIN (constructor_fields);
constructor_unfilled_fields = constructor_fields;
constructor_bit_index = bitsize_int (0);
constructor_bit_index = bitsize_zero_node;
}
else if (TREE_CODE (constructor_type) == ARRAY_TYPE)
{
@ -5211,7 +5200,7 @@ push_init_level (implicit)
(TYPE_DOMAIN (constructor_type)));
}
else
constructor_index = bitsize_int (0);
constructor_index = bitsize_zero_node;
constructor_unfilled_index = constructor_index;
}
@ -5393,9 +5382,8 @@ pop_init_level (implicit)
if (TREE_CODE (constructor_type) == RECORD_TYPE
|| TREE_CODE (constructor_type) == UNION_TYPE)
/* Find the offset of the end of that field. */
filled = size_binop (CEIL_DIV_EXPR,
constructor_bit_index,
bitsize_int (BITS_PER_UNIT));
filled = size_binop (CEIL_DIV_EXPR, constructor_bit_index,
bitsize_unit_node);
else if (TREE_CODE (constructor_type) == ARRAY_TYPE)
{
@ -5406,7 +5394,7 @@ pop_init_level (implicit)
{
tree maxindex
= copy_node (size_diffop (constructor_unfilled_index,
bitsize_int (1)));
bitsize_one_node));
TYPE_DOMAIN (constructor_type) = build_index_type (maxindex);
TREE_TYPE (maxindex) = TYPE_DOMAIN (constructor_type);
@ -5914,7 +5902,7 @@ output_init_element (value, type, field, pending)
(size_binop (TRUNC_DIV_EXPR,
size_binop (MINUS_EXPR, bit_position (field),
constructor_bit_index),
bitsize_int (BITS_PER_UNIT)),
bitsize_unit_node),
0));
output_constant (digest_init (type, value,
@ -5936,7 +5924,7 @@ output_init_element (value, type, field, pending)
if (TREE_CODE (constructor_type) == ARRAY_TYPE)
constructor_unfilled_index
= size_binop (PLUS_EXPR, constructor_unfilled_index,
bitsize_int (1));
bitsize_one_node);
else if (TREE_CODE (constructor_type) == RECORD_TYPE)
{
constructor_unfilled_fields
@ -6089,7 +6077,7 @@ output_pending_init_elements (all)
if (constructor_incremental)
{
tree filled;
tree nextpos_tree = bitsize_int (0);
tree nextpos_tree = bitsize_zero_node;
if (TREE_CODE (constructor_type) == RECORD_TYPE
|| TREE_CODE (constructor_type) == UNION_TYPE)
@ -6105,17 +6093,13 @@ output_pending_init_elements (all)
if (tail)
/* Find the offset of the end of that field. */
filled = size_binop (CEIL_DIV_EXPR,
size_binop (PLUS_EXPR,
bit_position (tail),
size_binop (PLUS_EXPR, bit_position (tail),
DECL_SIZE (tail)),
bitsize_int (BITS_PER_UNIT));
bitsize_unit_node);
else
filled = bitsize_int (0);
nextpos_tree = size_binop (CEIL_DIV_EXPR,
bit_position (next),
bitsize_int (BITS_PER_UNIT));
filled = bitsize_zero_node;
nextpos_tree = convert (bitsizetype, byte_position (next));
constructor_bit_index = bit_position (next);
constructor_unfilled_fields = next;
}
@ -6395,7 +6379,7 @@ process_init_element (value)
}
constructor_index
= size_binop (PLUS_EXPR, constructor_index, bitsize_int (1));
= size_binop (PLUS_EXPR, constructor_index, bitsize_one_node);
if (! value)
/* If we are doing the bookkeeping for an element that was

View File

@ -1,3 +1,9 @@
Sat Mar 25 09:12:10 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* actions.c (check_missing_cases): BYTES_NEEDED is HOST_WIDE_INT.
* typeck.c (expand_constant_to_buffer): Use int_byte_position.
(extract_constant_from_buffer): Likewise.
Fri Mar 17 08:09:14 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* typeck.c (min_precision): New function.

View File

@ -1453,7 +1453,8 @@ check_missing_cases (type)
unsigned char *cases_seen;
/* The number of possible selector values. */
HOST_WIDE_INT size = all_cases_count (type, &is_sparse);
long bytes_needed = (size+HOST_BITS_PER_CHAR)/HOST_BITS_PER_CHAR;
HOST_WIDE_INT bytes_needed
= (size + HOST_BITS_PER_CHAR) / HOST_BITS_PER_CHAR;
if (size == -1)
warning ("CASE selector with variable range");

View File

@ -830,7 +830,7 @@ expand_constant_to_buffer (value, buffer, buf_size)
if (DECL_BIT_FIELD (field))
return 0;
offset = int_bit_position (field) / BITS_PER_UNIT;
offset = int_byte_position (field);
if (!expand_constant_to_buffer (TREE_VALUE (list),
buffer + offset,
buf_size - offset))
@ -946,7 +946,7 @@ extract_constant_from_buffer (type, buffer, buf_size)
tree field = TYPE_FIELDS (type);
for (; field != NULL_TREE; field = TREE_CHAIN (field))
{
HOST_WIDE_INT offset = int_bit_position (field) / BITS_PER_UNIT;
HOST_WIDE_INT offset = int_byte_position (field);
if (DECL_BIT_FIELD (field))
return 0;

View File

@ -142,7 +142,7 @@ static int max_uid_cuid;
/* Maximum register number, which is the size of the tables below. */
static int combine_max_regno;
static unsigned int combine_max_regno;
/* Record last point of death of (hard or pseudo) register n. */
@ -291,7 +291,7 @@ static enum machine_mode nonzero_bits_mode;
/* Nonzero if we know that a register has some leading bits that are always
equal to the sign bit. */
static char *reg_sign_bit_copies;
static unsigned char *reg_sign_bit_copies;
/* Nonzero when reg_nonzero_bits and reg_sign_bit_copies can be safely used.
It is zero while computing them and after combine has completed. This
@ -371,11 +371,13 @@ static rtx simplify_set PARAMS ((rtx));
static rtx simplify_logical PARAMS ((rtx, int));
static rtx expand_compound_operation PARAMS ((rtx));
static rtx expand_field_assignment PARAMS ((rtx));
static rtx make_extraction PARAMS ((enum machine_mode, rtx, int, rtx, int,
int, int, int));
static rtx make_extraction PARAMS ((enum machine_mode, rtx, HOST_WIDE_INT,
rtx, unsigned HOST_WIDE_INT, int,
int, int));
static rtx extract_left_shift PARAMS ((rtx, int));
static rtx make_compound_operation PARAMS ((rtx, enum rtx_code));
static int get_pos_from_mask PARAMS ((unsigned HOST_WIDE_INT, int *));
static int get_pos_from_mask PARAMS ((unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT *));
static rtx force_to_mode PARAMS ((rtx, enum machine_mode,
unsigned HOST_WIDE_INT, rtx, int));
static rtx if_then_else_cond PARAMS ((rtx, rtx *, rtx *));
@ -386,7 +388,7 @@ static rtx apply_distributive_law PARAMS ((rtx));
static rtx simplify_and_const_int PARAMS ((rtx, enum machine_mode, rtx,
unsigned HOST_WIDE_INT));
static unsigned HOST_WIDE_INT nonzero_bits PARAMS ((rtx, enum machine_mode));
static int num_sign_bit_copies PARAMS ((rtx, enum machine_mode));
static unsigned int num_sign_bit_copies PARAMS ((rtx, enum machine_mode));
static int merge_outer_ops PARAMS ((enum rtx_code *, HOST_WIDE_INT *,
enum rtx_code, HOST_WIDE_INT,
enum machine_mode, int *));
@ -488,7 +490,7 @@ do_SUBST_INT(into, newval)
int
combine_instructions (f, nregs)
rtx f;
int nregs;
unsigned int nregs;
{
register rtx insn, next;
#ifdef HAVE_cc0
@ -508,7 +510,8 @@ combine_instructions (f, nregs)
reg_nonzero_bits = ((unsigned HOST_WIDE_INT *)
xcalloc (nregs, sizeof (unsigned HOST_WIDE_INT)));
reg_sign_bit_copies = (char *) xcalloc (nregs, sizeof (char));
reg_sign_bit_copies
= (unsigned char *) xcalloc (nregs, sizeof (unsigned char));
reg_last_death = (rtx *) xmalloc (nregs * sizeof (rtx));
reg_last_set = (rtx *) xmalloc (nregs * sizeof (rtx));
@ -764,7 +767,7 @@ combine_instructions (f, nregs)
static void
init_reg_last_arrays ()
{
int nregs = combine_max_regno;
unsigned int nregs = combine_max_regno;
bzero ((char *) reg_last_death, nregs * sizeof (rtx));
bzero ((char *) reg_last_set, nregs * sizeof (rtx));
@ -783,7 +786,7 @@ static void
setup_incoming_promotions ()
{
#ifdef PROMOTE_FUNCTION_ARGS
int regno;
unsigned int regno;
rtx reg;
enum machine_mode mode;
int unsignedp;
@ -825,7 +828,7 @@ set_nonzero_bits_and_sign_copies (x, set, data)
rtx set;
void *data ATTRIBUTE_UNUSED;
{
int num;
unsigned int num;
if (GET_CODE (x) == REG
&& REGNO (x) >= FIRST_PSEUDO_REGISTER
@ -967,10 +970,12 @@ can_combine_p (insn, i3, pred, succ, pdest, psrc)
{
rtx i3pat = PATTERN (i3);
int i = XVECLEN (i3pat, 0) - 1;
int regno = REGNO (XEXP (elt, 0));
unsigned int regno = REGNO (XEXP (elt, 0));
do
{
rtx i3elt = XVECEXP (i3pat, 0, i);
if (GET_CODE (i3elt) == USE
&& GET_CODE (XEXP (i3elt, 0)) == REG
&& (REGNO (XEXP (i3elt, 0)) == regno
@ -1866,7 +1871,7 @@ try_combine (i3, i2, i1, new_direct_jump_p)
i2src, const0_rtx))
!= GET_MODE (SET_DEST (newpat))))
{
int regno = REGNO (SET_DEST (newpat));
unsigned int regno = REGNO (SET_DEST (newpat));
rtx new_dest = gen_rtx_REG (compare_mode, regno);
if (regno < FIRST_PSEUDO_REGISTER
@ -2431,7 +2436,7 @@ try_combine (i3, i2, i1, new_direct_jump_p)
rtx i3notes, i2notes, i1notes = 0;
rtx i3links, i2links, i1links = 0;
rtx midnotes = 0;
register int regno;
unsigned int regno;
/* Compute which registers we expect to eliminate. newi2pat may be setting
either i3dest or i2dest, so we must check it. Also, i1dest may be the
same as i3dest, in which case newi2pat may be setting i1dest. */
@ -2691,9 +2696,7 @@ try_combine (i3, i2, i1, new_direct_jump_p)
regno = REGNO (i1dest);
if (! added_sets_1 && ! i1dest_in_i1src)
{
REG_N_SETS (regno)--;
}
REG_N_SETS (regno)--;
}
/* Update reg_nonzero_bits et al for any changes that may have been made
@ -2795,7 +2798,9 @@ find_split_point (loc, insn)
rtx x = *loc;
enum rtx_code code = GET_CODE (x);
rtx *split;
int len = 0, pos = 0, unsignedp = 0;
unsigned HOST_WIDE_INT len = 0;
HOST_WIDE_INT pos = 0;
int unsignedp = 0;
rtx inner = NULL_RTX;
/* First special-case some codes. */
@ -2930,9 +2935,9 @@ find_split_point (loc, insn)
<= GET_MODE_BITSIZE (GET_MODE (XEXP (SET_DEST (x), 0))))
&& ! side_effects_p (XEXP (SET_DEST (x), 0)))
{
int pos = INTVAL (XEXP (SET_DEST (x), 2));
int len = INTVAL (XEXP (SET_DEST (x), 1));
int src = INTVAL (SET_SRC (x));
HOST_WIDE_INT pos = INTVAL (XEXP (SET_DEST (x), 2));
unsigned HOST_WIDE_INT len = INTVAL (XEXP (SET_DEST (x), 1));
unsigned HOST_WIDE_INT src = INTVAL (SET_SRC (x));
rtx dest = XEXP (SET_DEST (x), 0);
enum machine_mode mode = GET_MODE (dest);
unsigned HOST_WIDE_INT mask = ((HOST_WIDE_INT) 1 << len) - 1;
@ -2940,7 +2945,7 @@ find_split_point (loc, insn)
if (BITS_BIG_ENDIAN)
pos = GET_MODE_BITSIZE (mode) - len - pos;
if ((unsigned HOST_WIDE_INT) src == mask)
if (src == mask)
SUBST (SET_SRC (x),
gen_binary (IOR, mode, dest, GEN_INT (src << pos)));
else
@ -4143,7 +4148,7 @@ combine_simplify_rtx (x, op0_mode, last, in_dest)
== ((HOST_WIDE_INT) 1 << (i + 1)) - 1))
|| (GET_CODE (XEXP (XEXP (x, 0), 0)) == ZERO_EXTEND
&& (GET_MODE_BITSIZE (GET_MODE (XEXP (XEXP (XEXP (x, 0), 0), 0)))
== i + 1))))
== (unsigned int) i + 1))))
return simplify_shift_const
(NULL_RTX, ASHIFTRT, mode,
simplify_shift_const (NULL_RTX, ASHIFT, mode,
@ -4866,7 +4871,7 @@ simplify_set (x)
which case we can safely change its mode. */
if (compare_mode != GET_MODE (dest))
{
int regno = REGNO (dest);
unsigned int regno = REGNO (dest);
rtx new_dest = gen_rtx_REG (compare_mode, regno);
if (regno < FIRST_PSEUDO_REGISTER
@ -5458,9 +5463,9 @@ static rtx
expand_compound_operation (x)
rtx x;
{
int pos = 0, len;
unsigned HOST_WIDE_INT pos = 0, len;
int unsignedp = 0;
int modewidth;
unsigned int modewidth;
rtx tem;
switch (GET_CODE (x))
@ -5608,7 +5613,7 @@ expand_compound_operation (x)
a such a position. */
modewidth = GET_MODE_BITSIZE (GET_MODE (x));
if (modewidth >= pos - len)
if (modewidth + len >= pos)
tem = simplify_shift_const (NULL_RTX, unsignedp ? LSHIFTRT : ASHIFTRT,
GET_MODE (x),
simplify_shift_const (NULL_RTX, ASHIFT,
@ -5800,9 +5805,9 @@ make_extraction (mode, inner, pos, pos_rtx, len,
unsignedp, in_dest, in_compare)
enum machine_mode mode;
rtx inner;
int pos;
HOST_WIDE_INT pos;
rtx pos_rtx;
int len;
unsigned HOST_WIDE_INT len;
int unsignedp;
int in_dest, in_compare;
{
@ -5819,7 +5824,7 @@ make_extraction (mode, inner, pos, pos_rtx, len,
int spans_byte = 0;
rtx new = 0;
rtx orig_pos_rtx = pos_rtx;
int orig_pos;
HOST_WIDE_INT orig_pos;
/* Get some information about INNER and get the innermost object. */
if (GET_CODE (inner) == USE)
@ -6528,7 +6533,7 @@ make_compound_operation (x, in_code)
static int
get_pos_from_mask (m, plen)
unsigned HOST_WIDE_INT m;
int *plen;
unsigned HOST_WIDE_INT *plen;
{
/* Get the bit number of the first 1 bit from the right, -1 if none. */
int pos = exact_log2 (m & - m);
@ -6748,7 +6753,7 @@ force_to_mode (x, mode, mask, reg, just_select)
This may eliminate that PLUS and, later, the AND. */
{
int width = GET_MODE_BITSIZE (mode);
unsigned int width = GET_MODE_BITSIZE (mode);
unsigned HOST_WIDE_INT smask = mask;
/* If MODE is narrower than HOST_WIDE_INT and mask is a negative
@ -6920,7 +6925,7 @@ force_to_mode (x, mode, mask, reg, just_select)
+ num_sign_bit_copies (XEXP (x, 0), GET_MODE (XEXP (x, 0))))
>= GET_MODE_BITSIZE (GET_MODE (x)))
&& exact_log2 (mask + 1) >= 0
&& (num_sign_bit_copies (XEXP (x, 0), GET_MODE (XEXP (x, 0)))
&& ((int) num_sign_bit_copies (XEXP (x, 0), GET_MODE (XEXP (x, 0)))
>= exact_log2 (mask + 1)))
x = gen_binary (LSHIFTRT, GET_MODE (x), XEXP (x, 0),
GEN_INT (GET_MODE_BITSIZE (GET_MODE (x))
@ -7119,7 +7124,7 @@ if_then_else_cond (x, ptrue, pfalse)
{
enum machine_mode mode = GET_MODE (x);
enum rtx_code code = GET_CODE (x);
int size = GET_MODE_BITSIZE (mode);
unsigned int size = GET_MODE_BITSIZE (mode);
rtx cond0, cond1, true0, true1, false0, false1;
unsigned HOST_WIDE_INT nz;
@ -7455,7 +7460,8 @@ make_field_assignment (x)
rtx assign;
rtx rhs, lhs;
HOST_WIDE_INT c1;
int pos, len;
HOST_WIDE_INT pos;
unsigned HOST_WIDE_INT len;
rtx other;
enum machine_mode mode;
@ -7802,7 +7808,7 @@ nonzero_bits (x, mode)
unsigned HOST_WIDE_INT nonzero = GET_MODE_MASK (mode);
unsigned HOST_WIDE_INT inner_nz;
enum rtx_code code;
int mode_width = GET_MODE_BITSIZE (mode);
unsigned int mode_width = GET_MODE_BITSIZE (mode);
rtx tem;
/* For floating-point values, assume all bits are needed. */
@ -8050,7 +8056,7 @@ nonzero_bits (x, mode)
= (nz0 & ((HOST_WIDE_INT) 1 << (mode_width - 1)));
HOST_WIDE_INT op1_maybe_minusp
= (nz1 & ((HOST_WIDE_INT) 1 << (mode_width - 1)));
int result_width = mode_width;
unsigned int result_width = mode_width;
int result_low = 0;
switch (code)
@ -8171,7 +8177,7 @@ nonzero_bits (x, mode)
&& INTVAL (XEXP (x, 1)) < HOST_BITS_PER_WIDE_INT)
{
enum machine_mode inner_mode = GET_MODE (x);
int width = GET_MODE_BITSIZE (inner_mode);
unsigned int width = GET_MODE_BITSIZE (inner_mode);
int count = INTVAL (XEXP (x, 1));
unsigned HOST_WIDE_INT mode_mask = GET_MODE_MASK (inner_mode);
unsigned HOST_WIDE_INT op_nonzero = nonzero_bits (XEXP (x, 0), mode);
@ -8228,13 +8234,13 @@ nonzero_bits (x, mode)
VOIDmode, X will be used in its own mode. The returned value will always
be between 1 and the number of bits in MODE. */
static int
static unsigned int
num_sign_bit_copies (x, mode)
rtx x;
enum machine_mode mode;
{
enum rtx_code code = GET_CODE (x);
int bitwidth;
unsigned int bitwidth;
int num0, num1, result;
unsigned HOST_WIDE_INT nonzero;
rtx tem;
@ -8253,8 +8259,11 @@ num_sign_bit_copies (x, mode)
/* For a smaller object, just ignore the high bits. */
if (bitwidth < GET_MODE_BITSIZE (GET_MODE (x)))
return MAX (1, (num_sign_bit_copies (x, GET_MODE (x))
- (GET_MODE_BITSIZE (GET_MODE (x)) - bitwidth)));
{
num0 = num_sign_bit_copies (x, GET_MODE (x));
return MAX (1,
num0 - (int) (GET_MODE_BITSIZE (GET_MODE (x)) - bitwidth));
}
if (GET_MODE (x) != VOIDmode && bitwidth > GET_MODE_BITSIZE (GET_MODE (x)))
{
@ -8310,7 +8319,8 @@ num_sign_bit_copies (x, mode)
#ifdef LOAD_EXTEND_OP
/* Some RISC machines sign-extend all loads of smaller than a word. */
if (LOAD_EXTEND_OP (GET_MODE (x)) == SIGN_EXTEND)
return MAX (1, bitwidth - GET_MODE_BITSIZE (GET_MODE (x)) + 1);
return MAX (1, ((int) bitwidth
- (int) GET_MODE_BITSIZE (GET_MODE (x)) + 1));
#endif
break;
@ -8330,16 +8340,20 @@ num_sign_bit_copies (x, mode)
high-order bits are known to be sign bit copies. */
if (SUBREG_PROMOTED_VAR_P (x) && ! SUBREG_PROMOTED_UNSIGNED_P (x))
return MAX (bitwidth - GET_MODE_BITSIZE (GET_MODE (x)) + 1,
num_sign_bit_copies (SUBREG_REG (x), mode));
{
num0 = num_sign_bit_copies (SUBREG_REG (x), mode);
return MAX ((int) bitwidth
- (int) GET_MODE_BITSIZE (GET_MODE (x)) + 1,
num0);
}
/* For a smaller object, just ignore the high bits. */
if (bitwidth <= GET_MODE_BITSIZE (GET_MODE (SUBREG_REG (x))))
{
num0 = num_sign_bit_copies (SUBREG_REG (x), VOIDmode);
return MAX (1, (num0
- (GET_MODE_BITSIZE (GET_MODE (SUBREG_REG (x)))
- bitwidth)));
- (int) (GET_MODE_BITSIZE (GET_MODE (SUBREG_REG (x)))
- bitwidth)));
}
#ifdef WORD_REGISTER_OPERATIONS
@ -8364,7 +8378,7 @@ num_sign_bit_copies (x, mode)
case SIGN_EXTRACT:
if (GET_CODE (XEXP (x, 1)) == CONST_INT)
return MAX (1, bitwidth - INTVAL (XEXP (x, 1)));
return MAX (1, (int) bitwidth - INTVAL (XEXP (x, 1)));
break;
case SIGN_EXTEND:
@ -8374,8 +8388,8 @@ num_sign_bit_copies (x, mode)
case TRUNCATE:
/* For a smaller object, just ignore the high bits. */
num0 = num_sign_bit_copies (XEXP (x, 0), VOIDmode);
return MAX (1, (num0 - (GET_MODE_BITSIZE (GET_MODE (XEXP (x, 0)))
- bitwidth)));
return MAX (1, (num0 - (int) (GET_MODE_BITSIZE (GET_MODE (XEXP (x, 0)))
- bitwidth)));
case NOT:
return num_sign_bit_copies (XEXP (x, 0), mode);
@ -8389,7 +8403,7 @@ num_sign_bit_copies (x, mode)
{
num0 = num_sign_bit_copies (XEXP (x, 0), mode);
return MAX (1, num0 - (code == ROTATE ? INTVAL (XEXP (x, 1))
: bitwidth - INTVAL (XEXP (x, 1))));
: (int) bitwidth - INTVAL (XEXP (x, 1))));
}
break;
@ -8557,7 +8571,7 @@ num_sign_bit_copies (x, mode)
This function will always return 0 unless called during combine, which
implies that it must be called from a define_split. */
int
unsigned int
extended_count (x, mode, unsignedp)
rtx x;
enum machine_mode mode;
@ -8568,8 +8582,9 @@ extended_count (x, mode, unsignedp)
return (unsignedp
? (GET_MODE_BITSIZE (mode) <= HOST_BITS_PER_WIDE_INT
&& (GET_MODE_BITSIZE (mode) - 1
- floor_log2 (nonzero_bits (x, mode))))
? (GET_MODE_BITSIZE (mode) - 1
- floor_log2 (nonzero_bits (x, mode)))
: 0)
: num_sign_bit_copies (x, mode) - 1);
}
@ -8719,18 +8734,20 @@ merge_outer_ops (pop0, pconst0, op1, const1, mode, pcomp_p)
are ASHIFTRT and ROTATE, which are always done in their original mode, */
static rtx
simplify_shift_const (x, code, result_mode, varop, count)
simplify_shift_const (x, code, result_mode, varop, input_count)
rtx x;
enum rtx_code code;
enum machine_mode result_mode;
rtx varop;
int count;
int input_count;
{
enum rtx_code orig_code = code;
int orig_count = count;
int orig_count = input_count;
unsigned int count;
int signed_count;
enum machine_mode mode = result_mode;
enum machine_mode shift_mode, tmode;
int mode_words
unsigned int mode_words
= (GET_MODE_SIZE (mode) + (UNITS_PER_WORD - 1)) / UNITS_PER_WORD;
/* We form (outer_op (code varop count) (outer_const)). */
enum rtx_code outer_op = NIL;
@ -8742,14 +8759,16 @@ simplify_shift_const (x, code, result_mode, varop, count)
/* If we were given an invalid count, don't do anything except exactly
what was requested. */
if (count < 0 || count > GET_MODE_BITSIZE (mode))
if (input_count < 0 || input_count > (int) GET_MODE_BITSIZE (mode))
{
if (x)
return x;
return gen_rtx_fmt_ee (code, mode, varop, GEN_INT (count));
return gen_rtx_fmt_ee (code, mode, varop, GEN_INT (input_count));
}
count = input_count;
/* Unless one of the branches of the `if' in this loop does a `continue',
we will `break' the loop after the `if'. */
@ -8803,12 +8822,6 @@ simplify_shift_const (x, code, result_mode, varop, count)
}
}
/* Negative counts are invalid and should not have been made (a
programmer-specified negative count should have been handled
above). */
else if (count < 0)
abort ();
/* An arithmetic right shift of a quantity known to be -1 or 0
is a no-op. */
if (code == ASHIFTRT
@ -8931,8 +8944,9 @@ simplify_shift_const (x, code, result_mode, varop, count)
if (GET_CODE (XEXP (varop, 1)) == CONST_INT
&& exact_log2 (INTVAL (XEXP (varop, 1))) >= 0)
{
varop = gen_binary (ASHIFT, GET_MODE (varop), XEXP (varop, 0),
GEN_INT (exact_log2 (INTVAL (XEXP (varop, 1)))));
varop
= gen_binary (ASHIFT, GET_MODE (varop), XEXP (varop, 0),
GEN_INT (exact_log2 (INTVAL (XEXP (varop, 1)))));
continue;
}
break;
@ -8942,8 +8956,9 @@ simplify_shift_const (x, code, result_mode, varop, count)
if (GET_CODE (XEXP (varop, 1)) == CONST_INT
&& exact_log2 (INTVAL (XEXP (varop, 1))) >= 0)
{
varop = gen_binary (LSHIFTRT, GET_MODE (varop), XEXP (varop, 0),
GEN_INT (exact_log2 (INTVAL (XEXP (varop, 1)))));
varop
= gen_binary (LSHIFTRT, GET_MODE (varop), XEXP (varop, 0),
GEN_INT (exact_log2 (INTVAL (XEXP (varop, 1)))));
continue;
}
break;
@ -8971,7 +8986,7 @@ simplify_shift_const (x, code, result_mode, varop, count)
&& GET_MODE_BITSIZE (mode) <= HOST_BITS_PER_WIDE_INT)
{
enum rtx_code first_code = GET_CODE (varop);
int first_count = INTVAL (XEXP (varop, 1));
unsigned int first_count = INTVAL (XEXP (varop, 1));
unsigned HOST_WIDE_INT mask;
rtx mask_rtx;
@ -9012,10 +9027,14 @@ simplify_shift_const (x, code, result_mode, varop, count)
&& (num_sign_bit_copies (XEXP (varop, 0), shift_mode)
> first_count))
{
count -= first_count;
if (count < 0)
count = - count, code = ASHIFT;
varop = XEXP (varop, 0);
signed_count = count - first_count;
if (signed_count < 0)
count = - signed_count, code = ASHIFT;
else
count = signed_count;
continue;
}
@ -9075,22 +9094,25 @@ simplify_shift_const (x, code, result_mode, varop, count)
/* If the shifts are in the same direction, we add the
counts. Otherwise, we subtract them. */
signed_count = count;
if ((code == ASHIFTRT || code == LSHIFTRT)
== (first_code == ASHIFTRT || first_code == LSHIFTRT))
count += first_count;
signed_count += first_count;
else
count -= first_count;
signed_count -= first_count;
/* If COUNT is positive, the new shift is usually CODE,
except for the two exceptions below, in which case it is
FIRST_CODE. If the count is negative, FIRST_CODE should
always be used */
if (count > 0
if (signed_count > 0
&& ((first_code == ROTATE && code == ASHIFT)
|| (first_code == ASHIFTRT && code == LSHIFTRT)))
code = first_code;
else if (count < 0)
code = first_code, count = - count;
code = first_code, count = signed_count;
else if (signed_count < 0)
code = first_code, count = - signed_count;
else
count = signed_count;
varop = XEXP (varop, 0);
continue;
@ -9191,7 +9213,8 @@ simplify_shift_const (x, code, result_mode, varop, count)
&& count == GET_MODE_BITSIZE (result_mode) - 1
&& GET_MODE_BITSIZE (result_mode) <= HOST_BITS_PER_WIDE_INT
&& ((STORE_FLAG_VALUE
& ((HOST_WIDE_INT) 1 << (GET_MODE_BITSIZE (result_mode) - 1))))
& ((HOST_WIDE_INT) 1
< (GET_MODE_BITSIZE (result_mode) - 1))))
&& nonzero_bits (XEXP (varop, 0), result_mode) == 1
&& merge_outer_ops (&outer_op, &outer_const, XOR,
(HOST_WIDE_INT) 1, result_mode,
@ -9276,7 +9299,7 @@ simplify_shift_const (x, code, result_mode, varop, count)
&& (new = simplify_binary_operation (ASHIFT, result_mode,
XEXP (varop, 1),
GEN_INT (count))) != 0
&& GET_CODE(new) == CONST_INT
&& GET_CODE (new) == CONST_INT
&& merge_outer_ops (&outer_op, &outer_const, PLUS,
INTVAL (new), result_mode, &complement_p))
{
@ -9324,10 +9347,11 @@ simplify_shift_const (x, code, result_mode, varop, count)
{
rtx varop_inner = XEXP (varop, 0);
varop_inner = gen_rtx_combine (LSHIFTRT,
GET_MODE (varop_inner),
XEXP (varop_inner, 0),
GEN_INT (count + INTVAL (XEXP (varop_inner, 1))));
varop_inner
= gen_rtx_combine (LSHIFTRT, GET_MODE (varop_inner),
XEXP (varop_inner, 0),
GEN_INT (count
+ INTVAL (XEXP (varop_inner, 1))));
varop = gen_rtx_combine (TRUNCATE, GET_MODE (varop),
varop_inner);
count = 0;
@ -9968,7 +9992,7 @@ simplify_comparison (code, pop0, pop1)
while (GET_CODE (op1) == CONST_INT)
{
enum machine_mode mode = GET_MODE (op0);
int mode_width = GET_MODE_BITSIZE (mode);
unsigned int mode_width = GET_MODE_BITSIZE (mode);
unsigned HOST_WIDE_INT mask = GET_MODE_MASK (mode);
int equality_comparison_p;
int sign_bit_comparison_p;
@ -10943,12 +10967,14 @@ update_table_tick (x)
if (code == REG)
{
int regno = REGNO (x);
int endregno = regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
unsigned int regno = REGNO (x);
unsigned int endregno
= regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
unsigned int r;
for (i = regno; i < endregno; i++)
reg_last_set_table_tick[i] = label_tick;
for (r = regno; r < endregno; r++)
reg_last_set_table_tick[r] = label_tick;
return;
}
@ -10971,10 +10997,11 @@ record_value_for_reg (reg, insn, value)
rtx insn;
rtx value;
{
int regno = REGNO (reg);
int endregno = regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (reg)) : 1);
int i;
unsigned int regno = REGNO (reg);
unsigned int endregno
= regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (reg)) : 1);
unsigned int i;
/* If VALUE contains REG and we have a previous value for REG, substitute
the previous value. */
@ -11007,10 +11034,11 @@ record_value_for_reg (reg, insn, value)
we don't know about its bitwise content, that its value has been
updated, and that we don't know the location of the death of the
register. */
for (i = regno; i < endregno; i ++)
for (i = regno; i < endregno; i++)
{
if (insn)
reg_last_set[i] = insn;
reg_last_set_value[i] = 0;
reg_last_set_mode[i] = 0;
reg_last_set_nonzero_bits[i] = 0;
@ -11118,15 +11146,15 @@ record_dead_and_set_regs (insn)
rtx insn;
{
register rtx link;
int i;
unsigned int i;
for (link = REG_NOTES (insn); link; link = XEXP (link, 1))
{
if (REG_NOTE_KIND (link) == REG_DEAD
&& GET_CODE (XEXP (link, 0)) == REG)
{
int regno = REGNO (XEXP (link, 0));
int endregno
unsigned int regno = REGNO (XEXP (link, 0));
unsigned int endregno
= regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (XEXP (link, 0)))
: 1);
@ -11171,7 +11199,7 @@ record_promoted_value (insn, subreg)
rtx subreg;
{
rtx links, set;
int regno = REGNO (SUBREG_REG (subreg));
unsigned int regno = REGNO (SUBREG_REG (subreg));
enum machine_mode mode = GET_MODE (subreg);
if (GET_MODE_BITSIZE (mode) >= HOST_BITS_PER_WIDE_INT)
@ -11262,10 +11290,11 @@ get_last_value_validate (loc, insn, tick, replace)
if (GET_CODE (x) == REG)
{
int regno = REGNO (x);
int endregno = regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
int j;
unsigned int regno = REGNO (x);
unsigned int endregno
= regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
unsigned int j;
for (j = regno; j < endregno; j++)
if (reg_last_set_invalid[j]
@ -11273,7 +11302,8 @@ get_last_value_validate (loc, insn, tick, replace)
live at the beginning of the function, it is always valid. */
|| (! (regno >= FIRST_PSEUDO_REGISTER
&& REG_N_SETS (regno) == 1
&& ! REGNO_REG_SET_P (BASIC_BLOCK (0)->global_live_at_start, regno))
&& (! REGNO_REG_SET_P
(BASIC_BLOCK (0)->global_live_at_start, regno)))
&& reg_last_set_label[j] > tick))
{
if (replace)
@ -11313,7 +11343,7 @@ static rtx
get_last_value (x)
rtx x;
{
int regno;
unsigned int regno;
rtx value;
/* If this is a non-paradoxical SUBREG, get the value of its operand and
@ -11346,7 +11376,8 @@ get_last_value (x)
|| (reg_last_set_label[regno] != label_tick
&& (regno < FIRST_PSEUDO_REGISTER
|| REG_N_SETS (regno) != 1
|| REGNO_REG_SET_P (BASIC_BLOCK (0)->global_live_at_start, regno))))
|| (REGNO_REG_SET_P
(BASIC_BLOCK (0)->global_live_at_start, regno)))))
return 0;
/* If the value was set in a later insn than the ones we are processing,
@ -11384,8 +11415,8 @@ use_crosses_set_p (x, from_cuid)
if (code == REG)
{
register int regno = REGNO (x);
int endreg = regno + (regno < FIRST_PSEUDO_REGISTER
unsigned int regno = REGNO (x);
unsigned endreg = regno + (regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
#ifdef PUSH_ROUNDING
@ -11394,7 +11425,7 @@ use_crosses_set_p (x, from_cuid)
if (regno == STACK_POINTER_REGNUM)
return 1;
#endif
for (;regno < endreg; regno++)
for (; regno < endreg; regno++)
if (reg_last_set[regno]
&& INSN_CUID (reg_last_set[regno]) > from_cuid)
return 1;
@ -11425,7 +11456,7 @@ use_crosses_set_p (x, from_cuid)
/* Define three variables used for communication between the following
routines. */
static int reg_dead_regno, reg_dead_endregno;
static unsigned int reg_dead_regno, reg_dead_endregno;
static int reg_dead_flag;
/* Function called via note_stores from reg_dead_at_p.
@ -11439,7 +11470,7 @@ reg_dead_at_p_1 (dest, x, data)
rtx x;
void *data ATTRIBUTE_UNUSED;
{
int regno, endregno;
unsigned int regno, endregno;
if (GET_CODE (dest) != REG)
return;
@ -11465,7 +11496,8 @@ reg_dead_at_p (reg, insn)
rtx reg;
rtx insn;
{
int block, i;
int block;
unsigned int i;
/* Set variables for reg_dead_at_p_1. */
reg_dead_regno = REGNO (reg);
@ -11524,8 +11556,8 @@ static void
mark_used_regs_combine (x)
rtx x;
{
register RTX_CODE code = GET_CODE (x);
register int regno;
RTX_CODE code = GET_CODE (x);
unsigned int regno;
int i;
switch (code)
@ -11559,6 +11591,8 @@ mark_used_regs_combine (x)
If so, mark all of them just like the first. */
if (regno < FIRST_PSEUDO_REGISTER)
{
unsigned int endregno, r;
/* None of this applies to the stack, frame or arg pointers */
if (regno == STACK_POINTER_REGNUM
#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM
@ -11570,9 +11604,9 @@ mark_used_regs_combine (x)
|| regno == FRAME_POINTER_REGNUM)
return;
i = HARD_REGNO_NREGS (regno, GET_MODE (x));
while (i-- > 0)
SET_HARD_REG_BIT (newpat_used_regs, regno + i);
endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
for (r = regno; r < endregno; r++)
SET_HARD_REG_BIT (newpat_used_regs, r);
}
return;
@ -11626,7 +11660,7 @@ mark_used_regs_combine (x)
rtx
remove_death (regno, insn)
int regno;
unsigned int regno;
rtx insn;
{
register rtx note = find_regno_note (insn, REG_DEAD, regno);
@ -11664,13 +11698,13 @@ move_deaths (x, maybe_kill_insn, from_cuid, to_insn, pnotes)
if (code == REG)
{
register int regno = REGNO (x);
unsigned int regno = REGNO (x);
register rtx where_dead = reg_last_death[regno];
register rtx before_dead, after_dead;
/* Don't move the register if it gets killed in between from and to */
if (maybe_kill_insn && reg_set_p (x, maybe_kill_insn)
&& !reg_referenced_p (x, maybe_kill_insn))
&& ! reg_referenced_p (x, maybe_kill_insn))
return;
/* WHERE_DEAD could be a USE insn made by combine, so first we
@ -11678,6 +11712,7 @@ move_deaths (x, maybe_kill_insn, from_cuid, to_insn, pnotes)
before_dead = where_dead;
while (before_dead && INSN_UID (before_dead) > max_uid_cuid)
before_dead = PREV_INSN (before_dead);
after_dead = where_dead;
while (after_dead && INSN_UID (after_dead) > max_uid_cuid)
after_dead = NEXT_INSN (after_dead);
@ -11703,12 +11738,13 @@ move_deaths (x, maybe_kill_insn, from_cuid, to_insn, pnotes)
&& (GET_MODE_SIZE (GET_MODE (XEXP (note, 0)))
> GET_MODE_SIZE (GET_MODE (x))))
{
int deadregno = REGNO (XEXP (note, 0));
int deadend
unsigned int deadregno = REGNO (XEXP (note, 0));
unsigned int deadend
= (deadregno + HARD_REGNO_NREGS (deadregno,
GET_MODE (XEXP (note, 0))));
int ourend = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int i;
unsigned int ourend
= regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
unsigned int i;
for (i = deadregno; i < deadend; i++)
if (i < regno || i >= ourend)
@ -11717,6 +11753,7 @@ move_deaths (x, maybe_kill_insn, from_cuid, to_insn, pnotes)
gen_rtx_REG (reg_raw_mode[i], i),
REG_NOTES (where_dead));
}
/* If we didn't find any note, or if we found a REG_DEAD note that
covers only part of the given reg, and we have a multi-reg hard
register, then to be safe we must check for REG_DEAD notes
@ -11729,8 +11766,9 @@ move_deaths (x, maybe_kill_insn, from_cuid, to_insn, pnotes)
&& regno < FIRST_PSEUDO_REGISTER
&& HARD_REGNO_NREGS (regno, GET_MODE (x)) > 1)
{
int ourend = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int i, offset;
unsigned int ourend
= regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
unsigned int i, offset;
rtx oldnotes = 0;
if (note)
@ -11829,7 +11867,7 @@ reg_bitfield_target_p (x, body)
{
rtx dest = SET_DEST (body);
rtx target;
int regno, tregno, endregno, endtregno;
unsigned int regno, tregno, endregno, endtregno;
if (GET_CODE (dest) == ZERO_EXTRACT)
target = XEXP (dest, 0);
@ -11949,7 +11987,8 @@ distribute_notes (notes, from_insn, i3, i2, elim_i2, elim_i1)
is one already. */
else if (reg_referenced_p (XEXP (note, 0), PATTERN (i3))
&& ! (GET_CODE (XEXP (note, 0)) == REG
? find_regno_note (i3, REG_DEAD, REGNO (XEXP (note, 0)))
? find_regno_note (i3, REG_DEAD,
REGNO (XEXP (note, 0)))
: find_reg_note (i3, REG_DEAD, XEXP (note, 0))))
{
PUT_REG_NOTE_KIND (note, REG_DEAD);
@ -12219,14 +12258,12 @@ distribute_notes (notes, from_insn, i3, i2, elim_i2, elim_i1)
of the block. If the existing life info says the reg
was dead, there's nothing left to do. Otherwise, we'll
need to do a global life update after combine. */
if (REG_NOTE_KIND (note) == REG_DEAD && place == 0)
if (REG_NOTE_KIND (note) == REG_DEAD && place == 0
&& REGNO_REG_SET_P (bb->global_live_at_start,
REGNO (XEXP (note, 0))))
{
int regno = REGNO (XEXP (note, 0));
if (REGNO_REG_SET_P (bb->global_live_at_start, regno))
{
SET_BIT (refresh_blocks, this_basic_block);
need_refresh = 1;
}
SET_BIT (refresh_blocks, this_basic_block);
need_refresh = 1;
}
}
@ -12238,7 +12275,7 @@ distribute_notes (notes, from_insn, i3, i2, elim_i2, elim_i1)
if (place && REG_NOTE_KIND (note) == REG_DEAD)
{
int regno = REGNO (XEXP (note, 0));
unsigned int regno = REGNO (XEXP (note, 0));
if (dead_or_set_p (place, XEXP (note, 0))
|| reg_bitfield_target_p (XEXP (note, 0), PATTERN (place)))
@ -12267,11 +12304,11 @@ distribute_notes (notes, from_insn, i3, i2, elim_i2, elim_i1)
if (place && regno < FIRST_PSEUDO_REGISTER
&& HARD_REGNO_NREGS (regno, GET_MODE (XEXP (note, 0))) > 1)
{
int endregno
unsigned int endregno
= regno + HARD_REGNO_NREGS (regno,
GET_MODE (XEXP (note, 0)));
int all_used = 1;
int i;
unsigned int i;
for (i = regno; i < endregno; i++)
if (! refers_to_regno_p (i, i + 1, PATTERN (place), 0)

View File

@ -120,8 +120,8 @@ convert_to_integer (type, expr)
{
enum tree_code ex_form = TREE_CODE (expr);
tree intype = TREE_TYPE (expr);
int inprec = TYPE_PRECISION (intype);
int outprec = TYPE_PRECISION (type);
unsigned int inprec = TYPE_PRECISION (intype);
unsigned int outprec = TYPE_PRECISION (type);
/* An INTEGER_TYPE cannot be incomplete, but an ENUMERAL_TYPE can
be. Consider `enum E = { a, b = (enum E) 3 };'. */

View File

@ -1,3 +1,25 @@
Sat Mar 25 09:12:10 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* class.c (build_vbase_pointer_fields): layout_field now place_field.
(get_vfield_offset): Use byte_position.
(set_rtti_entry): Set OFFSET to ssizetype zero.
(get_binfo_offset_as_int): Deleted.
(dfs_record_base_offsets): Use tree_low_cst.
(dfs_search_base_offsets): Likewise.
(layout_nonempty_base_or_field): Reflect changes in RLI format
and call byte_position.
(layout_empty_base): Convert offset to ssizetype.
(build_base_field): use rli_size_unit_so_far.
(dfs_propagate_binfo_offsets): Do computation in proper type.
(layout_virtual_bases): Pass ssizetype to propagate_binfo_offsets.
(layout_class_type): Reflect changes in RLI names and fields.
(finish_struct_1): Set DECL_FIELD_OFFSET.
* dump.c (dequeue_and_dump): Call bit_position.
* expr.c (cplus_expand_constant): Use byte_position.
* rtti.c (expand_class_desc): Use bitsize_one_node.
* typeck.c (build_component_addr): Use byte_position and don't
special case for zero offset.
2000-03-24 Nathan Sidwell <nathan@codesourcery.com>
* decl.c (vtype_decl_p): Use TYPE_POLYMORPHIC_P.

View File

@ -243,7 +243,7 @@ build_vbase_pointer_fields (rli, empty_p)
empty_p);
BINFO_VPTR_FIELD (base_binfo) = decl;
TREE_CHAIN (decl) = vbase_decls;
layout_field (rli, decl);
place_field (rli, decl);
vbase_decls = decl;
*empty_p = 0;
@ -912,13 +912,9 @@ tree
get_vfield_offset (binfo)
tree binfo;
{
tree tmp
= size_binop (FLOOR_DIV_EXPR,
bit_position (TYPE_VFIELD (BINFO_TYPE (binfo))),
bitsize_int (BITS_PER_UNIT));
return size_binop (PLUS_EXPR, convert (sizetype, tmp),
BINFO_OFFSET (binfo));
return
size_binop (PLUS_EXPR, byte_position (TYPE_VFIELD (BINFO_TYPE (binfo))),
BINFO_OFFSET (binfo));
}
/* Get the offset to the start of the original binfo that we derived
@ -981,7 +977,7 @@ set_rtti_entry (virtuals, offset, type)
/* The next node holds the decl. */
virtuals = TREE_CHAIN (virtuals);
offset = integer_zero_node;
offset = ssize_int (0);
}
/* This slot holds the function to call. */
@ -2794,7 +2790,6 @@ dfs_accumulate_vtbl_inits (binfo, data)
&& CLASSTYPE_VFIELDS (BINFO_TYPE (binfo))
&& BINFO_NEW_VTABLE_MARKED (binfo, t))
{
/* If this is a secondary vtable, record its location. */
if (binfo != TYPE_BINFO (t))
{
@ -4122,22 +4117,6 @@ build_vtbl_or_vbase_field (name, assembler_name, type, class_type, fcontext,
return field;
}
/* Return the BINFO_OFFSET for BINFO as a native integer, not an
INTEGER_CST. */
static unsigned HOST_WIDE_INT
get_binfo_offset_as_int (binfo)
tree binfo;
{
tree offset;
offset = BINFO_OFFSET (binfo);
my_friendly_assert (TREE_CODE (offset) == INTEGER_CST, 20000313);
my_friendly_assert (TREE_INT_CST_HIGH (offset) == 0, 20000313);
return (unsigned HOST_WIDE_INT) TREE_INT_CST_LOW (offset);
}
/* Record the type of BINFO in the slot in DATA (which is really a
`varray_type *') corresponding to the BINFO_OFFSET. */
@ -4147,7 +4126,7 @@ dfs_record_base_offsets (binfo, data)
void *data;
{
varray_type *v;
unsigned HOST_WIDE_INT offset = get_binfo_offset_as_int (binfo);
unsigned HOST_WIDE_INT offset = tree_low_cst (BINFO_OFFSET (binfo), 1);
v = (varray_type *) data;
while (VARRAY_SIZE (*v) <= offset)
@ -4184,11 +4163,10 @@ dfs_search_base_offsets (binfo, data)
if (is_empty_class (BINFO_TYPE (binfo)))
{
varray_type v = (varray_type) data;
unsigned HOST_WIDE_INT offset;
/* Find the offset for this BINFO. */
unsigned HOST_WIDE_INT offset = tree_low_cst (BINFO_OFFSET (binfo), 1);
tree t;
/* Find the offset for this BINFO. */
offset = get_binfo_offset_as_int (binfo);
/* If we haven't yet encountered any objects at offsets that
big, then there's no conflict. */
if (VARRAY_SIZE (v) <= offset)
@ -4238,14 +4216,14 @@ layout_nonempty_base_or_field (rli, decl, binfo, v)
while (1)
{
tree offset;
struct record_layout_info old_rli = *rli;
/* Layout this field. */
layout_field (rli, decl);
/* Place this field. */
place_field (rli, decl);
/* Now that we know where it wil be placed, update its
BINFO_OFFSET. */
offset = size_int (CEIL (TREE_INT_CST_LOW (DECL_FIELD_BITPOS (decl)),
BITS_PER_UNIT));
offset = convert (ssizetype, byte_position (decl));
if (binfo)
propagate_binfo_offsets (binfo, offset);
@ -4267,17 +4245,20 @@ layout_nonempty_base_or_field (rli, decl, binfo, v)
if (binfo && flag_new_abi && layout_conflict_p (binfo, v))
{
/* Undo the propogate_binfo_offsets call. */
offset = convert (sizetype,
size_diffop (size_zero_node, offset));
offset = size_diffop (size_zero_node, offset);
propagate_binfo_offsets (binfo, offset);
/* Strip off the size allocated to this field. That puts us
at the first place we could have put the field with
proper alignment. */
rli->const_size -= TREE_INT_CST_LOW (DECL_SIZE (decl));
/* Bump up by th alignment required for the type, without
*rli = old_rli;
/* Bump up by the alignment required for the type, without
virtual base classes. */
rli->const_size += CLASSTYPE_ALIGN (BINFO_TYPE (binfo));
rli->bitpos
= size_binop (PLUS_EXPR, rli->bitpos,
bitsize_int (CLASSTYPE_ALIGN (BINFO_TYPE (binfo))));
normalize_rli (rli);
}
else
/* There was no conflict. We're done laying out this field. */
@ -4312,7 +4293,7 @@ layout_empty_base (binfo, eoc, binfo_offsets)
{
/* That didn't work. Now, we move forward from the next
available spot in the class. */
propagate_binfo_offsets (binfo, eoc);
propagate_binfo_offsets (binfo, convert (ssizetype, eoc));
while (1)
{
if (!layout_conflict_p (binfo, binfo_offsets))
@ -4320,7 +4301,7 @@ layout_empty_base (binfo, eoc, binfo_offsets)
break;
/* There's overlap here, too. Bump along to the next spot. */
propagate_binfo_offsets (binfo, size_one_node);
propagate_binfo_offsets (binfo, ssize_int (1));
}
}
}
@ -4379,9 +4360,7 @@ build_base_field (rli, binfo, empty_p, base_align, v)
layout_nonempty_base_or_field (rli, decl, binfo, *v);
}
else
layout_empty_base (binfo,
size_int (CEIL (rli->const_size, BITS_PER_UNIT)),
*v);
layout_empty_base (binfo, rli_size_unit_so_far (rli), *v);
/* Check for inaccessible base classes. If the same base class
appears more than once in the hierarchy, but isn't virtual, then
@ -4749,12 +4728,12 @@ dfs_propagate_binfo_offsets (binfo, data)
{
tree offset = (tree) data;
/* Update the BINFO_OFFSET for this base. */
BINFO_OFFSET (binfo) = fold (build (PLUS_EXPR,
sizetype,
BINFO_OFFSET (binfo),
offset));
/* Update the BINFO_OFFSET for this base. Allow for the case where it
might be negative. */
BINFO_OFFSET (binfo)
= convert (sizetype, size_binop (PLUS_EXPR,
convert (ssizetype, BINFO_OFFSET (binfo)),
offset));
SET_BINFO_MARKED (binfo);
return NULL_TREE;
@ -4890,7 +4869,7 @@ layout_virtual_bases (t, base_offsets)
dsize = CEIL (dsize, desired_align) * desired_align;
/* And compute the offset of the virtual base. */
propagate_binfo_offsets (vbase,
size_int (CEIL (dsize, BITS_PER_UNIT)));
ssize_int (CEIL (dsize, BITS_PER_UNIT)));
/* Every virtual baseclass takes a least a UNIT, so that
we can take it's address and get something different
for each base. */
@ -4934,8 +4913,8 @@ layout_virtual_bases (t, base_offsets)
dsize = CEIL (dsize, TYPE_ALIGN (t)) * TYPE_ALIGN (t);
TYPE_SIZE (t) = bitsize_int (dsize);
TYPE_SIZE_UNIT (t) = convert (sizetype,
size_binop (FLOOR_DIV_EXPR, TYPE_SIZE (t),
bitsize_int (BITS_PER_UNIT)));
size_binop (CEIL_DIV_EXPR, TYPE_SIZE (t),
bitsize_unit_node));
/* Check for ambiguous virtual bases. */
if (extra_warnings)
@ -5009,8 +4988,8 @@ layout_class_type (t, empty_p, has_virtual_p,
/* Keep track of the first non-static data member. */
non_static_data_members = TYPE_FIELDS (t);
/* Initialize the layout information. */
rli = new_record_layout_info (t);
/* Start laying out the record. */
rli = start_record_layout (t);
/* If possible, we reuse the virtual function table pointer from one
of our base classes. */
@ -5025,7 +5004,7 @@ layout_class_type (t, empty_p, has_virtual_p,
if (flag_new_abi && vptr)
{
TYPE_FIELDS (t) = chainon (vptr, TYPE_FIELDS (t));
layout_field (rli, vptr);
place_field (rli, vptr);
}
/* Add pointers to all of our virtual base-classes. */
@ -5040,9 +5019,7 @@ layout_class_type (t, empty_p, has_virtual_p,
fixup_inline_methods (t);
/* Layout the non-static data members. */
for (field = non_static_data_members;
field;
field = TREE_CHAIN (field))
for (field = non_static_data_members; field; field = TREE_CHAIN (field))
{
tree binfo;
tree type;
@ -5052,7 +5029,7 @@ layout_class_type (t, empty_p, has_virtual_p,
the back-end, in case it wants to do something with them. */
if (TREE_CODE (field) != FIELD_DECL)
{
layout_field (rli, field);
place_field (rli, field);
continue;
}
@ -5067,9 +5044,9 @@ layout_class_type (t, empty_p, has_virtual_p,
&& ((flag_new_abi
&& INT_CST_LT (TYPE_SIZE (type), DECL_SIZE (field)))
|| (!flag_new_abi
&& compare_tree_int (DECL_SIZE (field),
TYPE_PRECISION
(long_long_unsigned_type_node)) > 0)))
&& 0 < compare_tree_int (DECL_SIZE (field),
TYPE_PRECISION
(long_long_unsigned_type_node)))))
{
integer_type_kind itk;
tree integer_type;
@ -5087,8 +5064,8 @@ layout_class_type (t, empty_p, has_virtual_p,
field. We have to back up by one to find the largest
type that fits. */
integer_type = integer_types[itk - 1];
padding = size_diffop (DECL_SIZE (field),
TYPE_SIZE (integer_type));
padding = size_binop (MINUS_EXPR, DECL_SIZE (field),
TYPE_SIZE (integer_type));
DECL_SIZE (field) = TYPE_SIZE (integer_type);
DECL_ALIGN (field) = TYPE_ALIGN (integer_type);
}
@ -5122,13 +5099,15 @@ layout_class_type (t, empty_p, has_virtual_p,
offset. However, now we need to make sure that RLI is big enough
to reflect the entire class. */
eoc = end_of_class (t, /*include_virtuals_p=*/0);
if (eoc * BITS_PER_UNIT > rli->const_size)
if (TREE_CODE (rli_size_unit_so_far (rli)) == INTEGER_CST
&& compare_tree_int (rli_size_unit_so_far (rli), eoc) < 0)
{
/* We don't handle zero-sized base classes specially under the
old ABI, so if we get here, we had better be operating under
the new ABI rules. */
my_friendly_assert (flag_new_abi, 20000321);
rli->const_size = (eoc + 1) * BITS_PER_UNIT;
rli->offset = size_binop (MAX_EXPR, rli->offset, size_int (eoc + 1));
rli->bitpos = bitsize_zero_node;
}
/* We make all structures have at least one element, so that they
@ -5141,7 +5120,7 @@ layout_class_type (t, empty_p, has_virtual_p,
tree padding;
padding = build_lang_decl (FIELD_DECL, NULL_TREE, char_type_node);
layout_field (rli, padding);
place_field (rli, padding);
TYPE_NONCOPIED_PARTS (t)
= tree_cons (NULL_TREE, padding, TYPE_NONCOPIED_PARTS (t));
TREE_STATIC (TYPE_NONCOPIED_PARTS (t)) = 1;
@ -5151,7 +5130,7 @@ layout_class_type (t, empty_p, has_virtual_p,
class. */
if (!flag_new_abi && vptr)
{
layout_field (rli, vptr);
place_field (rli, vptr);
TYPE_FIELDS (t) = chainon (TYPE_FIELDS (t), vptr);
}
@ -5170,7 +5149,7 @@ layout_class_type (t, empty_p, has_virtual_p,
the virtual bases. */
if (*empty_p && flag_new_abi)
{
CLASSTYPE_SIZE (t) = bitsize_int (0);
CLASSTYPE_SIZE (t) = bitsize_zero_node;
CLASSTYPE_SIZE_UNIT (t) = size_zero_node;
}
else if (flag_new_abi && TYPE_HAS_COMPLEX_INIT_REF (t)
@ -5283,18 +5262,15 @@ finish_struct_1 (t)
if (vfield != NULL_TREE
&& DECL_FIELD_CONTEXT (vfield) != t)
{
tree binfo = get_binfo (DECL_FIELD_CONTEXT (vfield), t, 0);
tree offset = convert (bitsizetype, BINFO_OFFSET (binfo));
vfield = copy_node (vfield);
copy_lang_decl (vfield);
if (! integer_zerop (offset))
offset = size_binop (MULT_EXPR, offset, bitsize_int (BITS_PER_UNIT));
DECL_FIELD_CONTEXT (vfield) = t;
DECL_FIELD_BITPOS (vfield)
= size_binop (PLUS_EXPR, offset, bit_position (vfield));
DECL_FIELD_OFFSET (vfield)
= size_binop (PLUS_EXPR,
BINFO_OFFSET (get_binfo (DECL_FIELD_CONTEXT (vfield),
t, 0)),
DECL_FIELD_OFFSET (vfield));
TYPE_VFIELD (t) = vfield;
}

View File

@ -550,7 +550,7 @@ dequeue_and_dump (di)
{
if (DECL_C_BIT_FIELD (t))
dump_string (di, "bitfield");
dump_child ("bpos", DECL_FIELD_BITPOS (t));
dump_child ("bpos", bit_position (t));
}
break;

View File

@ -52,7 +52,6 @@ cplus_expand_constant (cst)
{
tree type = TREE_TYPE (cst);
tree member;
tree offset;
/* Find the member. */
member = PTRMEM_CST_MEMBER (cst);
@ -60,10 +59,7 @@ cplus_expand_constant (cst)
if (TREE_CODE (member) == FIELD_DECL)
{
/* Find the offset for the field. */
offset = convert (sizetype,
size_binop (EASY_DIV_EXPR,
bit_position (member),
bitsize_int (BITS_PER_UNIT)));
tree offset = byte_position (member);
if (flag_new_abi)
/* Under the new ABI, we use -1 to represent the NULL
@ -80,15 +76,10 @@ cplus_expand_constant (cst)
}
else
{
tree delta;
tree idx;
tree pfn;
tree delta2;
tree delta, idx, pfn, delta2;
expand_ptrmemfunc_cst (cst, &delta, &idx, &pfn, &delta2);
cst = build_ptrmemfunc1 (type, delta, idx,
pfn, delta2);
cst = build_ptrmemfunc1 (type, delta, idx, pfn, delta2);
}
}
break;

View File

@ -1228,7 +1228,7 @@ build_mangled_name (parmtypes, begin, end)
if (end)
OB_FINISH ();
return (char *)obstack_base (&scratch_obstack);
return (char *) obstack_base (&scratch_obstack);
}
/* Emit modifiers such as constant, read-only, and volatile. */

View File

@ -513,30 +513,23 @@ get_base_offset (binfo, parent)
tree binfo;
tree parent;
{
tree offset;
if (!TREE_VIA_VIRTUAL (binfo))
offset = BINFO_OFFSET (binfo);
else if (!vbase_offsets_in_vtable_p ())
if (! TREE_VIA_VIRTUAL (binfo))
return BINFO_OFFSET (binfo);
else if (! vbase_offsets_in_vtable_p ())
{
tree t = BINFO_TYPE (binfo);
const char *name;
tree field;
FORMAT_VBASE_NAME (name, t);
field = lookup_field (parent, get_identifier (name), 0, 0);
offset = size_binop (FLOOR_DIV_EXPR, bit_position (field),
bitsize_int (BITS_PER_UNIT));
offset = convert (sizetype, offset);
FORMAT_VBASE_NAME (name, BINFO_TYPE (binfo));
return byte_position (lookup_field (parent, get_identifier (name),
0, 0));
}
else
{
/* Under the new ABI, we store the vtable offset at which
the virtual base offset can be found. */
tree vbase = BINFO_FOR_VBASE (BINFO_TYPE (binfo), parent);
offset = convert (sizetype, BINFO_VPTR_FIELD (vbase));
}
return offset;
/* Under the new ABI, we store the vtable offset at which
the virtual base offset can be found. */
return convert (sizetype,
BINFO_VPTR_FIELD (BINFO_FOR_VBASE (BINFO_TYPE (binfo),
parent)));
}
/* Execute a dynamic cast, as described in section 5.2.6 of the 9/93 working
@ -941,7 +934,7 @@ expand_class_desc (tdecl, type)
fields [2] = build_lang_decl (FIELD_DECL, NULL_TREE, boolean_type_node);
DECL_BIT_FIELD (fields[2]) = 1;
DECL_SIZE (fields[2]) = bitsize_int (1);
DECL_SIZE (fields[2]) = bitsize_one_node;
/* Actually enum access */
fields [3] = build_lang_decl (FIELD_DECL, NULL_TREE, integer_type_node);

View File

@ -4279,18 +4279,8 @@ build_component_addr (arg, argtype)
/* This conversion is harmless. */
rval = convert_force (argtype, rval, 0);
if (! integer_zerop (bit_position (field)))
{
tree offset = size_binop (EASY_DIV_EXPR, bit_position (field),
bitsize_int (BITS_PER_UNIT));
int flag = TREE_CONSTANT (rval);
offset = convert (sizetype, offset);
rval = fold (build (PLUS_EXPR, argtype,
rval, cp_convert (argtype, offset)));
TREE_CONSTANT (rval) = flag;
}
return rval;
return fold (build (PLUS_EXPR, argtype, rval,
cp_convert (argtype, byte_position (field))));
}
/* Construct and perhaps optimize a tree representation

146
gcc/cse.c
View File

@ -246,7 +246,7 @@ struct qty_table_elem
rtx const_insn;
rtx comparison_const;
int comparison_qty;
int first_reg, last_reg;
unsigned int first_reg, last_reg;
enum machine_mode mode;
enum rtx_code comparison_code;
};
@ -302,7 +302,7 @@ struct cse_reg_info
struct cse_reg_info *next;
/* Search key */
int regno;
unsigned int regno;
/* The quantity number of the register's current contents. */
int reg_qty;
@ -336,7 +336,7 @@ static struct cse_reg_info *reg_hash[REGHASH_SIZE];
/* The last lookup we did into the cse_reg_info_tree. This allows us
to cache repeated lookups. */
static int cached_regno;
static unsigned int cached_regno;
static struct cse_reg_info *cached_cse_reg_info;
/* A HARD_REG_SET containing all the hard registers for which there is
@ -531,7 +531,7 @@ struct table_elt
/* Determine if the quantity number for register X represents a valid index
into the qty_table. */
#define REGNO_QTY_VALID_P(N) (REG_QTY (N) != (N))
#define REGNO_QTY_VALID_P(N) (REG_QTY (N) != (int) (N))
#ifdef ADDRESS_COST
/* The ADDRESS_COST macro does not deal with ADDRESSOF nodes. But,
@ -653,9 +653,9 @@ struct cse_basic_block_data
static int notreg_cost PARAMS ((rtx));
static void new_basic_block PARAMS ((void));
static void make_new_qty PARAMS ((int, enum machine_mode));
static void make_regs_eqv PARAMS ((int, int));
static void delete_reg_equiv PARAMS ((int));
static void make_new_qty PARAMS ((unsigned int, enum machine_mode));
static void make_regs_eqv PARAMS ((unsigned int, unsigned int));
static void delete_reg_equiv PARAMS ((unsigned int));
static int mention_regs PARAMS ((rtx));
static int insert_regs PARAMS ((rtx, struct table_elt *, int));
static void remove_from_table PARAMS ((struct table_elt *, unsigned));
@ -668,8 +668,9 @@ static void merge_equiv_classes PARAMS ((struct table_elt *,
struct table_elt *));
static void invalidate PARAMS ((rtx, enum machine_mode));
static int cse_rtx_varies_p PARAMS ((rtx));
static void remove_invalid_refs PARAMS ((int));
static void remove_invalid_subreg_refs PARAMS ((int, int, enum machine_mode));
static void remove_invalid_refs PARAMS ((unsigned int));
static void remove_invalid_subreg_refs PARAMS ((unsigned int, unsigned int,
enum machine_mode));
static void rehash_using_reg PARAMS ((rtx));
static void invalidate_memory PARAMS ((void));
static void invalidate_for_call PARAMS ((void));
@ -699,7 +700,7 @@ static void cse_set_around_loop PARAMS ((rtx, rtx, rtx));
static rtx cse_basic_block PARAMS ((rtx, rtx, struct branch_path *, int));
static void count_reg_usage PARAMS ((rtx, int *, rtx, int));
extern void dump_class PARAMS ((struct table_elt*));
static struct cse_reg_info* get_cse_reg_info PARAMS ((int));
static struct cse_reg_info * get_cse_reg_info PARAMS ((unsigned int));
static void flush_hash_table PARAMS ((void));
@ -845,7 +846,7 @@ rtx_cost (x, outer_code)
static struct cse_reg_info *
get_cse_reg_info (regno)
int regno;
unsigned int regno;
{
struct cse_reg_info **hash_head = &reg_hash[REGHASH_FN (regno)];
struct cse_reg_info *p;
@ -949,8 +950,8 @@ new_basic_block ()
static void
make_new_qty (reg, mode)
register int reg;
register enum machine_mode mode;
unsigned int reg;
enum machine_mode mode;
{
register int q;
register struct qty_table_elem *ent;
@ -976,11 +977,11 @@ make_new_qty (reg, mode)
static void
make_regs_eqv (new, old)
register int new, old;
unsigned int new, old;
{
register int lastr, firstr;
register int q = REG_QTY (old);
register struct qty_table_elem *ent;
unsigned int lastr, firstr;
int q = REG_QTY (old);
struct qty_table_elem *ent;
ent = &qty_table[q];
@ -1040,14 +1041,14 @@ make_regs_eqv (new, old)
static void
delete_reg_equiv (reg)
register int reg;
unsigned int reg;
{
register struct qty_table_elem *ent;
register int q = REG_QTY (reg);
register int p, n;
/* If invalid, do nothing. */
if (q == reg)
if (q == (int) reg)
return;
ent = &qty_table[q];
@ -1094,11 +1095,11 @@ mention_regs (x)
code = GET_CODE (x);
if (code == REG)
{
register int regno = REGNO (x);
register int endregno
unsigned int regno = REGNO (x);
unsigned int endregno
= regno + (regno >= FIRST_PSEUDO_REGISTER ? 1
: HARD_REGNO_NREGS (regno, GET_MODE (x)));
int i;
unsigned int i;
for (i = regno; i < endregno; i++)
{
@ -1117,7 +1118,7 @@ mention_regs (x)
if (code == SUBREG && GET_CODE (SUBREG_REG (x)) == REG
&& REGNO (SUBREG_REG (x)) >= FIRST_PSEUDO_REGISTER)
{
int i = REGNO (SUBREG_REG (x));
unsigned int i = REGNO (SUBREG_REG (x));
if (REG_IN_TABLE (i) >= 0 && REG_IN_TABLE (i) != REG_TICK (i))
{
@ -1193,8 +1194,8 @@ insert_regs (x, classp, modified)
{
if (GET_CODE (x) == REG)
{
register int regno = REGNO (x);
register int qty_valid;
unsigned int regno = REGNO (x);
int qty_valid;
/* If REGNO is in the equivalence table already but is of the
wrong mode for that equivalence, don't do anything here. */
@ -1237,7 +1238,7 @@ insert_regs (x, classp, modified)
else if (GET_CODE (x) == SUBREG && GET_CODE (SUBREG_REG (x)) == REG
&& ! REGNO_QTY_VALID_P (REGNO (SUBREG_REG (x))))
{
int regno = REGNO (SUBREG_REG (x));
unsigned int regno = REGNO (SUBREG_REG (x));
insert_regs (SUBREG_REG (x), NULL_PTR, 0);
/* Mention_regs checks if REG_TICK is exactly one larger than
@ -1324,6 +1325,7 @@ remove_from_table (elt, hash)
if (elt->related_value != 0 && elt->related_value != elt)
{
register struct table_elt *p = elt->related_value;
while (p->related_value != elt)
p = p->related_value;
p->related_value = elt->related_value;
@ -1374,7 +1376,8 @@ lookup_for_remove (x, hash, mode)
if (GET_CODE (x) == REG)
{
int regno = REGNO (x);
unsigned int regno = REGNO (x);
/* Don't check the machine mode when comparing registers;
invalidating (REG:SI 0) also invalidates (REG:DF 0). */
for (p = table[hash]; p; p = p->next_same_hash)
@ -1400,8 +1403,9 @@ lookup_as_function (x, code)
rtx x;
enum rtx_code code;
{
register struct table_elt *p = lookup (x, safe_hash (x, VOIDmode) & HASH_MASK,
GET_MODE (x));
register struct table_elt *p
= lookup (x, safe_hash (x, VOIDmode) & HASH_MASK, GET_MODE (x));
/* If we are looking for a CONST_INT, the mode doesn't really matter, as
long as we are narrowing. So if we looked in vain for a mode narrower
than word_mode before, look for word_mode now. */
@ -1417,12 +1421,10 @@ lookup_as_function (x, code)
return 0;
for (p = p->first_same_value; p; p = p->next_same_value)
{
if (GET_CODE (p->exp) == code
/* Make sure this is a valid entry in the table. */
&& exp_equiv_p (p->exp, p->exp, 1, 0))
return p->exp;
}
if (GET_CODE (p->exp) == code
/* Make sure this is a valid entry in the table. */
&& exp_equiv_p (p->exp, p->exp, 1, 0))
return p->exp;
return 0;
}
@ -1470,12 +1472,12 @@ insert (x, classp, hash, mode)
/* If X is a hard register, show it is being put in the table. */
if (GET_CODE (x) == REG && REGNO (x) < FIRST_PSEUDO_REGISTER)
{
int regno = REGNO (x);
int endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int i;
unsigned int regno = REGNO (x);
unsigned int endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
unsigned int i;
for (i = regno; i < endregno; i++)
SET_HARD_REG_BIT (hard_regs_in_table, i);
SET_HARD_REG_BIT (hard_regs_in_table, i);
}
/* If X is a label, show we recorded it. */
@ -1488,9 +1490,7 @@ insert (x, classp, hash, mode)
elt = free_element_chain;
if (elt)
{
free_element_chain = elt->next_same_hash;
}
free_element_chain = elt->next_same_hash;
else
{
n_elements_made++;
@ -1538,12 +1538,15 @@ insert (x, classp, hash, mode)
/* Insert not at head of the class. */
/* Put it after the last element cheaper than X. */
register struct table_elt *p, *next;
for (p = classp; (next = p->next_same_value) && CHEAPER (next, elt);
p = next);
/* Put it after P and before NEXT. */
elt->next_same_value = next;
if (next)
next->prev_same_value = elt;
elt->prev_same_value = p;
p->next_same_value = elt;
elt->first_same_value = classp;
@ -1591,7 +1594,8 @@ insert (x, classp, hash, mode)
int x_q = REG_QTY (REGNO (x));
struct qty_table_elem *x_ent = &qty_table[x_q];
x_ent->const_rtx = gen_lowpart_if_possible (GET_MODE (x), p->exp);
x_ent->const_rtx
= gen_lowpart_if_possible (GET_MODE (x), p->exp);
x_ent->const_insn = this_insn;
break;
}
@ -1661,7 +1665,7 @@ merge_equiv_classes (class1, class2)
for (elt = class2; elt; elt = next)
{
unsigned hash;
unsigned int hash;
rtx exp = elt->exp;
enum machine_mode mode = elt->mode;
@ -1740,8 +1744,8 @@ invalidate (x, full_mode)
through the qty number mechanism. Just change the qty number of
the register, mark it as invalid for expressions that refer to it,
and remove it itself. */
register int regno = REGNO (x);
register unsigned hash = HASH (x, GET_MODE (x));
unsigned int regno = REGNO (x);
unsigned int hash = HASH (x, GET_MODE (x));
/* Remove REGNO from any quantity list it might be on and indicate
that its value might have changed. If it is a pseudo, remove its
@ -1768,18 +1772,19 @@ invalidate (x, full_mode)
{
HOST_WIDE_INT in_table
= TEST_HARD_REG_BIT (hard_regs_in_table, regno);
int endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int tregno, tendregno;
unsigned int endregno
= regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
unsigned int tregno, tendregno, rn;
register struct table_elt *p, *next;
CLEAR_HARD_REG_BIT (hard_regs_in_table, regno);
for (i = regno + 1; i < endregno; i++)
for (rn = regno + 1; rn < endregno; rn++)
{
in_table |= TEST_HARD_REG_BIT (hard_regs_in_table, i);
CLEAR_HARD_REG_BIT (hard_regs_in_table, i);
delete_reg_equiv (i);
REG_TICK (i)++;
in_table |= TEST_HARD_REG_BIT (hard_regs_in_table, rn);
CLEAR_HARD_REG_BIT (hard_regs_in_table, rn);
delete_reg_equiv (rn);
REG_TICK (rn)++;
}
if (in_table)
@ -1851,10 +1856,10 @@ invalidate (x, full_mode)
static void
remove_invalid_refs (regno)
int regno;
unsigned int regno;
{
register int i;
register struct table_elt *p, *next;
unsigned int i;
struct table_elt *p, *next;
for (i = 0; i < HASH_SIZE; i++)
for (p = table[i]; p; p = next)
@ -1869,13 +1874,13 @@ remove_invalid_refs (regno)
/* Likewise for a subreg with subreg_reg WORD and mode MODE. */
static void
remove_invalid_subreg_refs (regno, word, mode)
int regno;
int word;
unsigned int regno;
unsigned int word;
enum machine_mode mode;
{
register int i;
register struct table_elt *p, *next;
int end = word + (GET_MODE_SIZE (mode) - 1) / UNITS_PER_WORD;
unsigned int i;
struct table_elt *p, *next;
unsigned int end = word + (GET_MODE_SIZE (mode) - 1) / UNITS_PER_WORD;
for (i = 0; i < HASH_SIZE; i++)
for (p = table[i]; p; p = next)
@ -1956,8 +1961,8 @@ rehash_using_reg (x)
static void
invalidate_for_call ()
{
int regno, endregno;
int i;
unsigned int regno, endregno;
unsigned int i;
unsigned hash;
struct table_elt *p, *next;
int in_table = 0;
@ -2111,7 +2116,7 @@ canon_hash (x, mode)
{
case REG:
{
register int regno = REGNO (x);
unsigned int regno = REGNO (x);
/* On some machines, we can't record any non-fixed hard register,
because extending its life will cause reload problems. We
@ -2136,6 +2141,7 @@ canon_hash (x, mode)
do_not_record = 1;
return 0;
}
hash += ((unsigned) REG << 7) + (unsigned) REG_QTY (regno);
return hash;
}
@ -2374,11 +2380,11 @@ exp_equiv_p (x, y, validate, equal_values)
case REG:
{
int regno = REGNO (y);
int endregno
unsigned int regno = REGNO (y);
unsigned int endregno
= regno + (regno >= FIRST_PSEUDO_REGISTER ? 1
: HARD_REGNO_NREGS (regno, GET_MODE (y)));
int i;
unsigned int i;
/* If the quantities are not the same, the expressions are not
equivalent. If there are and we are not to validate, they
@ -5703,11 +5709,11 @@ cse_insn (insn, libcall_insn)
This code is similar to the REG case in mention_regs,
but it knows that reg_tick has been incremented, and
it leaves reg_in_table as -1 . */
register int regno = REGNO (x);
register int endregno
unsigned int regno = REGNO (x);
unsigned int endregno
= regno + (regno >= FIRST_PSEUDO_REGISTER ? 1
: HARD_REGNO_NREGS (regno, GET_MODE (x)));
int i;
unsigned int i;
for (i = regno; i < endregno; i++)
{

View File

@ -2370,8 +2370,7 @@ dbxout_parms (parms)
If we use DECL_RTL, then we must use the declared type of
the variable, not the type that it arrived in. */
if (REGNO (DECL_RTL (parms)) >= 0
&& REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
if (REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
{
best_rtl = DECL_RTL (parms);
parm_type = TREE_TYPE (parms);
@ -2430,8 +2429,7 @@ dbxout_parms (parms)
/* DECL_RTL looks like (MEM (REG...). Get the register number.
If it is an unallocated pseudo-reg, then use the register where
it was passed instead. */
if (REGNO (XEXP (DECL_RTL (parms), 0)) >= 0
&& REGNO (XEXP (DECL_RTL (parms), 0)) < FIRST_PSEUDO_REGISTER)
if (REGNO (XEXP (DECL_RTL (parms), 0)) < FIRST_PSEUDO_REGISTER)
current_sym_value = REGNO (XEXP (DECL_RTL (parms), 0));
else
current_sym_value = REGNO (DECL_INCOMING_RTL (parms));
@ -2558,7 +2556,6 @@ dbxout_reg_parms (parms)
/* Report parms that live in registers during the function
but were passed in memory. */
if (GET_CODE (DECL_RTL (parms)) == REG
&& REGNO (DECL_RTL (parms)) >= 0
&& REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
dbxout_symbol_location (parms, TREE_TYPE (parms),
0, DECL_RTL (parms));

View File

@ -218,7 +218,7 @@ static void reg_save PARAMS ((char *, unsigned, unsigned,
static void initial_return_save PARAMS ((rtx));
static void output_cfi PARAMS ((dw_cfi_ref, dw_fde_ref));
static void output_call_frame_info PARAMS ((int));
static unsigned reg_number PARAMS ((rtx));
static unsigned int reg_number PARAMS ((rtx));
static void dwarf2out_stack_adjust PARAMS ((rtx));
static void dwarf2out_frame_debug_expr PARAMS ((rtx, char *));
@ -553,7 +553,7 @@ stripattributes (s)
/* Return the register number described by a given RTL node. */
static unsigned
static unsigned int
reg_number (rtl)
register rtx rtl;
{
@ -1314,7 +1314,7 @@ dwarf2out_frame_debug_expr (expr, label)
/* Without an offset. */
case REG:
if (cfa_store_reg != (unsigned) REGNO (XEXP (dest, 0)))
if (cfa_store_reg != REGNO (XEXP (dest, 0)))
abort();
offset = -cfa_store_offset;
break;
@ -2670,9 +2670,9 @@ static inline int
is_pseudo_reg (rtl)
register rtx rtl;
{
return (((GET_CODE (rtl) == REG) && (REGNO (rtl) >= FIRST_PSEUDO_REGISTER))
|| ((GET_CODE (rtl) == SUBREG)
&& (REGNO (XEXP (rtl, 0)) >= FIRST_PSEUDO_REGISTER)));
return ((GET_CODE (rtl) == REG && REGNO (rtl) >= FIRST_PSEUDO_REGISTER)
|| (GET_CODE (rtl) == SUBREG
&& REGNO (XEXP (rtl, 0)) >= FIRST_PSEUDO_REGISTER));
}
/* Return a reference to a type, with its const and volatile qualifiers

View File

@ -924,7 +924,8 @@ subreg_realpart_p (x)
if (GET_CODE (x) != SUBREG)
abort ();
return SUBREG_WORD (x) * UNITS_PER_WORD < GET_MODE_UNIT_SIZE (GET_MODE (SUBREG_REG (x)));
return ((unsigned int) SUBREG_WORD (x) * UNITS_PER_WORD
< GET_MODE_UNIT_SIZE (GET_MODE (SUBREG_REG (x))));
}
/* Assuming that X is an rtx (e.g., MEM, REG or SUBREG) for a value,
@ -1104,7 +1105,7 @@ subreg_lowpart_p (x)
rtx
operand_subword (op, i, validate_address, mode)
rtx op;
int i;
unsigned int i;
int validate_address;
enum machine_mode mode;
{
@ -1181,7 +1182,9 @@ operand_subword (op, i, validate_address, mode)
return gen_rtx_SUBREG (word_mode, SUBREG_REG (op), i + SUBREG_WORD (op));
else if (GET_CODE (op) == CONCAT)
{
int partwords = GET_MODE_UNIT_SIZE (GET_MODE (op)) / UNITS_PER_WORD;
unsigned int partwords
= GET_MODE_UNIT_SIZE (GET_MODE (op)) / UNITS_PER_WORD;
if (i < partwords)
return operand_subword (XEXP (op, 0), i, validate_address, mode);
return operand_subword (XEXP (op, 1), i - partwords,
@ -1428,7 +1431,7 @@ operand_subword (op, i, validate_address, mode)
rtx
operand_subword_force (op, i, mode)
rtx op;
int i;
unsigned int i;
enum machine_mode mode;
{
rtx result = operand_subword (op, i, 1, mode);

View File

@ -2924,7 +2924,7 @@ eh_regs (pcontext, psp, pra, outgoing)
int outgoing ATTRIBUTE_UNUSED;
{
rtx rcontext, rsp, rra;
int i;
unsigned int i;
#ifdef FUNCTION_OUTGOING_VALUE
if (outgoing)

View File

@ -1584,17 +1584,20 @@ hard_function_value (valtype, func, outgoing)
int outgoing ATTRIBUTE_UNUSED;
{
rtx val;
#ifdef FUNCTION_OUTGOING_VALUE
if (outgoing)
val = FUNCTION_OUTGOING_VALUE (valtype, func);
else
#endif
val = FUNCTION_VALUE (valtype, func);
if (GET_CODE (val) == REG
&& GET_MODE (val) == BLKmode)
{
int bytes = int_size_in_bytes (valtype);
unsigned HOST_WIDE_INT bytes = int_size_in_bytes (valtype);
enum machine_mode tmpmode;
for (tmpmode = GET_CLASS_NARROWEST_MODE (MODE_INT);
tmpmode != VOIDmode;
tmpmode = GET_MODE_WIDER_MODE (tmpmode))

View File

@ -35,18 +35,24 @@ Boston, MA 02111-1307, USA. */
#include "real.h"
#include "recog.h"
static void store_fixed_bit_field PARAMS ((rtx, int, int, int, rtx,
static void store_fixed_bit_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, rtx,
unsigned int));
static void store_split_bit_field PARAMS ((rtx, int, int, rtx,
unsigned int));
static rtx extract_fixed_bit_field PARAMS ((enum machine_mode, rtx, int,
int, int, rtx, int,
static void store_split_bit_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, rtx,
unsigned int));
static rtx extract_fixed_bit_field PARAMS ((enum machine_mode, rtx,
unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT,
rtx, int, unsigned int));
static rtx mask_rtx PARAMS ((enum machine_mode, int,
int, int));
static rtx lshift_value PARAMS ((enum machine_mode, rtx,
int, int));
static rtx extract_split_bit_field PARAMS ((rtx, int, int, int,
static rtx extract_split_bit_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, int,
unsigned int));
static void do_cmp_and_jump PARAMS ((rtx, rtx, enum rtx_code,
enum machine_mode, rtx));
@ -225,19 +231,20 @@ negate_rtx (mode, x)
rtx
store_bit_field (str_rtx, bitsize, bitnum, fieldmode, value, align, total_size)
rtx str_rtx;
register int bitsize;
int bitnum;
unsigned HOST_WIDE_INT bitsize;
unsigned HOST_WIDE_INT bitnum;
enum machine_mode fieldmode;
rtx value;
unsigned int align;
int total_size;
HOST_WIDE_INT total_size;
{
int unit = (GET_CODE (str_rtx) == MEM) ? BITS_PER_UNIT : BITS_PER_WORD;
register int offset = bitnum / unit;
register int bitpos = bitnum % unit;
unsigned int unit
= (GET_CODE (str_rtx) == MEM) ? BITS_PER_UNIT : BITS_PER_WORD;
unsigned HOST_WIDE_INT offset = bitnum / unit;
unsigned HOST_WIDE_INT bitpos = bitnum % unit;
register rtx op0 = str_rtx;
#ifdef HAVE_insv
int insv_bitsize;
unsigned HOST_WIDE_INT insv_bitsize;
enum machine_mode op_mode;
op_mode = insn_data[(int) CODE_FOR_insv].operand[3].mode;
@ -384,10 +391,9 @@ store_bit_field (str_rtx, bitsize, bitnum, fieldmode, value, align, total_size)
be less than full.
However, only do that if the value is not BLKmode. */
int backwards = WORDS_BIG_ENDIAN && fieldmode != BLKmode;
int nwords = (bitsize + (BITS_PER_WORD - 1)) / BITS_PER_WORD;
int i;
unsigned int backwards = WORDS_BIG_ENDIAN && fieldmode != BLKmode;
unsigned int nwords = (bitsize + (BITS_PER_WORD - 1)) / BITS_PER_WORD;
unsigned int i;
/* This is the mode we must force value to, so that there will be enough
subwords to extract. Note that fieldmode will often (always?) be
@ -400,10 +406,13 @@ store_bit_field (str_rtx, bitsize, bitnum, fieldmode, value, align, total_size)
{
/* If I is 0, use the low-order word in both field and target;
if I is 1, use the next to lowest word; and so on. */
int wordnum = (backwards ? nwords - i - 1 : i);
int bit_offset = (backwards
? MAX (bitsize - (i + 1) * BITS_PER_WORD, 0)
: i * BITS_PER_WORD);
unsigned int wordnum = (backwards ? nwords - i - 1 : i);
unsigned int bit_offset = (backwards
? MAX ((int) bitsize - ((int) i + 1)
* BITS_PER_WORD,
0)
: (int) i * BITS_PER_WORD);
store_bit_field (op0, MIN (BITS_PER_WORD,
bitsize - i * BITS_PER_WORD),
bitnum + bit_offset, word_mode,
@ -513,7 +522,7 @@ store_bit_field (str_rtx, bitsize, bitnum, fieldmode, value, align, total_size)
if (bestmode == VOIDmode
|| (SLOW_UNALIGNED_ACCESS (bestmode, align)
&& GET_MODE_SIZE (bestmode) > (int) align))
&& GET_MODE_SIZE (bestmode) > align))
goto insv_loses;
/* Adjust address to point to the containing unit of that mode. */
@ -626,12 +635,12 @@ store_bit_field (str_rtx, bitsize, bitnum, fieldmode, value, align, total_size)
static void
store_fixed_bit_field (op0, offset, bitsize, bitpos, value, struct_align)
register rtx op0;
register int offset, bitsize, bitpos;
unsigned HOST_WIDE_INT offset, bitsize, bitpos;
register rtx value;
unsigned int struct_align;
{
register enum machine_mode mode;
int total_bits = BITS_PER_WORD;
unsigned int total_bits = BITS_PER_WORD;
rtx subtarget, temp;
int all_zero = 0;
int all_one = 0;
@ -797,12 +806,12 @@ store_fixed_bit_field (op0, offset, bitsize, bitpos, value, struct_align)
static void
store_split_bit_field (op0, bitsize, bitpos, value, align)
rtx op0;
int bitsize, bitpos;
unsigned HOST_WIDE_INT bitsize, bitpos;
rtx value;
unsigned int align;
{
int unit;
int bitsdone = 0;
unsigned int unit;
unsigned int bitsdone = 0;
/* Make sure UNIT isn't larger than BITS_PER_WORD, we can only handle that
much at a time. */
@ -831,10 +840,10 @@ store_split_bit_field (op0, bitsize, bitpos, value, align)
while (bitsdone < bitsize)
{
int thissize;
unsigned HOST_WIDE_INT thissize;
rtx part, word;
int thispos;
int offset;
unsigned HOST_WIDE_INT thispos;
unsigned HOST_WIDE_INT offset;
offset = (bitpos + bitsdone) / unit;
thispos = (bitpos + bitsdone) % unit;
@ -951,27 +960,28 @@ rtx
extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
target, mode, tmode, align, total_size)
rtx str_rtx;
register int bitsize;
int bitnum;
unsigned HOST_WIDE_INT bitsize;
unsigned HOST_WIDE_INT bitnum;
int unsignedp;
rtx target;
enum machine_mode mode, tmode;
unsigned int align;
int total_size;
HOST_WIDE_INT total_size;
{
int unit = (GET_CODE (str_rtx) == MEM) ? BITS_PER_UNIT : BITS_PER_WORD;
register int offset = bitnum / unit;
register int bitpos = bitnum % unit;
unsigned int unit
= (GET_CODE (str_rtx) == MEM) ? BITS_PER_UNIT : BITS_PER_WORD;
unsigned HOST_WIDE_INT offset = bitnum / unit;
unsigned HOST_WIDE_INT bitpos = bitnum % unit;
register rtx op0 = str_rtx;
rtx spec_target = target;
rtx spec_target_subreg = 0;
enum machine_mode int_mode;
#ifdef HAVE_extv
int extv_bitsize;
unsigned HOST_WIDE_INT extv_bitsize;
enum machine_mode extv_mode;
#endif
#ifdef HAVE_extzv
int extzv_bitsize;
unsigned HOST_WIDE_INT extzv_bitsize;
enum machine_mode extzv_mode;
#endif
@ -1107,8 +1117,8 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
This is because the most significant word is the one which may
be less than full. */
int nwords = (bitsize + (BITS_PER_WORD - 1)) / BITS_PER_WORD;
int i;
unsigned int nwords = (bitsize + (BITS_PER_WORD - 1)) / BITS_PER_WORD;
unsigned int i;
if (target == 0 || GET_CODE (target) != REG)
target = gen_reg_rtx (mode);
@ -1121,13 +1131,15 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
/* If I is 0, use the low-order word in both field and target;
if I is 1, use the next to lowest word; and so on. */
/* Word number in TARGET to use. */
int wordnum = (WORDS_BIG_ENDIAN
? GET_MODE_SIZE (GET_MODE (target)) / UNITS_PER_WORD - i - 1
: i);
unsigned int wordnum
= (WORDS_BIG_ENDIAN
? GET_MODE_SIZE (GET_MODE (target)) / UNITS_PER_WORD - i - 1
: i);
/* Offset from start of field in OP0. */
int bit_offset = (WORDS_BIG_ENDIAN
? MAX (0, bitsize - (i + 1) * BITS_PER_WORD)
: i * BITS_PER_WORD);
unsigned int bit_offset = (WORDS_BIG_ENDIAN
? MAX (0, ((int) bitsize - ((int) i + 1)
* BITS_PER_WORD))
: (int) i * BITS_PER_WORD);
rtx target_part = operand_subword (target, wordnum, 1, VOIDmode);
rtx result_part
= extract_bit_field (op0, MIN (BITS_PER_WORD,
@ -1149,7 +1161,7 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
need to be zero'd out. */
if (GET_MODE_SIZE (GET_MODE (target)) > nwords * UNITS_PER_WORD)
{
int i,total_words;
unsigned int i, total_words;
total_words = GET_MODE_SIZE (GET_MODE (target)) / UNITS_PER_WORD;
for (i = nwords; i < total_words; i++)
@ -1215,7 +1227,7 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
&& ! ((GET_CODE (op0) == REG || GET_CODE (op0) == SUBREG)
&& (bitsize + bitpos > extzv_bitsize)))
{
int xbitpos = bitpos, xoffset = offset;
unsigned HOST_WIDE_INT xbitpos = bitpos, xoffset = offset;
rtx bitsize_rtx, bitpos_rtx;
rtx last = get_last_insn ();
rtx xop0 = op0;
@ -1258,7 +1270,7 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
if (bestmode == VOIDmode
|| (SLOW_UNALIGNED_ACCESS (bestmode, align)
&& GET_MODE_SIZE (bestmode) > (int) align))
&& GET_MODE_SIZE (bestmode) > align))
goto extzv_loses;
/* Compute offset as multiple of this unit,
@ -1396,7 +1408,7 @@ extract_bit_field (str_rtx, bitsize, bitnum, unsignedp,
if (bestmode == VOIDmode
|| (SLOW_UNALIGNED_ACCESS (bestmode, align)
&& GET_MODE_SIZE (bestmode) > (int) align))
&& GET_MODE_SIZE (bestmode) > align))
goto extv_loses;
/* Compute offset as multiple of this unit,
@ -1533,11 +1545,11 @@ extract_fixed_bit_field (tmode, op0, offset, bitsize, bitpos,
target, unsignedp, align)
enum machine_mode tmode;
register rtx op0, target;
register int offset, bitsize, bitpos;
unsigned HOST_WIDE_INT offset, bitsize, bitpos;
int unsignedp;
unsigned int align;
{
int total_bits = BITS_PER_WORD;
unsigned int total_bits = BITS_PER_WORD;
enum machine_mode mode;
if (GET_CODE (op0) == SUBREG || GET_CODE (op0) == REG)
@ -1753,11 +1765,12 @@ lshift_value (mode, value, bitpos, bitsize)
static rtx
extract_split_bit_field (op0, bitsize, bitpos, unsignedp, align)
rtx op0;
int bitsize, bitpos, unsignedp;
unsigned HOST_WIDE_INT bitsize, bitpos;
int unsignedp;
unsigned int align;
{
int unit;
int bitsdone = 0;
unsigned int unit;
unsigned int bitsdone = 0;
rtx result = NULL_RTX;
int first = 1;
@ -1770,10 +1783,10 @@ extract_split_bit_field (op0, bitsize, bitpos, unsignedp, align)
while (bitsdone < bitsize)
{
int thissize;
unsigned HOST_WIDE_INT thissize;
rtx part, word;
int thispos;
int offset;
unsigned HOST_WIDE_INT thispos;
unsigned HOST_WIDE_INT offset;
offset = (bitpos + bitsdone) / unit;
thispos = (bitpos + bitsdone) % unit;

View File

@ -143,12 +143,15 @@ static void clear_by_pieces_1 PARAMS ((rtx (*) (rtx, ...),
struct clear_by_pieces *));
static int is_zeros_p PARAMS ((tree));
static int mostly_zeros_p PARAMS ((tree));
static void store_constructor_field PARAMS ((rtx, int, int, enum machine_mode,
static void store_constructor_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
HOST_WIDE_INT, enum machine_mode,
tree, tree, unsigned int, int));
static void store_constructor PARAMS ((tree, rtx, unsigned int, int, int));
static rtx store_field PARAMS ((rtx, int, int, enum machine_mode,
static void store_constructor PARAMS ((tree, rtx, unsigned int, int,
unsigned HOST_WIDE_INT));
static rtx store_field PARAMS ((rtx, HOST_WIDE_INT,
HOST_WIDE_INT, enum machine_mode,
tree, enum machine_mode, int,
unsigned int, int, int));
unsigned int, HOST_WIDE_INT, int));
static enum memory_use_mode
get_memory_usage_from_modifier PARAMS ((enum expand_modifier));
static tree save_noncopied_parts PARAMS ((tree, tree));
@ -162,7 +165,8 @@ static rtx expand_increment PARAMS ((tree, int, int));
static void preexpand_calls PARAMS ((tree));
static void do_jump_by_parts_greater PARAMS ((tree, int, rtx, rtx));
static void do_jump_by_parts_equality PARAMS ((tree, rtx, rtx));
static void do_compare_and_jump PARAMS ((tree, enum rtx_code, enum rtx_code, rtx, rtx));
static void do_compare_and_jump PARAMS ((tree, enum rtx_code, enum rtx_code,
rtx, rtx));
static rtx do_store_flag PARAMS ((tree, rtx, enum machine_mode, int));
/* Record for each mode whether we can move a register directly to or
@ -1368,7 +1372,7 @@ move_by_pieces (to, from, len, align)
{
struct move_by_pieces data;
rtx to_addr = XEXP (to, 0), from_addr = XEXP (from, 0);
int max_size = MOVE_MAX_PIECES + 1;
unsigned int max_size = MOVE_MAX_PIECES + 1;
enum machine_mode mode = VOIDmode, tmode;
enum insn_code icode;
@ -1479,7 +1483,7 @@ move_by_pieces_ninsns (l, align)
unsigned int align;
{
register int n_insns = 0;
int max_size = MOVE_MAX + 1;
unsigned int max_size = MOVE_MAX + 1;
if (! SLOW_UNALIGNED_ACCESS (word_mode, align)
|| align > MOVE_MAX || align >= BIGGEST_ALIGNMENT / BITS_PER_UNIT)
@ -1920,8 +1924,8 @@ emit_group_load (dst, orig_src, ssize, align)
for (i = start; i < XVECLEN (dst, 0); i++)
{
enum machine_mode mode = GET_MODE (XEXP (XVECEXP (dst, 0, i), 0));
int bytepos = INTVAL (XEXP (XVECEXP (dst, 0, i), 1));
int bytelen = GET_MODE_SIZE (mode);
HOST_WIDE_INT bytepos = INTVAL (XEXP (XVECEXP (dst, 0, i), 1));
unsigned int bytelen = GET_MODE_SIZE (mode);
int shift = 0;
/* Handle trailing fragments that run over the size of the struct. */
@ -2050,9 +2054,9 @@ emit_group_store (orig_dst, src, ssize, align)
/* Process the pieces. */
for (i = start; i < XVECLEN (src, 0); i++)
{
int bytepos = INTVAL (XEXP (XVECEXP (src, 0, i), 1));
HOST_WIDE_INT bytepos = INTVAL (XEXP (XVECEXP (src, 0, i), 1));
enum machine_mode mode = GET_MODE (tmps[i]);
int bytelen = GET_MODE_SIZE (mode);
unsigned int bytelen = GET_MODE_SIZE (mode);
/* Handle trailing fragments that run over the size of the struct. */
if (ssize >= 0 && bytepos + bytelen > ssize)
@ -2238,7 +2242,7 @@ clear_by_pieces (to, len, align)
{
struct clear_by_pieces data;
rtx to_addr = XEXP (to, 0);
int max_size = MOVE_MAX_PIECES + 1;
unsigned int max_size = MOVE_MAX_PIECES + 1;
enum machine_mode mode = VOIDmode, tmode;
enum insn_code icode;
@ -2587,7 +2591,7 @@ emit_move_insn_1 (x, y)
enum machine_mode mode = GET_MODE (x);
enum machine_mode submode;
enum mode_class class = GET_MODE_CLASS (mode);
int i;
unsigned int i;
if (mode >= MAX_MACHINE_MODE)
abort ();
@ -3323,8 +3327,7 @@ expand_assignment (to, from, want_value, suggest_reg)
|| TREE_CODE (to) == ARRAY_REF)
{
enum machine_mode mode1;
int bitsize;
int bitpos;
HOST_WIDE_INT bitsize, bitpos;
tree offset;
int unsignedp;
int volatilep = 0;
@ -4051,7 +4054,8 @@ static void
store_constructor_field (target, bitsize, bitpos,
mode, exp, type, align, cleared)
rtx target;
int bitsize, bitpos;
unsigned HOST_WIDE_INT bitsize;
HOST_WIDE_INT bitpos;
enum machine_mode mode;
tree exp, type;
unsigned int align;
@ -4095,7 +4099,7 @@ store_constructor (exp, target, align, cleared, size)
rtx target;
unsigned int align;
int cleared;
int size;
unsigned HOST_WIDE_INT size;
{
tree type = TREE_TYPE (exp);
#ifdef WORD_REGISTER_OPERATIONS
@ -4175,10 +4179,10 @@ store_constructor (exp, target, align, cleared, size)
tree value = TREE_VALUE (elt);
#endif
register enum machine_mode mode;
int bitsize;
int bitpos = 0;
HOST_WIDE_INT bitsize;
HOST_WIDE_INT bitpos = 0;
int unsignedp;
tree pos, constant = 0, offset = 0;
tree offset;
rtx to_rtx = target;
/* Just ignore missing fields.
@ -4190,8 +4194,8 @@ store_constructor (exp, target, align, cleared, size)
if (cleared && is_zeros_p (TREE_VALUE (elt)))
continue;
if (TREE_CODE (DECL_SIZE (field)) == INTEGER_CST)
bitsize = TREE_INT_CST_LOW (DECL_SIZE (field));
if (host_integerp (DECL_SIZE (field), 1))
bitsize = tree_low_cst (DECL_SIZE (field), 1);
else
bitsize = -1;
@ -4200,18 +4204,16 @@ store_constructor (exp, target, align, cleared, size)
if (DECL_BIT_FIELD (field))
mode = VOIDmode;
pos = DECL_FIELD_BITPOS (field);
if (TREE_CODE (pos) == INTEGER_CST)
constant = pos;
else if (TREE_CODE (pos) == PLUS_EXPR
&& TREE_CODE (TREE_OPERAND (pos, 1)) == INTEGER_CST)
constant = TREE_OPERAND (pos, 1), offset = TREE_OPERAND (pos, 0);
offset = DECL_FIELD_OFFSET (field);
if (host_integerp (offset, 0)
&& host_integerp (bit_position (field), 0))
{
bitpos = int_bit_position (field);
offset = 0;
}
else
offset = pos;
if (constant)
bitpos = TREE_INT_CST_LOW (constant);
bitpos = tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 0);
if (offset)
{
rtx offset_rtx;
@ -4220,8 +4222,7 @@ store_constructor (exp, target, align, cleared, size)
offset = build (WITH_RECORD_EXPR, bitsizetype,
offset, make_tree (TREE_TYPE (exp), target));
offset = size_binop (EXACT_DIV_EXPR, offset,
bitsize_int (BITS_PER_UNIT));
offset = size_binop (EXACT_DIV_EXPR, offset, bitsize_unit_node);
offset = convert (sizetype, offset);
offset_rtx = expand_expr (offset, NULL_RTX, VOIDmode, 0);
@ -4257,8 +4258,7 @@ store_constructor (exp, target, align, cleared, size)
start of a word, try to widen it to a full word.
This special case allows us to output C++ member function
initializations in a form that the optimizers can understand. */
if (constant
&& GET_CODE (target) == REG
if (GET_CODE (target) == REG
&& bitsize < BITS_PER_WORD
&& bitpos % BITS_PER_WORD == 0
&& GET_MODE_CLASS (mode) == MODE_INT
@ -4707,13 +4707,14 @@ static rtx
store_field (target, bitsize, bitpos, mode, exp, value_mode,
unsignedp, align, total_size, alias_set)
rtx target;
int bitsize, bitpos;
HOST_WIDE_INT bitsize;
HOST_WIDE_INT bitpos;
enum machine_mode mode;
tree exp;
enum machine_mode value_mode;
int unsignedp;
unsigned int align;
int total_size;
HOST_WIDE_INT total_size;
int alias_set;
{
HOST_WIDE_INT width_mask = 0;
@ -4929,25 +4930,29 @@ tree
get_inner_reference (exp, pbitsize, pbitpos, poffset, pmode,
punsignedp, pvolatilep, palignment)
tree exp;
int *pbitsize;
int *pbitpos;
HOST_WIDE_INT *pbitsize;
HOST_WIDE_INT *pbitpos;
tree *poffset;
enum machine_mode *pmode;
int *punsignedp;
int *pvolatilep;
unsigned int *palignment;
{
tree orig_exp = exp;
tree size_tree = 0;
enum machine_mode mode = VOIDmode;
tree offset = size_zero_node;
tree bit_offset = bitsize_zero_node;
unsigned int alignment = BIGGEST_ALIGNMENT;
tree tem;
/* First get the mode, signedness, and size. We do this from just the
outermost expression. */
if (TREE_CODE (exp) == COMPONENT_REF)
{
size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
if (! DECL_BIT_FIELD (TREE_OPERAND (exp, 1)))
mode = DECL_MODE (TREE_OPERAND (exp, 1));
*punsignedp = TREE_UNSIGNED (TREE_OPERAND (exp, 1));
}
else if (TREE_CODE (exp) == BIT_FIELD_REF)
@ -4958,122 +4963,71 @@ get_inner_reference (exp, pbitsize, pbitpos, poffset, pmode,
else
{
mode = TYPE_MODE (TREE_TYPE (exp));
*punsignedp = TREE_UNSIGNED (TREE_TYPE (exp));
if (mode == BLKmode)
size_tree = TYPE_SIZE (TREE_TYPE (exp));
*pbitsize = GET_MODE_BITSIZE (mode);
*punsignedp = TREE_UNSIGNED (TREE_TYPE (exp));
else
*pbitsize = GET_MODE_BITSIZE (mode);
}
if (size_tree)
if (size_tree != 0)
{
if (TREE_CODE (size_tree) != INTEGER_CST)
if (! host_integerp (size_tree, 1))
mode = BLKmode, *pbitsize = -1;
else
*pbitsize = TREE_INT_CST_LOW (size_tree);
*pbitsize = tree_low_cst (size_tree, 1);
}
/* Compute cumulative bit-offset for nested component-refs and array-refs,
and find the ultimate containing object. */
*pbitpos = 0;
while (1)
{
if (TREE_CODE (exp) == COMPONENT_REF || TREE_CODE (exp) == BIT_FIELD_REF)
if (TREE_CODE (exp) == BIT_FIELD_REF)
bit_offset = size_binop (PLUS_EXPR, bit_offset, TREE_OPERAND (exp, 2));
else if (TREE_CODE (exp) == COMPONENT_REF)
{
tree pos = (TREE_CODE (exp) == COMPONENT_REF
? DECL_FIELD_BITPOS (TREE_OPERAND (exp, 1))
: TREE_OPERAND (exp, 2));
tree constant = bitsize_int (0), var = pos;
tree field = TREE_OPERAND (exp, 1);
tree this_offset = DECL_FIELD_OFFSET (field);
/* If this field hasn't been filled in yet, don't go
past it. This should only happen when folding expressions
made during type construction. */
if (pos == 0)
if (this_offset == 0)
break;
else if (! TREE_CONSTANT (this_offset)
&& contains_placeholder_p (this_offset))
this_offset = build (WITH_RECORD_EXPR, sizetype, this_offset, exp);
/* Assume here that the offset is a multiple of a unit.
If not, there should be an explicitly added constant. */
if (TREE_CODE (pos) == PLUS_EXPR
&& TREE_CODE (TREE_OPERAND (pos, 1)) == INTEGER_CST)
constant = TREE_OPERAND (pos, 1), var = TREE_OPERAND (pos, 0);
else if (TREE_CODE (pos) == INTEGER_CST)
constant = pos, var = bitsize_int (0);
offset = size_binop (PLUS_EXPR, offset, DECL_FIELD_OFFSET (field));
bit_offset = size_binop (PLUS_EXPR, bit_offset,
DECL_FIELD_BIT_OFFSET (field));
*pbitpos += TREE_INT_CST_LOW (constant);
offset
= size_binop (PLUS_EXPR, offset,
convert (sizetype,
size_binop (EXACT_DIV_EXPR, var,
bitsize_int (BITS_PER_UNIT))));
if (! host_integerp (offset, 0))
alignment = MIN (alignment, DECL_OFFSET_ALIGN (field));
}
else if (TREE_CODE (exp) == ARRAY_REF)
{
/* This code is based on the code in case ARRAY_REF in expand_expr
below. We assume here that the size of an array element is
always an integral multiple of BITS_PER_UNIT. */
tree index = TREE_OPERAND (exp, 1);
tree domain = TYPE_DOMAIN (TREE_TYPE (TREE_OPERAND (exp, 0)));
tree low_bound
= domain ? TYPE_MIN_VALUE (domain) : integer_zero_node;
tree index_type = TREE_TYPE (index);
tree xindex;
tree low_bound = (domain ? TYPE_MIN_VALUE (domain) : 0);
if (TYPE_PRECISION (index_type) != TYPE_PRECISION (sizetype))
{
index = convert (type_for_size (TYPE_PRECISION (sizetype), 0),
index);
index_type = TREE_TYPE (index);
}
/* We assume all arrays have sizes that are a multiple of a byte.
First subtract the lower bound, if any, in the type of the
index, then convert to sizetype and multiply by the size of the
array element. */
if (low_bound != 0 && ! integer_zerop (low_bound))
index = fold (build (MINUS_EXPR, TREE_TYPE (index),
index, low_bound));
/* Optimize the special-case of a zero lower bound.
We convert the low_bound to sizetype to avoid some problems
with constant folding. (E.g. suppose the lower bound is 1,
and its mode is QI. Without the conversion, (ARRAY
+(INDEX-(unsigned char)1)) becomes ((ARRAY+(-(unsigned char)1))
+INDEX), which becomes (ARRAY+255+INDEX). Oops!)
But sizetype isn't quite right either (especially if
the lowbound is negative). FIXME */
if (! TREE_CONSTANT (index)
&& contains_placeholder_p (index))
index = build (WITH_RECORD_EXPR, TREE_TYPE (index), index, exp);
if (! integer_zerop (low_bound))
index = fold (build (MINUS_EXPR, index_type, index,
convert (sizetype, low_bound)));
if (TREE_CODE (index) == INTEGER_CST)
{
index = convert (sbitsizetype, index);
index_type = TREE_TYPE (index);
}
xindex = fold (build (MULT_EXPR, sbitsizetype, index,
convert (sbitsizetype,
TYPE_SIZE (TREE_TYPE (exp)))));
if (TREE_CODE (xindex) == INTEGER_CST
&& TREE_INT_CST_HIGH (xindex) == 0)
*pbitpos += TREE_INT_CST_LOW (xindex);
else
{
/* Either the bit offset calculated above is not constant, or
it overflowed. In either case, redo the multiplication
against the size in units. This is especially important
in the non-constant case to avoid a division at runtime. */
xindex
= fold (build (MULT_EXPR, ssizetype, index,
convert (ssizetype,
TYPE_SIZE_UNIT (TREE_TYPE (exp)))));
if (contains_placeholder_p (xindex))
xindex = build (WITH_RECORD_EXPR, ssizetype, xindex, exp);
offset
= size_binop (PLUS_EXPR, offset, convert (sizetype, xindex));
}
offset = size_binop (PLUS_EXPR, offset,
size_binop (MULT_EXPR,
convert (sizetype, index),
TYPE_SIZE_UNIT (TREE_TYPE (exp))));
}
else if (TREE_CODE (exp) != NON_LVALUE_EXPR
&& ! ((TREE_CODE (exp) == NOP_EXPR
@ -5088,7 +5042,7 @@ get_inner_reference (exp, pbitsize, pbitpos, poffset, pmode,
/* If the offset is non-constant already, then we can't assume any
alignment more than the alignment here. */
if (! integer_zerop (offset))
if (! TREE_CONSTANT (offset))
alignment = MIN (alignment, TYPE_ALIGN (TREE_TYPE (exp)));
exp = TREE_OPERAND (exp, 0);
@ -5099,19 +5053,24 @@ get_inner_reference (exp, pbitsize, pbitpos, poffset, pmode,
else if (TREE_TYPE (exp) != 0)
alignment = MIN (alignment, TYPE_ALIGN (TREE_TYPE (exp)));
if (integer_zerop (offset))
offset = 0;
if (offset != 0 && contains_placeholder_p (offset))
offset = build (WITH_RECORD_EXPR, sizetype, offset, orig_exp);
/* If OFFSET is constant, see if we can return the whole thing as a
constant bit position. Otherwise, split it up. */
if (host_integerp (offset, 0)
&& 0 != (tem = size_binop (MULT_EXPR, convert (bitsizetype, offset),
bitsize_unit_node))
&& 0 != (tem = size_binop (PLUS_EXPR, tem, bit_offset))
&& host_integerp (tem, 0))
*pbitpos = tree_low_cst (tem, 0), *poffset = 0;
else
*pbitpos = tree_low_cst (bit_offset, 0), *poffset = offset;
*pmode = mode;
*poffset = offset;
*palignment = alignment / BITS_PER_UNIT;
return exp;
}
/* Subroutine of expand_exp: compute memory_usage from modifier. */
static enum memory_use_mode
get_memory_usage_from_modifier (modifier)
enum expand_modifier modifier;
@ -6615,8 +6574,7 @@ expand_expr (exp, target, tmode, modifier)
{
enum machine_mode mode1;
int bitsize;
int bitpos;
HOST_WIDE_INT bitsize, bitpos;
tree offset;
int volatilep = 0;
unsigned int alignment;
@ -8616,8 +8574,7 @@ expand_expr_unaligned (exp, palign)
{
enum machine_mode mode1;
int bitsize;
int bitpos;
HOST_WIDE_INT bitsize, bitpos;
tree offset;
int volatilep = 0;
unsigned int alignment;
@ -9350,7 +9307,8 @@ do_jump (exp, if_false_label, if_true_label)
case BIT_FIELD_REF:
case ARRAY_REF:
{
int bitsize, bitpos, unsignedp;
HOST_WIDE_INT bitsize, bitpos;
int unsignedp;
enum machine_mode mode;
tree type;
tree offset;

View File

@ -1108,7 +1108,8 @@ extern rtx label_rtx PARAMS ((tree));
#endif
/* Indicate how an input argument register was promoted. */
extern rtx promoted_input_arg PARAMS ((int, enum machine_mode *, int *));
extern rtx promoted_input_arg PARAMS ((unsigned int, enum machine_mode *,
int *));
/* Return an rtx like arg but sans any constant terms.
Returns the original rtx if it has no constant terms.
@ -1206,11 +1207,14 @@ extern rtx hard_libcall_value PARAMS ((enum machine_mode));
of STACK_BOUNDARY / BITS_PER_UNIT. */
extern rtx round_push PARAMS ((rtx));
extern rtx store_bit_field PARAMS ((rtx, int, int, enum machine_mode, rtx,
unsigned int, int));
extern rtx extract_bit_field PARAMS ((rtx, int, int, int, rtx,
extern rtx store_bit_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT,
enum machine_mode, rtx,
unsigned int, HOST_WIDE_INT));
extern rtx extract_bit_field PARAMS ((rtx, unsigned HOST_WIDE_INT,
unsigned HOST_WIDE_INT, int, rtx,
enum machine_mode, enum machine_mode,
unsigned int, int));
unsigned int, HOST_WIDE_INT));
extern rtx expand_mult PARAMS ((enum machine_mode, rtx, rtx, rtx, int));
extern rtx expand_mult_add PARAMS ((rtx, rtx, rtx, rtx,enum machine_mode, int));
extern rtx expand_mult_highpart_adjust PARAMS ((enum machine_mode, rtx, rtx, rtx, rtx, int));
@ -1240,5 +1244,5 @@ extern void do_jump_by_parts_greater_rtx PARAMS ((enum machine_mode,
#ifdef TREE_CODE /* Don't lose if tree.h not included. */
extern void mark_seen_cases PARAMS ((tree, unsigned char *,
long, int));
HOST_WIDE_INT, int));
#endif

View File

@ -1,3 +1,8 @@
Sat Mar 25 09:12:10 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* com.c (ffecom_tree_canonize_ptr_): Use bitsize_zero_node.
(ffecom_tree_canonize_ref_): Likewise.
Mon Mar 20 15:49:40 2000 Jim Wilson <wilson@cygnus.com>
* f/target.h (FFETARGET_32bit_longs): New. Define for alpha, sparc64,

View File

@ -9097,7 +9097,7 @@ ffecom_tree_canonize_ptr_ (tree *decl, tree *offset,
case PARM_DECL:
*decl = t;
*offset = bitsize_int (0);
*offset = bitsize_zero_node;
break;
case ADDR_EXPR:
@ -9105,7 +9105,7 @@ ffecom_tree_canonize_ptr_ (tree *decl, tree *offset,
{
/* A reference to COMMON. */
*decl = TREE_OPERAND (t, 0);
*offset = bitsize_int (0);
*offset = bitsize_zero_node;
break;
}
/* Fall through. */
@ -9226,7 +9226,7 @@ ffecom_tree_canonize_ref_ (tree *decl, tree *offset,
case VAR_DECL:
case PARM_DECL:
*decl = t;
*offset = bitsize_int (0);
*offset = bitsize_zero_node;
*size = TYPE_SIZE (TREE_TYPE (t));
return;

View File

@ -2579,7 +2579,7 @@ verify_wide_reg_1 (px, pregno)
void *pregno;
{
rtx x = *px;
int regno = *(int *) pregno;
unsigned int regno = *(int *) pregno;
if (GET_CODE (x) == REG && REGNO (x) == regno)
{

View File

@ -80,7 +80,8 @@ static tree distribute_bit_expr PARAMS ((enum tree_code, tree, tree, tree));
static tree make_bit_field_ref PARAMS ((tree, tree, int, int, int));
static tree optimize_bit_field_compare PARAMS ((enum tree_code, tree,
tree, tree));
static tree decode_field_reference PARAMS ((tree, int *, int *,
static tree decode_field_reference PARAMS ((tree, HOST_WIDE_INT *,
HOST_WIDE_INT *,
enum machine_mode *, int *,
int *, tree *, tree *));
static int all_ones_mask_p PARAMS ((tree, int));
@ -1491,18 +1492,15 @@ int_const_binop (code, arg1, arg2, notrunc, forsize)
/* It's unclear from the C standard whether shifts can overflow.
The following code ignores overflow; perhaps a C standard
interpretation ruling is needed. */
lshift_double (int1l, int1h, int2l,
TYPE_PRECISION (TREE_TYPE (arg1)),
&low, &hi,
!uns);
lshift_double (int1l, int1h, int2l, TYPE_PRECISION (TREE_TYPE (arg1)),
&low, &hi, !uns);
no_overflow = 1;
break;
case RROTATE_EXPR:
int2l = - int2l;
case LROTATE_EXPR:
lrotate_double (int1l, int1h, int2l,
TYPE_PRECISION (TREE_TYPE (arg1)),
lrotate_double (int1l, int1h, int2l, TYPE_PRECISION (TREE_TYPE (arg1)),
&low, &hi);
break;
@ -1599,7 +1597,7 @@ int_const_binop (code, arg1, arg2, notrunc, forsize)
abort ();
}
if (forsize && hi == 0 && low < 1000)
if (forsize && hi == 0 && low < 10000)
return size_int_type_wide (low, TREE_TYPE (arg1));
else
{
@ -1850,7 +1848,7 @@ size_int_type_wide (number, type)
tree type;
{
/* Type-size nodes already made for small sizes. */
static tree size_table[2 * HOST_BITS_PER_WIDE_INT + 1];
static tree size_table[2048 + 1];
static int init_p = 0;
tree t;
@ -1864,8 +1862,7 @@ size_int_type_wide (number, type)
/* If this is a positive number that fits in the table we use to hold
cached entries, see if it is already in the table and put it there
if not. */
if (number >= 0
&& number < (int) (sizeof size_table / sizeof size_table[0]) / 2)
if (number >= 0 && number < (int) (sizeof size_table / sizeof size_table[0]))
{
if (size_table[number] != 0)
for (t = size_table[number]; t != 0; t = TREE_CHAIN (t))
@ -2021,7 +2018,7 @@ fold_convert (t, arg1)
/* If we are trying to make a sizetype for a small integer, use
size_int to pick up cached types to reduce duplicate nodes. */
if (TREE_CODE (type) == INTEGER_CST && TYPE_IS_SIZETYPE (type)
&& compare_tree_int (arg1, 1000) < 0)
&& compare_tree_int (arg1, 10000) < 0)
return size_int_type_wide (TREE_INT_CST_LOW (arg1), type);
/* Given an integer constant, make new constant with new type,
@ -2432,7 +2429,7 @@ operand_equal_for_comparison_p (arg0, arg1, other)
{
int unsignedp1, unsignedpo;
tree primarg0, primarg1, primother;
unsigned correct_width;
unsigned int correct_width;
if (operand_equal_p (arg0, arg1, 0))
return 1;
@ -2909,14 +2906,14 @@ optimize_bit_field_compare (code, compare_type, lhs, rhs)
tree compare_type;
tree lhs, rhs;
{
int lbitpos, lbitsize, rbitpos, rbitsize, nbitpos, nbitsize;
HOST_WIDE_INT lbitpos, lbitsize, rbitpos, rbitsize, nbitpos, nbitsize;
tree type = TREE_TYPE (lhs);
tree signed_type, unsigned_type;
int const_p = TREE_CODE (rhs) == INTEGER_CST;
enum machine_mode lmode, rmode, nmode;
int lunsignedp, runsignedp;
int lvolatilep = 0, rvolatilep = 0;
int alignment;
unsigned int alignment;
tree linner, rinner = NULL_TREE;
tree mask;
tree offset;
@ -3085,7 +3082,7 @@ static tree
decode_field_reference (exp, pbitsize, pbitpos, pmode, punsignedp,
pvolatilep, pmask, pand_mask)
tree exp;
int *pbitsize, *pbitpos;
HOST_WIDE_INT *pbitsize, *pbitpos;
enum machine_mode *pmode;
int *punsignedp, *pvolatilep;
tree *pmask;
@ -3094,8 +3091,8 @@ decode_field_reference (exp, pbitsize, pbitpos, pmode, punsignedp,
tree and_mask = 0;
tree mask, inner, offset;
tree unsigned_type;
int precision;
int alignment;
unsigned int precision;
unsigned int alignment;
/* All the optimizations using this function assume integer fields.
There are problems with FP fields since the type_for_size call
@ -3151,7 +3148,7 @@ all_ones_mask_p (mask, size)
int size;
{
tree type = TREE_TYPE (mask);
int precision = TYPE_PRECISION (type);
unsigned int precision = TYPE_PRECISION (type);
tree tmask;
tmask = build_int_2 (~0, ~0);
@ -3893,10 +3890,10 @@ fold_truthop (code, truth_type, lhs, rhs)
enum tree_code lcode, rcode;
tree ll_arg, lr_arg, rl_arg, rr_arg;
tree ll_inner, lr_inner, rl_inner, rr_inner;
int ll_bitsize, ll_bitpos, lr_bitsize, lr_bitpos;
int rl_bitsize, rl_bitpos, rr_bitsize, rr_bitpos;
int xll_bitpos, xlr_bitpos, xrl_bitpos, xrr_bitpos;
int lnbitsize, lnbitpos, rnbitsize, rnbitpos;
HOST_WIDE_INT ll_bitsize, ll_bitpos, lr_bitsize, lr_bitpos;
HOST_WIDE_INT rl_bitsize, rl_bitpos, rr_bitsize, rr_bitpos;
HOST_WIDE_INT xll_bitpos, xlr_bitpos, xrl_bitpos, xrr_bitpos;
HOST_WIDE_INT lnbitsize, lnbitpos, rnbitsize, rnbitpos;
int ll_unsignedp, lr_unsignedp, rl_unsignedp, rr_unsignedp;
enum machine_mode ll_mode, lr_mode, rl_mode, rr_mode;
enum machine_mode lnmode, rnmode;
@ -5042,17 +5039,17 @@ fold (expr)
int inside_int = INTEGRAL_TYPE_P (inside_type);
int inside_ptr = POINTER_TYPE_P (inside_type);
int inside_float = FLOAT_TYPE_P (inside_type);
int inside_prec = TYPE_PRECISION (inside_type);
unsigned int inside_prec = TYPE_PRECISION (inside_type);
int inside_unsignedp = TREE_UNSIGNED (inside_type);
int inter_int = INTEGRAL_TYPE_P (inter_type);
int inter_ptr = POINTER_TYPE_P (inter_type);
int inter_float = FLOAT_TYPE_P (inter_type);
int inter_prec = TYPE_PRECISION (inter_type);
unsigned int inter_prec = TYPE_PRECISION (inter_type);
int inter_unsignedp = TREE_UNSIGNED (inter_type);
int final_int = INTEGRAL_TYPE_P (final_type);
int final_ptr = POINTER_TYPE_P (final_type);
int final_float = FLOAT_TYPE_P (final_type);
int final_prec = TYPE_PRECISION (final_type);
unsigned int final_prec = TYPE_PRECISION (final_type);
int final_unsignedp = TREE_UNSIGNED (final_type);
/* In addition to the cases of two conversions in a row
@ -5690,7 +5687,9 @@ fold (expr)
if (TREE_CODE (arg0) == INTEGER_CST && TREE_CODE (arg1) == NOP_EXPR
&& TREE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg1, 0))))
{
int prec = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg1, 0)));
unsigned int prec
= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg1, 0)));
if (prec < BITS_PER_WORD && prec < HOST_BITS_PER_WIDE_INT
&& (~TREE_INT_CST_LOW (arg0)
& (((HOST_WIDE_INT) 1 << prec) - 1)) == 0)
@ -5699,7 +5698,9 @@ fold (expr)
if (TREE_CODE (arg1) == INTEGER_CST && TREE_CODE (arg0) == NOP_EXPR
&& TREE_UNSIGNED (TREE_TYPE (TREE_OPERAND (arg0, 0))))
{
int prec = TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));
unsigned int prec
= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (arg0, 0)));
if (prec < BITS_PER_WORD && prec < HOST_BITS_PER_WIDE_INT
&& (~TREE_INT_CST_LOW (arg1)
& (((HOST_WIDE_INT) 1 << prec) - 1)) == 0)
@ -6108,7 +6109,7 @@ fold (expr)
(TREE_OPERAND
(TREE_OPERAND (varop, 0), 1)));
tree mask, unsigned_type;
int precision;
unsigned int precision;
tree folded_compare;
/* First check whether the comparison would come out
@ -6165,7 +6166,7 @@ fold (expr)
(TREE_OPERAND
(TREE_OPERAND (varop, 0), 1)));
tree mask, unsigned_type;
int precision;
unsigned int precision;
tree folded_compare;
if (constopnum == 0)

View File

@ -247,7 +247,8 @@ static rtx assign_stack_temp_for_type PARAMS ((enum machine_mode,
static struct temp_slot *find_temp_slot_from_address PARAMS ((rtx));
static void put_reg_into_stack PARAMS ((struct function *, rtx, tree,
enum machine_mode, enum machine_mode,
int, int, int, struct hash_table *));
int, unsigned int, int,
struct hash_table *));
static void fixup_var_refs PARAMS ((rtx, enum machine_mode, int,
struct hash_table *));
static struct fixup_replacement
@ -262,7 +263,7 @@ static rtx fixup_stack_1 PARAMS ((rtx, rtx));
static void optimize_bit_field PARAMS ((rtx, rtx, rtx *));
static void instantiate_decls PARAMS ((tree, int));
static void instantiate_decls_1 PARAMS ((tree, int));
static void instantiate_decl PARAMS ((rtx, int, int));
static void instantiate_decl PARAMS ((rtx, HOST_WIDE_INT, int));
static int instantiate_virtual_regs_1 PARAMS ((rtx *, rtx, int));
static void delete_handlers PARAMS ((void));
static void pad_to_arg_alignment PARAMS ((struct args_size *, int,
@ -1451,19 +1452,20 @@ put_reg_into_stack (function, reg, type, promoted_mode, decl_mode, volatile_p,
tree type;
enum machine_mode promoted_mode, decl_mode;
int volatile_p;
int original_regno;
unsigned int original_regno;
int used_p;
struct hash_table *ht;
{
struct function *func = function ? function : cfun;
rtx new = 0;
int regno = original_regno;
unsigned int regno = original_regno;
if (regno == 0)
regno = REGNO (reg);
if (regno < func->x_max_parm_reg)
new = func->x_parm_reg_stack_loc[regno];
if (new == 0)
new = assign_stack_local_1 (decl_mode, GET_MODE_SIZE (decl_mode), 0, func);
@ -3328,7 +3330,7 @@ instantiate_virtual_regs (fndecl, insns)
rtx insns;
{
rtx insn;
int i;
unsigned int i;
/* Compute the offsets to use for this function. */
in_arg_offset = FIRST_PARM_OFFSET (fndecl);
@ -3446,7 +3448,7 @@ instantiate_decls_1 (let, valid_only)
static void
instantiate_decl (x, size, valid_only)
rtx x;
int size;
HOST_WIDE_INT size;
int valid_only;
{
enum machine_mode mode;
@ -3476,21 +3478,23 @@ instantiate_decl (x, size, valid_only)
instantiate_virtual_regs_1 (&addr, NULL_RTX, 0);
if (valid_only)
if (valid_only && size >= 0)
{
unsigned HOST_WIDE_INT decl_size = size;
/* Now verify that the resulting address is valid for every integer or
floating-point mode up to and including SIZE bytes long. We do this
since the object might be accessed in any mode and frame addresses
are shared. */
for (mode = GET_CLASS_NARROWEST_MODE (MODE_INT);
mode != VOIDmode && GET_MODE_SIZE (mode) <= size;
mode != VOIDmode && GET_MODE_SIZE (mode) <= decl_size;
mode = GET_MODE_WIDER_MODE (mode))
if (! memory_address_p (mode, addr))
return;
for (mode = GET_CLASS_NARROWEST_MODE (MODE_FLOAT);
mode != VOIDmode && GET_MODE_SIZE (mode) <= size;
mode != VOIDmode && GET_MODE_SIZE (mode) <= decl_size;
mode = GET_MODE_WIDER_MODE (mode))
if (! memory_address_p (mode, addr))
return;
@ -4523,7 +4527,7 @@ assign_parms (fndecl)
may need to do it in a wider mode. */
register rtx parmreg;
int regno, regnoi = 0, regnor = 0;
unsigned int regno, regnoi = 0, regnor = 0;
unsignedp = TREE_UNSIGNED (TREE_TYPE (parm));
@ -4917,7 +4921,7 @@ assign_parms (fndecl)
rtx
promoted_input_arg (regno, pmode, punsignedp)
int regno;
unsigned int regno;
enum machine_mode *pmode;
int *punsignedp;
{

View File

@ -399,7 +399,7 @@ struct function
/* 1 + last pseudo register number possibly used for loading a copy
of a parameter of this function. */
int x_max_parm_reg;
unsigned int x_max_parm_reg;
/* Vector indexed by REGNO, containing location on stack in which
to put the parm which is nominally in pseudo register REGNO,

View File

@ -399,7 +399,7 @@ static rtx *cuid_insn;
/* Maximum register number in function prior to doing gcse + 1.
Registers created during this pass have regno >= max_gcse_regno.
This is named with "gcse" to not collide with global of same name. */
static int max_gcse_regno;
static unsigned int max_gcse_regno;
/* Maximum number of cse-able expressions found. */
static int n_exprs;
@ -519,9 +519,9 @@ struct null_pointer_info
/* The basic block being processed. */
int current_block;
/* The first register to be handled in this pass. */
int min_reg;
unsigned int min_reg;
/* One greater than the last register to be handled in this pass. */
int max_reg;
unsigned int max_reg;
sbitmap *nonnull_local;
sbitmap *nonnull_killed;
};
@ -566,8 +566,8 @@ static void compute_expr_hash_table PARAMS ((void));
static void dump_hash_table PARAMS ((FILE *, const char *, struct expr **,
int, int));
static struct expr *lookup_expr PARAMS ((rtx));
static struct expr *lookup_set PARAMS ((int, rtx));
static struct expr *next_set PARAMS ((int, struct expr *));
static struct expr *lookup_set PARAMS ((unsigned int, rtx));
static struct expr *next_set PARAMS ((unsigned int, struct expr *));
static void reset_opr_set_tables PARAMS ((void));
static int oprs_not_set_p PARAMS ((rtx, rtx));
static void mark_call PARAMS ((rtx));
@ -628,7 +628,8 @@ static int handle_avail_expr PARAMS ((rtx, struct expr *));
static int classic_gcse PARAMS ((void));
static int one_classic_gcse_pass PARAMS ((int));
static void invalidate_nonnull_info PARAMS ((rtx, rtx, void *));
static void delete_null_pointer_checks_1 PARAMS ((int *, sbitmap *, sbitmap *,
static void delete_null_pointer_checks_1 PARAMS ((unsigned int *, sbitmap *,
sbitmap *,
struct null_pointer_info *));
static rtx process_insert_insn PARAMS ((struct expr *));
static int pre_edge_insert PARAMS ((struct edge_list *, struct expr **));
@ -2124,9 +2125,9 @@ compute_hash_table (set_p)
for (bb = 0; bb < n_basic_blocks; bb++)
{
rtx insn;
int regno;
unsigned int regno;
int in_libcall_block;
int i;
unsigned int i;
/* First pass over the instructions records information used to
determine when registers and memory are first and last set.
@ -2135,6 +2136,7 @@ compute_hash_table (set_p)
for (i = 0; i < max_gcse_regno; i++)
reg_first_set[i] = reg_last_set[i] = NEVER_SET;
mem_first_set = NEVER_SET;
mem_last_set = NEVER_SET;
@ -2321,7 +2323,7 @@ lookup_expr (pat)
static struct expr *
lookup_set (regno, pat)
int regno;
unsigned int regno;
rtx pat;
{
unsigned int hash = hash_set (regno, set_hash_table_size);
@ -2347,7 +2349,7 @@ lookup_set (regno, pat)
static struct expr *
next_set (regno, expr)
int regno;
unsigned int regno;
struct expr *expr;
{
do
@ -3074,7 +3076,7 @@ handle_avail_expr (insn, expr)
{
/* This is the case when the available expression that reaches
here has already been handled as an available expression. */
int regnum_for_replacing
unsigned int regnum_for_replacing
= REGNO (SET_SRC (PATTERN (insn_computes_expr)));
/* If the register was created by GCSE we can't use `reg_set_table',
@ -3093,7 +3095,7 @@ handle_avail_expr (insn, expr)
if (!found_setting)
{
int regnum_for_replacing
unsigned int regnum_for_replacing
= REGNO (SET_DEST (PATTERN (insn_computes_expr)));
/* This shouldn't happen. */
@ -3836,7 +3838,7 @@ cprop_insn (insn, alter_jumps)
for (reg_used = &reg_use_table[0]; reg_use_count > 0;
reg_used++, reg_use_count--)
{
int regno = REGNO (reg_used->reg_rtx);
unsigned int regno = REGNO (reg_used->reg_rtx);
rtx pat, src;
struct expr *set;
@ -4868,10 +4870,8 @@ invalidate_nonnull_info (x, setter, data)
rtx setter ATTRIBUTE_UNUSED;
void *data;
{
int offset, regno;
struct null_pointer_info* npi = (struct null_pointer_info *) data;
offset = 0;
unsigned int regno;
struct null_pointer_info *npi = (struct null_pointer_info *) data;
while (GET_CODE (x) == SUBREG)
x = SUBREG_REG (x);
@ -4894,7 +4894,7 @@ invalidate_nonnull_info (x, setter, data)
static void
delete_null_pointer_checks_1 (block_reg, nonnull_avin, nonnull_avout, npi)
int *block_reg;
unsigned int *block_reg;
sbitmap *nonnull_avin;
sbitmap *nonnull_avout;
struct null_pointer_info *npi;
@ -5063,7 +5063,7 @@ delete_null_pointer_checks (f)
rtx f;
{
sbitmap *nonnull_avin, *nonnull_avout;
int *block_reg;
unsigned int *block_reg;
int bb;
int reg;
int regs_per_pass;

View File

@ -622,7 +622,7 @@ attr_rtx VPARAMS ((enum rtx_code code, ...))
else if (GET_RTX_LENGTH (code) == 1
&& GET_RTX_FORMAT (code)[0] == 's')
{
char * arg0 = va_arg (p, char *);
char *arg0 = va_arg (p, char *);
if (code == SYMBOL_REF)
arg0 = attr_string (arg0, strlen (arg0));

View File

@ -363,6 +363,10 @@ ggc_mark_tree_children (t)
ggc_mark_rtx (DECL_INCOMING_RTL (t));
break;
case FIELD_DECL:
ggc_mark_tree (DECL_FIELD_BIT_OFFSET (t));
break;
case IDENTIFIER_NODE:
ggc_mark_string (IDENTIFIER_POINTER (t));
lang_mark_tree (t);

View File

@ -1,22 +1,22 @@
/* Garbage collection for the GNU compiler.
Copyright (C) 1998, 1999, 2000 Free Software Foundation, Inc.
This file is part of GNU CC.
This file is part of GNU CC.
GNU CC is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2, or (at your option)
any later version.
GNU CC is free software; you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by the
Free Software Foundation; either version 2, or (at your option) any
later version.
GNU CC is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
GNU CC is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
for more details.
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to
the Free Software Foundation, 59 Temple Place - Suite 330,
Boston, MA 02111-1307, USA. */
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to the Free
Software Foundation, 59 Temple Place - Suite 330, Boston, MA
02111-1307, USA. */
#include "gansidecl.h"

View File

@ -1585,11 +1585,11 @@ static void
set_preference (dest, src)
rtx dest, src;
{
int src_regno, dest_regno;
unsigned int src_regno, dest_regno;
/* Amount to add to the hard regno for SRC, or subtract from that for DEST,
to compensate for subregs in SRC or DEST. */
int offset = 0;
int i;
unsigned int i;
int copy = 1;
if (GET_RTX_FORMAT (GET_CODE (src))[0] == 'e')
@ -1633,7 +1633,7 @@ set_preference (dest, src)
&& reg_allocno[src_regno] >= 0)
{
dest_regno -= offset;
if (dest_regno >= 0 && dest_regno < FIRST_PSEUDO_REGISTER)
if (dest_regno < FIRST_PSEUDO_REGISTER)
{
if (copy)
SET_REGBIT (hard_reg_copy_preferences,
@ -1652,7 +1652,7 @@ set_preference (dest, src)
&& reg_allocno[dest_regno] >= 0)
{
src_regno += offset;
if (src_regno >= 0 && src_regno < FIRST_PSEUDO_REGISTER)
if (src_regno < FIRST_PSEUDO_REGISTER)
{
if (copy)
SET_REGBIT (hard_reg_copy_preferences,

View File

@ -1,5 +1,5 @@
/* Sets (bit vectors) of hard registers, and operations on them.
Copyright (C) 1987, 1992, 1994 Free Software Foundation, Inc.
Copyright (C) 1987, 1992, 1994, 2000 Free Software Foundation, Inc.
This file is part of GNU CC
@ -445,7 +445,7 @@ extern HARD_REG_SET reg_class_contents[];
/* For each reg class, number of regs it contains. */
extern int reg_class_size[N_REG_CLASSES];
extern unsigned int reg_class_size[N_REG_CLASSES];
/* For each reg class, table listing all the containing classes. */

View File

@ -2572,15 +2572,16 @@ mark_stores (dest, x, data)
if (regno >= 0)
{
int last_reg = (regno >= FIRST_PSEUDO_REGISTER ? regno
: regno + HARD_REGNO_NREGS (regno, mode) - 1);
int i;
unsigned int uregno = regno;
unsigned int last_reg = (uregno >= FIRST_PSEUDO_REGISTER ? uregno
: uregno + HARD_REGNO_NREGS (uregno, mode) - 1);
unsigned int i;
/* Ignore virtual stack var or virtual arg register since those
are handled separately. */
if (regno != VIRTUAL_INCOMING_ARGS_REGNUM
&& regno != VIRTUAL_STACK_VARS_REGNUM)
for (i = regno; i <= last_reg; i++)
if (uregno != VIRTUAL_INCOMING_ARGS_REGNUM
&& uregno != VIRTUAL_STACK_VARS_REGNUM)
for (i = uregno; i <= last_reg; i++)
if ((size_t) i < VARRAY_SIZE (global_const_equiv_varray))
VARRAY_CONST_EQUIV (global_const_equiv_varray, i).rtx = 0;
}

View File

@ -1,3 +1,11 @@
Sat Mar 25 09:12:10 2000 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* class.c (make_field_value): Use byte_position.
* expr.c (JAVA_ARRAY_LENGTH_OFFSET): Use byte_position.
(java_array_data_offset): Likewise.
* java-tree.h (MAYBE_CREATE_TYPE_TYPE_LANG_SPECIFIC): Add case to
bzero call.
2000-03-22 Alexandre Petit-Bianco <apbianco@cygnus.com>
* parse.y (check_abstract_method_definitions): New local

View File

@ -1071,7 +1071,7 @@ static tree
make_field_value (fdecl)
tree fdecl;
{
tree finit, info;
tree finit;
int flags;
tree type = TREE_TYPE (fdecl);
int resolved = is_compiled_class (type);
@ -1083,33 +1083,30 @@ make_field_value (fdecl)
else
{
tree signature = build_java_signature (type);
type = build_utf8_ref (unmangle_classname
(IDENTIFIER_POINTER(signature),
IDENTIFIER_LENGTH(signature)));
(IDENTIFIER_POINTER (signature),
IDENTIFIER_LENGTH (signature)));
}
PUSH_FIELD_VALUE (finit, "type", type);
flags = get_access_flags_from_decl (fdecl);
if (! resolved)
flags |= 0x8000 /* FIELD_UNRESOLVED_FLAG */;
PUSH_FIELD_VALUE (finit, "accflags", build_int_2 (flags, 0));
PUSH_FIELD_VALUE (finit, "bsize", TYPE_SIZE_UNIT (TREE_TYPE (fdecl)));
if (FIELD_STATIC (fdecl))
{
tree cfield = TREE_CHAIN (TYPE_FIELDS (field_info_union_node));
tree faddr = build_address_of (build_static_field_ref (fdecl));
info = build (CONSTRUCTOR, field_info_union_node, NULL_TREE,
build_tree_list (cfield, faddr));
}
else
info = build (CONSTRUCTOR, field_info_union_node, NULL_TREE,
build_tree_list (TYPE_FIELDS (field_info_union_node),
build_int_2 ((int_bit_position (fdecl)
/ BITS_PER_UNIT),
0)));
PUSH_FIELD_VALUE (finit, "info", info);
PUSH_FIELD_VALUE
(finit, "info",
build (CONSTRUCTOR, field_info_union_node, NULL_TREE,
build_tree_list
((FIELD_STATIC (fdecl)
? TREE_CHAIN (TYPE_FIELDS (field_info_union_node))
: TYPE_FIELDS (field_info_union_node)),
(FIELD_STATIC (fdecl)
? build_address_of (build_static_field_ref (fdecl))
: byte_position (fdecl)))));
FINISH_RECORD_CONSTRUCTOR (finit);
return finit;

View File

@ -575,11 +575,8 @@ build_java_ret (location)
/* Array core info access macros */
#define JAVA_ARRAY_LENGTH_OFFSET(A) \
size_binop (CEIL_DIV_EXPR, \
(DECL_FIELD_BITPOS \
(TREE_CHAIN (TYPE_FIELDS (TREE_TYPE (TREE_TYPE (A)))))), \
bitsize_int (BITS_PER_UNIT))
#define JAVA_ARRAY_LENGTH_OFFSET(A) \
byte_position (TREE_CHAIN (TYPE_FIELDS (TREE_TYPE (TREE_TYPE (A)))))
tree
decode_newarray_type (atype)
@ -690,10 +687,11 @@ java_array_data_offset (array)
{
tree array_type = TREE_TYPE (TREE_TYPE (array));
tree data_fld = TREE_CHAIN (TREE_CHAIN (TYPE_FIELDS (array_type)));
if (data_fld == NULL_TREE)
return size_in_bytes (array_type);
else
return build_int_2 (int_bit_position (data_fld) / BITS_PER_UNIT, 0);
return byte_position (data_fld);
}
/* Implement array indexing (either as l-value or r-value).

View File

@ -541,9 +541,12 @@ struct lang_decl_var
if (TYPE_LANG_SPECIFIC ((T)) == NULL) \
{ \
TYPE_LANG_SPECIFIC ((T)) = \
(struct lang_type *)xmalloc (sizeof (struct lang_type)); \
bzero (TYPE_LANG_SPECIFIC ((T)), sizeof (struct lang_type)); \
(struct lang_type *) xmalloc (sizeof (struct lang_type)); \
\
bzero ((char *) TYPE_LANG_SPECIFIC ((T)), \
sizeof (struct lang_type)); \
}
#define TYPE_FINIT_STMT_LIST(T) (TYPE_LANG_SPECIFIC(T)->finit_stmt_list)
#define TYPE_CLINIT_STMT_LIST(T) (TYPE_LANG_SPECIFIC(T)->clinit_stmt_list)
#define TYPE_II_STMT_LIST(T) (TYPE_LANG_SPECIFIC(T)->ii_block)

View File

@ -4998,7 +4998,8 @@ mark_modified_reg (dest, x, data)
rtx x ATTRIBUTE_UNUSED;
void *data ATTRIBUTE_UNUSED;
{
int regno, i;
int regno;
unsigned int i;
if (GET_CODE (dest) == SUBREG)
dest = SUBREG_REG (dest);
@ -5286,7 +5287,7 @@ rtx_equal_for_thread_p (x, y, yinsn)
return 1;
}
else
return (same_regs[REGNO (x)] == REGNO (y));
return (same_regs[REGNO (x)] == (int) REGNO (y));
break;
@ -5310,7 +5311,7 @@ rtx_equal_for_thread_p (x, y, yinsn)
if (GET_CODE (SET_DEST (x)) == REG
&& GET_CODE (SET_DEST (y)) == REG)
{
if (same_regs[REGNO (SET_DEST (x))] == REGNO (SET_DEST (y)))
if (same_regs[REGNO (SET_DEST (x))] == (int) REGNO (SET_DEST (y)))
{
same_regs[REGNO (SET_DEST (x))] = -1;
num_same_regs--;

View File

@ -1252,9 +1252,10 @@ block_alloc (b)
for (link = REG_NOTES (insn); link; link = XEXP (link, 1))
if (REG_NOTE_KIND (link) == REG_DEAD
&& GET_CODE (XEXP (link, 0)) == REG
&& combined_regno != REGNO (XEXP (link, 0))
&& (no_conflict_combined_regno != REGNO (XEXP (link, 0))
|| ! find_reg_note (insn, REG_NO_CONFLICT, XEXP (link, 0))))
&& combined_regno != (int) REGNO (XEXP (link, 0))
&& (no_conflict_combined_regno != (int) REGNO (XEXP (link, 0))
|| ! find_reg_note (insn, REG_NO_CONFLICT,
XEXP (link, 0))))
wipe_dead_reg (XEXP (link, 0), 0);
/* Allocate qty numbers for all registers local to this block

View File

@ -162,7 +162,7 @@ static int num_mem_sets;
/* Bound on pseudo register number before loop optimization.
A pseudo has valid regscan info if its number is < max_reg_before_loop. */
int max_reg_before_loop;
unsigned int max_reg_before_loop;
/* The value to pass to the next call of reg_scan_update. */
static int loop_max_reg;
@ -194,7 +194,7 @@ struct movable
of any registers used within the LIBCALL. */
int consec; /* Number of consecutive following insns
that must be moved with this one. */
int regno; /* The register it sets */
unsigned int regno; /* The register it sets */
short lifetime; /* lifetime of that register;
may be adjusted when matching movables
that load the same value are found. */
@ -306,7 +306,7 @@ static int insert_loop_mem PARAMS ((rtx *, void *));
static int replace_loop_mem PARAMS ((rtx *, void *));
static int replace_loop_reg PARAMS ((rtx *, void *));
static void note_reg_stored PARAMS ((rtx, rtx, void *));
static void try_copy_prop PARAMS ((const struct loop *, rtx, int));
static void try_copy_prop PARAMS ((const struct loop *, rtx, unsigned int));
static int replace_label PARAMS ((rtx *, void *));
typedef struct rtx_and_int {
@ -1536,8 +1536,8 @@ regs_match_p (x, y, movables)
rtx x, y;
struct movable *movables;
{
int xn = REGNO (x);
int yn = REGNO (y);
unsigned int xn = REGNO (x);
unsigned int yn = REGNO (y);
struct movable *mx, *my;
for (mx = movables; mx; mx = mx->next)
@ -3348,8 +3348,8 @@ consec_sets_invariant_p (loop, reg, n_sets, insn)
int n_sets;
rtx reg, insn;
{
register rtx p = insn;
register int regno = REGNO (reg);
rtx p = insn;
unsigned int regno = REGNO (reg);
rtx temp;
/* Number of sets we have to insist on finding after INSN. */
int count = n_sets - 1;
@ -3657,7 +3657,7 @@ struct iv_class *loop_iv_list;
/* Givs made from biv increments are always splittable for loop unrolling.
Since there is no regscan info for them, we have to keep track of them
separately. */
int first_increment_giv, last_increment_giv;
unsigned int first_increment_giv, last_increment_giv;
/* Communication with routines called via `note_stores'. */
@ -4089,7 +4089,7 @@ strength_reduce (loop, insn_count, unroll_p, bct_p)
&& CONSTANT_P (XEXP (src, 1))
&& ((increment = biv_total_increment (bl)) != NULL_RTX))
{
int regno = REGNO (XEXP (src, 0));
unsigned int regno = REGNO (XEXP (src, 0));
for (bl2 = loop_iv_list; bl2; bl2 = bl2->next)
if (bl2->regno == regno)
@ -4215,7 +4215,7 @@ strength_reduce (loop, insn_count, unroll_p, bct_p)
markers. */
if (n_extra_increment && ! loop_info->has_volatile)
{
int nregs = first_increment_giv + n_extra_increment;
unsigned int nregs = first_increment_giv + n_extra_increment;
/* Reallocate reg_iv_type and reg_iv_info. */
VARRAY_GROW (reg_iv_type, nregs);
@ -8458,7 +8458,7 @@ maybe_eliminate_biv (loop, bl, eliminate_p, threshold, insn_count)
if (set && GET_CODE (SET_DEST (set)) == REG)
{
int regno = REGNO (SET_DEST (set));
unsigned int regno = REGNO (SET_DEST (set));
if (regno < max_reg_before_loop
&& REG_IV_TYPE (regno) == GENERAL_INDUCT
@ -10064,11 +10064,12 @@ note_reg_stored (x, setter, arg)
There must be exactly one insn that sets this pseudo; it will be
deleted if all replacements succeed and we can prove that the register
is not used after the loop. */
static void
try_copy_prop (loop, replacement, regno)
const struct loop *loop;
rtx replacement;
int regno;
unsigned int regno;
{
/* This is the reg that we are copying from. */
rtx reg_rtx = regno_reg_rtx[regno];

View File

@ -137,7 +137,7 @@ struct induction
/* A `struct iv_class' is created for each biv. */
struct iv_class {
int regno; /* Pseudo reg which is the biv. */
unsigned int regno; /* Pseudo reg which is the biv. */
int biv_count; /* Number of insns setting this reg. */
struct induction *biv; /* List of all insns that set this reg. */
int giv_count; /* Number of DEST_REG givs computed from this
@ -211,7 +211,7 @@ enum iv_mode { UNKNOWN_INDUCT, BASIC_INDUCT, NOT_BASIC_INDUCT,
extern int *uid_luid;
extern int max_uid_for_loop;
extern int max_reg_before_loop;
extern unsigned int max_reg_before_loop;
extern struct loop **uid_loop;
extern FILE *loop_dump_stream;
@ -226,7 +226,7 @@ extern varray_type reg_iv_info;
extern struct iv_class **reg_biv_class;
extern struct iv_class *loop_iv_list;
extern int first_increment_giv, last_increment_giv;
extern unsigned int first_increment_giv, last_increment_giv;
/* Forward declarations for non-static functions declared in loop.c and
unroll.c. */

View File

@ -68,12 +68,12 @@ extern const enum mode_class mode_class[];
/* Get the size in bytes of an object of mode MODE. */
extern const int mode_size[];
extern const unsigned int mode_size[];
#define GET_MODE_SIZE(MODE) (mode_size[(int) (MODE)])
/* Get the size in bytes of the basic parts of an object of mode MODE. */
extern const int mode_unit_size[];
extern const unsigned int mode_unit_size[];
#define GET_MODE_UNIT_SIZE(MODE) (mode_unit_size[(int) (MODE)])
/* Get the number of units in the object. */
@ -106,12 +106,13 @@ extern const unsigned char mode_wider_mode[];
If LIMIT is nonzero, then don't use modes bigger than MAX_FIXED_MODE_SIZE.
The value is BLKmode if no other mode is found. */
extern enum machine_mode mode_for_size PARAMS ((int, enum mode_class, int));
extern enum machine_mode mode_for_size PARAMS ((unsigned int,
enum mode_class, int));
/* Similar, but find the smallest mode for a given width. */
extern enum machine_mode smallest_mode_for_size
PARAMS ((int, enum mode_class));
PARAMS ((unsigned int, enum mode_class));
/* Return an integer mode of the exact same size as the input mode,

View File

@ -3817,13 +3817,8 @@ build_ivar_list_initializer (type, field_decl)
ivar);
obstack_free (&util_obstack, util_firstobj);
/* set offset */
ivar
= tree_cons
(NULL_TREE,
build_int_2 ((int_bit_position (field_decl) / BITS_PER_UNIT), 0),
ivar);
/* Set offset. */
ivar = tree_cons (NULL_TREE, byte_position (field_decl), ivar);
initlist = tree_cons (NULL_TREE,
build_constructor (type, nreverse (ivar)),
initlist);

View File

@ -829,7 +829,7 @@ expand_binop (mode, binoptab, op0, op1, target, unsignedp, methods)
&& GET_MODE_SIZE (mode) > UNITS_PER_WORD
&& binoptab->handlers[(int) word_mode].insn_code != CODE_FOR_nothing)
{
int i;
unsigned int i;
rtx insns;
rtx equiv_value;
@ -1120,10 +1120,10 @@ expand_binop (mode, binoptab, op0, op1, target, unsignedp, methods)
&& GET_MODE_SIZE (mode) >= 2 * UNITS_PER_WORD
&& binoptab->handlers[(int) word_mode].insn_code != CODE_FOR_nothing)
{
int i;
unsigned int i;
rtx carry_tmp = gen_reg_rtx (word_mode);
optab otheroptab = binoptab == add_optab ? sub_optab : add_optab;
int nwords = GET_MODE_BITSIZE (mode) / BITS_PER_WORD;
unsigned int nwords = GET_MODE_BITSIZE (mode) / BITS_PER_WORD;
rtx carry_in = NULL_RTX, carry_out = NULL_RTX;
rtx xop0, xop1;
@ -2090,7 +2090,7 @@ expand_unop (mode, unoptab, op0, target, unsignedp)
&& GET_MODE_SIZE (mode) > UNITS_PER_WORD
&& unoptab->handlers[(int) word_mode].insn_code != CODE_FOR_nothing)
{
int i;
unsigned int i;
rtx insns;
if (target == 0 || target == op0)

View File

@ -421,7 +421,11 @@ print_node (file, prefix, node, indent)
fprintf (file, " alias set %d", DECL_POINTER_ALIAS_SET (node));
if (TREE_CODE (node) == FIELD_DECL)
print_node (file, "bitpos", DECL_FIELD_BITPOS (node), indent + 4);
{
print_node (file, "offset", DECL_FIELD_OFFSET (node), indent + 4);
print_node (file, "bit offset", DECL_FIELD_BIT_OFFSET (node),
indent + 4);
}
print_node_brief (file, "context", DECL_CONTEXT (node), indent + 4);
print_node_brief (file, "machine_attributes",

View File

@ -6843,7 +6843,7 @@ esqrt (x, y)
floating point mode. The mode can hold an integer value
that many bits wide, without losing any bits. */
int
unsigned int
significand_size (mode)
enum machine_mode mode;
{

View File

@ -119,7 +119,7 @@ typedef struct {
#endif /* no TFmode support */
#endif /* no XFmode support */
extern int significand_size PARAMS ((enum machine_mode));
extern unsigned int significand_size PARAMS ((enum machine_mode));
/* If emulation has been enabled by defining REAL_ARITHMETIC or by
setting LONG_DOUBLE_TYPE_SIZE to 96 or 128, then define macros so that

View File

@ -140,7 +140,7 @@ static unsigned int_reg_class_contents[N_REG_CLASSES][N_REG_INTS]
/* For each reg class, number of regs it contains. */
int reg_class_size[N_REG_CLASSES];
unsigned int reg_class_size[N_REG_CLASSES];
/* For each reg class, table listing all the containing classes. */
@ -554,8 +554,8 @@ memory_move_secondary_cost (mode, class, in)
enum machine_mode
choose_hard_reg_mode (regno, nregs)
int regno ATTRIBUTE_UNUSED;
int nregs;
unsigned int regno ATTRIBUTE_UNUSED;
unsigned int nregs;
{
enum machine_mode found_mode = VOIDmode, mode;
@ -730,7 +730,7 @@ static void record_address_regs PARAMS ((rtx, enum reg_class, int));
#ifdef FORBIDDEN_INC_DEC_CLASSES
static int auto_inc_dec_reg_p PARAMS ((rtx, enum machine_mode));
#endif
static void reg_scan_mark_refs PARAMS ((rtx, rtx, int, int));
static void reg_scan_mark_refs PARAMS ((rtx, rtx, int, unsigned int));
/* Return the reg_class in which pseudo reg number REGNO is best allocated.
This function is sometimes called before the info has been computed.
@ -1681,10 +1681,10 @@ record_reg_classes (n_alts, n_ops, ops, modes, subreg_changes_size,
for (i = 0; i <= 1; i++)
if (REGNO (ops[i]) >= FIRST_PSEUDO_REGISTER)
{
int regno = REGNO (ops[!i]);
unsigned int regno = REGNO (ops[!i]);
enum machine_mode mode = GET_MODE (ops[!i]);
int class;
int nr;
unsigned int nr;
if (regno >= FIRST_PSEUDO_REGISTER && reg_pref != 0)
{
@ -1704,13 +1704,14 @@ record_reg_classes (n_alts, n_ops, ops, modes, subreg_changes_size,
op_costs[i].cost[class] = -1;
else
{
for (nr = 0; nr < HARD_REGNO_NREGS(regno, mode); nr++)
for (nr = 0; nr < HARD_REGNO_NREGS (regno, mode); nr++)
{
if (!TEST_HARD_REG_BIT (reg_class_contents[class], regno + nr))
if (! TEST_HARD_REG_BIT (reg_class_contents[class],
regno + nr))
break;
}
if (nr == HARD_REGNO_NREGS(regno,mode))
if (nr == HARD_REGNO_NREGS (regno,mode))
op_costs[i].cost[class] = -1;
}
}
@ -2142,7 +2143,7 @@ int max_parallel;
void
reg_scan (f, nregs, repeat)
rtx f;
int nregs;
unsigned int nregs;
int repeat ATTRIBUTE_UNUSED;
{
register rtx insn;
@ -2171,10 +2172,10 @@ reg_scan (f, nregs, repeat)
such a REG. We only update information for those. */
void
reg_scan_update(first, last, old_max_regno)
reg_scan_update (first, last, old_max_regno)
rtx first;
rtx last;
int old_max_regno;
unsigned int old_max_regno;
{
register rtx insn;
@ -2205,7 +2206,7 @@ reg_scan_mark_refs (x, insn, note_flag, min_regno)
rtx x;
rtx insn;
int note_flag;
int min_regno;
unsigned int min_regno;
{
register enum rtx_code code;
register rtx dest;
@ -2227,7 +2228,7 @@ reg_scan_mark_refs (x, insn, note_flag, min_regno)
case REG:
{
register int regno = REGNO (x);
unsigned int regno = REGNO (x);
if (regno >= min_regno)
{

View File

@ -253,11 +253,12 @@ static int find_reusable_reload PARAMS ((rtx *, rtx, enum reg_class,
static rtx find_dummy_reload PARAMS ((rtx, rtx, rtx *, rtx *,
enum machine_mode, enum machine_mode,
enum reg_class, int, int));
static int hard_reg_set_here_p PARAMS ((int, int, rtx));
static int hard_reg_set_here_p PARAMS ((unsigned int, unsigned int, rtx));
static struct decomposition decompose PARAMS ((rtx));
static int immune_p PARAMS ((rtx, rtx, struct decomposition));
static int alternative_allows_memconst PARAMS ((const char *, int));
static rtx find_reloads_toplev PARAMS ((rtx, int, enum reload_type, int, int, rtx));
static rtx find_reloads_toplev PARAMS ((rtx, int, enum reload_type, int,
int, rtx));
static rtx make_memloc PARAMS ((rtx, int));
static int find_reloads_address PARAMS ((enum machine_mode, rtx *, rtx, rtx *,
int, enum reload_type, int, rtx));
@ -659,7 +660,7 @@ find_valid_class (m1, n)
int class;
int regno;
enum reg_class best_class = NO_REGS;
int best_size = 0;
unsigned int best_size = 0;
for (class = 1; class < N_REG_CLASSES; class++)
{
@ -1823,8 +1824,8 @@ find_dummy_reload (real_in, real_out, inloc, outloc,
if (GET_CODE (out) == REG
&& REGNO (out) < FIRST_PSEUDO_REGISTER)
{
register int regno = REGNO (out) + out_offset;
int nwords = HARD_REGNO_NREGS (regno, outmode);
unsigned int regno = REGNO (out) + out_offset;
unsigned int nwords = HARD_REGNO_NREGS (regno, outmode);
rtx saved_rtx;
/* When we consider whether the insn uses OUT,
@ -1843,7 +1844,8 @@ find_dummy_reload (real_in, real_out, inloc, outloc,
&& ! refers_to_regno_for_reload_p (regno, regno + nwords,
PATTERN (this_insn), outloc))
{
int i;
unsigned int i;
for (i = 0; i < nwords; i++)
if (! TEST_HARD_REG_BIT (reg_class_contents[(int) class],
regno + i))
@ -1882,8 +1884,8 @@ find_dummy_reload (real_in, real_out, inloc, outloc,
(GET_MODE (out) != VOIDmode
? GET_MODE (out) : outmode)))
{
register int regno = REGNO (in) + in_offset;
int nwords = HARD_REGNO_NREGS (regno, inmode);
unsigned int regno = REGNO (in) + in_offset;
unsigned int nwords = HARD_REGNO_NREGS (regno, inmode);
if (! refers_to_regno_for_reload_p (regno, regno + nwords, out, NULL_PTR)
&& ! hard_reg_set_here_p (regno, regno + nwords,
@ -1892,7 +1894,8 @@ find_dummy_reload (real_in, real_out, inloc, outloc,
|| ! refers_to_regno_for_reload_p (regno, regno + nwords,
PATTERN (this_insn), inloc)))
{
int i;
unsigned int i;
for (i = 0; i < nwords; i++)
if (! TEST_HARD_REG_BIT (reg_class_contents[(int) class],
regno + i))
@ -1942,17 +1945,19 @@ earlyclobber_operand_p (x)
static int
hard_reg_set_here_p (beg_regno, end_regno, x)
register int beg_regno, end_regno;
unsigned int beg_regno, end_regno;
rtx x;
{
if (GET_CODE (x) == SET || GET_CODE (x) == CLOBBER)
{
register rtx op0 = SET_DEST (x);
while (GET_CODE (op0) == SUBREG)
op0 = SUBREG_REG (op0);
if (GET_CODE (op0) == REG)
{
register int r = REGNO (op0);
unsigned int r = REGNO (op0);
/* See if this reg overlaps range under consideration. */
if (r < end_regno
&& r + HARD_REGNO_NREGS (r, GET_MODE (op0)) > beg_regno)
@ -1962,6 +1967,7 @@ hard_reg_set_here_p (beg_regno, end_regno, x)
else if (GET_CODE (x) == PARALLEL)
{
register int i = XVECLEN (x, 0) - 1;
for (; i >= 0; i--)
if (hard_reg_set_here_p (beg_regno, end_regno, XVECEXP (x, 0, i)))
return 1;
@ -5689,13 +5695,14 @@ find_replacement (loc)
int
refers_to_regno_for_reload_p (regno, endregno, x, loc)
int regno, endregno;
unsigned int regno, endregno;
rtx x;
rtx *loc;
{
register int i;
register RTX_CODE code;
register const char *fmt;
int i;
unsigned int r;
RTX_CODE code;
const char *fmt;
if (x == 0)
return 0;
@ -5706,26 +5713,26 @@ refers_to_regno_for_reload_p (regno, endregno, x, loc)
switch (code)
{
case REG:
i = REGNO (x);
r = REGNO (x);
/* If this is a pseudo, a hard register must not have been allocated.
X must therefore either be a constant or be in memory. */
if (i >= FIRST_PSEUDO_REGISTER)
if (r >= FIRST_PSEUDO_REGISTER)
{
if (reg_equiv_memory_loc[i])
if (reg_equiv_memory_loc[r])
return refers_to_regno_for_reload_p (regno, endregno,
reg_equiv_memory_loc[i],
reg_equiv_memory_loc[r],
NULL_PTR);
if (reg_equiv_constant[i])
if (reg_equiv_constant[r])
return 0;
abort ();
}
return (endregno > i
&& regno < i + (i < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (i, GET_MODE (x))
return (endregno > r
&& regno < r + (r < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (r, GET_MODE (x))
: 1));
case SUBREG:
@ -5734,8 +5741,8 @@ refers_to_regno_for_reload_p (regno, endregno, x, loc)
if (GET_CODE (SUBREG_REG (x)) == REG
&& REGNO (SUBREG_REG (x)) < FIRST_PSEUDO_REGISTER)
{
int inner_regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
int inner_endregno
unsigned int inner_regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
unsigned int inner_endregno
= inner_regno + (inner_regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
@ -5983,21 +5990,24 @@ find_equiv_reg (goal, insn, class, other, reload_reg_p, goalreg, mode)
p = PREV_INSN (p);
if (p == 0 || GET_CODE (p) == CODE_LABEL)
return 0;
if (GET_CODE (p) == INSN
/* If we don't want spill regs ... */
&& (! (reload_reg_p != 0
&& reload_reg_p != (short *) (HOST_WIDE_INT) 1)
/* ... then ignore insns introduced by reload; they aren't useful
and can cause results in reload_as_needed to be different
from what they were when calculating the need for spills.
If we notice an input-reload insn here, we will reject it below,
but it might hide a usable equivalent. That makes bad code.
It may even abort: perhaps no reg was spilled for this insn
because it was assumed we would find that equivalent. */
/* ... then ignore insns introduced by reload; they aren't
useful and can cause results in reload_as_needed to be
different from what they were when calculating the need for
spills. If we notice an input-reload insn here, we will
reject it below, but it might hide a usable equivalent.
That makes bad code. It may even abort: perhaps no reg was
spilled for this insn because it was assumed we would find
that equivalent. */
|| INSN_UID (p) < reload_first_uid))
{
rtx tem;
pat = single_set (p);
/* First check for something that sets some reg equal to GOAL. */
if (pat != 0
&& ((regno >= 0
@ -6098,8 +6108,8 @@ find_equiv_reg (goal, insn, class, other, reload_reg_p, goalreg, mode)
/* Reject registers that overlap GOAL. */
if (!goal_mem && !goal_const
&& regno + HARD_REGNO_NREGS (regno, mode) > valueno
&& regno < valueno + HARD_REGNO_NREGS (valueno, mode))
&& regno + (int) HARD_REGNO_NREGS (regno, mode) > valueno
&& regno < valueno + (int) HARD_REGNO_NREGS (valueno, mode))
return 0;
/* Reject VALUE if it is one of the regs reserved for reloads.
@ -6388,7 +6398,7 @@ find_inc_amount (x, inced)
int
regno_clobbered_p (regno, insn)
int regno;
unsigned int regno;
rtx insn;
{
if (GET_CODE (PATTERN (insn)) == CLOBBER

View File

@ -104,7 +104,7 @@ struct reload
enum machine_mode mode;
/* the largest number of registers this reload will require. */
int nregs;
unsigned int nregs;
/* Positive amount to increment or decrement by if
reload_in is a PRE_DEC, PRE_INC, POST_DEC, POST_INC.
@ -319,7 +319,8 @@ extern rtx find_replacement PARAMS ((rtx *));
/* Return nonzero if register in range [REGNO, ENDREGNO)
appears either explicitly or implicitly in X
other than being stored into. */
extern int refers_to_regno_for_reload_p PARAMS ((int, int, rtx, rtx *));
extern int refers_to_regno_for_reload_p PARAMS ((unsigned int, unsigned int,
rtx, rtx *));
/* Nonzero if modifying X will affect IN. */
extern int reg_overlap_mentioned_for_reload_p PARAMS ((rtx, rtx));
@ -334,7 +335,7 @@ extern rtx find_equiv_reg PARAMS ((rtx, rtx, enum reg_class, int, short *,
int, enum machine_mode));
/* Return 1 if register REGNO is the subject of a clobber in insn INSN. */
extern int regno_clobbered_p PARAMS ((int, rtx));
extern int regno_clobbered_p PARAMS ((unsigned int, rtx));
/* Return 1 if X is an operand of an insn that is being earlyclobbered. */
int earlyclobber_operand_p PARAMS ((rtx));

View File

@ -120,7 +120,7 @@ rtx *reg_equiv_address;
rtx *reg_equiv_mem;
/* Widest width in which each pseudo reg is referred to (via subreg). */
static int *reg_max_ref_width;
static unsigned int *reg_max_ref_width;
/* Element N is the list of insns that initialized reg N from its equivalent
constant or memory slot. */
@ -237,7 +237,7 @@ char double_reg_address_ok;
static rtx spill_stack_slot[FIRST_PSEUDO_REGISTER];
/* Width allocated so far for that stack slot. */
static int spill_stack_slot_width[FIRST_PSEUDO_REGISTER];
static unsigned int spill_stack_slot_width[FIRST_PSEUDO_REGISTER];
/* Record which pseudos needed to be spilled. */
static regset_head spilled_pseudos;
@ -393,7 +393,7 @@ static void set_initial_label_offsets PARAMS ((void));
static void set_offsets_for_label PARAMS ((rtx));
static void init_elim_table PARAMS ((void));
static void update_eliminables PARAMS ((HARD_REG_SET *));
static void spill_hard_reg PARAMS ((int, FILE *, int));
static void spill_hard_reg PARAMS ((unsigned int, FILE *, int));
static int finish_spills PARAMS ((int, FILE *));
static void ior_hard_reg_set PARAMS ((HARD_REG_SET *, HARD_REG_SET *));
static void scan_paradoxical_subregs PARAMS ((rtx));
@ -402,28 +402,33 @@ static void order_regs_for_reload PARAMS ((struct insn_chain *));
static void reload_as_needed PARAMS ((int));
static void forget_old_reloads_1 PARAMS ((rtx, rtx, void *));
static int reload_reg_class_lower PARAMS ((const PTR, const PTR));
static void mark_reload_reg_in_use PARAMS ((int, int, enum reload_type,
enum machine_mode));
static void clear_reload_reg_in_use PARAMS ((int, int, enum reload_type,
enum machine_mode));
static int reload_reg_free_p PARAMS ((int, int, enum reload_type));
static void mark_reload_reg_in_use PARAMS ((unsigned int, int,
enum reload_type,
enum machine_mode));
static void clear_reload_reg_in_use PARAMS ((unsigned int, int,
enum reload_type,
enum machine_mode));
static int reload_reg_free_p PARAMS ((unsigned int, int,
enum reload_type));
static int reload_reg_free_for_value_p PARAMS ((int, int, enum reload_type,
rtx, rtx, int, int));
static int reload_reg_reaches_end_p PARAMS ((int, int, enum reload_type));
static int allocate_reload_reg PARAMS ((struct insn_chain *, int, int));
rtx, rtx, int, int));
static int reload_reg_reaches_end_p PARAMS ((unsigned int, int,
enum reload_type));
static int allocate_reload_reg PARAMS ((struct insn_chain *, int,
int));
static void failed_reload PARAMS ((rtx, int));
static int set_reload_reg PARAMS ((int, int));
static void choose_reload_regs_init PARAMS ((struct insn_chain *, rtx *));
static void choose_reload_regs PARAMS ((struct insn_chain *));
static void merge_assigned_reloads PARAMS ((rtx));
static void emit_input_reload_insns PARAMS ((struct insn_chain *,
struct reload *, rtx, int));
struct reload *, rtx, int));
static void emit_output_reload_insns PARAMS ((struct insn_chain *,
struct reload *, int));
struct reload *, int));
static void do_input_reload PARAMS ((struct insn_chain *,
struct reload *, int));
struct reload *, int));
static void do_output_reload PARAMS ((struct insn_chain *,
struct reload *, int));
struct reload *, int));
static void emit_reload_insns PARAMS ((struct insn_chain *));
static void delete_output_reload PARAMS ((rtx, int, int));
static void delete_address_reloads PARAMS ((rtx, rtx));
@ -434,16 +439,16 @@ static void reload_cse_regs_1 PARAMS ((rtx));
static int reload_cse_noop_set_p PARAMS ((rtx));
static int reload_cse_simplify_set PARAMS ((rtx, rtx));
static int reload_cse_simplify_operands PARAMS ((rtx));
static void reload_combine PARAMS ((void));
static void reload_combine_note_use PARAMS ((rtx *, rtx));
static void reload_combine_note_store PARAMS ((rtx, rtx, void *));
static void reload_cse_move2add PARAMS ((rtx));
static void move2add_note_store PARAMS ((rtx, rtx, void *));
static void reload_combine PARAMS ((void));
static void reload_combine_note_use PARAMS ((rtx *, rtx));
static void reload_combine_note_store PARAMS ((rtx, rtx, void *));
static void reload_cse_move2add PARAMS ((rtx));
static void move2add_note_store PARAMS ((rtx, rtx, void *));
#ifdef AUTO_INC_DEC
static void add_auto_inc_notes PARAMS ((rtx, rtx));
static void add_auto_inc_notes PARAMS ((rtx, rtx));
#endif
static rtx gen_mode_int PARAMS ((enum machine_mode,
HOST_WIDE_INT));
HOST_WIDE_INT));
static void failed_reload PARAMS ((rtx, int));
static int set_reload_reg PARAMS ((int, int));
extern void dump_needs PARAMS ((struct insn_chain *, FILE *));
@ -534,17 +539,20 @@ new_insn_chain ()
/* Small utility function to set all regs in hard reg set TO which are
allocated to pseudos in regset FROM. */
void
compute_use_by_pseudos (to, from)
HARD_REG_SET *to;
regset from;
{
int regno;
unsigned int regno;
EXECUTE_IF_SET_IN_REG_SET
(from, FIRST_PSEUDO_REGISTER, regno,
{
int r = reg_renumber[regno];
int nregs;
if (r < 0)
{
/* reload_combine uses the information from
@ -1475,6 +1483,7 @@ static int spill_cost[FIRST_PSEUDO_REGISTER];
static int spill_add_cost[FIRST_PSEUDO_REGISTER];
/* Update the spill cost arrays, considering that pseudo REG is live. */
static void
count_pseudo (reg)
int reg;
@ -1552,6 +1561,7 @@ static HARD_REG_SET used_spill_regs_local;
SPILLED_NREGS. Determine how pseudo REG, which is live during the insn,
is affected. We will add it to SPILLED_PSEUDOS if necessary, and we will
update SPILL_COST/SPILL_ADD_COST. */
static void
count_spilled_pseudo (spilled, spilled_nregs, reg)
int spilled, spilled_nregs, reg;
@ -1582,7 +1592,8 @@ find_reg (chain, order, dumpfile)
struct reload *rl = rld + rnum;
int best_cost = INT_MAX;
int best_reg = -1;
int i, j;
unsigned int i, j;
int k;
HARD_REG_SET not_usable;
HARD_REG_SET used_by_other_reload;
@ -1591,9 +1602,10 @@ find_reg (chain, order, dumpfile)
IOR_COMPL_HARD_REG_SET (not_usable, reg_class_contents[rl->class]);
CLEAR_HARD_REG_SET (used_by_other_reload);
for (i = 0; i < order; i++)
for (k = 0; k < order; k++)
{
int other = reload_order[i];
int other = reload_order[k];
if (rld[other].regno >= 0 && reloads_conflict (other, rnum))
for (j = 0; j < rld[other].nregs; j++)
SET_HARD_REG_BIT (used_by_other_reload, rld[other].regno + j);
@ -1601,14 +1613,15 @@ find_reg (chain, order, dumpfile)
for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
{
int regno = i;
unsigned int regno = i;
if (! TEST_HARD_REG_BIT (not_usable, regno)
&& ! TEST_HARD_REG_BIT (used_by_other_reload, regno)
&& HARD_REGNO_MODE_OK (regno, rl->mode))
{
int this_cost = spill_cost[regno];
int ok = 1;
int this_nregs = HARD_REGNO_NREGS (regno, rl->mode);
unsigned int this_nregs = HARD_REGNO_NREGS (regno, rl->mode);
for (j = 1; j < this_nregs; j++)
{
@ -1643,8 +1656,10 @@ find_reg (chain, order, dumpfile)
}
if (best_reg == -1)
return 0;
if (dumpfile)
fprintf (dumpfile, "Using reg %d for reload %d\n", best_reg, rnum);
rl->nregs = HARD_REGNO_NREGS (best_reg, rl->mode);
rl->regno = best_reg;
@ -1653,6 +1668,7 @@ find_reg (chain, order, dumpfile)
{
count_spilled_pseudo (best_reg, rl->nregs, j);
});
EXECUTE_IF_SET_IN_REG_SET
(&chain->dead_or_set, FIRST_PSEUDO_REGISTER, j,
{
@ -1693,7 +1709,8 @@ find_reload_regs (chain, dumpfile)
{
int regno = REGNO (chain->rld[i].reg_rtx);
chain->rld[i].regno = regno;
chain->rld[i].nregs = HARD_REGNO_NREGS (regno, GET_MODE (chain->rld[i].reg_rtx));
chain->rld[i].nregs
= HARD_REGNO_NREGS (regno, GET_MODE (chain->rld[i].reg_rtx));
}
else
chain->rld[i].regno = -1;
@ -1868,8 +1885,8 @@ alter_reg (i, from_reg)
&& reg_equiv_memory_loc[i] == 0)
{
register rtx x;
int inherent_size = PSEUDO_REGNO_BYTES (i);
int total_size = MAX (inherent_size, reg_max_ref_width[i]);
unsigned int inherent_size = PSEUDO_REGNO_BYTES (i);
unsigned int total_size = MAX (inherent_size, reg_max_ref_width[i]);
int adjust = 0;
/* Each pseudo reg has an inherent size which comes from its own mode,
@ -1970,6 +1987,7 @@ mark_home_live (regno)
int regno;
{
register int i, lim;
i = reg_renumber[regno];
if (i < 0)
return;
@ -3419,7 +3437,7 @@ init_elim_table ()
static void
spill_hard_reg (regno, dumpfile, cant_eliminate)
register int regno;
unsigned int regno;
FILE *dumpfile ATTRIBUTE_UNUSED;
int cant_eliminate;
{
@ -3436,9 +3454,9 @@ spill_hard_reg (regno, dumpfile, cant_eliminate)
for (i = FIRST_PSEUDO_REGISTER; i < max_regno; i++)
if (reg_renumber[i] >= 0
&& reg_renumber[i] <= regno
&& (reg_renumber[i]
+ HARD_REGNO_NREGS (reg_renumber[i],
&& (unsigned int) reg_renumber[i] <= regno
&& ((unsigned int) reg_renumber[i]
+ HARD_REGNO_NREGS ((unsigned int) reg_renumber[i],
PSEUDO_REGNO_MODE (i))
> regno))
SET_REGNO_REG_SET (&spilled_pseudos, i);
@ -3446,6 +3464,7 @@ spill_hard_reg (regno, dumpfile, cant_eliminate)
/* I'm getting weird preprocessor errors if I use IOR_HARD_REG_SET
from within EXECUTE_IF_SET_IN_REG_SET. Hence this awkwardness. */
static void
ior_hard_reg_set (set1, set2)
HARD_REG_SET *set1, *set2;
@ -3956,8 +3975,8 @@ forget_old_reloads_1 (x, ignored, data)
rtx ignored ATTRIBUTE_UNUSED;
void *data ATTRIBUTE_UNUSED;
{
register int regno;
int nr;
unsigned int regno;
unsigned int nr;
int offset = 0;
/* note_stores does give us subregs of hard regs. */
@ -3976,7 +3995,8 @@ forget_old_reloads_1 (x, ignored, data)
nr = 1;
else
{
int i;
unsigned int i;
nr = HARD_REGNO_NREGS (regno, GET_MODE (x));
/* Storing into a spilled-reg invalidates its contents.
This can happen if a block-local pseudo is allocated to that reg
@ -4045,13 +4065,13 @@ static HARD_REG_SET reg_used_in_insn;
static void
mark_reload_reg_in_use (regno, opnum, type, mode)
int regno;
unsigned int regno;
int opnum;
enum reload_type type;
enum machine_mode mode;
{
int nregs = HARD_REGNO_NREGS (regno, mode);
int i;
unsigned int nregs = HARD_REGNO_NREGS (regno, mode);
unsigned int i;
for (i = regno; i < nregs + regno; i++)
{
@ -4110,13 +4130,13 @@ mark_reload_reg_in_use (regno, opnum, type, mode)
static void
clear_reload_reg_in_use (regno, opnum, type, mode)
int regno;
unsigned int regno;
int opnum;
enum reload_type type;
enum machine_mode mode;
{
int nregs = HARD_REGNO_NREGS (regno, mode);
int start_regno, end_regno;
unsigned int nregs = HARD_REGNO_NREGS (regno, mode);
unsigned int start_regno, end_regno, r;
int i;
/* A complication is that for some reload types, inheritance might
allow multiple reloads of the same types to share a reload register.
@ -4196,8 +4216,8 @@ clear_reload_reg_in_use (regno, opnum, type, mode)
&& (check_any || rld[i].opnum == opnum)
&& rld[i].reg_rtx)
{
int conflict_start = true_regnum (rld[i].reg_rtx);
int conflict_end
unsigned int conflict_start = true_regnum (rld[i].reg_rtx);
unsigned int conflict_end
= (conflict_start
+ HARD_REGNO_NREGS (conflict_start, rld[i].mode));
@ -4212,8 +4232,9 @@ clear_reload_reg_in_use (regno, opnum, type, mode)
}
}
}
for (i = start_regno; i < end_regno; i++)
CLEAR_HARD_REG_BIT (*used_in_set, i);
for (r = start_regno; r < end_regno; r++)
CLEAR_HARD_REG_BIT (*used_in_set, r);
}
/* 1 if reg REGNO is free as a reload reg for a reload of the sort
@ -4221,7 +4242,7 @@ clear_reload_reg_in_use (regno, opnum, type, mode)
static int
reload_reg_free_p (regno, opnum, type)
int regno;
unsigned int regno;
int opnum;
enum reload_type type;
{
@ -4381,7 +4402,7 @@ reload_reg_free_p (regno, opnum, type)
static int
reload_reg_reaches_end_p (regno, opnum, type)
int regno;
unsigned int regno;
int opnum;
enum reload_type type;
{
@ -5101,7 +5122,7 @@ choose_reload_regs (chain)
{
rtx insn = chain->insn;
register int i, j;
int max_group_size = 1;
unsigned int max_group_size = 1;
enum reg_class group_class = NO_REGS;
int pass, win, inheritance;
@ -5124,7 +5145,8 @@ choose_reload_regs (chain)
if (rld[j].nregs > 1)
{
max_group_size = MAX (rld[j].nregs, max_group_size);
group_class = reg_class_superunion[(int)rld[j].class][(int)group_class];
group_class
= reg_class_superunion[(int)rld[j].class][(int)group_class];
}
save_reload_reg_rtx[j] = rld[j].reg_rtx;
@ -5146,11 +5168,11 @@ choose_reload_regs (chain)
/* Process the reloads in order of preference just found.
Beyond this point, subregs can be found in reload_reg_rtx.
This used to look for an existing reloaded home for all
of the reloads, and only then perform any new reloads.
But that could lose if the reloads were done out of reg-class order
because a later reload with a looser constraint might have an old
home in a register needed by an earlier reload with a tighter constraint.
This used to look for an existing reloaded home for all of the
reloads, and only then perform any new reloads. But that could lose
if the reloads were done out of reg-class order because a later
reload with a looser constraint might have an old home in a register
needed by an earlier reload with a tighter constraint.
To solve this, we make two passes over the reloads, in the order
described above. In the first pass we try to inherit a reload
@ -5873,6 +5895,7 @@ static HARD_REG_SET reg_reloaded_died;
/* Generate insns to perform reload RL, which is for the insn in CHAIN and
has the number J. OLD contains the value to be used as input. */
static void
emit_input_reload_insns (chain, rl, old, j)
struct insn_chain *chain;
@ -5957,7 +5980,7 @@ emit_input_reload_insns (chain, rl, old, j)
if (oldequiv)
{
int regno = true_regnum (oldequiv);
unsigned int regno = true_regnum (oldequiv);
/* Don't use OLDEQUIV if any other reload changes it at an
earlier stage of this insn or at this stage. */
@ -8784,6 +8807,7 @@ reload_combine_note_use (xp, insn)
reg_offset[n] / reg_base_reg[n] / reg_mode[n] are only valid if
reg_set_luid[n] is larger than last_label_luid[n] . */
static int reg_set_luid[FIRST_PSEUDO_REGISTER];
/* reg_offset[n] has to be CONST_INT for it and reg_base_reg[n] /
reg_mode[n] to be valid.
If reg_offset[n] is a CONST_INT and reg_base_reg[n] is negative, register n
@ -8794,12 +8818,14 @@ static int reg_set_luid[FIRST_PSEUDO_REGISTER];
static rtx reg_offset[FIRST_PSEUDO_REGISTER];
static int reg_base_reg[FIRST_PSEUDO_REGISTER];
static enum machine_mode reg_mode[FIRST_PSEUDO_REGISTER];
/* move2add_luid is linearily increased while scanning the instructions
from first to last. It is used to set reg_set_luid in
reload_cse_move2add and move2add_note_store. */
static int move2add_luid;
/* Generate a CONST_INT and force it in the range of MODE. */
static rtx
gen_mode_int (mode, value)
enum machine_mode mode;
@ -8900,7 +8926,7 @@ reload_cse_move2add (first)
...
(set (REGX) (plus (REGX) (CONST_INT B-A))) */
else if (GET_CODE (src) == REG
&& reg_base_reg[regno] == REGNO (src)
&& reg_base_reg[regno] == (int) REGNO (src)
&& reg_set_luid[regno] > reg_set_luid[REGNO (src)])
{
rtx next = next_nonnote_insn (insn);
@ -8985,20 +9011,22 @@ reload_cse_move2add (first)
/* SET is a SET or CLOBBER that sets DST.
Update reg_set_luid, reg_offset and reg_base_reg accordingly.
Called from reload_cse_move2add via note_stores. */
static void
move2add_note_store (dst, set, data)
rtx dst, set;
void *data ATTRIBUTE_UNUSED;
{
int regno = 0;
int i;
unsigned int regno = 0;
unsigned int i;
enum machine_mode mode = GET_MODE (dst);
if (GET_CODE (dst) == SUBREG)
{
regno = SUBREG_WORD (dst);
dst = SUBREG_REG (dst);
}
if (GET_CODE (dst) != REG)
return;
@ -9017,6 +9045,7 @@ move2add_note_store (dst, set, data)
case PLUS:
{
rtx src0 = XEXP (src, 0);
if (GET_CODE (src0) == REG)
{
if (REGNO (src0) != regno
@ -9025,9 +9054,11 @@ move2add_note_store (dst, set, data)
reg_base_reg[regno] = REGNO (src0);
reg_set_luid[regno] = move2add_luid;
}
reg_offset[regno] = XEXP (src, 1);
break;
}
reg_set_luid[regno] = move2add_luid;
reg_offset[regno] = set; /* Invalidate contents. */
break;
@ -9048,7 +9079,9 @@ move2add_note_store (dst, set, data)
}
else
{
for (i = regno + HARD_REGNO_NREGS (regno, mode) - 1; i >= regno; i--)
unsigned int endregno = regno + HARD_REGNO_NREGS (regno, mode);
for (i = regno; i < endregno; i++)
{
/* Indicate that this register has been recently written to,
but the exact contents are not available. */

View File

@ -185,8 +185,9 @@ mark_referenced_resources (x, res, include_delayed_effects)
register struct resources *res;
register int include_delayed_effects;
{
register enum rtx_code code = GET_CODE (x);
register int i, j;
enum rtx_code code = GET_CODE (x);
int i, j;
unsigned int r;
register const char *format_ptr;
/* Handle leaf items for which we set resource flags. Also, special-case
@ -206,16 +207,18 @@ mark_referenced_resources (x, res, include_delayed_effects)
mark_referenced_resources (SUBREG_REG (x), res, 0);
else
{
int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
int last_regno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
for (i = regno; i < last_regno; i++)
SET_HARD_REG_BIT (res->regs, i);
unsigned int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
unsigned int last_regno
= regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
for (r = regno; r < last_regno; r++)
SET_HARD_REG_BIT (res->regs, r);
}
return;
case REG:
for (i = 0; i < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); i++)
SET_HARD_REG_BIT (res->regs, REGNO (x) + i);
for (r = 0; r < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); r++)
SET_HARD_REG_BIT (res->regs, REGNO (x) + r);
return;
case MEM:
@ -594,9 +597,10 @@ mark_set_resources (x, res, in_dest, include_delayed_effects)
int in_dest;
int include_delayed_effects;
{
register enum rtx_code code;
register int i, j;
register const char *format_ptr;
enum rtx_code code;
int i, j;
unsigned int r;
const char *format_ptr;
restart:
@ -634,9 +638,9 @@ mark_set_resources (x, res, in_dest, include_delayed_effects)
rtx link;
res->cc = res->memory = 1;
for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
if (call_used_regs[i] || global_regs[i])
SET_HARD_REG_BIT (res->regs, i);
for (r = 0; r < FIRST_PSEUDO_REGISTER; r++)
if (call_used_regs[r] || global_regs[r])
SET_HARD_REG_BIT (res->regs, r);
/* If X is part of a delay slot sequence, then NEXT should be
the first insn after the sequence. */
@ -731,18 +735,20 @@ mark_set_resources (x, res, in_dest, include_delayed_effects)
in_dest, include_delayed_effects);
else
{
int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
int last_regno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
for (i = regno; i < last_regno; i++)
SET_HARD_REG_BIT (res->regs, i);
unsigned int regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
unsigned int last_regno
= regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
for (r = regno; r < last_regno; r++)
SET_HARD_REG_BIT (res->regs, r);
}
}
return;
case REG:
if (in_dest)
for (i = 0; i < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); i++)
SET_HARD_REG_BIT (res->regs, REGNO (x) + i);
for (r = 0; r < HARD_REGNO_NREGS (REGNO (x), GET_MODE (x)); r++)
SET_HARD_REG_BIT (res->regs, REGNO (x) + r);
return;
case UNSPEC_VOLATILE:
@ -905,8 +911,8 @@ mark_target_live_regs (insns, target, res)
if (b != -1)
{
regset regs_live = BASIC_BLOCK (b)->global_live_at_start;
int j;
int regno;
unsigned int j;
unsigned int regno;
rtx start_insn, stop_insn;
/* Compute hard regs live at start of block -- this is the real hard regs
@ -918,12 +924,15 @@ mark_target_live_regs (insns, target, res)
EXECUTE_IF_SET_IN_REG_SET
(regs_live, FIRST_PSEUDO_REGISTER, i,
{
if ((regno = reg_renumber[i]) >= 0)
for (j = regno;
j < regno + HARD_REGNO_NREGS (regno,
PSEUDO_REGNO_MODE (i));
j++)
SET_HARD_REG_BIT (current_live_regs, j);
if (reg_renumber[i] >= 0)
{
regno = reg_renumber[i];
for (j = regno;
j < regno + HARD_REGNO_NREGS (regno,
PSEUDO_REGNO_MODE (i));
j++)
SET_HARD_REG_BIT (current_live_regs, j);
}
});
/* Get starting and ending insn, handling the case where each might

View File

@ -136,7 +136,7 @@ const enum mode_class mode_class[(int) MAX_MACHINE_MODE] = {
#define DEF_MACHMODE(SYM, NAME, CLASS, SIZE, UNIT, WIDER) SIZE,
const int mode_size[(int) MAX_MACHINE_MODE] = {
const unsigned int mode_size[(int) MAX_MACHINE_MODE] = {
#include "machmode.def"
};
@ -147,7 +147,7 @@ const int mode_size[(int) MAX_MACHINE_MODE] = {
#define DEF_MACHMODE(SYM, NAME, CLASS, SIZE, UNIT, WIDER) UNIT,
const int mode_unit_size[(int) MAX_MACHINE_MODE] = {
const unsigned int mode_unit_size[(int) MAX_MACHINE_MODE] = {
#include "machmode.def" /* machine modes are documented here */
};

View File

@ -87,6 +87,7 @@ typedef union rtunion_def
{
HOST_WIDE_INT rtwint;
int rtint;
unsigned int rtuint;
const char *rtstr;
struct rtx_def *rtx;
struct rtvec_def *rtvec;
@ -338,6 +339,7 @@ extern void rtvec_check_failed_bounds PARAMS ((rtvec, int,
#define XCWINT(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtwint)
#define XCINT(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtint)
#define XCUINT(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtuint)
#define XCSTR(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtstr)
#define XCEXP(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtx)
#define XCVEC(RTX, N, C) (RTL_CHECKC1(RTX, N, C).rtvec)
@ -613,7 +615,7 @@ extern const char * const note_insn_name[];
#define LABEL_ALTERNATE_NAME(RTX) XCSTR(RTX, 7, CODE_LABEL)
/* The original regno this ADDRESSOF was built for. */
#define ADDRESSOF_REGNO(RTX) XCINT(RTX, 1, ADDRESSOF)
#define ADDRESSOF_REGNO(RTX) XCUINT(RTX, 1, ADDRESSOF)
/* The variable in the register we took the address of. */
#define ADDRESSOF_DECL(RTX) XCTREE(RTX, 2, ADDRESSOF)
@ -642,7 +644,7 @@ extern const char * const note_insn_name[];
/* For a REG rtx, REGNO extracts the register number. */
#define REGNO(RTX) XCINT(RTX, 0, REG)
#define REGNO(RTX) XCUINT(RTX, 0, REG)
/* For a REG rtx, REG_FUNCTION_VALUE_P is nonzero if the reg
is the current function's return value. */
@ -660,7 +662,7 @@ extern const char * const note_insn_name[];
SUBREG_WORD extracts the word-number. */
#define SUBREG_REG(RTX) XCEXP(RTX, 0, SUBREG)
#define SUBREG_WORD(RTX) XCINT(RTX, 1, SUBREG)
#define SUBREG_WORD(RTX) XCUINT(RTX, 1, SUBREG)
/* 1 if the REG contained in SUBREG_REG is already known to be
sign- or zero-extended from the mode of the SUBREG to the mode of
@ -999,8 +1001,10 @@ extern rtx gen_lowpart_if_possible PARAMS ((enum machine_mode, rtx));
extern rtx gen_highpart PARAMS ((enum machine_mode, rtx));
extern rtx gen_realpart PARAMS ((enum machine_mode, rtx));
extern rtx gen_imagpart PARAMS ((enum machine_mode, rtx));
extern rtx operand_subword PARAMS ((rtx, int, int, enum machine_mode));
extern rtx operand_subword_force PARAMS ((rtx, int, enum machine_mode));
extern rtx operand_subword PARAMS ((rtx, unsigned int, int,
enum machine_mode));
extern rtx operand_subword_force PARAMS ((rtx, unsigned int,
enum machine_mode));
extern int subreg_lowpart_p PARAMS ((rtx));
extern rtx make_safe_from PARAMS ((rtx, rtx));
extern rtx convert_memory_address PARAMS ((enum machine_mode, rtx));
@ -1101,8 +1105,10 @@ extern rtx gen_bge PARAMS ((rtx));
extern rtx gen_ble PARAMS ((rtx));
extern rtx gen_mem_addressof PARAMS ((rtx, union tree_node *));
extern rtx eliminate_constant_term PARAMS ((rtx, rtx *));
extern rtx expand_complex_abs PARAMS ((enum machine_mode, rtx, rtx, int));
extern enum machine_mode choose_hard_reg_mode PARAMS ((int, int));
extern rtx expand_complex_abs PARAMS ((enum machine_mode, rtx, rtx,
int));
extern enum machine_mode choose_hard_reg_mode PARAMS ((unsigned int,
unsigned int));
extern void set_unique_reg_note PARAMS ((rtx, enum reg_note, rtx));
/* Functions in rtlanal.c */
@ -1126,16 +1132,21 @@ extern int reg_set_p PARAMS ((rtx, rtx));
extern rtx single_set PARAMS ((rtx));
extern int multiple_sets PARAMS ((rtx));
extern rtx find_last_value PARAMS ((rtx, rtx *, rtx, int));
extern int refers_to_regno_p PARAMS ((int, int, rtx, rtx *));
extern int refers_to_regno_p PARAMS ((unsigned int, unsigned int,
rtx, rtx *));
extern int reg_overlap_mentioned_p PARAMS ((rtx, rtx));
extern void note_stores PARAMS ((rtx, void (*)(rtx, rtx, void *), void *));
extern void note_stores PARAMS ((rtx,
void (*) (rtx, rtx, void *),
void *));
extern rtx reg_set_last PARAMS ((rtx, rtx));
extern int dead_or_set_p PARAMS ((rtx, rtx));
extern int dead_or_set_regno_p PARAMS ((rtx, int));
extern int dead_or_set_regno_p PARAMS ((rtx, unsigned int));
extern rtx find_reg_note PARAMS ((rtx, enum reg_note, rtx));
extern rtx find_regno_note PARAMS ((rtx, enum reg_note, int));
extern rtx find_regno_note PARAMS ((rtx, enum reg_note,
unsigned int));
extern int find_reg_fusage PARAMS ((rtx, enum rtx_code, rtx));
extern int find_regno_fusage PARAMS ((rtx, enum rtx_code, int));
extern int find_regno_fusage PARAMS ((rtx, enum rtx_code,
unsigned int));
extern void remove_note PARAMS ((rtx, rtx));
extern int side_effects_p PARAMS ((rtx));
extern int volatile_refs_p PARAMS ((rtx));
@ -1143,11 +1154,12 @@ extern int volatile_insn_p PARAMS ((rtx));
extern int may_trap_p PARAMS ((rtx));
extern int inequality_comparisons_p PARAMS ((rtx));
extern rtx replace_rtx PARAMS ((rtx, rtx, rtx));
extern rtx replace_regs PARAMS ((rtx, rtx *, int, int));
extern rtx replace_regs PARAMS ((rtx, rtx *, unsigned int,
int));
extern int computed_jump_p PARAMS ((rtx));
typedef int (*rtx_function) PARAMS ((rtx *, void *));
extern int for_each_rtx PARAMS ((rtx *, rtx_function, void *));
extern rtx regno_use_in PARAMS ((int, rtx));
extern rtx regno_use_in PARAMS ((unsigned int, rtx));
extern int auto_inc_p PARAMS ((rtx));
extern void remove_node_from_expr_list PARAMS ((rtx, rtx *));
extern int insns_safe_to_move_p PARAMS ((rtx, rtx, rtx *));
@ -1486,9 +1498,9 @@ extern void remove_unncessary_notes PARAMS ((void));
extern void add_clobbers PARAMS ((rtx, int));
/* In combine.c */
extern int combine_instructions PARAMS ((rtx, int));
extern int extended_count PARAMS ((rtx, enum machine_mode, int));
extern rtx remove_death PARAMS ((int, rtx));
extern int combine_instructions PARAMS ((rtx, unsigned int));
extern unsigned int extended_count PARAMS ((rtx, enum machine_mode, int));
extern rtx remove_death PARAMS ((unsigned int, rtx));
#ifdef BUFSIZ
extern void dump_combine_stats PARAMS ((FILE *));
extern void dump_combine_total_stats PARAMS ((FILE *));
@ -1585,8 +1597,8 @@ extern void init_reg_sets PARAMS ((void));
extern void regset_release_memory PARAMS ((void));
extern void regclass_init PARAMS ((void));
extern void regclass PARAMS ((rtx, int, FILE *));
extern void reg_scan PARAMS ((rtx, int, int));
extern void reg_scan_update PARAMS ((rtx, rtx, int));
extern void reg_scan PARAMS ((rtx, unsigned int, int));
extern void reg_scan_update PARAMS ((rtx, rtx, unsigned int));
extern void fix_register PARAMS ((const char *, int, int));
extern void delete_null_pointer_checks PARAMS ((rtx));

View File

@ -824,13 +824,14 @@ find_last_value (x, pinsn, valid_to, allow_hwreg)
int
refers_to_regno_p (regno, endregno, x, loc)
int regno, endregno;
unsigned int regno, endregno;
rtx x;
rtx *loc;
{
register int i;
register RTX_CODE code;
register const char *fmt;
int i;
unsigned int x_regno;
RTX_CODE code;
const char *fmt;
repeat:
/* The contents of a REG_NONNEG note is always zero, so we must come here
@ -843,22 +844,22 @@ refers_to_regno_p (regno, endregno, x, loc)
switch (code)
{
case REG:
i = REGNO (x);
x_regno = REGNO (x);
/* If we modifying the stack, frame, or argument pointer, it will
clobber a virtual register. In fact, we could be more precise,
but it isn't worth it. */
if ((i == STACK_POINTER_REGNUM
if ((x_regno == STACK_POINTER_REGNUM
#if FRAME_POINTER_REGNUM != ARG_POINTER_REGNUM
|| i == ARG_POINTER_REGNUM
|| x_regno == ARG_POINTER_REGNUM
#endif
|| i == FRAME_POINTER_REGNUM)
|| x_regno == FRAME_POINTER_REGNUM)
&& regno >= FIRST_VIRTUAL_REGISTER && regno <= LAST_VIRTUAL_REGISTER)
return 1;
return (endregno > i
&& regno < i + (i < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (i, GET_MODE (x))
return (endregno > x_regno
&& regno < x_regno + (x_regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (x_regno, GET_MODE (x))
: 1));
case SUBREG:
@ -867,8 +868,8 @@ refers_to_regno_p (regno, endregno, x, loc)
if (GET_CODE (SUBREG_REG (x)) == REG
&& REGNO (SUBREG_REG (x)) < FIRST_PSEUDO_REGISTER)
{
int inner_regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
int inner_endregno
unsigned int inner_regno = REGNO (SUBREG_REG (x)) + SUBREG_WORD (x);
unsigned int inner_endregno
= inner_regno + (inner_regno < FIRST_PSEUDO_REGISTER
? HARD_REGNO_NREGS (regno, GET_MODE (x)) : 1);
@ -939,7 +940,7 @@ int
reg_overlap_mentioned_p (x, in)
rtx x, in;
{
int regno, endregno;
unsigned int regno, endregno;
/* Overly conservative. */
if (GET_CODE (x) == STRICT_LOW_PART)
@ -1000,7 +1001,7 @@ reg_overlap_mentioned_p (x, in)
static int reg_set_last_unknown;
static rtx reg_set_last_value;
static int reg_set_last_first_regno, reg_set_last_last_regno;
static unsigned int reg_set_last_first_regno, reg_set_last_last_regno;
/* Called via note_stores from reg_set_last. */
@ -1010,7 +1011,7 @@ reg_set_last_1 (x, pat, data)
rtx pat;
void *data ATTRIBUTE_UNUSED;
{
int first, last;
unsigned int first, last;
/* If X is not a register, or is not one in the range we care
about, ignore. */
@ -1149,6 +1150,7 @@ note_stores (x, fun, data)
&& GET_MODE (dest) == BLKmode)
{
register int i;
for (i = XVECLEN (dest, 0) - 1; i >= 0; i--)
(*fun) (SET_DEST (XVECEXP (dest, 0, i)), y, data);
}
@ -1181,8 +1183,8 @@ dead_or_set_p (insn, x)
rtx insn;
rtx x;
{
register int regno, last_regno;
register int i;
unsigned int regno, last_regno;
unsigned int i;
/* Can't use cc0_rtx below since this file is used by genattrtab.c. */
if (GET_CODE (x) == CC0)
@ -1208,9 +1210,9 @@ dead_or_set_p (insn, x)
int
dead_or_set_regno_p (insn, test_regno)
rtx insn;
int test_regno;
unsigned int test_regno;
{
int regno, endregno;
unsigned int regno, endregno;
rtx link;
/* See if there is a death note for something that includes
@ -1323,7 +1325,7 @@ rtx
find_regno_note (insn, kind, regno)
rtx insn;
enum reg_note kind;
int regno;
unsigned int regno;
{
register rtx link;
@ -1376,15 +1378,16 @@ find_reg_fusage (insn, code, datum)
}
else
{
register int regno = REGNO (datum);
unsigned int regno = REGNO (datum);
/* CALL_INSN_FUNCTION_USAGE information cannot contain references
to pseudo registers, so don't bother checking. */
if (regno < FIRST_PSEUDO_REGISTER)
{
int end_regno = regno + HARD_REGNO_NREGS (regno, GET_MODE (datum));
int i;
unsigned int end_regno
= regno + HARD_REGNO_NREGS (regno, GET_MODE (datum));
unsigned int i;
for (i = regno; i < end_regno; i++)
if (find_regno_fusage (insn, code, i))
@ -1402,7 +1405,7 @@ int
find_regno_fusage (insn, code, regno)
rtx insn;
enum rtx_code code;
int regno;
unsigned int regno;
{
register rtx link;
@ -1415,8 +1418,8 @@ find_regno_fusage (insn, code, regno)
for (link = CALL_INSN_FUNCTION_USAGE (insn); link; link = XEXP (link, 1))
{
register int regnote;
register rtx op, reg;
unsigned int regnote;
rtx op, reg;
if (GET_CODE (op = XEXP (link, 0)) == code
&& GET_CODE (reg = XEXP (op, 0)) == REG
@ -1889,7 +1892,7 @@ rtx
replace_regs (x, reg_map, nregs, replace_dest)
rtx x;
rtx *reg_map;
int nregs;
unsigned int nregs;
int replace_dest;
{
register enum rtx_code code;
@ -2165,7 +2168,7 @@ for_each_rtx (x, f, data)
rtx
regno_use_in (regno, x)
int regno;
unsigned int regno;
rtx x;
{
register const char *fmt;

View File

@ -1358,7 +1358,6 @@ sdbout_parms (parms)
current_sym_value = 0;
if (GET_CODE (DECL_RTL (parms)) == REG
&& REGNO (DECL_RTL (parms)) >= 0
&& REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
type = DECL_ARG_TYPE (parms);
else
@ -1406,8 +1405,7 @@ sdbout_parms (parms)
pretend the parm was passed there. It would be more consistent
to describe the register where the parm was passed,
but in practice that register usually holds something else. */
if (REGNO (DECL_RTL (parms)) >= 0
&& REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
if (REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER)
best_rtl = DECL_RTL (parms);
/* If the parm lives nowhere,
use the register where it was passed. */
@ -1469,7 +1467,6 @@ sdbout_reg_parms (parms)
/* Report parms that live in registers during the function
but were passed in memory. */
if (GET_CODE (DECL_RTL (parms)) == REG
&& REGNO (DECL_RTL (parms)) >= 0
&& REGNO (DECL_RTL (parms)) < FIRST_PSEUDO_REGISTER
&& PARM_PASSED_IN_MEMORY (parms))
{

View File

@ -149,7 +149,7 @@ simplify_unary_operation (code, mode, op, op_mode)
rtx op;
enum machine_mode op_mode;
{
register int width = GET_MODE_BITSIZE (mode);
unsigned int width = GET_MODE_BITSIZE (mode);
/* The order of these tests is critical so that, for example, we don't
check the wrong mode (input vs. output) for a conversion operation,
@ -550,7 +550,7 @@ simplify_binary_operation (code, mode, op0, op1)
{
register HOST_WIDE_INT arg0, arg1, arg0s, arg1s;
HOST_WIDE_INT val;
int width = GET_MODE_BITSIZE (mode);
unsigned int width = GET_MODE_BITSIZE (mode);
rtx tem;
/* Relational operations don't work here. We must know the mode
@ -1975,16 +1975,20 @@ static int discard_useless_locs PARAMS ((void **, void *));
static int discard_useless_values PARAMS ((void **, void *));
static void remove_useless_values PARAMS ((void));
static unsigned int hash_rtx PARAMS ((rtx, enum machine_mode, int));
static cselib_val *new_cselib_val PARAMS ((unsigned int, enum machine_mode));
static void add_mem_for_addr PARAMS ((cselib_val *, cselib_val *, rtx));
static cselib_val *new_cselib_val PARAMS ((unsigned int,
enum machine_mode));
static void add_mem_for_addr PARAMS ((cselib_val *, cselib_val *,
rtx));
static cselib_val *cselib_lookup_mem PARAMS ((rtx, int));
static rtx cselib_subst_to_values PARAMS ((rtx));
static void cselib_invalidate_regno PARAMS ((int, enum machine_mode));
static void cselib_invalidate_regno PARAMS ((unsigned int,
enum machine_mode));
static int cselib_mem_conflict_p PARAMS ((rtx, rtx));
static int cselib_invalidate_mem_1 PARAMS ((void **, void *));
static void cselib_invalidate_mem PARAMS ((rtx));
static void cselib_invalidate_rtx PARAMS ((rtx, rtx, void *));
static void cselib_record_set PARAMS ((rtx, cselib_val *, cselib_val *));
static void cselib_record_set PARAMS ((rtx, cselib_val *,
cselib_val *));
static void cselib_record_sets PARAMS ((rtx));
/* There are three ways in which cselib can look up an rtx:
@ -2779,13 +2783,14 @@ cselib_lookup (x, mode, create)
is used to determine how many hard registers are being changed. If MODE
is VOIDmode, then only REGNO is being changed; this is used when
invalidating call clobbered registers across a call. */
static void
cselib_invalidate_regno (regno, mode)
int regno;
unsigned int regno;
enum machine_mode mode;
{
int endregno;
int i;
unsigned int endregno;
unsigned int i;
/* If we see pseudos after reload, something is _wrong_. */
if (reload_completed && regno >= FIRST_PSEUDO_REGISTER
@ -2810,15 +2815,17 @@ cselib_invalidate_regno (regno, mode)
{
cselib_val *v = (*l)->elt;
struct elt_loc_list **p;
int this_last = i;
unsigned int this_last = i;
if (i < FIRST_PSEUDO_REGISTER)
this_last += HARD_REGNO_NREGS (i, GET_MODE (v->u.val_rtx)) - 1;
if (this_last < regno)
{
l = &(*l)->next;
continue;
}
/* We have an overlap. */
unchain_one_elt_list (l);
@ -2827,6 +2834,7 @@ cselib_invalidate_regno (regno, mode)
for (p = &v->locs; ; p = &(*p)->next)
{
rtx x = (*p)->loc;
if (GET_CODE (x) == REG && REGNO (x) == i)
{
unchain_one_elt_loc_list (p);
@ -2986,12 +2994,13 @@ cselib_invalidate_rtx (dest, ignore, data)
/* Record the result of a SET instruction. DEST is being set; the source
contains the value described by SRC_ELT. If DEST is a MEM, DEST_ADDR_ELT
describes its address. */
static void
cselib_record_set (dest, src_elt, dest_addr_elt)
rtx dest;
cselib_val *src_elt, *dest_addr_elt;
{
int dreg = GET_CODE (dest) == REG ? REGNO (dest) : -1;
int dreg = GET_CODE (dest) == REG ? (int) REGNO (dest) : -1;
if (src_elt == 0 || side_effects_p (dest))
return;

View File

@ -1,22 +1,22 @@
/* Static Single Assignment conversion routines for the GNU compiler.
Copyright (C) 2000 Free Software Foundation, Inc.
This file is part of GNU CC.
This file is part of GNU CC.
GNU CC is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2, or (at your option)
any later version.
GNU CC is free software; you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by the
Free Software Foundation; either version 2, or (at your option) any
later version.
GNU CC is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
GNU CC is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
for more details.
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to
the Free Software Foundation, 59 Temple Place - Suite 330,
Boston, MA 02111-1307, USA. */
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to the Free
Software Foundation, 59 Temple Place - Suite 330, Boston, MA
02111-1307, USA. */
/* References:
@ -73,7 +73,7 @@ varray_type ssa_rename_from;
static rtx *ssa_rename_to;
/* The number of registers that were live on entry to the SSA routines. */
static int ssa_max_reg_num;
static unsigned int ssa_max_reg_num;
/* Local function prototypes. */
@ -689,7 +689,7 @@ rename_block (bb, idom)
while (PHI_NODE_P (insn))
{
rtx phi = PATTERN (insn);
int regno;
unsigned int regno;
rtx reg;
/* Find out which of our outgoing registers this node is

View File

@ -2952,12 +2952,14 @@ expand_return (retval)
&& TYPE_MODE (TREE_TYPE (retval_rhs)) == BLKmode
&& GET_CODE (result_rtl) == REG)
{
int i, bitpos, xbitpos;
int big_endian_correction = 0;
int bytes = int_size_in_bytes (TREE_TYPE (retval_rhs));
int i;
unsigned HOST_WIDE_INT bitpos, xbitpos;
unsigned HOST_WIDE_INT big_endian_correction = 0;
unsigned HOST_WIDE_INT bytes
= int_size_in_bytes (TREE_TYPE (retval_rhs));
int n_regs = (bytes + UNITS_PER_WORD - 1) / UNITS_PER_WORD;
int bitsize = MIN (TYPE_ALIGN (TREE_TYPE (retval_rhs)),
(unsigned int)BITS_PER_WORD);
unsigned int bitsize
= MIN (TYPE_ALIGN (TREE_TYPE (retval_rhs)), BITS_PER_WORD);
rtx *result_pseudos = (rtx *) alloca (sizeof (rtx) * n_regs);
rtx result_reg, src = NULL_RTX, dst = NULL_RTX;
rtx result_val = expand_expr (retval_rhs, NULL_RTX, VOIDmode, 0);
@ -4905,8 +4907,8 @@ add_case_node (low, high, label, duplicate)
/* Returns the number of possible values of TYPE.
Returns -1 if the number is unknown or variable.
Returns -2 if the number does not fit in a HOST_WIDE_INT.
Returns -1 if the number is unknown, variable, or if the number does not
fit in a HOST_WIDE_INT.
Sets *SPARENESS to 2 if TYPE is an ENUMERAL_TYPE whose values
do not increase monotonically (there may be duplicates);
to 1 if the values increase monotonically, but not always by 1;
@ -4917,73 +4919,60 @@ all_cases_count (type, spareness)
tree type;
int *spareness;
{
HOST_WIDE_INT count;
tree t;
HOST_WIDE_INT count, minval, lastval;
*spareness = 0;
switch (TREE_CODE (type))
{
tree t;
case BOOLEAN_TYPE:
count = 2;
break;
case CHAR_TYPE:
count = 1 << BITS_PER_UNIT;
break;
default:
case INTEGER_TYPE:
if (TREE_CODE (TYPE_MIN_VALUE (type)) != INTEGER_CST
|| TYPE_MAX_VALUE (type) == NULL
|| TREE_CODE (TYPE_MAX_VALUE (type)) != INTEGER_CST)
return -1;
if (TYPE_MAX_VALUE (type) != 0
&& 0 != (t = fold (build (MINUS_EXPR, type, TYPE_MAX_VALUE (type),
TYPE_MIN_VALUE (type))))
&& 0 != (t = fold (build (PLUS_EXPR, type, t,
convert (type, integer_zero_node))))
&& host_integerp (t, 1))
count = tree_low_cst (t, 1);
else
{
/* count
= TREE_INT_CST_LOW (TYPE_MAX_VALUE (type))
- TREE_INT_CST_LOW (TYPE_MIN_VALUE (type)) + 1
but with overflow checking. */
tree mint = TYPE_MIN_VALUE (type);
tree maxt = TYPE_MAX_VALUE (type);
HOST_WIDE_INT lo, hi;
neg_double(TREE_INT_CST_LOW (mint), TREE_INT_CST_HIGH (mint),
&lo, &hi);
add_double(TREE_INT_CST_LOW (maxt), TREE_INT_CST_HIGH (maxt),
lo, hi, &lo, &hi);
add_double (lo, hi, 1, 0, &lo, &hi);
if (hi != 0 || lo < 0)
return -2;
count = lo;
}
return -1;
break;
case ENUMERAL_TYPE:
/* Don't waste time with enumeral types with huge values. */
if (! host_integerp (TYPE_MIN_VALUE (type), 0)
|| TYPE_MAX_VALUE (type) == 0
|| ! host_integerp (TYPE_MAX_VALUE (type), 0))
return -1;
lastval = minval = tree_low_cst (TYPE_MIN_VALUE (type), 0);
count = 0;
for (t = TYPE_VALUES (type); t != NULL_TREE; t = TREE_CHAIN (t))
{
if (TREE_CODE (TYPE_MIN_VALUE (type)) != INTEGER_CST
|| TREE_CODE (TREE_VALUE (t)) != INTEGER_CST
|| (TREE_INT_CST_LOW (TYPE_MIN_VALUE (type)) + count
!= TREE_INT_CST_LOW (TREE_VALUE (t))))
HOST_WIDE_INT thisval = tree_low_cst (TREE_VALUE (t), 0);
if (*spareness == 2 || thisval < lastval)
*spareness = 2;
else if (thisval != minval + count)
*spareness = 1;
count++;
}
if (*spareness == 1)
{
tree prev = TREE_VALUE (TYPE_VALUES (type));
for (t = TYPE_VALUES (type); t = TREE_CHAIN (t), t != NULL_TREE; )
{
if (! tree_int_cst_lt (prev, TREE_VALUE (t)))
{
*spareness = 2;
break;
}
prev = TREE_VALUE (t);
}
}
}
return count;
}
#define BITARRAY_TEST(ARRAY, INDEX) \
((ARRAY)[(unsigned) (INDEX) / HOST_BITS_PER_CHAR]\
& (1 << ((unsigned) (INDEX) % HOST_BITS_PER_CHAR)))
@ -5003,21 +4992,22 @@ void
mark_seen_cases (type, cases_seen, count, sparseness)
tree type;
unsigned char *cases_seen;
long count;
HOST_WIDE_INT count;
int sparseness;
{
tree next_node_to_try = NULL_TREE;
long next_node_offset = 0;
HOST_WIDE_INT next_node_offset = 0;
register struct case_node *n, *root = case_stack->data.case_stmt.case_list;
tree val = make_node (INTEGER_CST);
TREE_TYPE (val) = type;
if (! root)
; /* Do nothing */
else if (sparseness == 2)
{
tree t;
HOST_WIDE_INT xlo;
unsigned HOST_WIDE_INT xlo;
/* This less efficient loop is only needed to handle
duplicate case values (multiple enum constants
@ -5053,6 +5043,7 @@ mark_seen_cases (type, cases_seen, count, sparseness)
{
if (root->left)
case_stack->data.case_stmt.case_list = root = case_tree2list (root, 0);
for (n = root; n; n = n->right)
{
TREE_INT_CST_LOW (val) = TREE_INT_CST_LOW (n->low);
@ -5063,8 +5054,10 @@ mark_seen_cases (type, cases_seen, count, sparseness)
The element with lowest value has offset 0, the next smallest
element has offset 1, etc. */
HOST_WIDE_INT xlo, xhi;
unsigned HOST_WIDE_INT xlo;
HOST_WIDE_INT xhi;
tree t;
if (sparseness && TYPE_VALUES (type) != NULL_TREE)
{
/* The TYPE_VALUES will be in increasing order, so
@ -5107,8 +5100,9 @@ mark_seen_cases (type, cases_seen, count, sparseness)
&xlo, &xhi);
}
if (xhi == 0 && xlo >= 0 && xlo < count)
if (xhi == 0 && xlo < (unsigned HOST_WIDE_INT) count)
BITARRAY_SET (cases_seen, xlo);
add_double (TREE_INT_CST_LOW (val), TREE_INT_CST_HIGH (val),
1, 0,
&TREE_INT_CST_LOW (val), &TREE_INT_CST_HIGH (val));
@ -5150,7 +5144,7 @@ check_for_full_enumeration_handling (type)
unsigned char *cases_seen;
/* The allocated size of cases_seen, in chars. */
long bytes_needed;
HOST_WIDE_INT bytes_needed;
if (! warn_switch)
return;
@ -5164,7 +5158,7 @@ check_for_full_enumeration_handling (type)
aborting, as xmalloc would do. */
&& (cases_seen = (unsigned char *) calloc (bytes_needed, 1)) != NULL)
{
long i;
HOST_WIDE_INT i;
tree v = TYPE_VALUES (type);
/* The time complexity of this code is normally O(N), where
@ -5174,12 +5168,10 @@ check_for_full_enumeration_handling (type)
mark_seen_cases (type, cases_seen, size, sparseness);
for (i = 0; v != NULL_TREE && i < size; i++, v = TREE_CHAIN (v))
{
if (BITARRAY_TEST(cases_seen, i) == 0)
warning ("enumeration value `%s' not handled in switch",
IDENTIFIER_POINTER (TREE_PURPOSE (v)));
}
for (i = 0; v != NULL_TREE && i < size; i++, v = TREE_CHAIN (v))
if (BITARRAY_TEST(cases_seen, i) == 0)
warning ("enumeration value `%s' not handled in switch",
IDENTIFIER_POINTER (TREE_PURPOSE (v)));
free (cases_seen);
}

File diff suppressed because it is too large Load Diff

View File

@ -2321,7 +2321,11 @@ tree
bit_position (field)
tree field;
{
return DECL_FIELD_BITPOS (field);
return size_binop (PLUS_EXPR, DECL_FIELD_BIT_OFFSET (field),
size_binop (MULT_EXPR,
convert (bitsizetype,
DECL_FIELD_OFFSET (field)),
bitsize_unit_node));
}
/* Likewise, but return as an integer. Abort if it cannot be represented
@ -2335,6 +2339,31 @@ int_bit_position (field)
return tree_low_cst (bit_position (field), 0);
}
/* Return the byte position of FIELD, in bytes from the start of the record.
This is a tree of type sizetype. */
tree
byte_position (field)
tree field;
{
return size_binop (PLUS_EXPR, DECL_FIELD_OFFSET (field),
convert (sizetype,
size_binop (FLOOR_DIV_EXPR,
DECL_FIELD_BIT_OFFSET (field),
bitsize_unit_node)));
}
/* Likewise, but return as an integer. Abort if it cannot be represented
in that way (since it could be a signed value, we don't have the option
of returning -1 like int_size_in_byte can. */
HOST_WIDE_INT
int_byte_position (field)
tree field;
{
return tree_low_cst (byte_position (field), 0);
}
/* Return the strictest alignment, in bits, that T is known to have. */
unsigned int
@ -4091,8 +4120,8 @@ type_hash_canon (hashcode, type)
obstack_free (TYPE_OBSTACK (type), type);
#ifdef GATHER_STATISTICS
tree_node_counts[(int)t_kind]--;
tree_node_sizes[(int)t_kind] -= sizeof (struct tree_type);
tree_node_counts[(int) t_kind]--;
tree_node_sizes[(int) t_kind] -= sizeof (struct tree_type);
#endif
return t1;
}
@ -4112,7 +4141,9 @@ mark_hash_entry (entry, param)
void *param ATTRIBUTE_UNUSED;
{
struct type_hash *p = *(struct type_hash **)entry;
ggc_mark_tree (p->type);
/* Continue scan. */
return 1;
}
@ -4124,14 +4155,16 @@ mark_type_hash (arg)
void *arg;
{
htab_t t = *(htab_t *) arg;
htab_traverse (t, mark_hash_entry, 0);
}
static void
print_type_hash_statistics ()
{
fprintf (stderr, "Type hash: size %d, %d elements, %f collisions\n",
htab_size (type_hash_table), htab_elements (type_hash_table),
fprintf (stderr, "Type hash: size %ld, %ld elements, %f collisions\n",
(long) htab_size (type_hash_table),
(long) htab_elements (type_hash_table),
htab_collisions (type_hash_table));
}
@ -4594,6 +4627,7 @@ build_index_type (maxval)
{
register tree itype = make_node (INTEGER_TYPE);
TREE_TYPE (itype) = sizetype;
TYPE_PRECISION (itype) = TYPE_PRECISION (sizetype);
TYPE_MIN_VALUE (itype) = size_zero_node;
@ -4605,20 +4639,9 @@ build_index_type (maxval)
TYPE_SIZE (itype) = TYPE_SIZE (sizetype);
TYPE_SIZE_UNIT (itype) = TYPE_SIZE_UNIT (sizetype);
TYPE_ALIGN (itype) = TYPE_ALIGN (sizetype);
if (TREE_CODE (maxval) == INTEGER_CST)
{
int maxint = TREE_INT_CST_LOW (maxval);
/* If the domain should be empty, make sure the maxval
remains -1 and is not spoiled by truncation. */
if (tree_int_cst_sgn (maxval) < 0)
{
TYPE_MAX_VALUE (itype) = build_int_2 (-1, -1);
TREE_TYPE (TYPE_MAX_VALUE (itype)) = sizetype;
}
return type_hash_canon (maxint < 0 ? ~maxint : maxint, itype);
}
if (host_integerp (maxval, 1))
return type_hash_canon (tree_low_cst (maxval, 1), itype);
else
return itype;
}
@ -4648,21 +4671,11 @@ build_range_type (type, lowval, highval)
TYPE_SIZE (itype) = TYPE_SIZE (type);
TYPE_SIZE_UNIT (itype) = TYPE_SIZE_UNIT (type);
TYPE_ALIGN (itype) = TYPE_ALIGN (type);
if (TREE_CODE (lowval) == INTEGER_CST)
{
HOST_WIDE_INT lowint, highint;
int maxint;
lowint = TREE_INT_CST_LOW (lowval);
if (highval && TREE_CODE (highval) == INTEGER_CST)
highint = TREE_INT_CST_LOW (highval);
else
highint = (~(unsigned HOST_WIDE_INT) 0) >> 1;
maxint = (int) (highint - lowint);
return type_hash_canon (maxint < 0 ? ~maxint : maxint, itype);
}
if (host_integerp (lowval, 0) && highval != 0 && host_integerp (highval, 0))
return type_hash_canon (tree_low_cst (highval, 0)
- tree_low_cst (lowval, 0),
itype);
else
return itype;
}
@ -4674,7 +4687,7 @@ tree
build_index_2_type (lowval,highval)
tree lowval, highval;
{
return build_range_type (NULL_TREE, lowval, highval);
return build_range_type (sizetype, lowval, highval);
}
/* Return nonzero iff ITYPE1 and ITYPE2 are equal (in the LISP sense).
@ -5700,10 +5713,11 @@ build_common_tree_nodes_2 (short_double)
integer_one_node = build_int_2 (1, 0);
TREE_TYPE (integer_one_node) = integer_type_node;
size_zero_node = build_int_2 (0, 0);
TREE_TYPE (size_zero_node) = sizetype;
size_one_node = build_int_2 (1, 0);
TREE_TYPE (size_one_node) = sizetype;
size_zero_node = size_int (0);
size_one_node = size_int (1);
bitsize_zero_node = bitsize_int (0);
bitsize_one_node = bitsize_int (1);
bitsize_unit_node = bitsize_int (BITS_PER_UNIT);
void_type_node = make_node (VOID_TYPE);
layout_type (void_type_node);

View File

@ -1063,27 +1063,30 @@ struct tree_type
containing function, the RECORD_TYPE or UNION_TYPE for the containing
type, or NULL_TREE if the given decl has "file scope". */
#define DECL_CONTEXT(NODE) (DECL_CHECK (NODE)->decl.context)
#define DECL_FIELD_CONTEXT(NODE) (DECL_CHECK (NODE)->decl.context)
#define DECL_FIELD_CONTEXT(NODE) (FIELD_DECL_CHECK (NODE)->decl.context)
/* In a DECL this is the field where configuration dependent machine
attributes are store */
#define DECL_MACHINE_ATTRIBUTES(NODE) (DECL_CHECK (NODE)->decl.machine_attributes)
/* In a FIELD_DECL, this is the field position, counting in bits,
of the bit closest to the beginning of the structure. */
#define DECL_FIELD_BITPOS(NODE) (DECL_CHECK (NODE)->decl.arguments)
/* In a FIELD_DECL, this is the field position, counting in bytes, of the
byte containing the bit closest to the beginning of the structure. */
#define DECL_FIELD_OFFSET(NODE) (FIELD_DECL_CHECK (NODE)->decl.arguments)
/* In a FIELD_DECL, this is the offset, in bits, of the first bit of the
field from DECL_FIELD_OFFSET. */
#define DECL_FIELD_BIT_OFFSET(NODE) (FIELD_DECL_CHECK (NODE)->decl.u2.t)
/* In a FIELD_DECL, this indicates whether the field was a bit-field and
if so, the type that was originally specified for it.
TREE_TYPE may have been modified (in finish_struct). */
#define DECL_BIT_FIELD_TYPE(NODE) (DECL_CHECK (NODE)->decl.result)
#define DECL_BIT_FIELD_TYPE(NODE) (FIELD_DECL_CHECK (NODE)->decl.result)
/* In FUNCTION_DECL, a chain of ..._DECL nodes. */
/* VAR_DECL and PARM_DECL reserve the arguments slot
for language-specific uses. */
#define DECL_ARGUMENTS(NODE) (DECL_CHECK (NODE)->decl.arguments)
/* In FUNCTION_DECL, holds the decl for the return value. */
#define DECL_RESULT(NODE) (DECL_CHECK (NODE)->decl.result)
#define DECL_RESULT(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.result)
/* For a TYPE_DECL, holds the "original" type. (TREE_TYPE has the copy.) */
#define DECL_ORIGINAL_TYPE(NODE) (DECL_CHECK (NODE)->decl.result)
#define DECL_ORIGINAL_TYPE(NODE) (TYPE_DECL_CHECK (NODE)->decl.result)
/* In PARM_DECL, holds the type as written (perhaps a function or array). */
#define DECL_ARG_TYPE_AS_WRITTEN(NODE) (DECL_CHECK (NODE)->decl.result)
#define DECL_ARG_TYPE_AS_WRITTEN(NODE) (PARM_DECL_CHECK (NODE)->decl.result)
/* For a FUNCTION_DECL, holds the tree of BINDINGs.
For a VAR_DECL, holds the initial value.
For a PARM_DECL, not used--default
@ -1092,10 +1095,10 @@ struct tree_type
#define DECL_INITIAL(NODE) (DECL_CHECK (NODE)->decl.initial)
/* For a PARM_DECL, records the data type used to pass the argument,
which may be different from the type seen in the program. */
#define DECL_ARG_TYPE(NODE) (DECL_CHECK (NODE)->decl.initial)
#define DECL_ARG_TYPE(NODE) (PARM_DECL_CHECK (NODE)->decl.initial)
/* For a FIELD_DECL in a QUAL_UNION_TYPE, records the expression, which
if nonzero, indicates that the field occupies the type. */
#define DECL_QUALIFIER(NODE) (DECL_CHECK (NODE)->decl.initial)
#define DECL_QUALIFIER(NODE) (FIELD_DECL_CHECK (NODE)->decl.initial)
/* These two fields describe where in the source code the declaration was. */
#define DECL_SOURCE_FILE(NODE) (DECL_CHECK (NODE)->decl.filename)
#define DECL_SOURCE_LINE(NODE) (DECL_CHECK (NODE)->decl.linenum)
@ -1105,7 +1108,9 @@ struct tree_type
/* Likewise for the size in bytes. */
#define DECL_SIZE_UNIT(NODE) (DECL_CHECK (NODE)->decl.size_unit)
/* Holds the alignment required for the datum. */
#define DECL_ALIGN(NODE) (DECL_CHECK (NODE)->decl.u1.u)
#define DECL_ALIGN(NODE) (DECL_CHECK (NODE)->decl.u1.a.align)
/* For FIELD_DECLs, holds the alignment that DECL_FEILD_OFFSET has. */
#define DECL_OFFSET_ALIGN(NODE) (FIELD_DECL_CHECK (NODE)->decl.u1.a.off_align)
/* Holds the machine mode corresponding to the declaration of a variable or
field. Always equal to TYPE_MODE (TREE_TYPE (decl)) except for a
FIELD_DECL. */
@ -1121,15 +1126,15 @@ struct tree_type
#define DECL_LIVE_RANGE_RTL(NODE) (DECL_CHECK (NODE)->decl.live_range_rtl)
/* For PARM_DECL, holds an RTL for the stack slot or register
where the data was actually passed. */
#define DECL_INCOMING_RTL(NODE) (DECL_CHECK (NODE)->decl.u2.r)
#define DECL_INCOMING_RTL(NODE) (PARM_DECL_CHECK (NODE)->decl.u2.r)
/* For FUNCTION_DECL, if it is inline, holds the saved insn chain. */
#define DECL_SAVED_INSNS(NODE) (DECL_CHECK (NODE)->decl.u2.f)
#define DECL_SAVED_INSNS(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.u2.f)
/* For FUNCTION_DECL, if it is inline,
holds the size of the stack frame, as an integer. */
#define DECL_FRAME_SIZE(NODE) (DECL_CHECK (NODE)->decl.u1.i)
#define DECL_FRAME_SIZE(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.u1.i)
/* For FUNCTION_DECL, if it is built-in,
this identifies which built-in operation it is. */
#define DECL_FUNCTION_CODE(NODE) (DECL_CHECK (NODE)->decl.u1.f)
#define DECL_FUNCTION_CODE(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.u1.f)
/* The DECL_VINDEX is used for FUNCTION_DECLS in two different ways.
Before the struct containing the FUNCTION_DECL is laid out,
@ -1142,7 +1147,7 @@ struct tree_type
/* For FIELD_DECLS, DECL_FCONTEXT is the *first* baseclass in
which this FIELD_DECL is defined. This information is needed when
writing debugging information about vfield and vbase decls for C++. */
#define DECL_FCONTEXT(NODE) (DECL_CHECK (NODE)->decl.vindex)
#define DECL_FCONTEXT(NODE) (FIELD_DECL_CHECK (NODE)->decl.vindex)
/* Every ..._DECL node gets a unique number. */
#define DECL_UID(NODE) (DECL_CHECK (NODE)->decl.uid)
@ -1206,19 +1211,20 @@ struct tree_type
nonzero means the detail info about this type is not dumped into stabs.
Instead it will generate cross reference ('x') of names.
This uses the same flag as DECL_EXTERNAL. */
#define TYPE_DECL_SUPPRESS_DEBUG(NODE) (DECL_CHECK (NODE)->decl.external_flag)
#define TYPE_DECL_SUPPRESS_DEBUG(NODE) \
(TYPE_DECL_CHECK (NODE)->decl.external_flag)
/* In VAR_DECL and PARM_DECL nodes, nonzero means declared `register'. */
#define DECL_REGISTER(NODE) (DECL_CHECK (NODE)->decl.regdecl_flag)
/* In LABEL_DECL nodes, nonzero means that an error message about
jumping into such a binding contour has been printed for this label. */
#define DECL_ERROR_ISSUED(NODE) (DECL_CHECK (NODE)->decl.regdecl_flag)
#define DECL_ERROR_ISSUED(NODE) (LABEL_DECL_CHECK (NODE)->decl.regdecl_flag)
/* In a FIELD_DECL, indicates this field should be bit-packed. */
#define DECL_PACKED(NODE) (DECL_CHECK (NODE)->decl.regdecl_flag)
#define DECL_PACKED(NODE) (FIELD_DECL_CHECK (NODE)->decl.regdecl_flag)
/* In a FUNCTION_DECL with a non-zero DECL_CONTEXT, indicates that a
static chain is not needed. */
#define DECL_NO_STATIC_CHAIN(NODE) (DECL_CHECK (NODE)->decl.regdecl_flag)
#define DECL_NO_STATIC_CHAIN(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.regdecl_flag)
/* Nonzero in a ..._DECL means this variable is ref'd from a nested function.
For VAR_DECL nodes, PARM_DECL nodes, and FUNCTION_DECL nodes.
@ -1231,35 +1237,37 @@ struct tree_type
/* Nonzero in a FUNCTION_DECL means this function can be substituted
where it is called. */
#define DECL_INLINE(NODE) (DECL_CHECK (NODE)->decl.inline_flag)
#define DECL_INLINE(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.inline_flag)
/* Nonzero in a FUNCTION_DECL means this is a built-in function
that is not specified by ansi C and that users are supposed to be allowed
to redefine for any purpose whatever. */
#define DECL_BUILT_IN_NONANSI(NODE) ((NODE)->common.unsigned_flag)
#define DECL_BUILT_IN_NONANSI(NODE) \
(FUNCTION_DECL_CHECK (NODE)->common.unsigned_flag)
/* Nonzero in a FUNCTION_DECL means this function should be treated
as if it were a malloc, meaning it returns a pointer that is
not an alias. */
#define DECL_IS_MALLOC(NODE) (DECL_CHECK (NODE)->decl.malloc_flag)
#define DECL_IS_MALLOC(NODE) (FUNCTION_DECL_CHECK (NODE)->decl.malloc_flag)
/* Nonzero in a FIELD_DECL means it is a bit field, and must be accessed
specially. */
#define DECL_BIT_FIELD(NODE) (DECL_CHECK (NODE)->decl.bit_field_flag)
#define DECL_BIT_FIELD(NODE) (FIELD_DECL_CHECK (NODE)->decl.bit_field_flag)
/* In a LABEL_DECL, nonzero means label was defined inside a binding
contour that restored a stack level and which is now exited. */
#define DECL_TOO_LATE(NODE) (DECL_CHECK (NODE)->decl.bit_field_flag)
#define DECL_TOO_LATE(NODE) (LABEL_DECL_CHECK (NODE)->decl.bit_field_flag)
/* Unused in FUNCTION_DECL. */
/* In a VAR_DECL that's static,
nonzero if the space is in the text section. */
#define DECL_IN_TEXT_SECTION(NODE) (DECL_CHECK (NODE)->decl.bit_field_flag)
#define DECL_IN_TEXT_SECTION(NODE) (VAR_DECL_CHECK (NODE)->decl.bit_field_flag)
/* In a FUNCTION_DECL, nonzero means a built in function. */
#define DECL_BUILT_IN(NODE) (DECL_BUILT_IN_CLASS (NODE) != NOT_BUILT_IN)
/* For a builtin function, identify which part of the compiler defined it. */
#define DECL_BUILT_IN_CLASS(NODE) (DECL_CHECK (NODE)->decl.built_in_class)
#define DECL_BUILT_IN_CLASS(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.built_in_class)
/* Used in VAR_DECLs to indicate that the variable is a vtable.
Used in FIELD_DECLs for vtable pointers.
@ -1273,12 +1281,16 @@ struct tree_type
/* Used in PARM_DECLs whose type are unions to indicate that the
argument should be passed in the same way that the first union
alternative would be passed. */
#define DECL_TRANSPARENT_UNION(NODE) (DECL_CHECK (NODE)->decl.transparent_union)
#define DECL_TRANSPARENT_UNION(NODE) \
(PARM_DECL_CHECK (NODE)->decl.transparent_union)
/* Used in FUNCTION_DECLs to indicate that they should be run automatically
at the beginning or end of execution. */
#define DECL_STATIC_CONSTRUCTOR(NODE) (DECL_CHECK (NODE)->decl.static_ctor_flag)
#define DECL_STATIC_DESTRUCTOR(NODE) (DECL_CHECK (NODE)->decl.static_dtor_flag)
#define DECL_STATIC_CONSTRUCTOR(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.static_ctor_flag)
#define DECL_STATIC_DESTRUCTOR(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.static_dtor_flag)
/* Used to indicate that this DECL represents a compiler-generated entity. */
#define DECL_ARTIFICIAL(NODE) (DECL_CHECK (NODE)->decl.artificial_flag)
@ -1303,15 +1315,18 @@ struct tree_type
/* Used in FUNCTION_DECLs to indicate that function entry and exit should
be instrumented with calls to support routines. */
#define DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT(NODE) ((NODE)->decl.no_instrument_function_entry_exit)
#define DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.no_instrument_function_entry_exit)
/* Used in FUNCTION_DECLs to indicate that check-memory-usage should be
disabled in this function. */
#define DECL_NO_CHECK_MEMORY_USAGE(NODE) ((NODE)->decl.no_check_memory_usage)
#define DECL_NO_CHECK_MEMORY_USAGE(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.no_check_memory_usage)
/* Used in FUNCTION_DECLs to indicate that limit-stack-* should be
disabled in this function. */
#define DECL_NO_LIMIT_STACK(NODE) ((NODE)->decl.no_limit_stack)
#define DECL_NO_LIMIT_STACK(NODE) \
(FUNCTION_DECL_CHECK (NODE)->decl.no_limit_stack)
/* Additional flags for language-specific uses. */
#define DECL_LANG_FLAG_0(NODE) (DECL_CHECK (NODE)->decl.lang_flag_0)
@ -1391,17 +1406,17 @@ struct tree_decl
/* For a FUNCTION_DECL, if inline, this is the size of frame needed.
If built-in, this is the code for which built-in function.
For other kinds of decls, this is DECL_ALIGN. */
For other kinds of decls, this is DECL_ALIGN and DECL_OFFSET_ALIGN. */
union {
HOST_WIDE_INT i;
unsigned int u;
enum built_in_function f;
struct {unsigned int align : 24; unsigned int off_align : 8;} a;
} u1;
union tree_node *size_unit;
union tree_node *name;
union tree_node *context;
union tree_node *arguments; /* Also used for DECL_FIELD_BITPOS */
union tree_node *arguments; /* Also used for DECL_FIELD_OFFSET */
union tree_node *result; /* Also used for DECL_BIT_FIELD_TYPE */
union tree_node *initial; /* Also used for DECL_QUALIFIER */
union tree_node *abstract_origin;
@ -1412,6 +1427,7 @@ struct tree_decl
struct rtx_def *live_range_rtl;
/* In FUNCTION_DECL, if it is inline, holds the saved insn chain.
In FIELD_DECL, is DECL_FIELD_BIT_OFFSET.
In PARM_DECL, holds an RTL for the stack slot
of register where the data was actually passed.
Used by Chill and Java in LABEL_DECL and by C++ and Java in VAR_DECL. */
@ -1471,7 +1487,11 @@ enum tree_index
TI_SIZE_ZERO,
TI_SIZE_ONE,
TI_BITSIZE_ZERO,
TI_BITSIZE_ONE,
TI_BITSIZE_UNIT,
TI_COMPLEX_INTEGER_TYPE,
TI_COMPLEX_FLOAT_TYPE,
TI_COMPLEX_DOUBLE_TYPE,
@ -1510,6 +1530,10 @@ extern tree global_trees[TI_MAX];
#define integer_one_node global_trees[TI_INTEGER_ONE]
#define size_zero_node global_trees[TI_SIZE_ZERO]
#define size_one_node global_trees[TI_SIZE_ONE]
#define bitsize_zero_node global_trees[TI_BITSIZE_ZERO]
#define bitsize_one_node global_trees[TI_BITSIZE_ONE]
#define bitsize_unit_node global_trees[TI_BITSIZE_UNIT]
#define null_pointer_node global_trees[TI_NULL_POINTER]
#define float_type_node global_trees[TI_FLOAT_TYPE]
@ -1749,37 +1773,37 @@ extern void layout_type PARAMS ((tree));
/* These functions allow a front-end to perform a manual layout of a
RECORD_TYPE. (For instance, if the placement of subsequent fields
depends on the placement of fields so far.) Begin by calling
new_record_layout_info. Then, call layout_field for each of the
start_record_layout. Then, call place_field for each of the
fields. Then, call finish_record_layout. See layout_type for the
default way in which these functions are used. */
struct record_layout_info_s
typedef struct record_layout_info
{
/* The RECORD_TYPE that we are laying out. */
tree t;
/* The size of the record so far, in bits. */
unsigned HOST_WIDE_INT const_size;
/* The offset into the record so far, in bytes, not including bits in
BITPOS. */
tree offset;
/* The last known alignment of SIZE. */
unsigned int offset_align;
/* The bit position within the last OFFSET_ALIGN bits, in bits. */
tree bitpos;
/* The alignment of the record so far, in bits. */
unsigned int record_align;
/* If the record can have a variable size, then this will be
non-NULL, and the total size will be CONST_SIZE + VAR_SIZE. */
tree var_size;
/* If the record can have a variable size, then this will be the
maximum alignment that we know VAR_SIZE has. */
unsigned int var_align;
/* The alignment of the record so far, not including padding, in bits. */
unsigned int unpacked_align;
/* The static variables (i.e., class variables, as opposed to
instance variables) encountered in T. */
tree pending_statics;
unsigned int unpacked_align;
int packed_maybe_necessary;
};
} *record_layout_info;
typedef struct record_layout_info_s *record_layout_info;
extern record_layout_info new_record_layout_info
PARAMS ((tree));
extern void layout_field PARAMS ((record_layout_info, tree));
extern void finish_record_layout PARAMS ((record_layout_info));
extern record_layout_info start_record_layout PARAMS ((tree));
extern tree rli_size_unit_so_far PARAMS ((record_layout_info));
extern tree rli_size_so_far PARAMS ((record_layout_info));
extern void normalize_rli PARAMS ((record_layout_info));
extern void place_field PARAMS ((record_layout_info, tree));
extern void finish_record_layout PARAMS ((record_layout_info));
/* Given a hashcode and a ..._TYPE node (for which the hashcode was made),
return a canonicalized ..._TYPE node, so that duplicates are not made.
@ -1817,6 +1841,8 @@ extern tree size_in_bytes PARAMS ((tree));
extern HOST_WIDE_INT int_size_in_bytes PARAMS ((tree));
extern tree bit_position PARAMS ((tree));
extern HOST_WIDE_INT int_bit_position PARAMS ((tree));
extern tree byte_position PARAMS ((tree));
extern HOST_WIDE_INT int_byte_position PARAMS ((tree));
/* Define data structures, macros, and functions for handling sizes
and the various types used to represent sizes. */
@ -2060,9 +2086,10 @@ extern tree maybe_build_cleanup PARAMS ((tree));
look for nested component-refs or array-refs at constant positions
and find the ultimate containing object, which is returned. */
extern tree get_inner_reference PARAMS ((tree, int *, int *, tree *,
enum machine_mode *, int *,
int *, unsigned int *));
extern tree get_inner_reference PARAMS ((tree, HOST_WIDE_INT *,
HOST_WIDE_INT *, tree *,
enum machine_mode *, int *,
int *, unsigned int *));
/* Given a DECL or TYPE, return the scope in which it was declared, or
NUL_TREE if there is no containing scope. */

View File

@ -196,7 +196,7 @@ static int *splittable_regs_updates;
/* Forward declarations. */
static void init_reg_map PARAMS ((struct inline_remap *, int));
static rtx calculate_giv_inc PARAMS ((rtx, rtx, int));
static rtx calculate_giv_inc PARAMS ((rtx, rtx, unsigned int));
static rtx initial_reg_note_copy PARAMS ((rtx, struct inline_remap *));
static void final_reg_note_copy PARAMS ((rtx, struct inline_remap *));
static void copy_loop_body PARAMS ((rtx, rtx, struct inline_remap *, rtx, int,
@ -234,6 +234,7 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
int strength_reduce_p;
{
int i, j;
unsigned int r;
unsigned HOST_WIDE_INT temp;
int unroll_number = 1;
rtx copy_start, copy_end;
@ -243,8 +244,8 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
struct inline_remap *map;
char *local_label = NULL;
char *local_regno;
int max_local_regnum;
int maxregnum;
unsigned int max_local_regnum;
unsigned int maxregnum;
rtx exit_label = 0;
rtx start_label;
struct iv_class *bl;
@ -829,11 +830,11 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
results in better code. */
/* We must limit the generic test to max_reg_before_loop, because only
these pseudo registers have valid regno_first_uid info. */
for (j = FIRST_PSEUDO_REGISTER; j < max_reg_before_loop; ++j)
if (REGNO_FIRST_UID (j) > 0 && REGNO_FIRST_UID (j) <= max_uid_for_loop
&& uid_luid[REGNO_FIRST_UID (j)] >= copy_start_luid
&& REGNO_LAST_UID (j) > 0 && REGNO_LAST_UID (j) <= max_uid_for_loop
&& uid_luid[REGNO_LAST_UID (j)] <= copy_end_luid)
for (r = FIRST_PSEUDO_REGISTER; r < max_reg_before_loop; ++r)
if (REGNO_FIRST_UID (r) > 0 && REGNO_FIRST_UID (r) <= max_uid_for_loop
&& uid_luid[REGNO_FIRST_UID (r)] >= copy_start_luid
&& REGNO_LAST_UID (r) > 0 && REGNO_LAST_UID (r) <= max_uid_for_loop
&& uid_luid[REGNO_LAST_UID (r)] <= copy_end_luid)
{
/* However, we must also check for loop-carried dependencies.
If the value the pseudo has at the end of iteration X is
@ -844,26 +845,26 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
regno_last_uid. */
/* ??? This check is simplistic. We would get better code if
this check was more sophisticated. */
if (set_dominates_use (j, REGNO_FIRST_UID (j), REGNO_LAST_UID (j),
if (set_dominates_use (r, REGNO_FIRST_UID (r), REGNO_LAST_UID (r),
copy_start, copy_end))
local_regno[j] = 1;
local_regno[r] = 1;
if (loop_dump_stream)
{
if (local_regno[j])
fprintf (loop_dump_stream, "Marked reg %d as local\n", j);
if (local_regno[r])
fprintf (loop_dump_stream, "Marked reg %d as local\n", r);
else
fprintf (loop_dump_stream, "Did not mark reg %d as local\n",
j);
r);
}
}
/* Givs that have been created from multiple biv increments always have
local registers. */
for (j = first_increment_giv; j <= last_increment_giv; j++)
for (r = first_increment_giv; r <= last_increment_giv; r++)
{
local_regno[j] = 1;
local_regno[r] = 1;
if (loop_dump_stream)
fprintf (loop_dump_stream, "Marked reg %d as local\n", j);
fprintf (loop_dump_stream, "Marked reg %d as local\n", r);
}
}
@ -1080,12 +1081,13 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
if (local_label[j])
set_label_in_map (map, j, gen_label_rtx ());
for (j = FIRST_PSEUDO_REGISTER; j < max_local_regnum; j++)
if (local_regno[j])
for (r = FIRST_PSEUDO_REGISTER; r < max_local_regnum; r++)
if (local_regno[r])
{
map->reg_map[j] = gen_reg_rtx (GET_MODE (regno_reg_rtx[j]));
record_base_value (REGNO (map->reg_map[j]),
regno_reg_rtx[j], 0);
map->reg_map[r]
= gen_reg_rtx (GET_MODE (regno_reg_rtx[r]));
record_base_value (REGNO (map->reg_map[r]),
regno_reg_rtx[r], 0);
}
/* The last copy needs the compare/branch insns at the end,
so reset copy_end here if the loop ends with a conditional
@ -1223,12 +1225,12 @@ unroll_loop (loop, insn_count, end_insert_before, strength_reduce_p)
if (local_label[j])
set_label_in_map (map, j, gen_label_rtx ());
for (j = FIRST_PSEUDO_REGISTER; j < max_local_regnum; j++)
if (local_regno[j])
for (r = FIRST_PSEUDO_REGISTER; r < max_local_regnum; r++)
if (local_regno[r])
{
map->reg_map[j] = gen_reg_rtx (GET_MODE (regno_reg_rtx[j]));
record_base_value (REGNO (map->reg_map[j]),
regno_reg_rtx[j], 0);
map->reg_map[r] = gen_reg_rtx (GET_MODE (regno_reg_rtx[r]));
record_base_value (REGNO (map->reg_map[r]),
regno_reg_rtx[r], 0);
}
/* If loop starts with a branch to the test, then fix it so that
@ -1532,7 +1534,7 @@ init_reg_map (map, maxregnum)
static rtx
calculate_giv_inc (pattern, src_insn, regno)
rtx pattern, src_insn;
int regno;
unsigned int regno;
{
rtx increment;
rtx increment_total = 0;
@ -1763,7 +1765,7 @@ copy_loop_body (copy_start, copy_end, map, exit_label, last_iteration,
{
struct iv_class *bl;
struct induction *v, *tv;
int regno = REGNO (SET_DEST (set));
unsigned int regno = REGNO (SET_DEST (set));
v = addr_combined_regs[REGNO (SET_DEST (set))];
bl = reg_biv_class[REGNO (v->src_reg)];
@ -1856,8 +1858,8 @@ copy_loop_body (copy_start, copy_end, map, exit_label, last_iteration,
&& GET_CODE (SET_DEST (set)) == REG
&& splittable_regs[REGNO (SET_DEST (set))])
{
int regno = REGNO (SET_DEST (set));
int src_regno;
unsigned int regno = REGNO (SET_DEST (set));
unsigned int src_regno;
dest_reg_was_split = 1;

View File

@ -1331,7 +1331,6 @@ assemble_variable (decl, top_level, at_end, dont_output_data)
{
register const char *name;
unsigned int align;
tree size_tree = NULL_TREE;
int reloc = 0;
enum in_section saved_in_section;
@ -1423,21 +1422,11 @@ assemble_variable (decl, top_level, at_end, dont_output_data)
app_disable ();
if (! dont_output_data)
if (! dont_output_data
&& ! host_integerp (DECL_SIZE_UNIT (decl), 1))
{
unsigned int size;
if (TREE_CODE (DECL_SIZE_UNIT (decl)) != INTEGER_CST)
goto finish;
size_tree = DECL_SIZE_UNIT (decl);
size = TREE_INT_CST_LOW (size_tree);
if (compare_tree_int (size_tree, size) != 0)
{
error_with_decl (decl, "size of variable `%s' is too large");
goto finish;
}
error_with_decl (decl, "size of variable `%s' is too large");
goto finish;
}
name = XSTR (XEXP (DECL_RTL (decl), 0), 0);
@ -1503,12 +1492,14 @@ assemble_variable (decl, top_level, at_end, dont_output_data)
&& DECL_SECTION_NAME (decl) == NULL_TREE
&& ! dont_output_data)
{
int size = TREE_INT_CST_LOW (size_tree);
int rounded = size;
unsigned HOST_WIDE_INT size = tree_low_cst (DECL_SIZE_UNIT (decl), 1);
unsigned HOST_WIDE_INT rounded = size;
/* Don't allocate zero bytes of common,
since that means "undefined external" in the linker. */
if (size == 0) rounded = 1;
if (size == 0)
rounded = 1;
/* Round size up to multiple of BIGGEST_ALIGNMENT bits
so that each uninitialized object starts on such a boundary. */
rounded += (BIGGEST_ALIGNMENT / BITS_PER_UNIT) - 1;
@ -1516,7 +1507,7 @@ assemble_variable (decl, top_level, at_end, dont_output_data)
* (BIGGEST_ALIGNMENT / BITS_PER_UNIT));
#if !defined(ASM_OUTPUT_ALIGNED_COMMON) && !defined(ASM_OUTPUT_ALIGNED_BSS)
if ((DECL_ALIGN (decl) / BITS_PER_UNIT) > (unsigned int) rounded)
if (DECL_ALIGN (decl) / BITS_PER_UNIT > rounded)
warning_with_decl
(decl, "requested alignment for %s is greater than implemented alignment of %d.",rounded);
#endif
@ -1650,10 +1641,11 @@ assemble_variable (decl, top_level, at_end, dont_output_data)
{
if (DECL_INITIAL (decl))
/* Output the actual data. */
output_constant (DECL_INITIAL (decl), TREE_INT_CST_LOW (size_tree));
output_constant (DECL_INITIAL (decl),
tree_low_cst (DECL_SIZE_UNIT (decl), 1));
else
/* Leave space for it. */
assemble_zeros (TREE_INT_CST_LOW (size_tree));
assemble_zeros (tree_low_cst (DECL_SIZE_UNIT (decl), 1));
}
finish:
@ -2279,22 +2271,16 @@ decode_addr_const (exp, value)
while (1)
{
if (TREE_CODE (target) == COMPONENT_REF
&& host_integerp (bit_position (TREE_OPERAND (target, 1)), 0))
&& host_integerp (byte_position (TREE_OPERAND (target, 1)), 0))
{
offset
+= int_bit_position (TREE_OPERAND (target, 1)) / BITS_PER_UNIT;
offset += int_byte_position (TREE_OPERAND (target, 1));
target = TREE_OPERAND (target, 0);
}
else if (TREE_CODE (target) == ARRAY_REF)
{
if (TREE_CODE (TREE_OPERAND (target, 1)) != INTEGER_CST
|| TREE_CODE (TYPE_SIZE (TREE_TYPE (target))) != INTEGER_CST)
abort ();
offset += ((TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (target)))
* TREE_INT_CST_LOW (TREE_OPERAND (target, 1)))
/ BITS_PER_UNIT);
offset += (tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (target)), 1)
* tree_low_cst (TREE_OPERAND (target, 1), 0));
target = TREE_OPERAND (target, 0);
}
else
@ -4420,13 +4406,12 @@ output_constructor (exp, size)
register int fieldsize;
/* Since this structure is static,
we know the positions are constant. */
int bitpos = (field ? (TREE_INT_CST_LOW (DECL_FIELD_BITPOS (field))
/ BITS_PER_UNIT)
: 0);
HOST_WIDE_INT bitpos = field ? int_byte_position (field) : 0;
if (index != 0)
bitpos = (TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (val)))
/ BITS_PER_UNIT
* (TREE_INT_CST_LOW (index) - min_index));
bitpos
= (tree_low_cst (TYPE_SIZE_UNIT (TREE_TYPE (val)), 1)
* (tree_low_cst (index, 0) - min_index));
/* Output any buffered-up bit-fields preceding this element. */
if (byte_buffer_in_use)
@ -4472,9 +4457,9 @@ output_constructor (exp, size)
{
/* Element that is a bit-field. */
int next_offset = TREE_INT_CST_LOW (DECL_FIELD_BITPOS (field));
int end_offset
= (next_offset + TREE_INT_CST_LOW (DECL_SIZE (field)));
HOST_WIDE_INT next_offset = int_bit_position (field);
HOST_WIDE_INT end_offset
= (next_offset + tree_low_cst (DECL_SIZE (field), 1));
if (val == 0)
val = integer_zero_node;
@ -4572,17 +4557,15 @@ output_constructor (exp, size)
take first the least significant bits of the value
and pack them starting at the least significant
bits of the bytes. */
shift = (next_offset
- TREE_INT_CST_LOW (DECL_FIELD_BITPOS (field)));
shift = next_offset - int_bit_position (field);
/* Don't try to take a bunch of bits that cross
the word boundary in the INTEGER_CST. We can
only select bits from the LOW or HIGH part
not from both. */
if (shift < HOST_BITS_PER_WIDE_INT
&& shift + this_time > HOST_BITS_PER_WIDE_INT)
{
this_time = (HOST_BITS_PER_WIDE_INT - shift);
}
this_time = (HOST_BITS_PER_WIDE_INT - shift);
/* Now get the bits from the appropriate constant word. */
if (shift < HOST_BITS_PER_WIDE_INT)
@ -4594,6 +4577,7 @@ output_constructor (exp, size)
}
else
abort ();
/* Get the result. This works only when:
1 <= this_time <= HOST_BITS_PER_WIDE_INT. */
byte |= (((value >> shift)