[multiple changes]

Fri Oct 29 15:25:07 1999  Arnaud Charlet  <charlet@ACT-Europe.FR>

	* gcov.c (DIR_SEPARATOR): Provide default.
	(output_data): Add test for MS-DOS format absolute filename.
	(fancy_abort): Correct program name.
	(open_files): Open all files in binary mode.
	* libgcc2.c (__bb_exit_func): Likewise.

	* profile.c (init_branch_prob): Specify binary when opening files.

	* flags.h (flag_unwind_tables): New decl.
	* toplev.c (flag_unwind_table): New definition.
	(f_options): Add -funwind-tables.
	(decode_g_option): Clarify warning when unknown -g option is given.
	(rest_of_compilation): If inside an inlined external function,
	pretend we are just being declared.

	* dwarf2out.c (dwarf2out_do_frame): Check -funwind_tables.
	(dwarf2out_frame_finish): Likewise.

Fri Oct 29 06:32:44 1999  Geoffrey Keating  <geoffk@cygnus.com>

	* flow.c (propagate_block): When the last reference to a label
 	before an ADDR_VEC is deleted because the reference is a dead
 	store, delete the ADDR_VEC.

Fri Oct 29 07:44:26 1999  Vasco Pedro  <vp@di.fct.unl.pt>

	* fold-const.c (merge_ranges): In not in0, but in1, handle
	upper bounds equal like subset case.

Thu Oct 28 19:22:24 1999  Douglas Rupp <rupp@gnat.com>

	* dbxout.c (dbxout_parms): Generate a second stabs line for parameters
	passed in a register but moved to the stack.

Thu Oct 28 19:12:57 1999  Sam Tardieu  <tardieu@act-europe.fr>

	* gcc.c (pass_exit_codes, greatest_status): New variables.
	(struct option_map): Add entry for "--pass-exit-codes".
	(execute): Update greatest_status if error.
	(display_help): Add documentation for -pass-exit-codes.
	(process_command): Handle -pass-exit-codes.
	(main): Look at pass_exit_codes and greatest_status on call to exit.

Thu Oct 28 18:06:50 1999  Richard Kenner  <kenner@vlsi1.ultra.nyu.edu>

	* reload.c (find_reloads): Refine test for no input reload
	case to not includes reloads emitted after insn.

	* function.c (find_temp_slots_from_address): Handle sum involving
	a register that points to a temp slot.
	(update_temp_slot_address): Make recursive call if both old and
	new are PLUS with a common operand.
	* calls.c (expand_call): Mark temp slot for result as having
	address taken.

	* rtlanal.c (reg_referenced_p, case IF_THEN_ELSE): New case.

	* gcc.c (process_command): Add standard_exec_prefix with "GCC"
	component as well as "BINUTILS".

	* integrate.h (copy_rtx_and_substitute): New arg, FOR_LHS.
	* integrate.c (copy_rtx_and_substitute): Likewise.
	(expand_inline_function, integrate_parm_decls, integrate_decl_tree):
	All callers changed.
	* unroll.c (inital_reg_note_copy, copy_loop_body): Likewise.

	* dbxout.c (dbxout_type, case INTEGER_TYPE_NODE): If can use
	gdb extensions, write size of type; also be more consistent
	in using references when this is a subtype.

	* pa.md (extv, extzv, insv): Use define_expand to reject constant
	that is out of range.

	* loop.c (unknown_constant_address_altered): New variable.
	(prescan_loop): Initialize it.
	(note_addr_stored): Set it for RTX_UNCHANGING_P MEM.
	(invariant_p, case MEM): Remove handling for volatile and readony;
	check new variable if readonly.
	(check_dbra_loop): Chdeck unknown_constant_address_altered.

	* cse.c (canon_hash, case MEM): Do not record if BLKmode.
	(addr_affects_sp_p): Removed from note_mem_written and only
	define #ifdef AUTO_INC_DEC.

	* alpha.c (input_operand, case ADDRESSOF): Treat as REG.

	* regclass.c (record_reg_classes): Properly handle register move
	directions.

	* varasm.c (initializer_constant_valid_p, case MINUS_EXPR):
	Don't think valid if both operands are invalid.
	(struct constant_descriptor): New field RTL.
	(mark_const_hash_entry): Mark it.
	(record_constant{,_rtx}): Initialize it.
	(output_constant_def): Allocate RTL in permanent obstack and
	save in table.
	({record,compare}_constant_1): Modes must match for
	CONSTRUCTOR of ARRAY_TYPE.

	* c-common.h (initializer_constant_valid_p): Delete decl from here.
	* output.h (initializer_constant_valid_p): Move decl to here.
	* c-common.c (initializer_constant_valid_p): Delete function from here.
	* varasm.c (initializer_constant_valid_p): Move function to here.

	* tree.h (STRIP_SIGN_NOPS): New macro.
	* fold-const.c (optimize_minmax_comparison): New function.
	(invert_truthvalue, case WITH_RECORD_EXPR): New case.
	(fold): Use STRIP_SIGN_NOPS instead of STRIP_TYPE_NOPS.
	(fold, case EQ_EXPR): Call optimize_minmax_comparison and add
	cases with ABS_EXPR, NEGATE_EXPR, PLUS_EXPR, MINUS_EXPR, and
	widening conversions.
	(fold, case LE_EXPR): Rework changing unsigned to signed comparisons
	to look at size of mode, not precision of type; also add missing cases.
	(optimize_bit_field_compare, decode_field_reference): Don't try to
	optimize COMPONENT_REF of a PLACEHOLDER_EXPR.

	* dwarf2out.c (ctype.h): Include.
	(dwarf2out_set_demangle_name_func): New function.
	(size_of_line_info): Deleted.
	(output_line_info): Compute size of line info table from difference
	of labels.
	(base_type_die, add_name_attribute): Call demangle function, if any.
	(field_byte_offset): Use bits per word for variable length fields.
	(gen_array_type_die): Add array name.
	(gen_subprogram_die): Ignore DECL_INLINE if -fno-inline.
	(dwarf2out_add_library_unit_info): New function.

	* explow.c (set_stack_check_libfunc): New function.
	(stack_check_libfunc): New static variable.
	(probe_stack_range): Allow front-end to set up a libfunc to call.

	* combine.c (simplify_comparison): When making comparison in wider
	mode, check for having commuted an AND and a SUBREG.
	(contains_muldiv): New function.
	(try_combine): Call it when dividing a PARALLEL.
	(simplify_rtx, case TRUNCATE): Don't remove for umulsi3_highpart.
	(simplify_comparison, case ASHIFTRT): Recognize sign-extension of
	a PLUS.
	(record_value_for_reg): If TEM is a binary operation with two CLOBBERs,
	use one of the CLOBBERs instead.
	(if_then_else_cond): If comparing against zero, just return thing
	being compared.

	* optabs.c (expand_abs): If machine has MAX, ABS (x) is MAX (x, -x).
	Don't generate shifts and subtract if have conditional arithmetic.

	* rtl.h (delete_barrier): New declaration.
	* jump.c (jump_optimize): Set up to handle conditional call.
	In conditional arithmetic case, handle CALL_INSN followed by a BARRIER.
	(delete_barrier): New function.

	* rtl.c (read_rtx): Call fatal if bad RTL code; check for bad mode.

	* recog.c (nonmemory_operand): Accept ADDRESSOF.

	* tree.c (build_type_attribute_variant): Push to obstack of
	ttype around type_hash_canon call.

	* expr.c (placeholder_list): Move decl to file scope.
	(expand_expr): Don't force access to volatile just because its
	address is taken.
	If ignoring reference operations, just expand the operands.
	(expand_expr, case COMPONENT_REF): Propagate
	EXPAND_CONST_ADDRESS to recursive call when expanding inner.
	Refine test for using bitfield operations vs pointer punning.
	(expand_expr, case CONVERT_EXPR): If converting to
	BLKmode UNION_TYPE from BLKmode, just return inner object.
	Use proper mode in store_field call.
	Properly set sizes of object to store and total size in store_field
	call for convert to union.
	(expand_expr, case ARRAY_REF): If OP0 is in a register, put it in
	memory (like for ADDR_EXPR).  Also, don't put constant in register if
	we'll want it in memory.
	(readonly_fields_p): New function.
	(expand_expr, case INDIRECT_REF): Call it if LHS.
	(expand_assignment): Handle a RESULT_DECL where
	DECL_RTL is a PARALLEL.
	(do_jump, case WITH_RECORD_EXPR): New case.
	(get_inner_reference): Always go inside a CONVERT_EXPR
	and NOP_EXPR if both modes are the same.
	(store_field): Use bitfield operations if size of bitsize is not same
	as size of RHS's type.
	Check for bitpos not a multiple of alignment in BLKmode case.
	Do block move in largest possible alignment.
	(store_constructor): Set BITSIZE to -1 for variable size and properly
 	in case of array of BLKmode.
	(expand_expr_unaligned): New function.
	(do_compare_and_jump): Call it.

	* mips/iris5.h (SWITCHES_NEED_SPACES): New macro.
	* collect2.c (main): Only allow -ofoo if SWITCHES_NEED_SPACES
	does not include 'o'.

	* function.c (instantiate_virtual_regs_1, case SET): Handle case where
	both SET_DEST and SET_SRC reference a virtual register.
	(gen_mem_addressof): Copy RTX_UNCHANGING_P from new REG to old REG.

	* integrate.c (expand_inline_function): Handle case of setting
	virtual stack vars register (from built in setjmp); when parameter
	lives in memory, expand virtual_{stack_vars,incoming_args}_rtx early.
	(subst_constant): Add new parm, MEMONLY.
	(expand_inline_function, integrate_parm_decls): Pass new parm.
	(integrate_decl_tree): Likewise.
	(copy_rtx_and_substitute, case MEM): Do copy RTX_UNCHANGING_P.
	(try_constants): Call subst_constants twice, with MEMONLY 0 and 1.
	(copy_rtx_and_substitute, case SET): Add explicit calls to
	copy_rtx_and_substitute for both sides.

	* stmt.c (expand_asm_operands): Don't use TREE_STRING_LENGTH for
	constraints.
	(pushcase{,_range}): Convert to NOMINAL_TYPE after checking for
	within INDEX_TYPE, instead of before.
	(fixup_gotos): Use f->target_rtl, not the next insn,
	since latter may be from a later fixup.
	(expand_value_return): Correctly convert VAL when promoting function
	return; support RETURN_REG being a PARALLEL.
	(expand_return): When checking for result in regs and having
	cleanup, consider PARALLEL in DECL_RTL as being in regs.

From-SVN: r30299
This commit is contained in:
Richard Kenner 1999-10-31 20:11:22 -05:00
parent 8f65050e41
commit 14a774a9d2
42 changed files with 2133 additions and 893 deletions

View File

@ -184,6 +184,227 @@ Fri Oct 29 02:51:35 1999 Mark Mitchell <mark@codesourcery.com>
(pre_edge_insert): Free inserted.
* stmt.c (free_stmt_status): Don't free NULL.
Fri Oct 29 15:25:07 1999 Arnaud Charlet <charlet@ACT-Europe.FR>
* gcov.c (DIR_SEPARATOR): Provide default.
(output_data): Add test for MS-DOS format absolute filename.
(fancy_abort): Correct program name.
(open_files): Open all files in binary mode.
* libgcc2.c (__bb_exit_func): Likewise.
* profile.c (init_branch_prob): Specify binary when opening files.
* flags.h (flag_unwind_tables): New decl.
* toplev.c (flag_unwind_table): New definition.
(f_options): Add -funwind-tables.
(decode_g_option): Clarify warning when unknown -g option is given.
(rest_of_compilation): If inside an inlined external function,
pretend we are just being declared.
* dwarf2out.c (dwarf2out_do_frame): Check -funwind_tables.
(dwarf2out_frame_finish): Likewise.
Fri Oct 29 06:32:44 1999 Geoffrey Keating <geoffk@cygnus.com>
* flow.c (propagate_block): When the last reference to a label
before an ADDR_VEC is deleted because the reference is a dead
store, delete the ADDR_VEC.
Fri Oct 29 07:44:26 1999 Vasco Pedro <vp@di.fct.unl.pt>
* fold-const.c (merge_ranges): In not in0, but in1, handle
upper bounds equal like subset case.
Thu Oct 28 19:22:24 1999 Douglas Rupp <rupp@gnat.com>
* dbxout.c (dbxout_parms): Generate a second stabs line for parameters
passed in a register but moved to the stack.
Thu Oct 28 19:12:57 1999 Sam Tardieu <tardieu@act-europe.fr>
* gcc.c (pass_exit_codes, greatest_status): New variables.
(struct option_map): Add entry for "--pass-exit-codes".
(execute): Update greatest_status if error.
(display_help): Add documentation for -pass-exit-codes.
(process_command): Handle -pass-exit-codes.
(main): Look at pass_exit_codes and greatest_status on call to exit.
Thu Oct 28 18:06:50 1999 Richard Kenner <kenner@vlsi1.ultra.nyu.edu>
* reload.c (find_reloads): Refine test for no input reload
case to not includes reloads emitted after insn.
* function.c (find_temp_slots_from_address): Handle sum involving
a register that points to a temp slot.
(update_temp_slot_address): Make recursive call if both old and
new are PLUS with a common operand.
* calls.c (expand_call): Mark temp slot for result as having
address taken.
* rtlanal.c (reg_referenced_p, case IF_THEN_ELSE): New case.
* gcc.c (process_command): Add standard_exec_prefix with "GCC"
component as well as "BINUTILS".
* integrate.h (copy_rtx_and_substitute): New arg, FOR_LHS.
* integrate.c (copy_rtx_and_substitute): Likewise.
(expand_inline_function, integrate_parm_decls, integrate_decl_tree):
All callers changed.
* unroll.c (inital_reg_note_copy, copy_loop_body): Likewise.
* dbxout.c (dbxout_type, case INTEGER_TYPE_NODE): If can use
gdb extensions, write size of type; also be more consistent
in using references when this is a subtype.
* pa.md (extv, extzv, insv): Use define_expand to reject constant
that is out of range.
* loop.c (unknown_constant_address_altered): New variable.
(prescan_loop): Initialize it.
(note_addr_stored): Set it for RTX_UNCHANGING_P MEM.
(invariant_p, case MEM): Remove handling for volatile and readony;
check new variable if readonly.
(check_dbra_loop): Chdeck unknown_constant_address_altered.
* cse.c (canon_hash, case MEM): Do not record if BLKmode.
(addr_affects_sp_p): Removed from note_mem_written and only
define #ifdef AUTO_INC_DEC.
* alpha.c (input_operand, case ADDRESSOF): Treat as REG.
* regclass.c (record_reg_classes): Properly handle register move
directions.
* varasm.c (initializer_constant_valid_p, case MINUS_EXPR):
Don't think valid if both operands are invalid.
(struct constant_descriptor): New field RTL.
(mark_const_hash_entry): Mark it.
(record_constant{,_rtx}): Initialize it.
(output_constant_def): Allocate RTL in permanent obstack and
save in table.
({record,compare}_constant_1): Modes must match for
CONSTRUCTOR of ARRAY_TYPE.
* c-common.h (initializer_constant_valid_p): Delete decl from here.
* output.h (initializer_constant_valid_p): Move decl to here.
* c-common.c (initializer_constant_valid_p): Delete function from here.
* varasm.c (initializer_constant_valid_p): Move function to here.
* tree.h (STRIP_SIGN_NOPS): New macro.
* fold-const.c (optimize_minmax_comparison): New function.
(invert_truthvalue, case WITH_RECORD_EXPR): New case.
(fold): Use STRIP_SIGN_NOPS instead of STRIP_TYPE_NOPS.
(fold, case EQ_EXPR): Call optimize_minmax_comparison and add
cases with ABS_EXPR, NEGATE_EXPR, PLUS_EXPR, MINUS_EXPR, and
widening conversions.
(fold, case LE_EXPR): Rework changing unsigned to signed comparisons
to look at size of mode, not precision of type; also add missing cases.
(optimize_bit_field_compare, decode_field_reference): Don't try to
optimize COMPONENT_REF of a PLACEHOLDER_EXPR.
* dwarf2out.c (ctype.h): Include.
(dwarf2out_set_demangle_name_func): New function.
(size_of_line_info): Deleted.
(output_line_info): Compute size of line info table from difference
of labels.
(base_type_die, add_name_attribute): Call demangle function, if any.
(field_byte_offset): Use bits per word for variable length fields.
(gen_array_type_die): Add array name.
(gen_subprogram_die): Ignore DECL_INLINE if -fno-inline.
(dwarf2out_add_library_unit_info): New function.
* explow.c (set_stack_check_libfunc): New function.
(stack_check_libfunc): New static variable.
(probe_stack_range): Allow front-end to set up a libfunc to call.
* combine.c (simplify_comparison): When making comparison in wider
mode, check for having commuted an AND and a SUBREG.
(contains_muldiv): New function.
(try_combine): Call it when dividing a PARALLEL.
(simplify_rtx, case TRUNCATE): Don't remove for umulsi3_highpart.
(simplify_comparison, case ASHIFTRT): Recognize sign-extension of
a PLUS.
(record_value_for_reg): If TEM is a binary operation with two CLOBBERs,
use one of the CLOBBERs instead.
(if_then_else_cond): If comparing against zero, just return thing
being compared.
* optabs.c (expand_abs): If machine has MAX, ABS (x) is MAX (x, -x).
Don't generate shifts and subtract if have conditional arithmetic.
* rtl.h (delete_barrier): New declaration.
* jump.c (jump_optimize): Set up to handle conditional call.
In conditional arithmetic case, handle CALL_INSN followed by a BARRIER.
(delete_barrier): New function.
* rtl.c (read_rtx): Call fatal if bad RTL code; check for bad mode.
* recog.c (nonmemory_operand): Accept ADDRESSOF.
* tree.c (build_type_attribute_variant): Push to obstack of
ttype around type_hash_canon call.
* expr.c (placeholder_list): Move decl to file scope.
(expand_expr): Don't force access to volatile just because its
address is taken.
If ignoring reference operations, just expand the operands.
(expand_expr, case COMPONENT_REF): Propagate
EXPAND_CONST_ADDRESS to recursive call when expanding inner.
Refine test for using bitfield operations vs pointer punning.
(expand_expr, case CONVERT_EXPR): If converting to
BLKmode UNION_TYPE from BLKmode, just return inner object.
Use proper mode in store_field call.
Properly set sizes of object to store and total size in store_field
call for convert to union.
(expand_expr, case ARRAY_REF): If OP0 is in a register, put it in
memory (like for ADDR_EXPR). Also, don't put constant in register if
we'll want it in memory.
(readonly_fields_p): New function.
(expand_expr, case INDIRECT_REF): Call it if LHS.
(expand_assignment): Handle a RESULT_DECL where
DECL_RTL is a PARALLEL.
(do_jump, case WITH_RECORD_EXPR): New case.
(get_inner_reference): Always go inside a CONVERT_EXPR
and NOP_EXPR if both modes are the same.
(store_field): Use bitfield operations if size of bitsize is not same
as size of RHS's type.
Check for bitpos not a multiple of alignment in BLKmode case.
Do block move in largest possible alignment.
(store_constructor): Set BITSIZE to -1 for variable size and properly
in case of array of BLKmode.
(expand_expr_unaligned): New function.
(do_compare_and_jump): Call it.
* mips/iris5.h (SWITCHES_NEED_SPACES): New macro.
* collect2.c (main): Only allow -ofoo if SWITCHES_NEED_SPACES
does not include 'o'.
* function.c (instantiate_virtual_regs_1, case SET): Handle case where
both SET_DEST and SET_SRC reference a virtual register.
(gen_mem_addressof): Copy RTX_UNCHANGING_P from new REG to old REG.
* integrate.c (expand_inline_function): Handle case of setting
virtual stack vars register (from built in setjmp); when parameter
lives in memory, expand virtual_{stack_vars,incoming_args}_rtx early.
(subst_constant): Add new parm, MEMONLY.
(expand_inline_function, integrate_parm_decls): Pass new parm.
(integrate_decl_tree): Likewise.
(copy_rtx_and_substitute, case MEM): Do copy RTX_UNCHANGING_P.
(try_constants): Call subst_constants twice, with MEMONLY 0 and 1.
(copy_rtx_and_substitute, case SET): Add explicit calls to
copy_rtx_and_substitute for both sides.
* stmt.c (expand_asm_operands): Don't use TREE_STRING_LENGTH for
constraints.
(pushcase{,_range}): Convert to NOMINAL_TYPE after checking for
within INDEX_TYPE, instead of before.
(fixup_gotos): Use f->target_rtl, not the next insn,
since latter may be from a later fixup.
(expand_value_return): Correctly convert VAL when promoting function
return; support RETURN_REG being a PARALLEL.
(expand_return): When checking for result in regs and having
cleanup, consider PARALLEL in DECL_RTL as being in regs.
1999-10-28 21:27 -0700 Zack Weinberg <zack@bitmover.com>
* cpplib.h (struct cpp_buffer: fname, nominal_fname,

View File

@ -3845,161 +3845,6 @@ build_va_arg (expr, type)
{
return build1 (VA_ARG_EXPR, type, expr);
}
/* Return nonzero if VALUE is a valid constant-valued expression
for use in initializing a static variable; one that can be an
element of a "constant" initializer.
Return null_pointer_node if the value is absolute;
if it is relocatable, return the variable that determines the relocation.
We assume that VALUE has been folded as much as possible;
therefore, we do not need to check for such things as
arithmetic-combinations of integers. */
tree
initializer_constant_valid_p (value, endtype)
tree value;
tree endtype;
{
switch (TREE_CODE (value))
{
case CONSTRUCTOR:
if ((TREE_CODE (TREE_TYPE (value)) == UNION_TYPE
|| TREE_CODE (TREE_TYPE (value)) == RECORD_TYPE)
&& TREE_CONSTANT (value)
&& CONSTRUCTOR_ELTS (value))
return
initializer_constant_valid_p (TREE_VALUE (CONSTRUCTOR_ELTS (value)),
endtype);
return TREE_STATIC (value) ? null_pointer_node : 0;
case INTEGER_CST:
case REAL_CST:
case STRING_CST:
case COMPLEX_CST:
return null_pointer_node;
case ADDR_EXPR:
return TREE_OPERAND (value, 0);
case NON_LVALUE_EXPR:
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
case CONVERT_EXPR:
case NOP_EXPR:
/* Allow conversions between pointer types. */
if (POINTER_TYPE_P (TREE_TYPE (value))
&& POINTER_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow conversions between real types. */
if (FLOAT_TYPE_P (TREE_TYPE (value))
&& FLOAT_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow length-preserving conversions between integer types. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0)))
&& (TYPE_PRECISION (TREE_TYPE (value))
== TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0)))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow conversions between other integer types only if
explicit value. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
{
tree inner = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
if (inner == null_pointer_node)
return null_pointer_node;
break;
}
/* Allow (int) &foo provided int is as wide as a pointer. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& POINTER_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0)))
&& (TYPE_PRECISION (TREE_TYPE (value))
>= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0)))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
/* Likewise conversions from int to pointers, but also allow
conversions from 0. */
if (POINTER_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
{
if (integer_zerop (TREE_OPERAND (value, 0)))
return null_pointer_node;
else if (TYPE_PRECISION (TREE_TYPE (value))
<= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
}
/* Allow conversions to union types if the value inside is okay. */
if (TREE_CODE (TREE_TYPE (value)) == UNION_TYPE)
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
break;
case PLUS_EXPR:
if (! INTEGRAL_TYPE_P (endtype)
|| TYPE_PRECISION (endtype) >= POINTER_SIZE)
{
tree valid0 = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
tree valid1 = initializer_constant_valid_p (TREE_OPERAND (value, 1),
endtype);
/* If either term is absolute, use the other terms relocation. */
if (valid0 == null_pointer_node)
return valid1;
if (valid1 == null_pointer_node)
return valid0;
}
break;
case MINUS_EXPR:
if (! INTEGRAL_TYPE_P (endtype)
|| TYPE_PRECISION (endtype) >= POINTER_SIZE)
{
tree valid0 = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
tree valid1 = initializer_constant_valid_p (TREE_OPERAND (value, 1),
endtype);
/* Win if second argument is absolute. */
if (valid1 == null_pointer_node)
return valid0;
/* Win if both arguments have the same relocation.
Then the value is absolute. */
if (valid0 == valid1)
return null_pointer_node;
}
/* Support differences between labels. */
if (INTEGRAL_TYPE_P (endtype))
{
tree op0, op1;
op0 = TREE_OPERAND (value, 0);
op1 = TREE_OPERAND (value, 1);
STRIP_NOPS (op0);
STRIP_NOPS (op1);
if (TREE_CODE (op0) == ADDR_EXPR
&& TREE_CODE (TREE_OPERAND (op0, 0)) == LABEL_DECL
&& TREE_CODE (op1) == ADDR_EXPR
&& TREE_CODE (TREE_OPERAND (op1, 0)) == LABEL_DECL)
return null_pointer_node;
}
break;
default:
break;
}
return 0;
}
/* Given a type, apply default promotions wrt unnamed function arguments
and return the new type. Return NULL_TREE if no change. */

View File

@ -119,8 +119,6 @@ extern void c_common_nodes_and_builtins PROTO((int, int, int));
extern tree build_va_arg PROTO((tree, tree));
extern tree initializer_constant_valid_p PROTO((tree, tree));
/* Nonzero if the type T promotes to itself.
ANSI C states explicitly the list of types that promote;
in particular, short promotes to int even if they have the same width. */

View File

@ -1763,6 +1763,7 @@ expand_call (exp, target, ignore)
d = build_decl (VAR_DECL, NULL_TREE, TREE_TYPE (exp));
DECL_RTL (d) = assign_temp (TREE_TYPE (exp), 1, 0, 1);
mark_addressable (d);
mark_temp_addr_taken (DECL_RTL (d));
structure_value_addr = XEXP (DECL_RTL (d), 0);
TREE_USED (d) = 1;
target = 0;

View File

@ -1200,7 +1200,12 @@ main (argc, argv)
case 'o':
if (arg[2] == '\0')
output_file = *ld1++ = *ld2++ = *++argv;
else
else if (1
#ifdef SWITCHES_NEED_SPACES
&& ! index (SWITCHES_NEED_SPACES, arg[1])
#endif
)
output_file = &arg[2];
break;

View File

@ -358,6 +358,7 @@ static void set_nonzero_bits_and_sign_copies PROTO((rtx, rtx, void *));
static int can_combine_p PROTO((rtx, rtx, rtx, rtx, rtx *, rtx *));
static int sets_function_arg_p PROTO((rtx));
static int combinable_i3pat PROTO((rtx, rtx *, rtx, rtx, int, rtx *));
static int contains_muldiv PROTO((rtx));
static rtx try_combine PROTO((rtx, rtx, rtx));
static void undo_all PROTO((void));
static rtx *find_split_point PROTO((rtx *, rtx));
@ -1350,6 +1351,37 @@ combinable_i3pat (i3, loc, i2dest, i1dest, i1_not_in_src, pi3dest_killed)
return 1;
}
/* Return 1 if X is an arithmetic expression that contains a multiplication
and division. We don't count multiplications by powers of two here. */
static int
contains_muldiv (x)
rtx x;
{
switch (GET_CODE (x))
{
case MOD: case DIV: case UMOD: case UDIV:
return 1;
case MULT:
return ! (GET_CODE (XEXP (x, 1)) == CONST_INT
&& exact_log2 (INTVAL (XEXP (x, 1))) >= 0);
default:
switch (GET_RTX_CLASS (GET_CODE (x)))
{
case 'c': case '<': case '2':
return contains_muldiv (XEXP (x, 0))
|| contains_muldiv (XEXP (x, 1));
case '1':
return contains_muldiv (XEXP (x, 0));
default:
return 0;
}
}
}
/* Try to combine the insns I1 and I2 into I3.
Here I1 and I2 appear earlier than I3.
I1 can be zero; then we combine just I2 into I3.
@ -2201,7 +2233,9 @@ try_combine (i3, i2, i1)
&& ! reg_referenced_p (SET_DEST (XVECEXP (newpat, 0, 1)),
XVECEXP (newpat, 0, 0))
&& ! reg_referenced_p (SET_DEST (XVECEXP (newpat, 0, 0)),
XVECEXP (newpat, 0, 1)))
XVECEXP (newpat, 0, 1))
&& ! (contains_muldiv (SET_SRC (XVECEXP (newpat, 0, 0)))
&& contains_muldiv (SET_SRC (XVECEXP (newpat, 0, 1)))))
{
/* Normally, it doesn't matter which of the two is done first,
but it does if one references cc0. In that case, it has to
@ -3848,12 +3882,16 @@ combine_simplify_rtx (x, op0_mode, last, in_dest)
return SUBREG_REG (XEXP (x, 0));
/* If we know that the value is already truncated, we can
replace the TRUNCATE with a SUBREG if TRULY_NOOP_TRUNCATION is
nonzero for the corresponding modes. */
replace the TRUNCATE with a SUBREG if TRULY_NOOP_TRUNCATION
is nonzero for the corresponding modes. But don't do this
for an (LSHIFTRT (MULT ...)) since this will cause problems
with the umulXi3_highpart patterns. */
if (TRULY_NOOP_TRUNCATION (GET_MODE_BITSIZE (mode),
GET_MODE_BITSIZE (GET_MODE (XEXP (x, 0))))
&& num_sign_bit_copies (XEXP (x, 0), GET_MODE (XEXP (x, 0)))
>= GET_MODE_BITSIZE (mode) + 1)
>= GET_MODE_BITSIZE (mode) + 1
&& ! (GET_CODE (XEXP (x, 0)) == LSHIFTRT
&& GET_CODE (XEXP (XEXP (x, 0), 0)) == MULT))
return gen_lowpart_for_combine (mode, XEXP (x, 0));
/* A truncate of a comparison can be replaced with a subreg if
@ -6898,10 +6936,19 @@ if_then_else_cond (x, ptrue, pfalse)
rtx cond0, cond1, true0, true1, false0, false1;
unsigned HOST_WIDE_INT nz;
/* If we are comparing a value against zero, we are done. */
if ((code == NE || code == EQ)
&& GET_CODE (XEXP (x, 1)) == CONST_INT && INTVAL (XEXP (x, 1)) == 0)
{
*ptrue = (code == NE) ? const1_rtx : const0_rtx;
*pfalse = (code == NE) ? const0_rtx : const1_rtx;
return XEXP (x, 0);
}
/* If this is a unary operation whose operand has one of two values, apply
our opcode to compute those values. */
if (GET_RTX_CLASS (code) == '1'
&& (cond0 = if_then_else_cond (XEXP (x, 0), &true0, &false0)) != 0)
else if (GET_RTX_CLASS (code) == '1'
&& (cond0 = if_then_else_cond (XEXP (x, 0), &true0, &false0)) != 0)
{
*ptrue = gen_unary (code, mode, GET_MODE (XEXP (x, 0)), true0);
*pfalse = gen_unary (code, mode, GET_MODE (XEXP (x, 0)), false0);
@ -10459,6 +10506,32 @@ simplify_comparison (code, pop0, pop1)
continue;
}
/* Likewise if OP0 is a PLUS of a sign extension with a
constant, which is usually represented with the PLUS
between the shifts. */
if (! unsigned_comparison_p
&& GET_CODE (XEXP (op0, 1)) == CONST_INT
&& GET_CODE (XEXP (op0, 0)) == PLUS
&& GET_CODE (XEXP (XEXP (op0, 0), 1)) == CONST_INT
&& GET_CODE (XEXP (XEXP (op0, 0), 0)) == ASHIFT
&& XEXP (op0, 1) == XEXP (XEXP (XEXP (op0, 0), 0), 1)
&& (tmode = mode_for_size (mode_width - INTVAL (XEXP (op0, 1)),
MODE_INT, 1)) != BLKmode
&& ((unsigned HOST_WIDE_INT) const_op <= GET_MODE_MASK (tmode)
|| ((unsigned HOST_WIDE_INT) - const_op
<= GET_MODE_MASK (tmode))))
{
rtx inner = XEXP (XEXP (XEXP (op0, 0), 0), 0);
rtx add_const = XEXP (XEXP (op0, 0), 1);
rtx new_const = gen_binary (ASHIFTRT, GET_MODE (op0), add_const,
XEXP (op0, 1));
op0 = gen_binary (PLUS, tmode,
gen_lowpart_for_combine (tmode, inner),
new_const);
continue;
}
/* ... fall through ... */
case LSHIFTRT:
/* If we have (compare (xshiftrt FOO N) (const_int C)) and
@ -10563,6 +10636,17 @@ simplify_comparison (code, pop0, pop1)
&& (num_sign_bit_copies (op1, tmode)
> GET_MODE_BITSIZE (tmode) - GET_MODE_BITSIZE (mode))))
{
/* If OP0 is an AND and we don't have an AND in MODE either,
make a new AND in the proper mode. */
if (GET_CODE (op0) == AND
&& (add_optab->handlers[(int) mode].insn_code
== CODE_FOR_nothing))
op0 = gen_binary (AND, tmode,
gen_lowpart_for_combine (tmode,
XEXP (op0, 0)),
gen_lowpart_for_combine (tmode,
XEXP (op0, 1)));
op0 = gen_lowpart_for_combine (tmode, op0);
op1 = gen_lowpart_for_combine (tmode, op1);
break;
@ -10690,8 +10774,20 @@ record_value_for_reg (reg, insn, value)
subst_low_cuid = INSN_CUID (insn);
tem = get_last_value (reg);
/* If TEM is simply a binary operation with two CLOBBERs as operands,
it isn't going to be useful and will take a lot of time to process,
so just use the CLOBBER. */
if (tem)
value = replace_rtx (copy_rtx (value), reg, tem);
{
if ((GET_RTX_CLASS (GET_CODE (tem)) == '2'
|| GET_RTX_CLASS (GET_CODE (tem)) == 'c')
&& GET_CODE (XEXP (tem, 0)) == CLOBBER
&& GET_CODE (XEXP (tem, 1)) == CLOBBER)
tem = XEXP (tem, 0);
value = replace_rtx (copy_rtx (value), reg, tem);
}
}
/* For each register modified, show we don't know its value, that

View File

@ -625,6 +625,7 @@ input_operand (op, mode)
return mode == ptr_mode || mode == DImode;
case REG:
case ADDRESSOF:
return 1;
case SUBREG:

View File

@ -43,6 +43,9 @@ Boston, MA 02111-1307, USA. */
#define LD_INIT_SWITCH "-init"
#define LD_FINI_SWITCH "-fini"
/* The linker needs a space after "-o". */
#define SWITCHES_NEED_SPACES "o"
/* Specify wchar_t types. */
#undef WCHAR_TYPE
#undef WCHAR_TYPE_SIZE

View File

@ -5106,7 +5106,20 @@
DONE;
}")
(define_insn "extzv"
(define_expand "extzv"
[(set (match_operand:SI 0 "register_operand" "")
(zero_extract:SI (match_operand:SI 1 "register_operand" "")
(match_operand:SI 2 "uint5_operand" "")
(match_operand:SI 3 "uint5_operand" "")))]
""
"
{
if (! uint5_operand (operands[2], SImode)
|| ! uint5_operand (operands[3], SImode))
FAIL;
}")
(define_insn ""
[(set (match_operand:SI 0 "register_operand" "=r")
(zero_extract:SI (match_operand:SI 1 "register_operand" "r")
(match_operand:SI 2 "uint5_operand" "")
@ -5126,7 +5139,20 @@
[(set_attr "type" "shift")
(set_attr "length" "4")])
(define_insn "extv"
(define_expand "extv"
[(set (match_operand:SI 0 "register_operand" "")
(sign_extract:SI (match_operand:SI 1 "register_operand" "")
(match_operand:SI 2 "uint5_operand" "")
(match_operand:SI 3 "uint5_operand" "")))]
""
"
{
if (! uint5_operand (operands[2], SImode)
|| ! uint5_operand (operands[3], SImode))
FAIL;
}")
(define_insn ""
[(set (match_operand:SI 0 "register_operand" "=r")
(sign_extract:SI (match_operand:SI 1 "register_operand" "r")
(match_operand:SI 2 "uint5_operand" "")
@ -5146,7 +5172,20 @@
[(set_attr "type" "shift")
(set_attr "length" "4")])
(define_insn "insv"
(define_expand "insv"
[(set (zero_extract:SI (match_operand:SI 0 "register_operand" "")
(match_operand:SI 1 "uint5_operand" "")
(match_operand:SI 2 "uint5_operand" ""))
(match_operand:SI 3 "arith5_operand" "r,L"))]
""
"
{
if (! uint5_operand (operands[1], SImode)
|| ! uint5_operand (operands[2], SImode))
FAIL;
}")
(define_insn ""
[(set (zero_extract:SI (match_operand:SI 0 "register_operand" "+r,r")
(match_operand:SI 1 "uint5_operand" "")
(match_operand:SI 2 "uint5_operand" ""))

259
gcc/cse.c
View File

@ -291,13 +291,14 @@ static rtx this_insn;
static int *reg_next_eqv;
static int *reg_prev_eqv;
struct cse_reg_info {
struct cse_reg_info
{
/* The number of times the register has been altered in the current
basic block. */
int reg_tick;
/* The next cse_reg_info structure in the free or used list. */
struct cse_reg_info* next;
struct cse_reg_info *next;
/* The REG_TICK value at which rtx's containing this register are
valid in the hash table. If this does not equal the current
@ -576,7 +577,8 @@ static int constant_pool_entries_cost;
/* This data describes a block that will be processed by cse_basic_block. */
struct cse_basic_block_data {
struct cse_basic_block_data
{
/* Lowest CUID value of insns in block. */
int low_cuid;
/* Highest CUID value of insns in block. */
@ -588,14 +590,15 @@ struct cse_basic_block_data {
/* Size of current branch path, if any. */
int path_size;
/* Current branch path, indicating which branches will be taken. */
struct branch_path {
/* The branch insn. */
rtx branch;
/* Whether it should be taken or not. AROUND is the same as taken
except that it is used when the destination label is not preceded
struct branch_path
{
/* The branch insn. */
rtx branch;
/* Whether it should be taken or not. AROUND is the same as taken
except that it is used when the destination label is not preceded
by a BARRIER. */
enum taken {TAKEN, NOT_TAKEN, AROUND} status;
} path[PATHLENGTH];
enum taken {TAKEN, NOT_TAKEN, AROUND} status;
} path[PATHLENGTH];
};
/* Nonzero if X has the form (PLUS frame-pointer integer). We check for
@ -692,7 +695,9 @@ static void record_jump_equiv PROTO((rtx, int));
static void record_jump_cond PROTO((enum rtx_code, enum machine_mode,
rtx, rtx, int));
static void cse_insn PROTO((rtx, rtx));
static int note_mem_written PROTO((rtx));
#ifdef AUTO_INC_DEC
static int addr_affects_sp_p PROTO((rtx));
#endif
static void invalidate_from_clobbers PROTO((rtx));
static rtx cse_process_notes PROTO((rtx, rtx));
static void cse_around_loop PROTO((rtx));
@ -1721,21 +1726,18 @@ flush_hash_table ()
remove_from_table (p, i);
}
}
/* Remove from the hash table, or mark as invalid, all expressions whose
values could be altered by storing in X. X is a register, a subreg, or
a memory reference with nonvarying address (because, when a memory
reference with a varying address is stored in, all memory references are
removed by invalidate_memory so specific invalidation is superfluous).
FULL_MODE, if not VOIDmode, indicates that this much should be
invalidated instead of just the amount indicated by the mode of X. This
is only used for bitfield stores into memory.
/* Remove from the hash table, or mark as invalid,
all expressions whose values could be altered by storing in X.
X is a register, a subreg, or a memory reference with nonvarying address
(because, when a memory reference with a varying address is stored in,
all memory references are removed by invalidate_memory
so specific invalidation is superfluous).
FULL_MODE, if not VOIDmode, indicates that this much should be invalidated
instead of just the amount indicated by the mode of X. This is only used
for bitfield stores into memory.
A nonvarying address may be just a register or just
a symbol reference, or it may be either of those plus
a numeric offset. */
A nonvarying address may be just a register or just a symbol reference,
or it may be either of those plus a numeric offset. */
static void
invalidate (x, full_mode)
@ -1745,130 +1747,118 @@ invalidate (x, full_mode)
register int i;
register struct table_elt *p;
/* If X is a register, dependencies on its contents
are recorded through the qty number mechanism.
Just change the qty number of the register,
mark it as invalid for expressions that refer to it,
and remove it itself. */
if (GET_CODE (x) == REG)
switch (GET_CODE (x))
{
register int regno = REGNO (x);
register unsigned hash = HASH (x, GET_MODE (x));
case REG:
{
/* If X is a register, dependencies on its contents are recorded
through the qty number mechanism. Just change the qty number of
the register, mark it as invalid for expressions that refer to it,
and remove it itself. */
register int regno = REGNO (x);
register unsigned hash = HASH (x, GET_MODE (x));
/* Remove REGNO from any quantity list it might be on and indicate
that its value might have changed. If it is a pseudo, remove its
entry from the hash table.
/* Remove REGNO from any quantity list it might be on and indicate
that its value might have changed. If it is a pseudo, remove its
entry from the hash table.
For a hard register, we do the first two actions above for any
additional hard registers corresponding to X. Then, if any of these
registers are in the table, we must remove any REG entries that
overlap these registers. */
For a hard register, we do the first two actions above for any
additional hard registers corresponding to X. Then, if any of these
registers are in the table, we must remove any REG entries that
overlap these registers. */
delete_reg_equiv (regno);
REG_TICK (regno)++;
delete_reg_equiv (regno);
REG_TICK (regno)++;
if (regno >= FIRST_PSEUDO_REGISTER)
{
/* Because a register can be referenced in more than one mode,
we might have to remove more than one table entry. */
if (regno >= FIRST_PSEUDO_REGISTER)
{
/* Because a register can be referenced in more than one mode,
we might have to remove more than one table entry. */
struct table_elt *elt;
struct table_elt *elt;
while ((elt = lookup_for_remove (x, hash, GET_MODE (x))))
remove_from_table (elt, hash);
}
else
{
HOST_WIDE_INT in_table
= TEST_HARD_REG_BIT (hard_regs_in_table, regno);
int endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int tregno, tendregno;
register struct table_elt *p, *next;
while ((elt = lookup_for_remove (x, hash, GET_MODE (x))))
remove_from_table (elt, hash);
}
else
{
HOST_WIDE_INT in_table
= TEST_HARD_REG_BIT (hard_regs_in_table, regno);
int endregno = regno + HARD_REGNO_NREGS (regno, GET_MODE (x));
int tregno, tendregno;
register struct table_elt *p, *next;
CLEAR_HARD_REG_BIT (hard_regs_in_table, regno);
CLEAR_HARD_REG_BIT (hard_regs_in_table, regno);
for (i = regno + 1; i < endregno; i++)
{
in_table |= TEST_HARD_REG_BIT (hard_regs_in_table, i);
CLEAR_HARD_REG_BIT (hard_regs_in_table, i);
delete_reg_equiv (i);
REG_TICK (i)++;
}
for (i = regno + 1; i < endregno; i++)
{
in_table |= TEST_HARD_REG_BIT (hard_regs_in_table, i);
CLEAR_HARD_REG_BIT (hard_regs_in_table, i);
delete_reg_equiv (i);
REG_TICK (i)++;
}
if (in_table)
for (hash = 0; hash < NBUCKETS; hash++)
for (p = table[hash]; p; p = next)
{
next = p->next_same_hash;
if (in_table)
for (hash = 0; hash < NBUCKETS; hash++)
for (p = table[hash]; p; p = next)
{
next = p->next_same_hash;
if (GET_CODE (p->exp) != REG
|| REGNO (p->exp) >= FIRST_PSEUDO_REGISTER)
continue;
tregno = REGNO (p->exp);
tendregno
= tregno + HARD_REGNO_NREGS (tregno, GET_MODE (p->exp));
if (tendregno > regno && tregno < endregno)
remove_from_table (p, hash);
}
}
tregno = REGNO (p->exp);
tendregno
= tregno + HARD_REGNO_NREGS (tregno, GET_MODE (p->exp));
if (tendregno > regno && tregno < endregno)
remove_from_table (p, hash);
}
}
}
return;
}
if (GET_CODE (x) == SUBREG)
{
if (GET_CODE (SUBREG_REG (x)) != REG)
abort ();
case SUBREG:
invalidate (SUBREG_REG (x), VOIDmode);
return;
}
/* If X is a parallel, invalidate all of its elements. */
if (GET_CODE (x) == PARALLEL)
{
case PARALLEL:
for (i = XVECLEN (x, 0) - 1; i >= 0 ; --i)
invalidate (XVECEXP (x, 0, i), VOIDmode);
return;
}
/* If X is an expr_list, this is part of a disjoint return value;
extract the location in question ignoring the offset. */
if (GET_CODE (x) == EXPR_LIST)
{
case EXPR_LIST:
/* This is part of a disjoint return value; extract the location in
question ignoring the offset. */
invalidate (XEXP (x, 0), VOIDmode);
return;
}
/* X is not a register; it must be a memory reference with
a nonvarying address. Remove all hash table elements
that refer to overlapping pieces of memory. */
case MEM:
/* Remove all hash table elements that refer to overlapping pieces of
memory. */
if (full_mode == VOIDmode)
full_mode = GET_MODE (x);
if (GET_CODE (x) != MEM)
abort ();
if (full_mode == VOIDmode)
full_mode = GET_MODE (x);
for (i = 0; i < NBUCKETS; i++)
{
register struct table_elt *next;
for (p = table[i]; p; p = next)
for (i = 0; i < NBUCKETS; i++)
{
next = p->next_same_hash;
/* Invalidate ASM_OPERANDS which reference memory (this is easier
than checking all the aliases). */
if (p->in_memory
&& (GET_CODE (p->exp) != MEM
|| true_dependence (x, full_mode, p->exp, cse_rtx_varies_p)))
remove_from_table (p, i);
register struct table_elt *next;
for (p = table[i]; p; p = next)
{
next = p->next_same_hash;
if (p->in_memory
&& (GET_CODE (p->exp) != MEM
|| true_dependence (x, full_mode, p->exp,
cse_rtx_varies_p)))
remove_from_table (p, i);
}
}
return;
default:
abort ();
}
}
/* Remove all expressions that refer to register REGNO,
since they are already invalid, and we are about to
mark that register valid again and don't want the old
@ -2215,7 +2205,9 @@ canon_hash (x, mode)
return hash;
case MEM:
if (MEM_VOLATILE_P (x))
/* We don't record if marked volatile or if BLKmode since we don't
know the size of the move. */
if (MEM_VOLATILE_P (x) || GET_MODE (x) == BLKmode)
{
do_not_record = 1;
return 0;
@ -6171,6 +6163,7 @@ cse_insn (insn, libcall_insn)
}
/* Remove from the hash table all expressions that reference memory. */
static void
invalidate_memory ()
{
@ -6186,13 +6179,15 @@ invalidate_memory ()
}
}
/* XXX ??? The name of this function bears little resemblance to
what this function actually does. FIXME. */
#ifdef AUTO_INC_DEC
/* If ADDR is an address that implicitly affects the stack pointer, return
1 and update the register tables to show the effect. Else, return 0. */
static int
note_mem_written (addr)
addr_affects_sp_p (addr)
register rtx addr;
{
/* Pushing or popping the stack invalidates just the stack pointer. */
if ((GET_CODE (addr) == PRE_DEC || GET_CODE (addr) == PRE_INC
|| GET_CODE (addr) == POST_DEC || GET_CODE (addr) == POST_INC)
&& GET_CODE (XEXP (addr, 0)) == REG
@ -6204,10 +6199,13 @@ note_mem_written (addr)
/* This should be *very* rare. */
if (TEST_HARD_REG_BIT (hard_regs_in_table, STACK_POINTER_REGNUM))
invalidate (stack_pointer_rtx, VOIDmode);
return 1;
}
return 0;
}
#endif
/* Perform invalidation on the basis of everything about an insn
except for invalidating the actual places that are SET in it.
@ -6432,7 +6430,9 @@ invalidate_skipped_set (dest, set, data)
enum rtx_code code = GET_CODE (dest);
if (code == MEM
&& ! note_mem_written (dest) /* If this is not a stack push ... */
#ifdef AUTO_INC_DEC
&& ! addr_affects_sp_p (dest) /* If this is not a stack push ... */
#endif
/* There are times when an address can appear varying and be a PLUS
during this scan when it would be a fixed address were we to know
the proper equivalences. So invalidate all memory if there is
@ -6605,10 +6605,13 @@ cse_set_around_loop (x, insn, loop_start)
}
}
/* Now invalidate anything modified by X. */
note_mem_written (SET_DEST (x));
#ifdef AUTO_INC_DEC
/* Deal with the destination of X affecting the stack pointer. */
addr_affects_sp_p (SET_DEST (x));
#endif
/* See comment on similar code in cse_insn for explanation of these tests. */
/* See comment on similar code in cse_insn for explanation of these
tests. */
if (GET_CODE (SET_DEST (x)) == REG || GET_CODE (SET_DEST (x)) == SUBREG
|| GET_CODE (SET_DEST (x)) == MEM)
invalidate (SET_DEST (x), VOIDmode);

View File

@ -1153,25 +1153,48 @@ dbxout_type (type, full, show_arg_types)
dbxout_type_index (type);
fprintf (asmfile, ";0;127;");
}
/* This used to check if the type's precision was more than
HOST_BITS_PER_WIDE_INT. That is wrong since gdb uses a
long (it has no concept of HOST_BITS_PER_WIDE_INT). */
else if (use_gnu_debug_info_extensions
&& (TYPE_PRECISION (type) > TYPE_PRECISION (integer_type_node)
|| TYPE_PRECISION (type) >= HOST_BITS_PER_LONG))
{
/* This used to say `r1' and we used to take care
to make sure that `int' was type number 1. */
fprintf (asmfile, "r");
dbxout_type_index (integer_type_node);
fprintf (asmfile, ";");
print_int_cst_octal (TYPE_MIN_VALUE (type));
fprintf (asmfile, ";");
print_int_cst_octal (TYPE_MAX_VALUE (type));
fprintf (asmfile, ";");
}
else /* Output other integer types as subranges of `int'. */
/* If this is a subtype of another integer type, always prefer to
write it as a subtype. */
else if (TREE_TYPE (type) != 0
&& TREE_CODE (TREE_TYPE (type)) == INTEGER_CST)
dbxout_range_type (type);
else
{
/* If the size is non-standard, say what it is if we can use
GDB extensions. */
if (use_gnu_debug_info_extensions
&& TYPE_PRECISION (type) != TYPE_PRECISION (integer_type_node))
fprintf (asmfile, "@s%d;", TYPE_PRECISION (type));
/* If we can use GDB extensions and the size is wider than a
long (the size used by GDB to read them) or we may have
trouble writing the bounds the usual way, write them in
octal. Note the test is for the *target's* size of "long",
not that of the host. The host test is just to make sure we
can write it out in case the host wide int is narrower than the
target "long". */
if (use_gnu_debug_info_extensions
&& (TYPE_PRECISION (type) > TYPE_PRECISION (integer_type_node)
|| TYPE_PRECISION (type) > HOST_BITS_PER_WIDE_INT))
{
fprintf (asmfile, "r");
dbxout_type_index (type);
fprintf (asmfile, ";");
print_int_cst_octal (TYPE_MIN_VALUE (type));
fprintf (asmfile, ";");
print_int_cst_octal (TYPE_MAX_VALUE (type));
fprintf (asmfile, ";");
}
else
/* Output other integer types as subranges of `int'. */
dbxout_range_type (type);
}
CHARS (22);
break;
@ -2427,17 +2450,26 @@ dbxout_parms (parms)
&& ! CONSTANT_P (XEXP (DECL_RTL (parms), 0)))
{
/* Parm was passed in registers but lives on the stack. */
int aux_sym_value = 0;
current_sym_code = N_PSYM;
/* DECL_RTL looks like (MEM (PLUS (REG...) (CONST_INT...))),
in which case we want the value of that CONST_INT,
or (MEM (REG ...)) or (MEM (MEM ...)),
in which case we use a value of zero. */
if (GET_CODE (XEXP (DECL_RTL (parms), 0)) == REG
|| GET_CODE (XEXP (DECL_RTL (parms), 0)) == MEM)
if (GET_CODE (XEXP (DECL_RTL (parms), 0)) == REG)
current_sym_value = 0;
else if (GET_CODE (XEXP (DECL_RTL (parms), 0)) == MEM)
{
/* Remember the location on the stack the parm is moved to */
aux_sym_value
= INTVAL (XEXP (XEXP (XEXP (DECL_RTL (parms), 0), 0), 1));
current_sym_value = 0;
}
else
current_sym_value = INTVAL (XEXP (XEXP (DECL_RTL (parms), 0), 1));
current_sym_value
= INTVAL (XEXP (XEXP (DECL_RTL (parms), 0), 1));
current_sym_addr = 0;
/* Make a big endian correction if the mode of the type of the
@ -2452,7 +2484,8 @@ dbxout_parms (parms)
FORCE_TEXT;
if (DECL_NAME (parms))
{
current_sym_nchars = 2 + strlen (IDENTIFIER_POINTER (DECL_NAME (parms)));
current_sym_nchars
= 2 + strlen (IDENTIFIER_POINTER (DECL_NAME (parms)));
fprintf (asmfile, "%s \"%s:%c", ASM_STABS_OP,
IDENTIFIER_POINTER (DECL_NAME (parms)),
@ -2470,6 +2503,17 @@ dbxout_parms (parms)
XEXP (DECL_RTL (parms), 0));
dbxout_type (TREE_TYPE (parms), 0, 0);
dbxout_finish_symbol (parms);
if (aux_sym_value != 0)
{
/* Generate an entry for the stack location */
fprintf (asmfile, "%s \"%s:", ASM_STABS_OP,
IDENTIFIER_POINTER (DECL_NAME (parms)));
current_sym_value = aux_sym_value;
current_sym_code = N_LSYM;
dbxout_type (build_reference_type (TREE_TYPE (parms)), 0, 0);
dbxout_finish_symbol (parms);
}
}
}
}

View File

@ -44,6 +44,7 @@ Boston, MA 02111-1307, USA. */
#include "dwarf2out.h"
#include "toplev.h"
#include "dyn-string.h"
#include <ctype.h>
/* We cannot use <assert.h> in GCC source, since that would include
GCC's assert.h, which may not be compatible with the host compiler. */
@ -65,6 +66,7 @@ dwarf2out_do_frame ()
|| DWARF2_FRAME_INFO
#endif
#ifdef DWARF2_UNWIND_INFO
|| flag_unwind_tables
|| (flag_exceptions && ! exceptions_via_longjmp)
#endif
);
@ -1904,11 +1906,11 @@ dwarf2out_frame_finish ()
#ifdef MIPS_DEBUGGING_INFO
if (write_symbols == DWARF2_DEBUG)
output_call_frame_info (0);
if (flag_exceptions && ! exceptions_via_longjmp)
if (flag_unwind_tables || (flag_exceptions && ! exceptions_via_longjmp))
output_call_frame_info (1);
#else
if (write_symbols == DWARF2_DEBUG
|| (flag_exceptions && ! exceptions_via_longjmp))
|| flag_unwind_tables || (flag_exceptions && ! exceptions_via_longjmp))
output_call_frame_info (1);
#endif
}
@ -2441,7 +2443,6 @@ static int constant_size PROTO((long unsigned));
static unsigned long size_of_die PROTO((dw_die_ref));
static void calc_die_sizes PROTO((dw_die_ref));
static unsigned long size_of_line_prolog PROTO((void));
static unsigned long size_of_line_info PROTO((void));
static unsigned long size_of_pubnames PROTO((void));
static unsigned long size_of_aranges PROTO((void));
static enum dwarf_form value_format PROTO((dw_val_ref));
@ -2655,6 +2656,18 @@ static char debug_line_section_label[MAX_ARTIFICIAL_LABEL_BYTES];
while (0)
#endif
/* We allow a language front-end to designate a function that is to be
called to "demangle" any name before it it put into a DIE. */
static char *(*demangle_name_func) PROTO((char *));
void
dwarf2out_set_demangle_name_func (func)
char *(*func) PROTO((char *));
{
demangle_name_func = func;
}
/* Convert an integer constant expression into assembler syntax. Addition
and subtraction are the only arithmetic that may appear in these
expressions. This is an adaptation of output_addr_const in final.c.
@ -4720,165 +4733,6 @@ size_of_line_prolog ()
return size;
}
/* Return the size of the line information generated for this
compilation unit. */
static unsigned long
size_of_line_info ()
{
register unsigned long size;
register unsigned long lt_index;
register unsigned long current_line;
register long line_offset;
register long line_delta;
register unsigned long current_file;
register unsigned long function;
unsigned long size_of_set_address;
/* Size of a DW_LNE_set_address instruction. */
size_of_set_address = 1 + size_of_uleb128 (1 + PTR_SIZE) + 1 + PTR_SIZE;
/* Version number. */
size = 2;
/* Prolog length specifier. */
size += DWARF_OFFSET_SIZE;
/* Prolog. */
size += size_of_line_prolog ();
current_file = 1;
current_line = 1;
for (lt_index = 1; lt_index < line_info_table_in_use; ++lt_index)
{
register dw_line_info_ref line_info = &line_info_table[lt_index];
if (line_info->dw_line_num == current_line
&& line_info->dw_file_num == current_file)
continue;
/* Advance pc instruction. */
/* ??? See the DW_LNS_advance_pc comment in output_line_info. */
if (0)
size += 1 + 2;
else
size += size_of_set_address;
if (line_info->dw_file_num != current_file)
{
/* Set file number instruction. */
size += 1;
current_file = line_info->dw_file_num;
size += size_of_uleb128 (current_file);
}
if (line_info->dw_line_num != current_line)
{
line_offset = line_info->dw_line_num - current_line;
line_delta = line_offset - DWARF_LINE_BASE;
current_line = line_info->dw_line_num;
if (line_delta >= 0 && line_delta < (DWARF_LINE_RANGE - 1))
/* 1-byte special line number instruction. */
size += 1;
else
{
/* Advance line instruction. */
size += 1;
size += size_of_sleb128 (line_offset);
/* Generate line entry instruction. */
size += 1;
}
}
}
/* Advance pc instruction. */
if (0)
size += 1 + 2;
else
size += size_of_set_address;
/* End of line number info. marker. */
size += 1 + size_of_uleb128 (1) + 1;
function = 0;
current_file = 1;
current_line = 1;
for (lt_index = 0; lt_index < separate_line_info_table_in_use; )
{
register dw_separate_line_info_ref line_info
= &separate_line_info_table[lt_index];
if (line_info->dw_line_num == current_line
&& line_info->dw_file_num == current_file
&& line_info->function == function)
goto cont;
if (function != line_info->function)
{
function = line_info->function;
/* Set address register instruction. */
size += size_of_set_address;
}
else
{
/* Advance pc instruction. */
if (0)
size += 1 + 2;
else
size += size_of_set_address;
}
if (line_info->dw_file_num != current_file)
{
/* Set file number instruction. */
size += 1;
current_file = line_info->dw_file_num;
size += size_of_uleb128 (current_file);
}
if (line_info->dw_line_num != current_line)
{
line_offset = line_info->dw_line_num - current_line;
line_delta = line_offset - DWARF_LINE_BASE;
current_line = line_info->dw_line_num;
if (line_delta >= 0 && line_delta < (DWARF_LINE_RANGE - 1))
/* 1-byte special line number instruction. */
size += 1;
else
{
/* Advance line instruction. */
size += 1;
size += size_of_sleb128 (line_offset);
/* Generate line entry instruction. */
size += 1;
}
}
cont:
++lt_index;
/* If we're done with a function, end its sequence. */
if (lt_index == separate_line_info_table_in_use
|| separate_line_info_table[lt_index].function != function)
{
current_file = 1;
current_line = 1;
/* Advance pc instruction. */
if (0)
size += 1 + 2;
else
size += size_of_set_address;
/* End of line number info. marker. */
size += 1 + size_of_uleb128 (1) + 1;
}
}
return size;
}
/* Return the size of the .debug_pubnames table generated for the
compilation unit. */
@ -5650,10 +5504,7 @@ output_aranges ()
}
/* Output the source line number correspondence information. This
information goes into the .debug_line section.
If the format of this data changes, then the function size_of_line_info
must also be adjusted the same way. */
information goes into the .debug_line section. */
static void
output_line_info ()
@ -5670,12 +5521,13 @@ output_line_info ()
register unsigned long current_file;
register unsigned long function;
ASM_OUTPUT_DWARF_DATA (asm_out_file, size_of_line_info ());
ASM_OUTPUT_DWARF_DELTA (asm_out_file, ".LTEND", ".LTSTART");
if (flag_debug_asm)
fprintf (asm_out_file, "\t%s Length of Source Line Info.",
ASM_COMMENT_START);
fputc ('\n', asm_out_file);
ASM_OUTPUT_LABEL (asm_out_file, ".LTSTART");
ASM_OUTPUT_DWARF_DATA2 (asm_out_file, DWARF_VERSION);
if (flag_debug_asm)
fprintf (asm_out_file, "\t%s DWARF Version", ASM_COMMENT_START);
@ -5932,6 +5784,7 @@ output_line_info ()
}
/* Output the marker for the end of the line number info. */
ASM_OUTPUT_LABEL (asm_out_file, ".LTEND");
ASM_OUTPUT_DWARF_DATA1 (asm_out_file, 0);
if (flag_debug_asm)
fprintf (asm_out_file, "\t%s DW_LNE_end_sequence", ASM_COMMENT_START);
@ -6234,6 +6087,9 @@ base_type_die (type)
}
base_type_result = new_die (DW_TAG_base_type, comp_unit_die);
if (demangle_name_func)
type_name = (*demangle_name_func) (type_name);
add_AT_string (base_type_result, DW_AT_name, type_name);
add_AT_unsigned (base_type_result, DW_AT_byte_size,
int_size_in_bytes (type));
@ -6774,17 +6630,20 @@ field_byte_offset (decl)
bitpos_tree = DECL_FIELD_BITPOS (decl);
field_size_tree = DECL_SIZE (decl);
/* We cannot yet cope with fields whose positions or sizes are variable, so
/* We cannot yet cope with fields whose positions are variable, so
for now, when we see such things, we simply return 0. Someday, we may
be able to handle such cases, but it will be damn difficult. */
if (TREE_CODE (bitpos_tree) != INTEGER_CST)
return 0;
bitpos_int = (unsigned) TREE_INT_CST_LOW (bitpos_tree);
if (TREE_CODE (field_size_tree) != INTEGER_CST)
return 0;
/* If we don't know the size of the field, pretend it's a full word. */
if (TREE_CODE (field_size_tree) == INTEGER_CST)
field_size_in_bits = (unsigned) TREE_INT_CST_LOW (field_size_tree);
else
field_size_in_bits = BITS_PER_WORD;
field_size_in_bits = (unsigned) TREE_INT_CST_LOW (field_size_tree);
type_size_in_bits = simple_type_size_in_bits (type);
type_align_in_bits = simple_type_align_in_bits (type);
type_align_in_bytes = type_align_in_bits / BITS_PER_UNIT;
@ -7229,7 +7088,12 @@ add_name_attribute (die, name_string)
register const char *name_string;
{
if (name_string != NULL && *name_string != 0)
add_AT_string (die, DW_AT_name, name_string);
{
if (demangle_name_func)
name_string = (*demangle_name_func) (name_string);
add_AT_string (die, DW_AT_name, name_string);
}
}
/* Given a tree node describing an array bound (either lower or upper) output
@ -7923,6 +7787,7 @@ gen_array_type_die (type, context_die)
#endif
add_subscript_info (array_die, type);
add_name_attribute (array_die, type_tag (type));
equate_type_number_to_die (type, array_die);
/* Add representation of the type of the elements of this array type. */
@ -8422,13 +8287,13 @@ gen_subprogram_die (decl, context_die)
inline is not saved anywhere. */
if (DECL_DEFER_OUTPUT (decl))
{
if (DECL_INLINE (decl))
if (DECL_INLINE (decl) && !flag_no_inline)
add_AT_unsigned (subr_die, DW_AT_inline, DW_INL_declared_inlined);
else
add_AT_unsigned (subr_die, DW_AT_inline,
DW_INL_declared_not_inlined);
}
else if (DECL_INLINE (decl))
else if (DECL_INLINE (decl) && !flag_no_inline)
add_AT_unsigned (subr_die, DW_AT_inline, DW_INL_inlined);
else
abort ();
@ -9577,6 +9442,30 @@ gen_decl_die (decl, context_die)
}
}
/* Add Ada "use" clause information for SGI Workshop debugger. */
void
dwarf2out_add_library_unit_info (filename, context_list)
char *filename;
char *context_list;
{
unsigned int file_index;
if (filename != NULL)
{
dw_die_ref unit_die = new_die (DW_TAG_module, comp_unit_die);
tree context_list_decl
= build_decl (LABEL_DECL, get_identifier (context_list),
void_type_node);
TREE_PUBLIC (context_list_decl) = TRUE;
add_name_attribute (unit_die, context_list);
file_index = lookup_filename (filename);
add_AT_unsigned (unit_die, DW_AT_decl_file, file_index);
add_pubname (context_list_decl, unit_die);
}
}
/* Write the debugging output for DECL. */
void

View File

@ -1379,6 +1379,19 @@ allocate_dynamic_stack_space (size, target, known_align)
return target;
}
/* A front end may want to override GCC's stack checking by providing a
run-time routine to call to check the stack, so provide a mechanism for
calling that routine. */
static rtx stack_check_libfunc;
void
set_stack_check_libfunc (libfunc)
rtx libfunc;
{
stack_check_libfunc = libfunc;
}
/* Emit one stack probe at ADDRESS, an address within the stack. */
static void
@ -1412,9 +1425,19 @@ probe_stack_range (first, size)
HOST_WIDE_INT first;
rtx size;
{
/* First see if we have an insn to check the stack. Use it if so. */
/* First see if the front end has set up a function for us to call to
check the stack. */
if (stack_check_libfunc != 0)
emit_library_call (stack_check_libfunc, 0, VOIDmode, 1,
memory_address (QImode,
gen_rtx (STACK_GROW_OP, Pmode,
stack_pointer_rtx,
plus_constant (size, first))),
ptr_mode);
/* Next see if we have an insn to check the stack. Use it if so. */
#ifdef HAVE_check_stack
if (HAVE_check_stack)
else if (HAVE_check_stack)
{
insn_operand_predicate_fn pred;
rtx last_addr
@ -1428,14 +1451,13 @@ probe_stack_range (first, size)
last_addr = copy_to_mode_reg (Pmode, last_addr);
emit_insn (gen_check_stack (last_addr));
return;
}
#endif
/* If we have to generate explicit probes, see if we have a constant
small number of them to generate. If so, that's the easy case. */
if (GET_CODE (size) == CONST_INT
&& INTVAL (size) < 10 * STACK_CHECK_PROBE_INTERVAL)
else if (GET_CODE (size) == CONST_INT
&& INTVAL (size) < 10 * STACK_CHECK_PROBE_INTERVAL)
{
HOST_WIDE_INT offset;

View File

@ -90,6 +90,9 @@ int do_preexpand_calls = 1;
infinite recursion. */
static int in_check_memory_usage;
/* Chain of pending expressions for PLACEHOLDER_EXPR to replace. */
static tree placeholder_list = 0;
/* This structure is used by move_by_pieces to describe the move to
be performed. */
struct move_by_pieces
@ -153,6 +156,8 @@ static tree init_noncopied_parts PROTO((tree, tree));
static int safe_from_p PROTO((rtx, tree, int));
static int fixed_type_p PROTO((tree));
static rtx var_rtx PROTO((tree));
static int readonly_fields_p PROTO((tree));
static rtx expand_expr_unaligned PROTO((tree, int *));
static rtx expand_increment PROTO((tree, int, int));
static void preexpand_calls PROTO((tree));
static void do_jump_by_parts_greater PROTO((tree, int, rtx, rtx));
@ -3492,13 +3497,20 @@ expand_assignment (to, from, want_value, suggest_reg)
}
/* Don't move directly into a return register. */
if (TREE_CODE (to) == RESULT_DECL && GET_CODE (to_rtx) == REG)
if (TREE_CODE (to) == RESULT_DECL
&& (GET_CODE (to_rtx) == REG || GET_CODE (to_rtx) == PARALLEL))
{
rtx temp;
push_temp_slots ();
temp = expand_expr (from, 0, GET_MODE (to_rtx), 0);
emit_move_insn (to_rtx, temp);
if (GET_CODE (to_rtx) == PARALLEL)
emit_group_load (to_rtx, temp, int_size_in_bytes (TREE_TYPE (from)),
TYPE_ALIGN (TREE_TYPE (from)) / BITS_PER_UNIT);
else
emit_move_insn (to_rtx, temp);
preserve_temp_slots (to_rtx);
free_temp_slots ();
pop_temp_slots ();
@ -4142,7 +4154,11 @@ store_constructor (exp, target, align, cleared)
if (cleared && is_zeros_p (TREE_VALUE (elt)))
continue;
bitsize = TREE_INT_CST_LOW (DECL_SIZE (field));
if (TREE_CODE (DECL_SIZE (field)) == INTEGER_CST)
bitsize = TREE_INT_CST_LOW (DECL_SIZE (field));
else
bitsize = -1;
unsignedp = TREE_UNSIGNED (field);
mode = DECL_MODE (field);
if (DECL_BIT_FIELD (field))
@ -4317,9 +4333,18 @@ store_constructor (exp, target, align, cleared)
if (cleared && is_zeros_p (value))
continue;
mode = TYPE_MODE (elttype);
bitsize = GET_MODE_BITSIZE (mode);
unsignedp = TREE_UNSIGNED (elttype);
mode = TYPE_MODE (elttype);
if (mode == BLKmode)
{
if (TREE_CODE (TYPE_SIZE (elttype)) == INTEGER_CST
&& TREE_INT_CST_HIGH (TYPE_SIZE (elttype)) == 0)
bitsize = TREE_INT_CST_LOW (TYPE_SIZE (elttype));
else
bitsize = -1;
}
else
bitsize = GET_MODE_BITSIZE (mode);
if (index != NULL_TREE && TREE_CODE (index) == RANGE_EXPR)
{
@ -4709,9 +4734,19 @@ store_field (target, bitsize, bitpos, mode, exp, value_mode,
|| GET_CODE (target) == SUBREG
/* If the field isn't aligned enough to store as an ordinary memref,
store it as a bit field. */
|| (SLOW_UNALIGNED_ACCESS
&& align * BITS_PER_UNIT < GET_MODE_ALIGNMENT (mode))
|| (SLOW_UNALIGNED_ACCESS && bitpos % GET_MODE_ALIGNMENT (mode) != 0))
|| (mode != BLKmode && SLOW_UNALIGNED_ACCESS
&& (align * BITS_PER_UNIT < GET_MODE_ALIGNMENT (mode)
|| bitpos % GET_MODE_ALIGNMENT (mode)))
|| (mode == BLKmode && SLOW_UNALIGNED_ACCESS
&& (TYPE_ALIGN (TREE_TYPE (exp)) > align * BITS_PER_UNIT
|| bitpos % TYPE_ALIGN (TREE_TYPE (exp)) != 0))
/* If the RHS and field are a constant size and the size of the
RHS isn't the same size as the bitfield, we must use bitfield
operations. */
|| ((bitsize >= 0
&& TREE_CODE (TYPE_SIZE (TREE_TYPE (exp))) == INTEGER_CST)
&& (TREE_INT_CST_HIGH (TYPE_SIZE (TREE_TYPE (exp))) != 0
|| TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp))) != bitsize)))
{
rtx temp = expand_expr (exp, NULL_RTX, VOIDmode, 0);
@ -4746,10 +4781,14 @@ store_field (target, bitsize, bitpos, mode, exp, value_mode,
plus_constant (XEXP (target, 0),
bitpos / BITS_PER_UNIT));
/* Find an alignment that is consistent with the bit position. */
while ((bitpos % (align * BITS_PER_UNIT)) != 0)
align >>= 1;
emit_block_move (target, temp,
GEN_INT ((bitsize + BITS_PER_UNIT - 1)
/ BITS_PER_UNIT),
1);
align);
return value_mode == VOIDmode ? const0_rtx : target;
}
@ -4985,9 +5024,6 @@ get_inner_reference (exp, pbitsize, pbitpos, poffset, pmode,
else if (TREE_CODE (exp) != NON_LVALUE_EXPR
&& ! ((TREE_CODE (exp) == NOP_EXPR
|| TREE_CODE (exp) == CONVERT_EXPR)
&& ! (TREE_CODE (TREE_TYPE (exp)) == UNION_TYPE
&& (TREE_CODE (TREE_TYPE (TREE_OPERAND (exp, 0)))
!= UNION_TYPE))
&& (TYPE_MODE (TREE_TYPE (exp))
== TYPE_MODE (TREE_TYPE (TREE_OPERAND (exp, 0))))))
break;
@ -5525,6 +5561,25 @@ check_max_integer_computation_mode (exp)
}
#endif
/* Utility function used by expand_expr to see if TYPE, a RECORD_TYPE,
has any readonly fields. If any of the fields have types that
contain readonly fields, return true as well. */
static int
readonly_fields_p (type)
tree type;
{
tree field;
for (field = TYPE_FIELDS (type); field != 0; field = TREE_CHAIN (field))
if (TREE_READONLY (field)
|| (TREE_CODE (TREE_TYPE (field)) == RECORD_TYPE
&& readonly_fields_p (TREE_TYPE (field))))
return 1;
return 0;
}
/* expand_expr: generate code for computing expression EXP.
An rtx for the computed value is returned. The value is never null.
@ -5568,9 +5623,6 @@ expand_expr (exp, target, tmode, modifier)
enum machine_mode tmode;
enum expand_modifier modifier;
{
/* Chain of pending expressions for PLACEHOLDER_EXPR to replace.
This is static so it will be accessible to our recursive callees. */
static tree placeholder_list = 0;
register rtx op0, op1, temp;
tree type = TREE_TYPE (exp);
int unsignedp = TREE_UNSIGNED (type);
@ -5629,10 +5681,12 @@ expand_expr (exp, target, tmode, modifier)
if (! TREE_SIDE_EFFECTS (exp))
return const0_rtx;
/* Ensure we reference a volatile object even if value is ignored. */
/* Ensure we reference a volatile object even if value is ignored, but
don't do this if all we are doing is taking its address. */
if (TREE_THIS_VOLATILE (exp)
&& TREE_CODE (exp) != FUNCTION_DECL
&& mode != VOIDmode && mode != BLKmode)
&& mode != VOIDmode && mode != BLKmode
&& modifier != EXPAND_CONST_ADDRESS)
{
temp = expand_expr (exp, NULL_RTX, VOIDmode, ro_modifier);
if (GET_CODE (temp) == MEM)
@ -5640,11 +5694,12 @@ expand_expr (exp, target, tmode, modifier)
return const0_rtx;
}
if (TREE_CODE_CLASS (code) == '1')
if (TREE_CODE_CLASS (code) == '1' || code == COMPONENT_REF
|| code == INDIRECT_REF || code == BUFFER_REF)
return expand_expr (TREE_OPERAND (exp, 0), const0_rtx,
VOIDmode, ro_modifier);
else if (TREE_CODE_CLASS (code) == '2'
|| TREE_CODE_CLASS (code) == '<')
else if (TREE_CODE_CLASS (code) == '2' || TREE_CODE_CLASS (code) == '<'
|| code == ARRAY_REF)
{
expand_expr (TREE_OPERAND (exp, 0), const0_rtx, VOIDmode, ro_modifier);
expand_expr (TREE_OPERAND (exp, 1), const0_rtx, VOIDmode, ro_modifier);
@ -5656,7 +5711,14 @@ expand_expr (exp, target, tmode, modifier)
the first. */
return expand_expr (TREE_OPERAND (exp, 0), const0_rtx,
VOIDmode, ro_modifier);
else if (code == BIT_FIELD_REF)
{
expand_expr (TREE_OPERAND (exp, 0), const0_rtx, VOIDmode, ro_modifier);
expand_expr (TREE_OPERAND (exp, 1), const0_rtx, VOIDmode, ro_modifier);
expand_expr (TREE_OPERAND (exp, 2), const0_rtx, VOIDmode, ro_modifier);
return const0_rtx;
}
;
target = 0;
}
@ -6334,6 +6396,14 @@ expand_expr (exp, target, tmode, modifier)
never change. Languages where it can never change should
also set TREE_STATIC. */
RTX_UNCHANGING_P (temp) = TREE_READONLY (exp) & TREE_STATIC (exp);
/* If we are writing to this object and its type is a record with
readonly fields, we must mark it as readonly so it will
conflict with readonly references to those fields. */
if (modifier == EXPAND_MEMORY_USE_WO
&& TREE_CODE (type) == RECORD_TYPE && readonly_fields_p (type))
RTX_UNCHANGING_P (temp) = 1;
return temp;
}
@ -6516,15 +6586,17 @@ expand_expr (exp, target, tmode, modifier)
!= INTEGER_CST)
? target : NULL_RTX),
VOIDmode,
modifier == EXPAND_INITIALIZER
(modifier == EXPAND_INITIALIZER
|| modifier == EXPAND_CONST_ADDRESS)
? modifier : EXPAND_NORMAL);
/* If this is a constant, put it into a register if it is a
legitimate constant and memory if it isn't. */
legitimate constant and OFFSET is 0 and memory if it isn't. */
if (CONSTANT_P (op0))
{
enum machine_mode mode = TYPE_MODE (TREE_TYPE (tem));
if (mode != BLKmode && LEGITIMATE_CONSTANT_P (op0))
if (mode != BLKmode && LEGITIMATE_CONSTANT_P (op0)
&& offset == 0)
op0 = force_reg (mode, op0);
else
op0 = validize_mem (force_const_mem (mode, op0));
@ -6534,6 +6606,20 @@ expand_expr (exp, target, tmode, modifier)
{
rtx offset_rtx = expand_expr (offset, NULL_RTX, VOIDmode, 0);
/* If this object is in memory, put it into a register.
This case can't occur in C, but can in Ada if we have
unchecked conversion of an expression from a scalar type to
an array or record type. */
if (GET_CODE (op0) == REG || GET_CODE (op0) == SUBREG
|| GET_CODE (op0) == CONCAT || GET_CODE (op0) == ADDRESSOF)
{
rtx memloc = assign_temp (TREE_TYPE (tem), 1, 1, 1);
mark_temp_addr_taken (memloc);
emit_move_insn (memloc, op0);
op0 = memloc;
}
if (GET_CODE (op0) != MEM)
abort ();
@ -6546,12 +6632,12 @@ expand_expr (exp, target, tmode, modifier)
#endif
}
/* A constant address in TO_RTX can have VOIDmode, we must not try
/* A constant address in OP0 can have VOIDmode, we must not try
to call force_reg for that case. Avoid that case. */
if (GET_CODE (op0) == MEM
&& GET_MODE (op0) == BLKmode
&& GET_MODE (XEXP (op0, 0)) != VOIDmode
&& bitsize
&& bitsize != 0
&& (bitpos % bitsize) == 0
&& (bitsize % GET_MODE_ALIGNMENT (mode1)) == 0
&& (alignment * BITS_PER_UNIT) == GET_MODE_ALIGNMENT (mode1))
@ -6625,13 +6711,23 @@ expand_expr (exp, target, tmode, modifier)
&& GET_MODE_CLASS (mode) != MODE_COMPLEX_FLOAT)
/* If the field isn't aligned enough to fetch as a memref,
fetch it as a bit field. */
|| (SLOW_UNALIGNED_ACCESS
&& ((TYPE_ALIGN (TREE_TYPE (tem)) < (unsigned int) GET_MODE_ALIGNMENT (mode))
|| (bitpos % GET_MODE_ALIGNMENT (mode) != 0))))))
|| (mode1 != BLKmode && SLOW_UNALIGNED_ACCESS
&& ((TYPE_ALIGN (TREE_TYPE (tem))
< (unsigned int) GET_MODE_ALIGNMENT (mode))
|| (bitpos % GET_MODE_ALIGNMENT (mode) != 0)))))
|| (modifier != EXPAND_CONST_ADDRESS
&& modifier != EXPAND_INITIALIZER
&& mode == BLKmode
&& SLOW_UNALIGNED_ACCESS
&& (TYPE_ALIGN (type) > alignment * BITS_PER_UNIT
|| bitpos % TYPE_ALIGN (type) != 0)))
{
enum machine_mode ext_mode = mode;
if (ext_mode == BLKmode)
if (ext_mode == BLKmode
&& ! (target != 0 && GET_CODE (op0) == MEM
&& GET_CODE (target) == MEM
&& bitpos % BITS_PER_UNIT == 0))
ext_mode = mode_for_size (bitsize, MODE_INT, 1);
if (ext_mode == BLKmode)
@ -6709,7 +6805,7 @@ expand_expr (exp, target, tmode, modifier)
if (GET_CODE (op0) == MEM)
MEM_ALIAS_SET (op0) = get_alias_set (exp);
if (GET_CODE (XEXP (op0, 0)) == REG)
mark_reg_pointer (XEXP (op0, 0), alignment);
@ -6890,6 +6986,16 @@ expand_expr (exp, target, tmode, modifier)
if (TREE_CODE (type) == UNION_TYPE)
{
tree valtype = TREE_TYPE (TREE_OPERAND (exp, 0));
/* If both input and output are BLKmode, this conversion
isn't actually doing anything unless we need to make the
alignment stricter. */
if (mode == BLKmode && TYPE_MODE (valtype) == BLKmode
&& (TYPE_ALIGN (type) <= TYPE_ALIGN (valtype)
|| TYPE_ALIGN (type) >= BIGGEST_ALIGNMENT))
return expand_expr (TREE_OPERAND (exp, 0), target, tmode,
modifier);
if (target == 0)
{
if (mode != BLKmode)
@ -6905,11 +7011,13 @@ expand_expr (exp, target, tmode, modifier)
else if (GET_CODE (target) == REG)
/* Store this field into a union of the proper type. */
store_field (target, GET_MODE_BITSIZE (TYPE_MODE (valtype)), 0,
TYPE_MODE (valtype), TREE_OPERAND (exp, 0),
VOIDmode, 0, 1,
int_size_in_bytes (TREE_TYPE (TREE_OPERAND (exp, 0))),
0);
store_field (target,
MIN ((int_size_in_bytes (TREE_TYPE
(TREE_OPERAND (exp, 0)))
* BITS_PER_UNIT),
GET_MODE_BITSIZE (mode)),
0, TYPE_MODE (valtype), TREE_OPERAND (exp, 0),
VOIDmode, 0, 1, int_size_in_bytes (type), 0);
else
abort ();
@ -8306,6 +8414,263 @@ expand_expr (exp, target, tmode, modifier)
return temp;
}
/* Similar to expand_expr, except that we don't specify a target, target
mode, or modifier and we return the alignment of the inner type. This is
used in cases where it is not necessary to align the result to the
alignment of its type as long as we know the alignment of the result, for
example for comparisons of BLKmode values. */
static rtx
expand_expr_unaligned (exp, palign)
register tree exp;
int *palign;
{
register rtx op0;
tree type = TREE_TYPE (exp);
register enum machine_mode mode = TYPE_MODE (type);
/* Default the alignment we return to that of the type. */
*palign = TYPE_ALIGN (type);
/* The only cases in which we do anything special is if the resulting mode
is BLKmode. */
if (mode != BLKmode)
return expand_expr (exp, NULL_RTX, VOIDmode, EXPAND_NORMAL);
switch (TREE_CODE (exp))
{
case CONVERT_EXPR:
case NOP_EXPR:
case NON_LVALUE_EXPR:
/* Conversions between BLKmode values don't change the underlying
alignment or value. */
if (TYPE_MODE (TREE_TYPE (TREE_OPERAND (exp, 0))) == BLKmode)
return expand_expr_unaligned (TREE_OPERAND (exp, 0), palign);
break;
case ARRAY_REF:
/* Much of the code for this case is copied directly from expand_expr.
We need to duplicate it here because we will do something different
in the fall-through case, so we need to handle the same exceptions
it does. */
{
tree array = TREE_OPERAND (exp, 0);
tree domain = TYPE_DOMAIN (TREE_TYPE (array));
tree low_bound = domain ? TYPE_MIN_VALUE (domain) : integer_zero_node;
tree index = TREE_OPERAND (exp, 1);
tree index_type = TREE_TYPE (index);
HOST_WIDE_INT i;
if (TREE_CODE (TREE_TYPE (TREE_OPERAND (exp, 0))) != ARRAY_TYPE)
abort ();
/* Optimize the special-case of a zero lower bound.
We convert the low_bound to sizetype to avoid some problems
with constant folding. (E.g. suppose the lower bound is 1,
and its mode is QI. Without the conversion, (ARRAY
+(INDEX-(unsigned char)1)) becomes ((ARRAY+(-(unsigned char)1))
+INDEX), which becomes (ARRAY+255+INDEX). Oops!)
But sizetype isn't quite right either (especially if
the lowbound is negative). FIXME */
if (! integer_zerop (low_bound))
index = fold (build (MINUS_EXPR, index_type, index,
convert (sizetype, low_bound)));
/* If this is a constant index into a constant array,
just get the value from the array. Handle both the cases when
we have an explicit constructor and when our operand is a variable
that was declared const. */
if (TREE_CODE (array) == CONSTRUCTOR && ! TREE_SIDE_EFFECTS (array))
{
if (TREE_CODE (index) == INTEGER_CST
&& TREE_INT_CST_HIGH (index) == 0)
{
tree elem = CONSTRUCTOR_ELTS (TREE_OPERAND (exp, 0));
i = TREE_INT_CST_LOW (index);
while (elem && i--)
elem = TREE_CHAIN (elem);
if (elem)
return expand_expr_unaligned (fold (TREE_VALUE (elem)),
palign);
}
}
else if (optimize >= 1
&& TREE_READONLY (array) && ! TREE_SIDE_EFFECTS (array)
&& TREE_CODE (array) == VAR_DECL && DECL_INITIAL (array)
&& TREE_CODE (DECL_INITIAL (array)) != ERROR_MARK)
{
if (TREE_CODE (index) == INTEGER_CST)
{
tree init = DECL_INITIAL (array);
i = TREE_INT_CST_LOW (index);
if (TREE_CODE (init) == CONSTRUCTOR)
{
tree elem = CONSTRUCTOR_ELTS (init);
while (elem
&& !tree_int_cst_equal (TREE_PURPOSE (elem), index))
elem = TREE_CHAIN (elem);
if (elem)
return expand_expr_unaligned (fold (TREE_VALUE (elem)),
palign);
}
}
}
}
/* ... fall through ... */
case COMPONENT_REF:
case BIT_FIELD_REF:
/* If the operand is a CONSTRUCTOR, we can just extract the
appropriate field if it is present. Don't do this if we have
already written the data since we want to refer to that copy
and varasm.c assumes that's what we'll do. */
if (TREE_CODE (exp) != ARRAY_REF
&& TREE_CODE (TREE_OPERAND (exp, 0)) == CONSTRUCTOR
&& TREE_CST_RTL (TREE_OPERAND (exp, 0)) == 0)
{
tree elt;
for (elt = CONSTRUCTOR_ELTS (TREE_OPERAND (exp, 0)); elt;
elt = TREE_CHAIN (elt))
if (TREE_PURPOSE (elt) == TREE_OPERAND (exp, 1))
/* Note that unlike the case in expand_expr, we know this is
BLKmode and hence not an integer. */
return expand_expr_unaligned (TREE_VALUE (elt), palign);
}
{
enum machine_mode mode1;
int bitsize;
int bitpos;
tree offset;
int volatilep = 0;
int alignment;
int unsignedp;
tree tem = get_inner_reference (exp, &bitsize, &bitpos, &offset,
&mode1, &unsignedp, &volatilep,
&alignment);
/* If we got back the original object, something is wrong. Perhaps
we are evaluating an expression too early. In any event, don't
infinitely recurse. */
if (tem == exp)
abort ();
op0 = expand_expr (tem, NULL_RTX, VOIDmode, EXPAND_NORMAL);
/* If this is a constant, put it into a register if it is a
legitimate constant and OFFSET is 0 and memory if it isn't. */
if (CONSTANT_P (op0))
{
enum machine_mode inner_mode = TYPE_MODE (TREE_TYPE (tem));
if (inner_mode != BLKmode && LEGITIMATE_CONSTANT_P (op0)
&& offset == 0)
op0 = force_reg (inner_mode, op0);
else
op0 = validize_mem (force_const_mem (inner_mode, op0));
}
if (offset != 0)
{
rtx offset_rtx = expand_expr (offset, NULL_RTX, VOIDmode, 0);
/* If this object is in a register, put it into memory.
This case can't occur in C, but can in Ada if we have
unchecked conversion of an expression from a scalar type to
an array or record type. */
if (GET_CODE (op0) == REG || GET_CODE (op0) == SUBREG
|| GET_CODE (op0) == CONCAT || GET_CODE (op0) == ADDRESSOF)
{
rtx memloc = assign_temp (TREE_TYPE (tem), 1, 1, 1);
mark_temp_addr_taken (memloc);
emit_move_insn (memloc, op0);
op0 = memloc;
}
if (GET_CODE (op0) != MEM)
abort ();
if (GET_MODE (offset_rtx) != ptr_mode)
{
#ifdef POINTERS_EXTEND_UNSIGNED
offset_rtx = convert_memory_address (ptr_mode, offset_rtx);
#else
offset_rtx = convert_to_mode (ptr_mode, offset_rtx, 0);
#endif
}
op0 = change_address (op0, VOIDmode,
gen_rtx_PLUS (ptr_mode, XEXP (op0, 0),
force_reg (ptr_mode,
offset_rtx)));
}
/* Don't forget about volatility even if this is a bitfield. */
if (GET_CODE (op0) == MEM && volatilep && ! MEM_VOLATILE_P (op0))
{
op0 = copy_rtx (op0);
MEM_VOLATILE_P (op0) = 1;
}
/* Check the access. */
if (current_function_check_memory_usage && GET_CODE (op0) == MEM)
{
rtx to;
int size;
to = plus_constant (XEXP (op0, 0), (bitpos / BITS_PER_UNIT));
size = (bitpos % BITS_PER_UNIT) + bitsize + BITS_PER_UNIT - 1;
/* Check the access right of the pointer. */
if (size > BITS_PER_UNIT)
emit_library_call (chkr_check_addr_libfunc, 1, VOIDmode, 3,
to, ptr_mode, GEN_INT (size / BITS_PER_UNIT),
TYPE_MODE (sizetype),
GEN_INT (MEMORY_USE_RO),
TYPE_MODE (integer_type_node));
}
/* Get a reference to just this component. */
op0 = change_address (op0, mode1,
plus_constant (XEXP (op0, 0),
(bitpos / BITS_PER_UNIT)));
MEM_ALIAS_SET (op0) = get_alias_set (exp);
/* Adjust the alignment in case the bit position is not
a multiple of the alignment of the inner object. */
while (bitpos % alignment != 0)
alignment >>= 1;
if (GET_CODE (XEXP (op0, 0)) == REG)
mark_reg_pointer (XEXP (op0, 0), alignment);
MEM_IN_STRUCT_P (op0) = 1;
MEM_VOLATILE_P (op0) |= volatilep;
*palign = alignment;
return op0;
}
default:
break;
}
return expand_expr (exp, NULL_RTX, VOIDmode, EXPAND_NORMAL);
}
/* Return the tree node and offset if a given argument corresponds to
a string constant. */
@ -8771,6 +9136,15 @@ do_jump (exp, if_false_label, if_true_label)
do_jump (TREE_OPERAND (exp, 0), if_false_label, if_true_label);
break;
case WITH_RECORD_EXPR:
/* Put the object on the placeholder list, recurse through our first
operand, and pop the list. */
placeholder_list = tree_cons (TREE_OPERAND (exp, 1), NULL_TREE,
placeholder_list);
do_jump (TREE_OPERAND (exp, 0), if_false_label, if_true_label);
placeholder_list = TREE_CHAIN (placeholder_list);
break;
#if 0
/* This is never less insns than evaluating the PLUS_EXPR followed by
a test and can be longer if the test is eliminated. */
@ -9424,6 +9798,7 @@ do_compare_and_jump (exp, signed_code, unsigned_code, if_false_label,
enum rtx_code signed_code, unsigned_code;
rtx if_false_label, if_true_label;
{
int align0, align1;
register rtx op0, op1;
register tree type;
register enum machine_mode mode;
@ -9431,11 +9806,11 @@ do_compare_and_jump (exp, signed_code, unsigned_code, if_false_label,
enum rtx_code code;
/* Don't crash if the comparison was erroneous. */
op0 = expand_expr (TREE_OPERAND (exp, 0), NULL_RTX, VOIDmode, 0);
op0 = expand_expr_unaligned (TREE_OPERAND (exp, 0), &align0);
if (TREE_CODE (TREE_OPERAND (exp, 0)) == ERROR_MARK)
return;
op1 = expand_expr (TREE_OPERAND (exp, 1), NULL_RTX, VOIDmode, 0);
op1 = expand_expr_unaligned (TREE_OPERAND (exp, 1), &align1);
type = TREE_TYPE (TREE_OPERAND (exp, 0));
mode = TYPE_MODE (type);
unsignedp = TREE_UNSIGNED (type);
@ -9473,7 +9848,7 @@ do_compare_and_jump (exp, signed_code, unsigned_code, if_false_label,
do_compare_rtx_and_jump (op0, op1, code, unsignedp, mode,
((mode == BLKmode)
? expr_size (TREE_OPERAND (exp, 0)) : NULL_RTX),
TYPE_ALIGN (TREE_TYPE (exp)) / BITS_PER_UNIT,
MIN (align0, align1) / BITS_PER_UNIT,
if_false_label, if_true_label);
}

View File

@ -404,6 +404,10 @@ extern int flag_exceptions;
extern int flag_new_exceptions;
/* Nonzero means generate frame unwind info table when supported */
extern int flag_unwind_tables;
/* Nonzero means don't place uninitialized global data in common storage
by default. */

View File

@ -92,6 +92,7 @@ static int merge_ranges PROTO((int *, tree *, tree *, int, tree, tree,
static tree fold_range_test PROTO((tree));
static tree unextend PROTO((tree, int, int, tree));
static tree fold_truthop PROTO((enum tree_code, tree, tree, tree));
static tree optimize_minmax_comparison PROTO((tree));
static tree strip_compound_expr PROTO((tree, tree));
static int multiple_of_p PROTO((tree, tree, tree));
static tree constant_boolean_node PROTO((int, tree));
@ -2585,6 +2586,11 @@ invert_truthvalue (arg)
return build (COMPOUND_EXPR, type, TREE_OPERAND (arg, 0),
invert_truthvalue (TREE_OPERAND (arg, 1)));
case WITH_RECORD_EXPR:
return build (WITH_RECORD_EXPR, type,
invert_truthvalue (TREE_OPERAND (arg, 0)),
TREE_OPERAND (arg, 1));
case NON_LVALUE_EXPR:
return invert_truthvalue (TREE_OPERAND (arg, 0));
@ -2728,11 +2734,13 @@ optimize_bit_field_compare (code, compare_type, lhs, rhs)
/* Get all the information about the extractions being done. If the bit size
if the same as the size of the underlying object, we aren't doing an
extraction at all and so can do nothing. */
extraction at all and so can do nothing. We also don't want to
do anything if the inner expression is a PLACEHOLDER_EXPR since we
then will no longer be able to replace it. */
linner = get_inner_reference (lhs, &lbitsize, &lbitpos, &offset, &lmode,
&lunsignedp, &lvolatilep, &alignment);
if (linner == lhs || lbitsize == GET_MODE_BITSIZE (lmode) || lbitsize < 0
|| offset != 0)
|| offset != 0 || TREE_CODE (linner) == PLACEHOLDER_EXPR)
return 0;
if (!const_p)
@ -2743,7 +2751,8 @@ optimize_bit_field_compare (code, compare_type, lhs, rhs)
&runsignedp, &rvolatilep, &alignment);
if (rinner == rhs || lbitpos != rbitpos || lbitsize != rbitsize
|| lunsignedp != runsignedp || offset != 0)
|| lunsignedp != runsignedp || offset != 0
|| TREE_CODE (rinner) == PLACEHOLDER_EXPR)
return 0;
}
@ -2936,7 +2945,8 @@ decode_field_reference (exp, pbitsize, pbitpos, pmode, punsignedp,
inner = get_inner_reference (exp, pbitsize, pbitpos, &offset, pmode,
punsignedp, pvolatilep, &alignment);
if ((inner == exp && and_mask == 0)
|| *pbitsize < 0 || offset != 0)
|| *pbitsize < 0 || offset != 0
|| TREE_CODE (inner) == PLACEHOLDER_EXPR)
return 0;
/* Compute the mask to access the bitfield. */
@ -3305,15 +3315,11 @@ make_range (exp, pin_p, plow, phigh)
/* A range without an upper bound is, naturally, unbounded.
Since convert would have cropped a very large value, use
the max value for the destination type. */
the max value for the destination type. */
high_positive
= TYPE_MAX_VALUE (equiv_type) ? TYPE_MAX_VALUE (equiv_type)
: TYPE_MAX_VALUE (type);
high_positive = TYPE_MAX_VALUE (equiv_type);
if (!high_positive)
{
high_positive = TYPE_MAX_VALUE (type);
if (!high_positive)
abort();
}
high_positive = fold (build (RSHIFT_EXPR, type,
convert (type, high_positive),
convert (type, integer_one_node)));
@ -3517,7 +3523,7 @@ merge_ranges (pin_p, plow, phigh, in0_p, low0, high0, in1_p, low1, high1)
end of the second. */
if (no_overlap)
in_p = 1, low = low1, high = high1;
else if (subset)
else if (subset || highequal)
in_p = 0, low = high = 0;
else
{
@ -4051,6 +4057,103 @@ fold_truthop (code, truth_type, lhs, rhs)
const_binop (BIT_IOR_EXPR, l_const, r_const, 0));
}
/* Optimize T, which is a comparison of a MIN_EXPR or MAX_EXPR with a
constant. */
static tree
optimize_minmax_comparison (t)
tree t;
{
tree type = TREE_TYPE (t);
tree arg0 = TREE_OPERAND (t, 0);
enum tree_code op_code;
tree comp_const = TREE_OPERAND (t, 1);
tree minmax_const;
int consts_equal, consts_lt;
tree inner;
STRIP_SIGN_NOPS (arg0);
op_code = TREE_CODE (arg0);
minmax_const = TREE_OPERAND (arg0, 1);
consts_equal = tree_int_cst_equal (minmax_const, comp_const);
consts_lt = tree_int_cst_lt (minmax_const, comp_const);
inner = TREE_OPERAND (arg0, 0);
/* If something does not permit us to optimize, return the original tree. */
if ((op_code != MIN_EXPR && op_code != MAX_EXPR)
|| TREE_CODE (comp_const) != INTEGER_CST
|| TREE_CONSTANT_OVERFLOW (comp_const)
|| TREE_CODE (minmax_const) != INTEGER_CST
|| TREE_CONSTANT_OVERFLOW (minmax_const))
return t;
/* Now handle all the various comparison codes. We only handle EQ_EXPR
and GT_EXPR, doing the rest with recursive calls using logical
simplifications. */
switch (TREE_CODE (t))
{
case NE_EXPR: case LT_EXPR: case LE_EXPR:
return
invert_truthvalue (optimize_minmax_comparison (invert_truthvalue (t)));
case GE_EXPR:
return
fold (build (TRUTH_ORIF_EXPR, type,
optimize_minmax_comparison
(build (EQ_EXPR, type, arg0, comp_const)),
optimize_minmax_comparison
(build (GT_EXPR, type, arg0, comp_const))));
case EQ_EXPR:
if (op_code == MAX_EXPR && consts_equal)
/* MAX (X, 0) == 0 -> X <= 0 */
return fold (build (LE_EXPR, type, inner, comp_const));
else if (op_code == MAX_EXPR && consts_lt)
/* MAX (X, 0) == 5 -> X == 5 */
return fold (build (EQ_EXPR, type, inner, comp_const));
else if (op_code == MAX_EXPR)
/* MAX (X, 0) == -1 -> false */
return omit_one_operand (type, integer_zero_node, inner);
else if (consts_equal)
/* MIN (X, 0) == 0 -> X >= 0 */
return fold (build (GE_EXPR, type, inner, comp_const));
else if (consts_lt)
/* MIN (X, 0) == 5 -> false */
return omit_one_operand (type, integer_zero_node, inner);
else
/* MIN (X, 0) == -1 -> X == -1 */
return fold (build (EQ_EXPR, type, inner, comp_const));
case GT_EXPR:
if (op_code == MAX_EXPR && (consts_equal || consts_lt))
/* MAX (X, 0) > 0 -> X > 0
MAX (X, 0) > 5 -> X > 5 */
return fold (build (GT_EXPR, type, inner, comp_const));
else if (op_code == MAX_EXPR)
/* MAX (X, 0) > -1 -> true */
return omit_one_operand (type, integer_one_node, inner);
else if (op_code == MIN_EXPR && (consts_equal || consts_lt))
/* MIN (X, 0) > 0 -> false
MIN (X, 0) > 5 -> false */
return omit_one_operand (type, integer_zero_node, inner);
else
/* MIN (X, 0) > -1 -> X > -1 */
return fold (build (GT_EXPR, type, inner, comp_const));
default:
return t;
}
}
/* If T contains a COMPOUND_EXPR which was inserted merely to evaluate
S, a SAVE_EXPR, return the expression actually being evaluated. Note
that we may sometimes modify the tree. */
@ -4182,7 +4285,7 @@ fold (expr)
/* Don't use STRIP_NOPS, because signedness of argument type matters. */
if (arg0 != 0)
STRIP_TYPE_NOPS (arg0);
STRIP_SIGN_NOPS (arg0);
if (arg0 != 0 && TREE_CODE (arg0) == COMPLEX_CST)
subop = TREE_REALPART (arg0);
@ -4216,7 +4319,7 @@ fold (expr)
{
/* Signedness matters here. Perhaps we can refine this
later. */
STRIP_TYPE_NOPS (op);
STRIP_SIGN_NOPS (op);
}
else
{
@ -5925,6 +6028,76 @@ fold (expr)
}
}
/* If this is an EQ or NE comparison of a constant with a PLUS_EXPR or
a MINUS_EXPR of a constant, we can convert it into a comparison with
a revised constant as long as no overflow occurs. */
if ((code == EQ_EXPR || code == NE_EXPR)
&& TREE_CODE (arg1) == INTEGER_CST
&& (TREE_CODE (arg0) == PLUS_EXPR
|| TREE_CODE (arg0) == MINUS_EXPR)
&& TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST
&& 0 != (tem = const_binop (TREE_CODE (arg0) == PLUS_EXPR
? MINUS_EXPR : PLUS_EXPR,
arg1, TREE_OPERAND (arg0, 1), 0))
&& ! TREE_CONSTANT_OVERFLOW (tem))
return fold (build (code, type, TREE_OPERAND (arg0, 0), tem));
/* Similarly for a NEGATE_EXPR. */
else if ((code == EQ_EXPR || code == NE_EXPR)
&& TREE_CODE (arg0) == NEGATE_EXPR
&& TREE_CODE (arg1) == INTEGER_CST
&& 0 != (tem = fold (build1 (NEGATE_EXPR, TREE_TYPE (arg1),
arg1)))
&& TREE_CODE (tem) == INTEGER_CST
&& ! TREE_CONSTANT_OVERFLOW (tem))
return fold (build (code, type, TREE_OPERAND (arg0, 0), tem));
/* If we have X - Y == 0, we can convert that to X == Y and similarly
for !=. Don't do this for ordered comparisons due to overflow. */
else if ((code == NE_EXPR || code == EQ_EXPR)
&& integer_zerop (arg1) && TREE_CODE (arg0) == MINUS_EXPR)
return fold (build (code, type,
TREE_OPERAND (arg0, 0), TREE_OPERAND (arg0, 1)));
/* If we are widening one operand of an integer comparison,
see if the other operand is similarly being widened. Perhaps we
can do the comparison in the narrower type. */
else if (TREE_CODE (TREE_TYPE (arg0)) == INTEGER_TYPE
&& TREE_CODE (arg0) == NOP_EXPR
&& (tem = get_unwidened (arg0, NULL_TREE)) != arg0
&& (t1 = get_unwidened (arg1, TREE_TYPE (tem))) != 0
&& (TREE_TYPE (t1) == TREE_TYPE (tem)
|| (TREE_CODE (t1) == INTEGER_CST
&& int_fits_type_p (t1, TREE_TYPE (tem)))))
return fold (build (code, type, tem, convert (TREE_TYPE (tem), t1)));
/* If this is comparing a constant with a MIN_EXPR or a MAX_EXPR of a
constant, we can simplify it. */
else if (TREE_CODE (arg1) == INTEGER_CST
&& (TREE_CODE (arg0) == MIN_EXPR
|| TREE_CODE (arg0) == MAX_EXPR)
&& TREE_CODE (TREE_OPERAND (arg0, 1)) == INTEGER_CST)
return optimize_minmax_comparison (t);
/* If we are comparing an ABS_EXPR with a constant, we can
convert all the cases into explicit comparisons, but they may
well not be faster than doing the ABS and one comparison.
But ABS (X) <= C is a range comparison, which becomes a subtraction
and a comparison, and is probably faster. */
else if (code == LE_EXPR && TREE_CODE (arg1) == INTEGER_CST
&& TREE_CODE (arg0) == ABS_EXPR
&& ! TREE_SIDE_EFFECTS (arg0))
{
tree inner = TREE_OPERAND (arg0, 0);
tem = fold (build1 (NEGATE_EXPR, TREE_TYPE (arg1), arg1));
if (TREE_CODE (tem) == INTEGER_CST
&& ! TREE_CONSTANT_OVERFLOW (tem))
return fold (build (TRUTH_ANDIF_EXPR, type,
build (GE_EXPR, type, inner, tem),
build (LE_EXPR, type, inner, arg1)));
}
/* If this is an EQ or NE comparison with zero and ARG0 is
(1 << foo) & bar, convert it to (bar >> foo) & 1. Both require
two operations, but the latter can be done in one less insn
@ -6076,35 +6249,93 @@ fold (expr)
}
}
/* An unsigned <= 0x7fffffff can be simplified. */
/* Comparisons with the highest or lowest possible integer of
the specified size will have known values and an unsigned
<= 0x7fffffff can be simplified. */
{
int width = TYPE_PRECISION (TREE_TYPE (arg1));
int width = GET_MODE_BITSIZE (TYPE_MODE (TREE_TYPE (arg1)));
if (TREE_CODE (arg1) == INTEGER_CST
&& ! TREE_CONSTANT_OVERFLOW (arg1)
&& width <= HOST_BITS_PER_WIDE_INT
&& TREE_INT_CST_LOW (arg1) == ((HOST_WIDE_INT) 1 << (width - 1)) - 1
&& TREE_INT_CST_HIGH (arg1) == 0
&& (INTEGRAL_TYPE_P (TREE_TYPE (arg1))
|| POINTER_TYPE_P (TREE_TYPE (arg1)))
&& TREE_UNSIGNED (TREE_TYPE (arg1)))
|| POINTER_TYPE_P (TREE_TYPE (arg1))))
{
switch (TREE_CODE (t))
{
case LE_EXPR:
return fold (build (GE_EXPR, type,
convert (signed_type (TREE_TYPE (arg0)),
arg0),
convert (signed_type (TREE_TYPE (arg1)),
integer_zero_node)));
case GT_EXPR:
return fold (build (LT_EXPR, type,
convert (signed_type (TREE_TYPE (arg0)),
arg0),
convert (signed_type (TREE_TYPE (arg1)),
integer_zero_node)));
default:
break;
}
if (TREE_INT_CST_HIGH (arg1) == 0
&& (TREE_INT_CST_LOW (arg1)
== ((HOST_WIDE_INT) 1 << (width - 1)) - 1)
&& ! TREE_UNSIGNED (TREE_TYPE (arg1)))
switch (TREE_CODE (t))
{
case GT_EXPR:
return omit_one_operand (type,
convert (type, integer_zero_node),
arg0);
case GE_EXPR:
TREE_SET_CODE (t, EQ_EXPR);
break;
case LE_EXPR:
return omit_one_operand (type,
convert (type, integer_one_node),
arg0);
case LT_EXPR:
TREE_SET_CODE (t, NE_EXPR);
break;
default:
break;
}
else if (TREE_INT_CST_HIGH (arg1) == -1
&& (- TREE_INT_CST_LOW (arg1)
== ((HOST_WIDE_INT) 1 << (width - 1)))
&& ! TREE_UNSIGNED (TREE_TYPE (arg1)))
switch (TREE_CODE (t))
{
case LT_EXPR:
return omit_one_operand (type,
convert (type, integer_zero_node),
arg0);
case LE_EXPR:
TREE_SET_CODE (t, EQ_EXPR);
break;
case GE_EXPR:
return omit_one_operand (type,
convert (type, integer_one_node),
arg0);
case GT_EXPR:
TREE_SET_CODE (t, NE_EXPR);
break;
default:
break;
}
else if (TREE_INT_CST_HIGH (arg1) == 0
&& (TREE_INT_CST_LOW (arg1)
== ((HOST_WIDE_INT) 1 << (width - 1)) - 1)
&& TREE_UNSIGNED (TREE_TYPE (arg1)))
switch (TREE_CODE (t))
{
case LE_EXPR:
return fold (build (GE_EXPR, type,
convert (signed_type (TREE_TYPE (arg0)),
arg0),
convert (signed_type (TREE_TYPE (arg1)),
integer_zero_node)));
case GT_EXPR:
return fold (build (LT_EXPR, type,
convert (signed_type (TREE_TYPE (arg0)),
arg0),
convert (signed_type (TREE_TYPE (arg1)),
integer_zero_node)));
default:
break;
}
}
}
@ -6268,6 +6499,7 @@ fold (expr)
/* Note that it is safe to invert for real values here because we
will check below in the one case that it matters. */
t1 = NULL_TREE;
invert = 0;
if (code == NE_EXPR || code == GE_EXPR)
{

View File

@ -940,6 +940,15 @@ find_temp_slot_from_address (x)
return p;
}
/* If we have a sum involving a register, see if it points to a temp
slot. */
if (GET_CODE (x) == PLUS && GET_CODE (XEXP (x, 0)) == REG
&& (p = find_temp_slot_from_address (XEXP (x, 0))) != 0)
return p;
else if (GET_CODE (x) == PLUS && GET_CODE (XEXP (x, 1)) == REG
&& (p = find_temp_slot_from_address (XEXP (x, 1))) != 0)
return p;
return 0;
}
@ -950,11 +959,34 @@ void
update_temp_slot_address (old, new)
rtx old, new;
{
struct temp_slot *p = find_temp_slot_from_address (old);
struct temp_slot *p;
/* If none, return. Else add NEW as an alias. */
if (p == 0)
if (rtx_equal_p (old, new))
return;
p = find_temp_slot_from_address (old);
/* If we didn't find one, see if both OLD and NEW are a PLUS and if
there is a register in common between them. If so, try a recursive
call on those values. */
if (p == 0)
{
if (GET_CODE (old) != PLUS || GET_CODE (new) != PLUS)
return;
if (rtx_equal_p (XEXP (old, 0), XEXP (new, 0)))
update_temp_slot_address (XEXP (old, 1), XEXP (new, 1));
else if (rtx_equal_p (XEXP (old, 1), XEXP (new, 0)))
update_temp_slot_address (XEXP (old, 0), XEXP (new, 1));
else if (rtx_equal_p (XEXP (old, 0), XEXP (new, 1)))
update_temp_slot_address (XEXP (old, 1), XEXP (new, 0));
else if (rtx_equal_p (XEXP (old, 1), XEXP (new, 1)))
update_temp_slot_address (XEXP (old, 0), XEXP (new, 0));
return;
}
/* Otherwise add an alias for the temp's address. */
else if (p->address == 0)
p->address = new;
else
@ -2665,9 +2697,11 @@ gen_mem_addressof (reg, decl)
tree type = TREE_TYPE (decl);
rtx r = gen_rtx_ADDRESSOF (Pmode, gen_reg_rtx (GET_MODE (reg)),
REGNO (reg), decl);
/* If the original REG was a user-variable, then so is the REG whose
address is being taken. */
address is being taken. Likewise for unchanging. */
REG_USERVAR_P (XEXP (r, 0)) = REG_USERVAR_P (reg);
RTX_UNCHANGING_P (XEXP (r, 0)) = RTX_UNCHANGING_P (reg);
PUT_CODE (reg, MEM);
PUT_MODE (reg, DECL_MODE (decl));
@ -3422,17 +3456,20 @@ instantiate_virtual_regs_1 (loc, object, extra_insns)
if (new)
{
rtx src = SET_SRC (x);
instantiate_virtual_regs_1 (&src, NULL_RTX, 0);
/* The only valid sources here are PLUS or REG. Just do
the simplest possible thing to handle them. */
if (GET_CODE (SET_SRC (x)) != REG
&& GET_CODE (SET_SRC (x)) != PLUS)
if (GET_CODE (src) != REG && GET_CODE (src) != PLUS)
abort ();
start_sequence ();
if (GET_CODE (SET_SRC (x)) != REG)
temp = force_operand (SET_SRC (x), NULL_RTX);
if (GET_CODE (src) != REG)
temp = force_operand (src, NULL_RTX);
else
temp = SET_SRC (x);
temp = src;
temp = force_operand (plus_constant (temp, offset), NULL_RTX);
seq = get_insns ();
end_sequence ();

View File

@ -100,6 +100,10 @@ static char dir_separator_str[] = {DIR_SEPARATOR, 0};
#define MIN_FATAL_STATUS 1
/* Flag saying to pass the greatest exit code returned by a sub-process
to the calling program. */
static int pass_exit_codes;
/* Flag saying to print the directories gcc will search through looking for
programs, libraries, etc. */
@ -168,6 +172,10 @@ static char *cross_compile = "0";
run if this is non-zero. */
static int error_count = 0;
/* Greatest exit code of sub-processes that has been encountered up to
now. */
static int greatest_status = 1;
/* This is the obstack which we use to allocate many strings. */
static struct obstack obstack;
@ -769,6 +777,7 @@ static const char *link_command_spec = "\
%{!A:%{!nostdlib:%{!nostartfiles:%S}}}\
%{static:} %{L*} %o\
%{!nostdlib:%{!nodefaultlibs:%G %L %G}}\
{"--pass-exit-codes", "-pass-exit-codes", 0},
%{!A:%{!nostdlib:%{!nostartfiles:%E}}}\
%{T*}\
\n }}}}}}";
@ -2363,7 +2372,11 @@ execute ()
}
else if (WIFEXITED (status)
&& WEXITSTATUS (status) >= MIN_FATAL_STATUS)
ret_code = -1;
{
if (WEXITSTATUS (status) > greatest_status)
greatest_status = WEXITSTATUS (status);
ret_code = -1;
}
}
#ifdef HAVE_GETRUSAGE
if (report_times && ut + st != 0)
@ -2490,6 +2503,7 @@ display_help ()
printf ("Usage: %s [options] file...\n", programname);
printf ("Options:\n");
printf (" -pass-exit-codes Exit with highest error code from a phase\n");
printf (" --help Display this information\n");
if (! verbose_flag)
printf (" (Use '-v --help' to display command line options of sub-processes)\n");
@ -2793,6 +2807,11 @@ process_command (argc, argv)
add_assembler_option ("--help", 6);
add_linker_option ("--help", 6);
}
else if (! strcmp (argv[i], "-pass-exit-codes"))
{
pass_exit_codes = 1;
n_switches++;
}
else if (! strcmp (argv[i], "-print-search-dirs"))
print_search_dirs = 1;
else if (! strcmp (argv[i], "-print-libgcc-file-name"))
@ -3086,6 +3105,8 @@ process_command (argc, argv)
/* Use 2 as fourth arg meaning try just the machine as a suffix,
as well as trying the machine and the version. */
#ifndef OS2
add_prefix (&exec_prefixes, standard_exec_prefix, "GCC",
0, 1, warn_std_ptr);
add_prefix (&exec_prefixes, standard_exec_prefix, "BINUTILS",
0, 2, warn_std_ptr);
add_prefix (&exec_prefixes, standard_exec_prefix_1, "BINUTILS",
@ -3161,6 +3182,8 @@ process_command (argc, argv)
;
else if (! strncmp (argv[i], "-Wp,", 4))
;
else if (! strcmp (argv[i], "-pass-exit-codes"))
;
else if (! strcmp (argv[i], "-print-search-dirs"))
;
else if (! strcmp (argv[i], "-print-libgcc-file-name"))
@ -5198,7 +5221,9 @@ main (argc, argv)
printf ("<URL:http://www.gnu.org/software/gcc/faq.html#bugreport>\n");
}
return (error_count > 0 ? (signal_count ? 2 : 1) : 0);
return (signal_count != 0 ? 2
: error_count > 0 ? (pass_exit_codes ? greatest_status : 1)
: 0);
}
/* Find the proper compilation spec for the file name NAME,

View File

@ -275,6 +275,9 @@ fnotice VPROTO ((FILE *file, const char *msgid, ...))
va_end (ap);
}
#ifndef DIR_SEPARATOR
#define DIR_SEPARATOR '/'
#endif
/* More 'friendly' abort that prints the line and file.
config.h can #define abort fancy_abort if you like that sort of thing. */
@ -283,7 +286,7 @@ extern void fancy_abort PROTO ((void)) ATTRIBUTE_NORETURN;
void
fancy_abort ()
{
fnotice (stderr, "Internal gcc abort.\n");
fnotice (stderr, "Internal gcov abort.\n");
exit (FATAL_EXIT_CODE);
}
@ -407,7 +410,7 @@ open_files ()
else
strcat (bbg_file_name, ".bbg");
bb_file = fopen (bb_file_name, "r");
bb_file = fopen (bb_file_name, "rb");
if (bb_file == NULL)
{
fnotice (stderr, "Could not open basic block file %s.\n", bb_file_name);
@ -416,14 +419,14 @@ open_files ()
/* If none of the functions in the file were executed, then there won't
be a .da file. Just assume that all counts are zero in this case. */
da_file = fopen (da_file_name, "r");
da_file = fopen (da_file_name, "rb");
if (da_file == NULL)
{
fnotice (stderr, "Could not open data file %s.\n", da_file_name);
fnotice (stderr, "Assuming that all execution counts are zero.\n");
}
bbg_file = fopen (bbg_file_name, "r");
bbg_file = fopen (bbg_file_name, "rb");
if (bbg_file == NULL)
{
fnotice (stderr, "Could not open program flow graph file %s.\n",
@ -1000,7 +1003,13 @@ output_data ()
{
/* If this is a relative file name, and an object directory has been
specified, then make it relative to the object directory name. */
if (*s_ptr->name != '/' && object_directory != 0
if (! (*s_ptr->name == '/' || *s_ptr->name == DIR_SEPARATOR
/* Check for disk name on MS-DOS-based systems. */
|| (DIR_SEPARATOR == '\\'
&& s_ptr->name[1] == ':'
&& (s_ptr->name[2] == DIR_SEPARATOR
|| s_ptr->name[2] == '/')))
&& object_directory != 0
&& *object_directory != '\0')
{
int objdir_count = strlen (object_directory);

View File

@ -1,22 +1,22 @@
/* Simple garbage collection for the GNU compiler.
Copyright (C) 1999 Free Software Foundation, Inc.
This file is part of GNU CC.
This file is part of GNU CC.
GNU CC is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2, or (at your option)
any later version.
GNU CC is free software; you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by the
Free Software Foundation; either version 2, or (at your option) any
later version.
GNU CC is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
GNU CC is distributed in the hope that it will be useful, but WITHOUT
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
for more details.
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to
the Free Software Foundation, 59 Temple Place - Suite 330,
Boston, MA 02111-1307, USA. */
You should have received a copy of the GNU General Public License
along with GNU CC; see the file COPYING. If not, write to the Free
Software Foundation, 59 Temple Place - Suite 330, Boston, MA
02111-1307, USA. */
/* Generic garbage collection (GC) functions and data, not specific to
any particular GC implementation. */
@ -63,7 +63,7 @@ static void
ggc_mark_rtx_ptr (elt)
void *elt;
{
ggc_mark_rtx (*(rtx *)elt);
ggc_mark_rtx (*(rtx *) elt);
}
/* Type-correct function to pass to ggc_add_root. It just forwards
@ -73,7 +73,7 @@ static void
ggc_mark_tree_ptr (elt)
void *elt;
{
ggc_mark_tree (*(tree *)elt);
ggc_mark_tree (*(tree *) elt);
}
/* Type-correct function to pass to ggc_add_root. It just forwards
@ -83,7 +83,7 @@ static void
ggc_mark_tree_varray_ptr (elt)
void *elt;
{
ggc_mark_tree_varray (*(varray_type *)elt);
ggc_mark_tree_varray (*(varray_type *) elt);
}
/* Type-correct function to pass to ggc_add_root. It just forwards
@ -104,7 +104,7 @@ static void
ggc_mark_string_ptr (elt)
void *elt;
{
ggc_mark_string (*(char **)elt);
ggc_mark_string (*(char **) elt);
}
/* Add BASE as a new garbage collection root. It is an array of

View File

@ -69,7 +69,7 @@ static void integrate_parm_decls PROTO((tree, struct inline_remap *,
static tree integrate_decl_tree PROTO((tree,
struct inline_remap *));
static void subst_constants PROTO((rtx *, rtx,
struct inline_remap *));
struct inline_remap *, int));
static void set_block_origin_self PROTO((tree));
static void set_decl_origin_self PROTO((tree));
static void set_block_abstract_flags PROTO((tree, int));
@ -153,7 +153,8 @@ function_cannot_inline_p (fndecl)
return N_("function with nested functions cannot be inline");
if (forced_labels)
return N_("function with label addresses used in initializers cannot inline");
return
N_("function with label addresses used in initializers cannot inline");
if (current_function_cannot_inline)
return current_function_cannot_inline;
@ -769,11 +770,19 @@ expand_inline_function (fndecl, parms, target, ignore, type,
}
else if (GET_CODE (loc) == MEM)
{
/* This is the case of a parameter that lives in memory.
It will live in the block we allocate in the called routine's
/* This is the case of a parameter that lives in memory. It
will live in the block we allocate in the called routine's
frame that simulates the incoming argument area. Do nothing
now; we will call store_expr later. */
;
with the parameter now; we will call store_expr later. In
this case, however, we must ensure that the virtual stack and
incoming arg rtx values are expanded now so that we can be
sure we have enough slots in the const equiv map since the
store_expr call can easily blow the size estimate. */
if (DECL_FRAME_SIZE (fndecl) != 0)
copy_rtx_and_substitute (virtual_stack_vars_rtx, map, 0);
if (DECL_SAVED_INSNS (fndecl)->args_size != 0)
copy_rtx_and_substitute (virtual_incoming_args_rtx, map, 0);
}
else if (GET_CODE (loc) == REG)
process_reg_param (map, loc, copy);
@ -816,8 +825,8 @@ expand_inline_function (fndecl, parms, target, ignore, type,
/* Compute the address in the area we reserved and store the
value there. */
temp = copy_rtx_and_substitute (loc, map);
subst_constants (&temp, NULL_RTX, map);
temp = copy_rtx_and_substitute (loc, map, 1);
subst_constants (&temp, NULL_RTX, map, 1);
apply_change_group ();
if (! memory_address_p (GET_MODE (temp), XEXP (temp, 0)))
temp = change_address (temp, VOIDmode, XEXP (temp, 0));
@ -841,8 +850,8 @@ expand_inline_function (fndecl, parms, target, ignore, type,
{
if (GET_CODE (XEXP (loc, 0)) == ADDRESSOF)
{
temp = copy_rtx_and_substitute (loc, map);
subst_constants (&temp, NULL_RTX, map);
temp = copy_rtx_and_substitute (loc, map, 1);
subst_constants (&temp, NULL_RTX, map, 1);
apply_change_group ();
target = temp;
}
@ -882,8 +891,8 @@ expand_inline_function (fndecl, parms, target, ignore, type,
}
else
{
temp = copy_rtx_and_substitute (loc, map);
subst_constants (&temp, NULL_RTX, map);
temp = copy_rtx_and_substitute (loc, map, 1);
subst_constants (&temp, NULL_RTX, map, 0);
apply_change_group ();
emit_move_insn (temp, structure_value_addr);
}
@ -1033,7 +1042,7 @@ expand_inline_function (fndecl, parms, target, ignore, type,
/* If we must not delete the source,
load it into a new temporary. */
copy = emit_insn (copy_rtx_and_substitute (pattern, map));
copy = emit_insn (copy_rtx_and_substitute (pattern, map, 0));
new_set = single_set (copy);
if (new_set == 0)
@ -1046,7 +1055,7 @@ expand_inline_function (fndecl, parms, target, ignore, type,
has a note on it, keep the insn. */
else if (rtx_equal_p (SET_DEST (set), SET_SRC (set))
&& REG_NOTES (insn) != 0)
copy = emit_insn (copy_rtx_and_substitute (pattern, map));
copy = emit_insn (copy_rtx_and_substitute (pattern, map, 0));
else
break;
}
@ -1066,13 +1075,44 @@ expand_inline_function (fndecl, parms, target, ignore, type,
&& rtx_equal_p (SET_SRC (set),
static_chain_incoming_rtx))
{
rtx newdest = copy_rtx_and_substitute (SET_DEST (set), map);
rtx newdest = copy_rtx_and_substitute (SET_DEST (set), map, 1);
copy = emit_move_insn (newdest, static_chain_value);
static_chain_value = 0;
}
/* If this is setting the virtual stack vars register, this must
be the code at the handler for a builtin longjmp. The value
saved in the setjmp buffer will be the address of the frame
we've made for this inlined instance within our frame. But we
know the offset of that value so we can use it to reconstruct
our virtual stack vars register from that value. If we are
copying it from the stack pointer, leave it unchanged. */
else if (set != 0
&& rtx_equal_p (SET_DEST (set), virtual_stack_vars_rtx))
{
temp = map->reg_map[REGNO (SET_DEST (set))];
temp = VARRAY_CONST_EQUIV (map->const_equiv_varray,
REGNO (temp)).rtx;
if (GET_CODE (temp) != PLUS
|| ! rtx_equal_p (XEXP (temp, 0), virtual_stack_vars_rtx)
|| GET_CODE (XEXP (temp, 1)) != CONST_INT)
abort ();
if (rtx_equal_p (SET_SRC (set), stack_pointer_rtx))
temp = SET_SRC (set);
else
temp
= force_operand (plus_constant (SET_SRC (set),
- INTVAL (XEXP (temp, 1))),
NULL_RTX);
copy = emit_move_insn (SET_DEST (set), temp);
}
else
copy = emit_insn (copy_rtx_and_substitute (pattern, map));
copy = emit_insn (copy_rtx_and_substitute (pattern, map, 0));
/* REG_NOTES will be copied later. */
#ifdef HAVE_cc0
@ -1113,7 +1153,7 @@ expand_inline_function (fndecl, parms, target, ignore, type,
pattern = gen_jump (local_return_label);
}
else
pattern = copy_rtx_and_substitute (PATTERN (insn), map);
pattern = copy_rtx_and_substitute (PATTERN (insn), map, 0);
copy = emit_jump_insn (pattern);
@ -1149,13 +1189,14 @@ expand_inline_function (fndecl, parms, target, ignore, type,
break;
case CALL_INSN:
pattern = copy_rtx_and_substitute (PATTERN (insn), map);
pattern = copy_rtx_and_substitute (PATTERN (insn), map, 0);
copy = emit_call_insn (pattern);
/* Because the USAGE information potentially contains objects other
than hard registers, we need to copy it. */
CALL_INSN_FUNCTION_USAGE (copy)
= copy_rtx_and_substitute (CALL_INSN_FUNCTION_USAGE (insn), map);
= copy_rtx_and_substitute (CALL_INSN_FUNCTION_USAGE (insn),
map, 0);
#ifdef HAVE_cc0
if (cc0_insn)
@ -1238,10 +1279,11 @@ expand_inline_function (fndecl, parms, target, ignore, type,
&& map->insn_map[INSN_UID (insn)]
&& REG_NOTES (insn))
{
rtx tem = copy_rtx_and_substitute (REG_NOTES (insn), map);
rtx tem = copy_rtx_and_substitute (REG_NOTES (insn), map, 0);
/* We must also do subst_constants, in case one of our parameters
has const type and constant value. */
subst_constants (&tem, NULL_RTX, map);
subst_constants (&tem, NULL_RTX, map, 0);
apply_change_group ();
REG_NOTES (map->insn_map[INSN_UID (insn)]) = tem;
}
@ -1334,7 +1376,7 @@ integrate_parm_decls (args, map, arg_vector)
register tree decl = build_decl (VAR_DECL, DECL_NAME (tail),
TREE_TYPE (tail));
rtx new_decl_rtl
= copy_rtx_and_substitute (RTVEC_ELT (arg_vector, i), map);
= copy_rtx_and_substitute (RTVEC_ELT (arg_vector, i), map, 1);
DECL_ARG_TYPE (decl) = DECL_ARG_TYPE (tail);
/* We really should be setting DECL_INCOMING_RTL to something reasonable
@ -1349,7 +1391,7 @@ integrate_parm_decls (args, map, arg_vector)
debugging information contains the actual register, instead of the
virtual register. Do this by not passing an insn to
subst_constants. */
subst_constants (&new_decl_rtl, NULL_RTX, map);
subst_constants (&new_decl_rtl, NULL_RTX, map, 1);
apply_change_group ();
DECL_RTL (decl) = new_decl_rtl;
}
@ -1385,12 +1427,13 @@ integrate_decl_tree (let, map)
if (DECL_RTL (t) != 0)
{
DECL_RTL (d) = copy_rtx_and_substitute (DECL_RTL (t), map);
DECL_RTL (d) = copy_rtx_and_substitute (DECL_RTL (t), map, 1);
/* Fully instantiate the address with the equivalent form so that the
debugging information contains the actual register, instead of the
virtual register. Do this by not passing an insn to
subst_constants. */
subst_constants (&DECL_RTL (d), NULL_RTX, map);
subst_constants (&DECL_RTL (d), NULL_RTX, map, 1);
apply_change_group ();
}
/* These args would always appear unused, if not for this. */
@ -1437,21 +1480,26 @@ integrate_decl_tree (let, map)
return new_block;
}
/* Create a new copy of an rtx.
Recursively copies the operands of the rtx,
/* Create a new copy of an rtx. Recursively copies the operands of the rtx,
except for those few rtx codes that are sharable.
We always return an rtx that is similar to that incoming rtx, with the
exception of possibly changing a REG to a SUBREG or vice versa. No
rtl is ever emitted.
If FOR_LHS is nonzero, if means we are processing something that will
be the LHS of a SET. In that case, we copy RTX_UNCHANGING_P even if
inlining since we need to be conservative in how it is set for
such cases.
Handle constants that need to be placed in the constant pool by
calling `force_const_mem'. */
rtx
copy_rtx_and_substitute (orig, map)
copy_rtx_and_substitute (orig, map, for_lhs)
register rtx orig;
struct inline_remap *map;
int for_lhs;
{
register rtx copy, temp;
register int i, j;
@ -1593,7 +1641,7 @@ copy_rtx_and_substitute (orig, map)
return map->reg_map[regno];
case SUBREG:
copy = copy_rtx_and_substitute (SUBREG_REG (orig), map);
copy = copy_rtx_and_substitute (SUBREG_REG (orig), map, for_lhs);
/* SUBREG is ordinary, but don't make nested SUBREGs. */
if (GET_CODE (copy) == SUBREG)
return gen_rtx_SUBREG (GET_MODE (orig), SUBREG_REG (copy),
@ -1616,7 +1664,8 @@ copy_rtx_and_substitute (orig, map)
case ADDRESSOF:
copy = gen_rtx_ADDRESSOF (mode,
copy_rtx_and_substitute (XEXP (orig, 0), map),
copy_rtx_and_substitute (XEXP (orig, 0),
map, for_lhs),
0, ADDRESSOF_DECL(orig));
regno = ADDRESSOF_REGNO (orig);
if (map->reg_map[regno])
@ -1644,7 +1693,7 @@ copy_rtx_and_substitute (orig, map)
to (use foo) if the original insn didn't have a subreg.
Removing the subreg distorts the VAX movstrhi pattern
by changing the mode of an operand. */
copy = copy_rtx_and_substitute (XEXP (orig, 0), map);
copy = copy_rtx_and_substitute (XEXP (orig, 0), map, code == CLOBBER);
if (GET_CODE (copy) == SUBREG && GET_CODE (XEXP (orig, 0)) != SUBREG)
copy = SUBREG_REG (copy);
return gen_rtx_fmt_e (code, VOIDmode, copy);
@ -1697,7 +1746,9 @@ copy_rtx_and_substitute (orig, map)
if (inlining)
{
rtx temp = force_const_mem (const_mode,
copy_rtx_and_substitute (constant, map));
copy_rtx_and_substitute (constant,
map, 0));
#if 0
/* Legitimizing the address here is incorrect.
@ -1723,9 +1774,9 @@ copy_rtx_and_substitute (orig, map)
return temp;
}
else if (GET_CODE (constant) == LABEL_REF)
return XEXP (force_const_mem (GET_MODE (orig),
copy_rtx_and_substitute (constant,
map)),
return XEXP (force_const_mem
(GET_MODE (orig),
copy_rtx_and_substitute (constant, map, for_lhs)),
0);
}
else
@ -1792,8 +1843,8 @@ copy_rtx_and_substitute (orig, map)
(GET_MODE (orig),
gen_rtx_MEM (GET_MODE (XEXP (orig, 0)),
copy_rtx_and_substitute (XEXP (XEXP (orig, 0), 0),
map)),
copy_rtx_and_substitute (XEXP (orig, 1), map));
map, 0)),
copy_rtx_and_substitute (XEXP (orig, 1), map, 0));
break;
#if 0
@ -1815,19 +1866,25 @@ copy_rtx_and_substitute (orig, map)
rtx equiv_loc;
HOST_WIDE_INT loc_offset;
copy_rtx_and_substitute (SET_DEST (orig), map);
copy_rtx_and_substitute (SET_DEST (orig), map, for_lhs);
equiv_reg = map->reg_map[REGNO (SET_DEST (orig))];
equiv_loc = VARRAY_CONST_EQUIV (map->const_equiv_varray, REGNO (equiv_reg)).rtx;
equiv_loc = VARRAY_CONST_EQUIV (map->const_equiv_varray,
REGNO (equiv_reg)).rtx;
loc_offset
= GET_CODE (equiv_loc) == REG ? 0 : INTVAL (XEXP (equiv_loc, 1));
return gen_rtx_SET (VOIDmode, SET_DEST (orig),
force_operand
(plus_constant
(copy_rtx_and_substitute (SET_SRC (orig), map),
(copy_rtx_and_substitute (SET_SRC (orig),
map, 0),
- loc_offset),
NULL_RTX));
}
else
return gen_rtx_SET (VOIDmode,
copy_rtx_and_substitute (SET_DEST (orig), map, 1),
copy_rtx_and_substitute (SET_SRC (orig), map, 0));
break;
case MEM:
@ -1835,9 +1892,13 @@ copy_rtx_and_substitute (orig, map)
&& GET_CODE (XEXP (orig, 0)) == SYMBOL_REF
&& CONSTANT_POOL_ADDRESS_P (XEXP (orig, 0)))
{
enum machine_mode const_mode = get_pool_mode_for_function (inlining, XEXP (orig, 0));
rtx constant = get_pool_constant_for_function (inlining, XEXP (orig, 0));
constant = copy_rtx_and_substitute (constant, map);
enum machine_mode const_mode
= get_pool_mode_for_function (inlining, XEXP (orig, 0));
rtx constant
= get_pool_constant_for_function (inlining, XEXP (orig, 0));
constant = copy_rtx_and_substitute (constant, map, 0);
/* If this was an address of a constant pool entry that itself
had to be placed in the constant pool, it might not be a
valid address. So the recursive call might have turned it
@ -1846,22 +1907,16 @@ copy_rtx_and_substitute (orig, map)
MEM into a REG, but we'll assume that it safe. */
if (! CONSTANT_P (constant))
return constant;
return validize_mem (force_const_mem (const_mode, constant));
}
copy = rtx_alloc (MEM);
PUT_MODE (copy, mode);
XEXP (copy, 0) = copy_rtx_and_substitute (XEXP (orig, 0), map);
XEXP (copy, 0) = copy_rtx_and_substitute (XEXP (orig, 0), map, 0);
MEM_COPY_ATTRIBUTES (copy, orig);
MEM_ALIAS_SET (copy) = MEM_ALIAS_SET (orig);
/* If doing function inlining, this MEM might not be const in the
function that it is being inlined into, and thus may not be
unchanging after function inlining. Constant pool references are
handled elsewhere, so this doesn't lose RTX_UNCHANGING_P bits
for them. */
if (! map->integrating)
RTX_UNCHANGING_P (copy) = RTX_UNCHANGING_P (orig);
RTX_UNCHANGING_P (copy) = RTX_UNCHANGING_P (orig);
return copy;
default:
@ -1886,7 +1941,8 @@ copy_rtx_and_substitute (orig, map)
break;
case 'e':
XEXP (copy, i) = copy_rtx_and_substitute (XEXP (orig, i), map);
XEXP (copy, i)
= copy_rtx_and_substitute (XEXP (orig, i), map, for_lhs);
break;
case 'u':
@ -1902,7 +1958,8 @@ copy_rtx_and_substitute (orig, map)
XVEC (copy, i) = rtvec_alloc (XVECLEN (orig, i));
for (j = 0; j < XVECLEN (copy, i); j++)
XVECEXP (copy, i, j)
= copy_rtx_and_substitute (XVECEXP (orig, i, j), map);
= copy_rtx_and_substitute (XVECEXP (orig, i, j),
map, for_lhs);
}
break;
@ -1947,9 +2004,14 @@ try_constants (insn, map)
int i;
map->num_sets = 0;
subst_constants (&PATTERN (insn), insn, map);
/* Apply the changes if they are valid; otherwise discard them. */
/* First try just updating addresses, then other things. This is
important when we have something like the store of a constant
into memory and we can update the memory address but the machine
does not support a constant source. */
subst_constants (&PATTERN (insn), insn, map, 1);
apply_change_group ();
subst_constants (&PATTERN (insn), insn, map, 0);
apply_change_group ();
/* Show we don't know the value of anything stored or clobbered. */
@ -1996,16 +2058,19 @@ try_constants (insn, map)
into insns; cse will do the latter task better.
This function is also used to adjust address of items previously addressed
via the virtual stack variable or virtual incoming arguments registers. */
via the virtual stack variable or virtual incoming arguments registers.
If MEMONLY is nonzero, only make changes inside a MEM. */
static void
subst_constants (loc, insn, map)
subst_constants (loc, insn, map, memonly)
rtx *loc;
rtx insn;
struct inline_remap *map;
int memonly;
{
rtx x = *loc;
register int i;
register int i, j;
register enum rtx_code code;
register const char *format_ptr;
int num_changes = num_validated_changes ();
@ -2027,7 +2092,8 @@ subst_constants (loc, insn, map)
#ifdef HAVE_cc0
case CC0:
validate_change (insn, loc, map->last_cc0_value, 1);
if (! memonly)
validate_change (insn, loc, map->last_cc0_value, 1);
return;
#endif
@ -2036,24 +2102,25 @@ subst_constants (loc, insn, map)
/* The only thing we can do with a USE or CLOBBER is possibly do
some substitutions in a MEM within it. */
if (GET_CODE (XEXP (x, 0)) == MEM)
subst_constants (&XEXP (XEXP (x, 0), 0), insn, map);
subst_constants (&XEXP (XEXP (x, 0), 0), insn, map, 0);
return;
case REG:
/* Substitute for parms and known constants. Don't replace
hard regs used as user variables with constants. */
{
int regno = REGNO (x);
struct const_equiv_data *p;
if (! memonly)
{
int regno = REGNO (x);
struct const_equiv_data *p;
if (! (regno < FIRST_PSEUDO_REGISTER && REG_USERVAR_P (x))
&& (size_t) regno < VARRAY_SIZE (map->const_equiv_varray)
&& (p = &VARRAY_CONST_EQUIV (map->const_equiv_varray, regno),
p->rtx != 0)
&& p->age >= map->const_age)
validate_change (insn, loc, p->rtx, 1);
return;
}
if (! (regno < FIRST_PSEUDO_REGISTER && REG_USERVAR_P (x))
&& (size_t) regno < VARRAY_SIZE (map->const_equiv_varray)
&& (p = &VARRAY_CONST_EQUIV (map->const_equiv_varray, regno),
p->rtx != 0)
&& p->age >= map->const_age)
validate_change (insn, loc, p->rtx, 1);
}
return;
case SUBREG:
/* SUBREG applied to something other than a reg
@ -2061,7 +2128,7 @@ subst_constants (loc, insn, map)
be a special hack and we don't know how to treat it specially.
Consider for example mulsidi3 in m68k.md.
Ordinary SUBREG of a REG needs this special treatment. */
if (GET_CODE (SUBREG_REG (x)) == REG)
if (! memonly && GET_CODE (SUBREG_REG (x)) == REG)
{
rtx inner = SUBREG_REG (x);
rtx new = 0;
@ -2071,7 +2138,7 @@ subst_constants (loc, insn, map)
see what is inside, try to form the new SUBREG and see if that is
valid. We handle two cases: extracting a full word in an
integral mode and extracting the low part. */
subst_constants (&inner, NULL_RTX, map);
subst_constants (&inner, NULL_RTX, map, 0);
if (GET_MODE_CLASS (GET_MODE (x)) == MODE_INT
&& GET_MODE_SIZE (GET_MODE (x)) == UNITS_PER_WORD
@ -2091,11 +2158,11 @@ subst_constants (loc, insn, map)
break;
case MEM:
subst_constants (&XEXP (x, 0), insn, map);
subst_constants (&XEXP (x, 0), insn, map, 0);
/* If a memory address got spoiled, change it back. */
if (insn != 0 && num_validated_changes () != num_changes
&& !memory_address_p (GET_MODE (x), XEXP (x, 0)))
if (! memonly && insn != 0 && num_validated_changes () != num_changes
&& ! memory_address_p (GET_MODE (x), XEXP (x, 0)))
cancel_changes (num_changes);
return;
@ -2108,7 +2175,7 @@ subst_constants (loc, insn, map)
rtx dest = *dest_loc;
rtx src, tem;
subst_constants (&SET_SRC (x), insn, map);
subst_constants (&SET_SRC (x), insn, map, memonly);
src = SET_SRC (x);
while (GET_CODE (*dest_loc) == ZERO_EXTRACT
@ -2117,15 +2184,15 @@ subst_constants (loc, insn, map)
{
if (GET_CODE (*dest_loc) == ZERO_EXTRACT)
{
subst_constants (&XEXP (*dest_loc, 1), insn, map);
subst_constants (&XEXP (*dest_loc, 2), insn, map);
subst_constants (&XEXP (*dest_loc, 1), insn, map, memonly);
subst_constants (&XEXP (*dest_loc, 2), insn, map, memonly);
}
dest_loc = &XEXP (*dest_loc, 0);
}
/* Do substitute in the address of a destination in memory. */
if (GET_CODE (*dest_loc) == MEM)
subst_constants (&XEXP (*dest_loc, 0), insn, map);
subst_constants (&XEXP (*dest_loc, 0), insn, map, 0);
/* Check for the case of DEST a SUBREG, both it and the underlying
register are less than one word, and the SUBREG has the wider mode.
@ -2187,7 +2254,7 @@ subst_constants (loc, insn, map)
case 'e':
if (XEXP (x, i))
subst_constants (&XEXP (x, i), insn, map);
subst_constants (&XEXP (x, i), insn, map, memonly);
break;
case 'u':
@ -2199,11 +2266,9 @@ subst_constants (loc, insn, map)
case 'E':
if (XVEC (x, i) != NULL && XVECLEN (x, i) != 0)
{
int j;
for (j = 0; j < XVECLEN (x, i); j++)
subst_constants (&XVECEXP (x, i, j), insn, map);
}
for (j = 0; j < XVECLEN (x, i); j++)
subst_constants (&XVECEXP (x, i, j), insn, map, memonly);
break;
default:
@ -2213,7 +2278,8 @@ subst_constants (loc, insn, map)
/* If this is a commutative operation, move a constant to the second
operand unless the second operand is already a CONST_INT. */
if ((GET_RTX_CLASS (code) == 'c' || code == NE || code == EQ)
if (! memonly
&& (GET_RTX_CLASS (code) == 'c' || code == NE || code == EQ)
&& CONSTANT_P (XEXP (x, 0)) && GET_CODE (XEXP (x, 1)) != CONST_INT)
{
rtx tem = XEXP (x, 0);
@ -2222,45 +2288,49 @@ subst_constants (loc, insn, map)
}
/* Simplify the expression in case we put in some constants. */
switch (GET_RTX_CLASS (code))
{
case '1':
if (op0_mode == MAX_MACHINE_MODE)
abort ();
new = simplify_unary_operation (code, GET_MODE (x),
XEXP (x, 0), op0_mode);
break;
case '<':
if (! memonly)
switch (GET_RTX_CLASS (code))
{
enum machine_mode op_mode = GET_MODE (XEXP (x, 0));
if (op_mode == VOIDmode)
op_mode = GET_MODE (XEXP (x, 1));
new = simplify_relational_operation (code, op_mode,
XEXP (x, 0), XEXP (x, 1));
case '1':
if (op0_mode == MAX_MACHINE_MODE)
abort ();
new = simplify_unary_operation (code, GET_MODE (x),
XEXP (x, 0), op0_mode);
break;
case '<':
{
enum machine_mode op_mode = GET_MODE (XEXP (x, 0));
if (op_mode == VOIDmode)
op_mode = GET_MODE (XEXP (x, 1));
new = simplify_relational_operation (code, op_mode,
XEXP (x, 0), XEXP (x, 1));
#ifdef FLOAT_STORE_FLAG_VALUE
if (new != 0 && GET_MODE_CLASS (GET_MODE (x)) == MODE_FLOAT)
if (new != 0 && GET_MODE_CLASS (GET_MODE (x)) == MODE_FLOAT)
new = ((new == const0_rtx) ? CONST0_RTX (GET_MODE (x))
: CONST_DOUBLE_FROM_REAL_VALUE (FLOAT_STORE_FLAG_VALUE,
GET_MODE (x)));
#endif
break;
break;
}
case '2':
case 'c':
new = simplify_binary_operation (code, GET_MODE (x),
XEXP (x, 0), XEXP (x, 1));
break;
case '2':
case 'c':
new = simplify_binary_operation (code, GET_MODE (x),
XEXP (x, 0), XEXP (x, 1));
break;
case 'b':
case '3':
if (op0_mode == MAX_MACHINE_MODE)
abort ();
new = simplify_ternary_operation (code, GET_MODE (x), op0_mode,
XEXP (x, 0), XEXP (x, 1), XEXP (x, 2));
break;
}
case 'b':
case '3':
if (op0_mode == MAX_MACHINE_MODE)
abort ();
new = simplify_ternary_operation (code, GET_MODE (x), op0_mode,
XEXP (x, 0), XEXP (x, 1),
XEXP (x, 2));
break;
}
if (new)
validate_change (insn, loc, new, 1);

View File

@ -114,7 +114,7 @@ struct inline_remap
/* Return a copy of an rtx (as needed), substituting pseudo-register,
labels, and frame-pointer offsets as necessary. */
extern rtx copy_rtx_and_substitute PROTO((rtx, struct inline_remap *));
extern rtx copy_rtx_and_substitute PROTO((rtx, struct inline_remap *, int));
extern void try_constants PROTO((rtx, struct inline_remap *));

View File

@ -88,7 +88,7 @@ in the following sections.
@item Overall Options
@xref{Overall Options,,Options Controlling the Kind of Output}.
@smallexample
-c -S -E -o @var{file} -pipe -v --help -x @var{language}
-c -S -E -o @var{file} -pipe -pass-exit-codes -v --help -x @var{language}
@end smallexample
@item C Language Options
@ -428,7 +428,7 @@ in the following sections.
@xref{Code Gen Options,,Options for Code Generation Conventions}.
@smallexample
-fcall-saved-@var{reg} -fcall-used-@var{reg}
-fexceptions -ffixed-@var{reg} -finhibit-size-directive
-fexceptions -funwind-tables -ffixed-@var{reg} -finhibit-size-directive
-fcheck-memory-usage -fprefix-function-name
-fno-common -fno-ident -fno-gnu-linker
-fpcc-struct-return -fpic -fPIC
@ -527,6 +527,13 @@ assembler assembler-with-cpp
Turn off any specification of a language, so that subsequent files are
handled according to their file name suffixes (as they are if @samp{-x}
has not been used at all).
@item -pass-exit-codes
Normally the @code{gcc} program will exit with the code of 1 if any
phase of the compiler returns a non-success return code. If you specify
@samp{-pass-exit-codes}, the @code{gcc} program will instead return with
numerically highest error produced by any phase that returned an error
indication.
@end table
If you only want some of the stages of compilation, you can use
@ -6693,6 +6700,12 @@ properly with exception handlers written in C++. You may also wish to
disable this option if you are compiling older C++ programs that don't
use exception handling.
@item -funwind-tables
Similar to @code{-fexceptions}, except that it will just generate any needed
static data, but will not affect the generated code in any other way.
You will normally not enable this option; instead, a language processor
that needs this handling would enable it on your behalf.
@item -fpcc-struct-return
Return ``short'' @code{struct} and @code{union} values in memory like
longer ones, rather than in registers. This convention is less

View File

@ -1028,26 +1028,52 @@ jump_optimize_1 (f, cross_jump, noop_moves, after_regscan, mark_labels_only)
need to remove the BARRIER if we succeed. We can only
have one such jump since there must be a label after
the BARRIER and it's either ours, in which case it's the
only one or some other, in which case we'd fail. */
only one or some other, in which case we'd fail.
Likewise if it's a CALL_INSN followed by a BARRIER. */
if (simplejump_p (temp1))
changed_jump = temp1;
if (simplejump_p (temp1)
|| (GET_CODE (temp1) == CALL_INSN
&& NEXT_INSN (temp1) != 0
&& GET_CODE (NEXT_INSN (temp1)) == BARRIER))
{
if (changed_jump == 0)
changed_jump = temp1;
else
changed_jump
= gen_rtx_INSN_LIST (VOIDmode, temp1, changed_jump);
}
/* See if we are allowed another insn and if this insn
if one we think we may be able to handle. */
if (++num_insns > BRANCH_COST
|| last_insn
|| (temp2 = single_set (temp1)) == 0
|| side_effects_p (SET_SRC (temp2))
|| may_trap_p (SET_SRC (temp2)))
failed = 1;
else
|| (((temp2 = single_set (temp1)) == 0
|| side_effects_p (SET_SRC (temp2))
|| may_trap_p (SET_SRC (temp2)))
&& GET_CODE (temp1) != CALL_INSN))
failed = 1;
else if (temp2 != 0)
validate_change (temp1, &SET_SRC (temp2),
gen_rtx_IF_THEN_ELSE
(GET_MODE (SET_DEST (temp2)),
copy_rtx (ourcond),
SET_SRC (temp2), SET_DEST (temp2)),
1);
else
{
/* This is a CALL_INSN that doesn't have a SET. */
rtx *call_loc = &PATTERN (temp1);
if (GET_CODE (*call_loc) == PARALLEL)
call_loc = &XVECEXP (*call_loc, 0, 0);
validate_change (temp1, call_loc,
gen_rtx_IF_THEN_ELSE
(VOIDmode, copy_rtx (ourcond),
*call_loc, const0_rtx),
1);
}
if (modified_in_p (ourcond, temp1))
last_insn = 1;
@ -1073,10 +1099,13 @@ jump_optimize_1 (f, cross_jump, noop_moves, after_regscan, mark_labels_only)
if (changed_jump != 0)
{
if (GET_CODE (NEXT_INSN (changed_jump)) != BARRIER)
abort ();
while (GET_CODE (changed_jump) == INSN_LIST)
{
delete_barrier (NEXT_INSN (XEXP (changed_jump, 0)));
changed_jump = XEXP (changed_jump, 1);
}
delete_insn (NEXT_INSN (changed_jump));
delete_barrier (NEXT_INSN (changed_jump));
}
delete_insn (insn);
@ -4037,6 +4066,18 @@ delete_jump (insn)
delete_computation (insn);
}
/* Verify INSN is a BARRIER and delete it. */
void
delete_barrier (insn)
rtx insn;
{
if (GET_CODE (insn) != BARRIER)
abort ();
delete_insn (insn);
}
/* Recursively delete prior insns that compute the value (used only by INSN
which the caller is deleting) stored in the register mentioned by NOTE
which is a REG_DEAD note associated with INSN. */

View File

@ -1514,7 +1514,7 @@ __bb_exit_func (void)
/* If the file exists, and the number of counts in it is the same,
then merge them in. */
if ((da_file = fopen (ptr->filename, "r")) != 0)
if ((da_file = fopen (ptr->filename, "rb")) != 0)
{
long n_counts = 0;
@ -1547,7 +1547,7 @@ __bb_exit_func (void)
fprintf (stderr, "arc profiling: Error closing output file %s.\n",
ptr->filename);
}
if ((da_file = fopen (ptr->filename, "w")) == 0)
if ((da_file = fopen (ptr->filename, "wb")) == 0)
{
fprintf (stderr, "arc profiling: Can't open output file %s.\n",
ptr->filename);

View File

@ -197,6 +197,11 @@ static int loop_mems_allocated;
static int unknown_address_altered;
/* The above doesn't count any readonly memory locations that are stored.
This does. */
static int unknown_constant_address_altered;
/* Count of movable (i.e. invariant) instructions discovered in the loop. */
static int num_movables;
@ -2386,9 +2391,9 @@ constant_high_bytes (p, loop_start)
/* Scan a loop setting the elements `cont', `vtop', `loops_enclosed',
`has_call', `has_volatile', and `has_tablejump' within LOOP_INFO.
Set the global variables `unknown_address_altered' and
`num_mem_sets'. Also, fill in the array `loop_mems' and the list
`loop_store_mems'. */
Set the global variables `unknown_address_altered',
`unknown_constant_address_altered', and `num_mem_sets'. Also, fill
in the array `loop_mems' and the list `loop_store_mems'. */
static void
prescan_loop (start, end, loop_info)
@ -2414,6 +2419,7 @@ prescan_loop (start, end, loop_info)
loop_info->vtop = 0;
unknown_address_altered = 0;
unknown_constant_address_altered = 0;
loop_store_mems = NULL_RTX;
first_loop_store_insn = NULL_RTX;
loop_mems_idx = 0;
@ -3166,11 +3172,15 @@ note_addr_stored (x, y, data)
num_mem_sets++;
/* BLKmode MEM means all memory is clobbered. */
if (GET_MODE (x) == BLKmode)
unknown_address_altered = 1;
if (GET_MODE (x) == BLKmode)
{
if (RTX_UNCHANGING_P (x))
unknown_constant_address_altered = 1;
else
unknown_address_altered = 1;
if (unknown_address_altered)
return;
return;
}
loop_store_mems = gen_rtx_EXPR_LIST (VOIDmode, x, loop_store_mems);
}
@ -3275,20 +3285,12 @@ invariant_p (x)
return VARRAY_INT (set_in_loop, REGNO (x)) == 0;
case MEM:
/* Volatile memory references must be rejected. Do this before
checking for read-only items, so that volatile read-only items
will be rejected also. */
if (MEM_VOLATILE_P (x))
return 0;
/* Read-only items (such as constants in a constant pool) are
invariant if their address is. */
if (RTX_UNCHANGING_P (x))
break;
/* If we had a subroutine call, any location in memory could have been
clobbered. */
if (unknown_address_altered)
/* If we had a subroutine call, any location in memory could
have been clobbered. We used to test here for volatile and
readonly, but true_dependence knows how to do that better
than we do. */
if (RTX_UNCHANGING_P (x)
? unknown_constant_address_altered : unknown_address_altered)
return 0;
/* See if there is any dependence between a store and this load. */
@ -3298,6 +3300,7 @@ invariant_p (x)
if (true_dependence (XEXP (mem_list_entry, 0), VOIDmode,
x, rtx_varies_p))
return 0;
mem_list_entry = XEXP (mem_list_entry, 1);
}
@ -8013,6 +8016,7 @@ check_dbra_loop (loop_end, insn_count, loop_start, loop_info)
reversible_mem_store
= (! unknown_address_altered
&& ! unknown_constant_address_altered
&& ! invariant_p (XEXP (XEXP (loop_store_mems, 0), 0)));
/* If the store depends on a register that is set after the

View File

@ -2263,10 +2263,29 @@ expand_abs (mode, op0, target, safe)
if (temp != 0)
return temp;
/* If we have a MAX insn, we can do this as MAX (x, -x). */
if (smax_optab->handlers[(int) mode].insn_code != CODE_FOR_nothing)
{
rtx last = get_last_insn ();
temp = expand_unop (mode, neg_optab, op0, NULL_RTX, 0);
if (temp != 0)
temp = expand_binop (mode, smax_optab, op0, temp, target, 0,
OPTAB_WIDEN);
if (temp != 0)
return temp;
delete_insns_since (last);
}
/* If this machine has expensive jumps, we can do integer absolute
value of X as (((signed) x >> (W-1)) ^ x) - ((signed) x >> (W-1)),
where W is the width of MODE. */
where W is the width of MODE. But don't do this if the machine has
conditional arithmetic since the branches will be converted into
a conditional negation insn. */
#ifndef HAVE_conditional_arithmetic
if (GET_MODE_CLASS (mode) == MODE_INT && BRANCH_COST >= 2)
{
rtx extended = expand_shift (RSHIFT_EXPR, mode, op0,
@ -2282,6 +2301,7 @@ expand_abs (mode, op0, target, safe)
if (temp != 0)
return temp;
}
#endif
/* If that does not win, use conditional jump and negate. */

View File

@ -336,6 +336,17 @@ extern rtx peephole PROTO((rtx));
/* Write all the constants in the constant pool. */
extern void output_constant_pool PROTO((char *, tree));
/* Return nonzero if VALUE is a valid constant-valued expression
for use in initializing a static variable; one that can be an
element of a "constant" initializer.
Return null_pointer_node if the value is absolute;
if it is relocatable, return the variable that determines the relocation.
We assume that VALUE has been folded as much as possible;
therefore, we do not need to check for such things as
arithmetic-combinations of integers. */
extern tree initializer_constant_valid_p PROTO((tree, tree));
/* Output assembler code for constant EXP to FILE, with no label.
This includes the pseudo-op such as ".int" or ".byte", and a newline.
Assumes output_addressed_constants has been done on EXP already.

View File

@ -1423,7 +1423,7 @@ init_branch_prob (filename)
strcpy (data_file, filename);
strip_off_ending (data_file, len);
strcat (data_file, ".bb");
if ((bb_file = fopen (data_file, "w")) == 0)
if ((bb_file = fopen (data_file, "wb")) == 0)
pfatal_with_name (data_file);
/* Open an output file for the program flow graph. */
@ -1432,7 +1432,7 @@ init_branch_prob (filename)
strcpy (bbg_file_name, filename);
strip_off_ending (bbg_file_name, len);
strcat (bbg_file_name, ".bbg");
if ((bbg_file = fopen (bbg_file_name, "w")) == 0)
if ((bbg_file = fopen (bbg_file_name, "wb")) == 0)
pfatal_with_name (bbg_file_name);
/* Initialize to zero, to ensure that the first file name will be
@ -1447,7 +1447,7 @@ init_branch_prob (filename)
strcpy (da_file_name, filename);
strip_off_ending (da_file_name, len);
strcat (da_file_name, ".da");
if ((da_file = fopen (da_file_name, "r")) == 0)
if ((da_file = fopen (da_file_name, "rb")) == 0)
warning ("file %s not found, execution counts assumed to be zero.",
da_file_name);

View File

@ -1020,6 +1020,11 @@ register_operand (op, mode)
op = SUBREG_REG (op);
}
/* If we have an ADDRESSOF, consider it valid since it will be
converted into something that will not be a MEM. */
if (GET_CODE (op) == ADDRESSOF)
return 1;
/* We don't consider registers whose class is NO_REGS
to be a register operand. */
return (GET_CODE (op) == REG

View File

@ -1436,14 +1436,19 @@ record_reg_classes (n_alts, n_ops, ops, modes, subreg_changes_size,
struct costs *pp = &this_op_costs[i];
for (class = 0; class < N_REG_CLASSES; class++)
pp->cost[class] = may_move_cost[class][(int) classes[i]];
pp->cost[class]
= (recog_data.operand_type[i] == OP_IN
? may_move_cost[class][(int) classes[i]]
: move_cost[(int) classes[i]][class]);
/* If the alternative actually allows memory, make things
a bit cheaper since we won't need an extra insn to
load it. */
pp->mem_cost = (MEMORY_MOVE_COST (mode, classes[i], 1)
- allows_mem);
pp->mem_cost
= (MEMORY_MOVE_COST (mode, classes[i],
recog_data.operand_type[i] == OP_IN)
- allows_mem);
/* If we have assigned a class to this register in our
first pass, add a cost to this alternative corresponding
@ -1452,7 +1457,8 @@ record_reg_classes (n_alts, n_ops, ops, modes, subreg_changes_size,
if (prefclass)
alt_cost
+= may_move_cost[(unsigned char)prefclass[REGNO (op)]][(int) classes[i]];
+= (may_move_cost[(unsigned char) prefclass[REGNO (op)]]
[(int) classes[i]]);
}
}

View File

@ -2662,15 +2662,6 @@ find_reloads (insn, replace, ind_levels, live_known, reload_reg_p)
&& reg_alternate_class (REGNO (recog_data.operand[i])) == NO_REGS);
}
#ifdef HAVE_cc0
/* If we made any reloads for addresses, see if they violate a
"no input reloads" requirement for this insn. */
if (no_input_reloads)
for (i = 0; i < n_reloads; i++)
if (rld[i].in != 0)
abort ();
#endif
/* If this is simply a copy from operand 1 to operand 0, merge the
preferred classes for the operands. */
if (set != 0 && noperands >= 2 && recog_data.operand[0] == SET_DEST (set)
@ -4112,6 +4103,18 @@ find_reloads (insn, replace, ind_levels, live_known, reload_reg_p)
rld[j].in = 0;
}
#ifdef HAVE_cc0
/* If we made any reloads for addresses, see if they violate a
"no input reloads" requirement for this insn. But loads that we
do after the insn (such as for output addresses) are fine. */
if (no_input_reloads)
for (i = 0; i < n_reloads; i++)
if (reload_in[i] != 0
&& reload_when_needed[i] != RELOAD_FOR_OUTADDR_ADDRESS
&& reload_when_needed[i] != RELOAD_FOR_OUTPUT_ADDRESS)
abort ();
#endif
/* Set which reloads must use registers not used in any group. Start
with those that conflict with a group and then include ones that
conflict with ones that are already known to conflict with a group. */

View File

@ -806,30 +806,28 @@ read_rtx (infile)
tmp_code = UNKNOWN;
for (i=0; i < NUM_RTX_CODE; i++) /* @@ might speed this search up */
{
if (!(strcmp (tmp_char, GET_RTX_NAME (i))))
{
tmp_code = (RTX_CODE) i; /* get value for name */
break;
}
}
for (i = 0; i < NUM_RTX_CODE; i++)
if (! strcmp (tmp_char, GET_RTX_NAME (i)))
{
tmp_code = (RTX_CODE) i; /* get value for name */
break;
}
if (tmp_code == UNKNOWN)
{
fprintf (stderr,
"Unknown rtx read in rtl.read_rtx(). Code name was %s .",
tmp_char);
}
fatal ("Unknown rtx read in rtl.read_rtx(). Code name was %s .", tmp_char);
/* (NIL) stands for an expression that isn't there. */
if (tmp_code == NIL)
{
/* Discard the closeparen. */
while ((c = getc (infile)) && c != ')');
while ((c = getc (infile)) && c != ')')
;
return 0;
}
return_rtx = rtx_alloc (tmp_code); /* if we end up with an insn expression
then we free this space below. */
/* If we end up with an insn expression then we free this space below. */
return_rtx = rtx_alloc (tmp_code);
format_ptr = GET_RTX_FORMAT (GET_CODE (return_rtx));
/* If what follows is `: mode ', read it and
@ -838,13 +836,16 @@ read_rtx (infile)
i = read_skip_spaces (infile);
if (i == ':')
{
register int k;
read_name (tmp_char, infile);
for (k = 0; k < NUM_MACHINE_MODES; k++)
if (!strcmp (GET_MODE_NAME (k), tmp_char))
for (j = 0; j < NUM_MACHINE_MODES; j++)
if (! strcmp (GET_MODE_NAME (j), tmp_char))
break;
PUT_MODE (return_rtx, (enum machine_mode) k );
if (j == MAX_MACHINE_MODE)
fatal ("Unknown mode read in rtl.read_rtx(). Mode name was %s.",
tmp_char);
PUT_MODE (return_rtx, (enum machine_mode) j);
}
else
ungetc (i, infile);

View File

@ -1059,6 +1059,7 @@ extern rtx find_equiv_reg PROTO((rtx, rtx, enum reg_class, int, short *, int, e
extern rtx squeeze_notes PROTO((rtx, rtx));
extern rtx delete_insn PROTO((rtx));
extern void delete_jump PROTO((rtx));
extern void delete_barrier PROTO((rtx));
extern rtx get_label_before PROTO((rtx));
extern rtx get_label_after PROTO((rtx));
extern rtx follow_jumps PROTO((rtx));

View File

@ -403,6 +403,7 @@ reg_referenced_p (x, body)
case CALL:
case USE:
case IF_THEN_ELSE:
return reg_overlap_mentioned_p (x, body);
case TRAP_IF:

View File

@ -1543,17 +1543,15 @@ simplify_plus_minus (code, mode, op0, op1)
struct cfc_args
{
/* Input */
rtx op0, op1;
/* Output */
int equal, op0lt, op1lt;
rtx op0, op1; /* Input */
int equal, op0lt, op1lt; /* Output */
};
static void
check_fold_consts (data)
PTR data;
{
struct cfc_args * args = (struct cfc_args *) data;
struct cfc_args *args = (struct cfc_args *) data;
REAL_VALUE_TYPE d0, d1;
REAL_VALUE_FROM_CONST_DOUBLE (d0, args->op0);

View File

@ -1115,25 +1115,18 @@ fixup_gotos (thisblock, stack_level, cleanup_list, first_insn, dont_jump_in)
{
register rtx cleanup_insns;
/* Get the first non-label after the label
this goto jumps to. If that's before this scope begins,
we don't have a jump into the scope. */
rtx after_label = f->target_rtl;
while (after_label != 0 && GET_CODE (after_label) == CODE_LABEL)
after_label = NEXT_INSN (after_label);
/* If this fixup jumped into this contour from before the beginning
of this contour, report an error. */
of this contour, report an error. This code used to use
the first non-label insn after f->target_rtl, but that's
wrong since such can be added, by things like put_var_into_stack
and have INSN_UIDs that are out of the range of the block. */
/* ??? Bug: this does not detect jumping in through intermediate
blocks that have stack levels or cleanups.
It detects only a problem with the innermost block
around the label. */
if (f->target != 0
&& (dont_jump_in || stack_level || cleanup_list)
/* If AFTER_LABEL is 0, it means the jump goes to the end
of the rtl, which means it jumps into this scope. */
&& (after_label == 0
|| INSN_UID (first_insn) < INSN_UID (after_label))
&& INSN_UID (first_insn) < INSN_UID (f->target_rtl)
&& INSN_UID (first_insn) > INSN_UID (f->before_jump)
&& ! DECL_ERROR_ISSUED (f->target))
{
@ -1345,6 +1338,7 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
for (tail = clobbers; tail; tail = TREE_CHAIN (tail))
{
char *regname = TREE_STRING_POINTER (TREE_VALUE (tail));
i = decode_reg_name (regname);
if (i >= 0 || i == -4)
++nclobbers;
@ -1372,11 +1366,13 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
while (tmp)
{
char *constraint = TREE_STRING_POINTER (TREE_PURPOSE (tmp));
if (n_occurrences (',', constraint) != nalternatives)
{
error ("operand constraints for `asm' differ in number of alternatives");
return;
}
if (TREE_CHAIN (tmp))
tmp = TREE_CHAIN (tmp);
else
@ -1405,7 +1401,7 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
the worst that happens if we get it wrong is we issue an error
message. */
c_len = TREE_STRING_LENGTH (TREE_PURPOSE (tail)) - 1;
c_len = strlen (TREE_STRING_POINTER (TREE_PURPOSE (tail)));
constraint = TREE_STRING_POINTER (TREE_PURPOSE (tail));
/* Allow the `=' or `+' to not be at the beginning of the string,
@ -1585,7 +1581,7 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
return;
}
c_len = TREE_STRING_LENGTH (TREE_PURPOSE (tail)) - 1;
c_len = strlen (TREE_STRING_POINTER (TREE_PURPOSE (tail)));
constraint = TREE_STRING_POINTER (TREE_PURPOSE (tail));
orig_constraint = constraint;
@ -1597,7 +1593,8 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
case '+': case '=': case '&':
if (constraint == orig_constraint)
{
error ("input operand constraint contains `%c'", constraint[j]);
error ("input operand constraint contains `%c'",
constraint[j]);
return;
}
break;
@ -1645,10 +1642,11 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
|| (j == 1 && c_len == 2 && constraint[0] == '%'))
{
tree o = outputs;
for (j = constraint[j] - '0'; j > 0; --j)
o = TREE_CHAIN (o);
c_len = TREE_STRING_LENGTH (TREE_PURPOSE (o)) - 1;
c_len = strlen (TREE_STRING_POINTER (TREE_PURPOSE (o)));
constraint = TREE_STRING_POINTER (TREE_PURPOSE (o));
j = 0;
break;
@ -1691,6 +1689,7 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
emit_move_insn (memloc, op);
op = memloc;
}
else if (GET_CODE (op) == MEM && MEM_VOLATILE_P (op))
/* We won't recognize volatile memory as available a
memory_operand at this point. Ignore it. */
@ -1711,8 +1710,8 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
i++;
}
/* Protect all the operands from the queue,
now that they have all been evaluated. */
/* Protect all the operands from the queue now that they have all been
evaluated. */
for (i = 0; i < ninputs - ninout; i++)
XVECEXP (body, 3, i) = protect_from_queue (XVECEXP (body, 3, i), 0);
@ -1741,20 +1740,24 @@ expand_asm_operands (string, outputs, inputs, clobbers, vol, filename, line)
XSTR (body, 1) = TREE_STRING_POINTER (TREE_PURPOSE (outputs));
insn = emit_insn (gen_rtx_SET (VOIDmode, output_rtx[0], body));
}
else if (noutputs == 0 && nclobbers == 0)
{
/* No output operands: put in a raw ASM_OPERANDS rtx. */
insn = emit_insn (body);
}
else
{
rtx obody = body;
int num = noutputs;
if (num == 0) num = 1;
if (num == 0)
num = 1;
body = gen_rtx_PARALLEL (VOIDmode, rtvec_alloc (num + nclobbers));
/* For each output operand, store a SET. */
for (i = 0, tail = outputs; tail; tail = TREE_CHAIN (tail), i++)
{
XVECEXP (body, 0, i)
@ -2692,19 +2695,25 @@ expand_value_return (val)
#ifdef PROMOTE_FUNCTION_RETURN
tree type = TREE_TYPE (DECL_RESULT (current_function_decl));
int unsignedp = TREE_UNSIGNED (type);
enum machine_mode old_mode
= DECL_MODE (DECL_RESULT (current_function_decl));
enum machine_mode mode
= promote_mode (type, DECL_MODE (DECL_RESULT (current_function_decl)),
&unsignedp, 1);
= promote_mode (type, old_mode, &unsignedp, 1);
if (GET_MODE (val) != VOIDmode && GET_MODE (val) != mode)
convert_move (return_reg, val, unsignedp);
else
if (mode != old_mode)
val = convert_modes (mode, old_mode, val, unsignedp);
#endif
if (GET_CODE (return_reg) == PARALLEL)
emit_group_load (return_reg, val, int_size_in_bytes (type),
TYPE_ALIGN (type) / BITS_PER_UNIT);
else
emit_move_insn (return_reg, val);
}
if (GET_CODE (return_reg) == REG
&& REGNO (return_reg) < FIRST_PSEUDO_REGISTER)
emit_insn (gen_rtx_USE (VOIDmode, return_reg));
/* Handle calls that return values in multiple non-contiguous locations.
The Irix 6 ABI has examples of this. */
else if (GET_CODE (return_reg) == PARALLEL)
@ -2789,6 +2798,7 @@ expand_return (retval)
run destructors on variables that might be used in the subsequent
computation of the return value. */
rtx last_insn = 0;
rtx result_rtl = DECL_RTL (DECL_RESULT (current_function_decl));
register rtx val = 0;
register rtx op0;
tree retval_rhs;
@ -2941,7 +2951,7 @@ expand_return (retval)
if (retval_rhs != 0
&& TYPE_MODE (TREE_TYPE (retval_rhs)) == BLKmode
&& GET_CODE (DECL_RTL (DECL_RESULT (current_function_decl))) == REG)
&& GET_CODE (result_rtl) == REG)
{
int i, bitpos, xbitpos;
int big_endian_correction = 0;
@ -3017,7 +3027,7 @@ expand_return (retval)
if (tmpmode == VOIDmode)
abort ();
PUT_MODE (DECL_RTL (DECL_RESULT (current_function_decl)), tmpmode);
PUT_MODE (result_rtl, tmpmode);
if (GET_MODE_SIZE (tmpmode) < GET_MODE_SIZE (word_mode))
result_reg_mode = word_mode;
@ -3038,10 +3048,13 @@ expand_return (retval)
else if (cleanups
&& retval_rhs != 0
&& TREE_TYPE (retval_rhs) != void_type_node
&& GET_CODE (DECL_RTL (DECL_RESULT (current_function_decl))) == REG)
&& (GET_CODE (result_rtl) == REG
|| (GET_CODE (result_rtl) == PARALLEL)))
{
/* Calculate the return value into a pseudo reg. */
val = gen_reg_rtx (DECL_MODE (DECL_RESULT (current_function_decl)));
/* Calculate the return value into a temporary (usually a pseudo
reg). */
val = assign_temp (TREE_TYPE (DECL_RESULT (current_function_decl)),
0, 0, 1);
val = expand_expr (retval_rhs, val, GET_MODE (val), 0);
val = force_not_mem (val);
emit_queue ();
@ -3054,7 +3067,7 @@ expand_return (retval)
calculate value into hard return reg. */
expand_expr (retval, const0_rtx, VOIDmode, 0);
emit_queue ();
expand_value_return (DECL_RTL (DECL_RESULT (current_function_decl)));
expand_value_return (result_rtl);
}
}
@ -4533,17 +4546,19 @@ pushcase (value, converter, label, duplicate)
if (index_type == error_mark_node)
return 0;
/* Convert VALUE to the type in which the comparisons are nominally done. */
if (value != 0)
value = (*converter) (nominal_type, value);
check_seenlabel ();
/* Fail if this value is out of range for the actual type of the index
(which may be narrower than NOMINAL_TYPE). */
if (value != 0 && ! int_fits_type_p (value, index_type))
if (value != 0
&& (TREE_CONSTANT_OVERFLOW (value)
|| ! int_fits_type_p (value, index_type)))
return 3;
/* Convert VALUE to the type in which the comparisons are nominally done. */
if (value != 0)
value = (*converter) (nominal_type, value);
/* Fail if this is a duplicate or overlaps another entry. */
if (value == 0)
{
@ -4606,17 +4621,14 @@ pushcase_range (value1, value2, converter, label, duplicate)
/* Fail if the range is empty. Do this before any conversion since
we want to allow out-of-range empty ranges. */
if (value2 && tree_int_cst_lt (value2, value1))
if (value2 != 0 && tree_int_cst_lt (value2, value1))
return 4;
value1 = (*converter) (nominal_type, value1);
/* If the max was unbounded, use the max of the nominal_type we are
converting to. Do this after the < check above to suppress false
positives. */
if (!value2)
if (value2 == 0)
value2 = TYPE_MAX_VALUE (nominal_type);
value2 = (*converter) (nominal_type, value2);
/* Fail if these values are out of range. */
if (TREE_CONSTANT_OVERFLOW (value1)
@ -4627,6 +4639,9 @@ pushcase_range (value1, value2, converter, label, duplicate)
|| ! int_fits_type_p (value2, index_type))
return 3;
value1 = (*converter) (nominal_type, value1);
value2 = (*converter) (nominal_type, value2);
return add_case_node (value1, value2, label, duplicate);
}

View File

@ -633,6 +633,10 @@ int flag_exceptions;
int flag_new_exceptions = 1;
/* Nonzero means generate frame unwind info table when supported */
int flag_unwind_tables = 0;
/* Nonzero means don't place uninitialized global data in common storage
by default. */
@ -937,6 +941,8 @@ lang_independent_options f_options[] =
"Enable exception handling" },
{"new-exceptions", &flag_new_exceptions, 1,
"Use the new model for exception handling" },
{"unwind-tables", &flag_unwind_tables, 1,
"Just generate unwind tables for exception handling" },
{"sjlj-exceptions", &exceptions_via_longjmp, 1,
"Use setjmp/longjmp to handle exceptions" },
{"asynchronous-exceptions", &asynchronous_exceptions, 1,
@ -3571,8 +3577,21 @@ rest_of_compilation (decl)
if (DECL_SAVED_INSNS (decl) == 0)
{
int inlinable = 0;
tree parent;
const char *lose;
/* If this is nested inside an inlined external function, pretend
it was only declared. Since we cannot inline such functions,
generating code for this one is not only not necessary but will
confuse some debugging output writers. */
for (parent = DECL_CONTEXT (current_function_decl);
parent != 0; parent = DECL_CONTEXT (parent))
if (DECL_INLINE (parent) && DECL_EXTERNAL (parent))
{
DECL_INITIAL (decl) = 0;
goto exit_rest_of_compilation;
}
/* If requested, consider whether to make this function inline. */
if (DECL_INLINE (decl) || flag_inline_functions)
TIMEVAR (integration_time,
@ -4929,6 +4948,7 @@ decode_W_option (arg)
/* Parse a -g... comand line switch. ARG is the value after the -g.
It is safe to access 'ARG - 2' to generate the full switch name.
Return the number of strings consumed. */
static int
decode_g_option (arg)
const char * arg;
@ -5012,8 +5032,7 @@ ignoring option `%s' due to invalid debug level specification",
}
if (type == NO_DEBUG)
warning ("`%s' not supported by this configuration of GCC",
arg - 2);
warning ("`%s': unknown or unsupported -g option", arg - 2);
/* Does it conflict with an already selected type? */
if (type_explicitly_set_p
@ -5046,7 +5065,7 @@ ignoring option `%s' due to invalid debug level specification",
}
if (! da->arg)
warning ("`%s' not supported by this configuration of GCC", arg - 2);
warning ("`%s': unknown or unsupported -g option", arg - 2);
return 1;
}

View File

@ -3359,12 +3359,9 @@ build_type_attribute_variant (ttype, attribute)
if ( ! attribute_list_equal (TYPE_ATTRIBUTES (ttype), attribute))
{
register int hashcode;
register struct obstack *ambient_obstack = current_obstack;
tree ntype;
if (ambient_obstack != &permanent_obstack)
current_obstack = TYPE_OBSTACK (ttype);
push_obstacks (TYPE_OBSTACK (ttype), TYPE_OBSTACK (ttype));
ntype = copy_node (ttype);
TYPE_POINTER_TO (ntype) = 0;
@ -3400,12 +3397,7 @@ build_type_attribute_variant (ttype, attribute)
ntype = type_hash_canon (hashcode, ntype);
ttype = build_qualified_type (ntype, TYPE_QUALS (ttype));
/* We must restore the current obstack after the type_hash_canon call,
because type_hash_canon calls type_hash_add for permanent types, and
then type_hash_add calls oballoc expecting to get something permanent
back. */
current_obstack = ambient_obstack;
pop_obstacks ();
}
return ttype;

View File

@ -402,6 +402,18 @@ extern void tree_class_check_failed PROTO((const tree, char,
== TYPE_MODE (TREE_TYPE (TREE_OPERAND (EXP, 0))))) \
(EXP) = TREE_OPERAND (EXP, 0);
/* Like STRIP_NOPS, but don't let the signedness change either. */
#define STRIP_SIGN_NOPS(EXP) \
while ((TREE_CODE (EXP) == NOP_EXPR \
|| TREE_CODE (EXP) == CONVERT_EXPR \
|| TREE_CODE (EXP) == NON_LVALUE_EXPR) \
&& (TYPE_MODE (TREE_TYPE (EXP)) \
== TYPE_MODE (TREE_TYPE (TREE_OPERAND (EXP, 0)))) \
&& (TREE_UNSIGNED (TREE_TYPE (EXP)) \
== TREE_UNSIGNED (TREE_TYPE (TREE_OPERAND (EXP, 0))))) \
(EXP) = TREE_OPERAND (EXP, 0);
/* Like STRIP_NOPS, but don't alter the TREE_TYPE either. */
#define STRIP_TYPE_NOPS(EXP) \
@ -861,9 +873,9 @@ struct tree_block
/* The set of type qualifiers for this type. */
#define TYPE_QUALS(NODE) \
((TYPE_READONLY(NODE) * TYPE_QUAL_CONST) | \
(TYPE_VOLATILE(NODE) * TYPE_QUAL_VOLATILE) | \
(TYPE_RESTRICT(NODE) * TYPE_QUAL_RESTRICT))
((TYPE_READONLY(NODE) * TYPE_QUAL_CONST) \
| (TYPE_VOLATILE(NODE) * TYPE_QUAL_VOLATILE) \
| (TYPE_RESTRICT(NODE) * TYPE_QUAL_RESTRICT))
/* These flags are available for each language front end to use internally. */
#define TYPE_LANG_FLAG_0(NODE) (TYPE_CHECK (NODE)->type.lang_flag_0)

View File

@ -1616,7 +1616,7 @@ initial_reg_note_copy (notes, map)
PUT_MODE (copy, GET_MODE (notes));
if (GET_CODE (notes) == EXPR_LIST)
XEXP (copy, 0) = copy_rtx_and_substitute (XEXP (notes, 0), map);
XEXP (copy, 0) = copy_rtx_and_substitute (XEXP (notes, 0), map, 0);
else if (GET_CODE (notes) == INSN_LIST)
/* Don't substitute for these yet. */
XEXP (copy, 0) = XEXP (notes, 0);
@ -1927,7 +1927,7 @@ copy_loop_body (copy_start, copy_end, map, exit_label, last_iteration,
}
else
{
pattern = copy_rtx_and_substitute (pattern, map);
pattern = copy_rtx_and_substitute (pattern, map, 0);
copy = emit_insn (pattern);
}
REG_NOTES (copy) = initial_reg_note_copy (REG_NOTES (insn), map);
@ -1974,7 +1974,7 @@ copy_loop_body (copy_start, copy_end, map, exit_label, last_iteration,
break;
case JUMP_INSN:
pattern = copy_rtx_and_substitute (PATTERN (insn), map);
pattern = copy_rtx_and_substitute (PATTERN (insn), map, 0);
copy = emit_jump_insn (pattern);
REG_NOTES (copy) = initial_reg_note_copy (REG_NOTES (insn), map);
@ -2107,14 +2107,15 @@ copy_loop_body (copy_start, copy_end, map, exit_label, last_iteration,
break;
case CALL_INSN:
pattern = copy_rtx_and_substitute (PATTERN (insn), map);
pattern = copy_rtx_and_substitute (PATTERN (insn), map, 0);
copy = emit_call_insn (pattern);
REG_NOTES (copy) = initial_reg_note_copy (REG_NOTES (insn), map);
/* Because the USAGE information potentially contains objects other
than hard registers, we need to copy it. */
CALL_INSN_FUNCTION_USAGE (copy)
= copy_rtx_and_substitute (CALL_INSN_FUNCTION_USAGE (insn), map);
= copy_rtx_and_substitute (CALL_INSN_FUNCTION_USAGE (insn),
map, 0);
#ifdef HAVE_cc0
if (cc0_insn)

View File

@ -2343,6 +2343,7 @@ struct constant_descriptor
{
struct constant_descriptor *next;
char *label;
rtx rtl;
char contents[1];
};
@ -2361,6 +2362,7 @@ mark_const_hash_entry (ptr)
while (desc)
{
ggc_mark_string (desc->label);
ggc_mark_rtx (desc->rtl);
desc = desc->next;
}
}
@ -2576,6 +2578,7 @@ compare_constant_1 (exp, p)
register tree link;
int length = list_length (CONSTRUCTOR_ELTS (exp));
tree type;
enum machine_mode mode = TYPE_MODE (TREE_TYPE (exp));
int have_purpose = 0;
for (link = CONSTRUCTOR_ELTS (exp); link; link = TREE_CHAIN (link))
@ -2599,6 +2602,14 @@ compare_constant_1 (exp, p)
if (bcmp ((char *) &type, p, sizeof type))
return 0;
if (TREE_CODE (TREE_TYPE (exp)) == ARRAY_TYPE)
{
if (bcmp ((char *) &mode, p, sizeof mode))
return 0;
p += sizeof mode;
}
p += sizeof type;
if (bcmp ((char *) &have_purpose, p, sizeof have_purpose))
@ -2715,12 +2726,14 @@ record_constant (exp)
{
struct constant_descriptor *next = 0;
char *label = 0;
rtx rtl = 0;
/* Make a struct constant_descriptor. The first two pointers will
/* Make a struct constant_descriptor. The first three pointers will
be filled in later. Here we just leave space for them. */
obstack_grow (&permanent_obstack, (char *) &next, sizeof next);
obstack_grow (&permanent_obstack, (char *) &label, sizeof label);
obstack_grow (&permanent_obstack, (char *) &rtl, sizeof rtl);
record_constant_1 (exp);
return (struct constant_descriptor *) obstack_finish (&permanent_obstack);
}
@ -2785,6 +2798,7 @@ record_constant_1 (exp)
{
register tree link;
int length = list_length (CONSTRUCTOR_ELTS (exp));
enum machine_mode mode = TYPE_MODE (TREE_TYPE (exp));
tree type;
int have_purpose = 0;
@ -2795,14 +2809,18 @@ record_constant_1 (exp)
obstack_grow (&permanent_obstack, (char *) &length, sizeof length);
/* For record constructors, insist that the types match.
For arrays, just verify both constructors are for arrays.
Then insist that either both or none have any TREE_PURPOSE
values. */
For arrays, just verify both constructors are for arrays
of the same mode. Then insist that either both or none
have any TREE_PURPOSE values. */
if (TREE_CODE (TREE_TYPE (exp)) == RECORD_TYPE)
type = TREE_TYPE (exp);
else
type = 0;
obstack_grow (&permanent_obstack, (char *) &type, sizeof type);
if (TREE_CODE (TREE_TYPE (exp)) == ARRAY_TYPE)
obstack_grow (&permanent_obstack, &mode, sizeof mode);
obstack_grow (&permanent_obstack, (char *) &have_purpose,
sizeof have_purpose);
@ -3027,9 +3045,8 @@ output_constant_def (exp)
register int hash;
register struct constant_descriptor *desc;
char label[256];
char *found = 0;
int reloc;
register rtx def;
int found = 1;
if (TREE_CST_RTL (exp))
return TREE_CST_RTL (exp);
@ -3047,12 +3064,9 @@ output_constant_def (exp)
for (desc = const_hash_table[hash]; desc; desc = desc->next)
if (compare_constant (exp, desc))
{
found = desc->label;
break;
}
break;
if (found == 0)
if (desc == 0)
{
/* No constant equal to EXP is known to have been output.
Make a constant descriptor to enter EXP in the hash table.
@ -3066,23 +3080,30 @@ output_constant_def (exp)
desc->next = const_hash_table[hash];
desc->label = ggc_alloc_string (label, -1);
const_hash_table[hash] = desc;
}
/* We have a symbol name; construct the SYMBOL_REF and the MEM. */
/* We have a symbol name; construct the SYMBOL_REF and the MEM
in the permanent obstack. We could also construct this in the
obstack of EXP and put it into TREE_CST_RTL, but we have no way
of knowing what obstack it is (e.g., it might be in a function
obstack of a function we are nested inside). */
push_obstacks_nochange ();
if (TREE_PERMANENT (exp))
end_temporary_allocation ();
push_obstacks_nochange ();
end_temporary_allocation ();
def = gen_rtx_SYMBOL_REF (Pmode, desc->label);
TREE_CST_RTL (exp)
= gen_rtx_MEM (TYPE_MODE (TREE_TYPE (exp)), def);
RTX_UNCHANGING_P (TREE_CST_RTL (exp)) = 1;
if (AGGREGATE_TYPE_P (TREE_TYPE (exp)))
MEM_SET_IN_STRUCT_P (TREE_CST_RTL (exp), 1);
desc->rtl
= gen_rtx_MEM (TYPE_MODE (TREE_TYPE (exp)),
gen_rtx_SYMBOL_REF (Pmode, desc->label));
pop_obstacks ();
RTX_UNCHANGING_P (desc->rtl) = 1;
if (AGGREGATE_TYPE_P (TREE_TYPE (exp)))
MEM_SET_IN_STRUCT_P (desc->rtl, 1);
pop_obstacks ();
found = 0;
}
TREE_CST_RTL (exp) = desc->rtl;
/* Optionally set flags or add text to the name to record information
such as that it is a function name. If the name is changed, the macro
@ -3093,7 +3114,7 @@ output_constant_def (exp)
/* If this is the first time we've seen this particular constant,
output it (or defer its output for later). */
if (found == 0)
if (! found)
{
int after_function = 0;
@ -3482,6 +3503,7 @@ record_constant_rtx (mode, x)
{
struct constant_descriptor *ptr;
char *label;
rtx rtl;
struct rtx_const value;
decode_rtx_const (mode, x, &value);
@ -3491,6 +3513,7 @@ record_constant_rtx (mode, x)
memory allocated from function_obstack (current_obstack). */
obstack_grow (saveable_obstack, &ptr, sizeof ptr);
obstack_grow (saveable_obstack, &label, sizeof label);
obstack_grow (saveable_obstack, &rtl, sizeof rtl);
/* Record constant contents. */
obstack_grow (saveable_obstack, &value, sizeof value);
@ -3988,6 +4011,161 @@ output_addressed_constants (exp)
return reloc;
}
/* Return nonzero if VALUE is a valid constant-valued expression
for use in initializing a static variable; one that can be an
element of a "constant" initializer.
Return null_pointer_node if the value is absolute;
if it is relocatable, return the variable that determines the relocation.
We assume that VALUE has been folded as much as possible;
therefore, we do not need to check for such things as
arithmetic-combinations of integers. */
tree
initializer_constant_valid_p (value, endtype)
tree value;
tree endtype;
{
switch (TREE_CODE (value))
{
case CONSTRUCTOR:
if ((TREE_CODE (TREE_TYPE (value)) == UNION_TYPE
|| TREE_CODE (TREE_TYPE (value)) == RECORD_TYPE)
&& TREE_CONSTANT (value)
&& CONSTRUCTOR_ELTS (value))
return
initializer_constant_valid_p (TREE_VALUE (CONSTRUCTOR_ELTS (value)),
endtype);
return TREE_STATIC (value) ? null_pointer_node : 0;
case INTEGER_CST:
case REAL_CST:
case STRING_CST:
case COMPLEX_CST:
return null_pointer_node;
case ADDR_EXPR:
return TREE_OPERAND (value, 0);
case NON_LVALUE_EXPR:
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
case CONVERT_EXPR:
case NOP_EXPR:
/* Allow conversions between pointer types. */
if (POINTER_TYPE_P (TREE_TYPE (value))
&& POINTER_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow conversions between real types. */
if (FLOAT_TYPE_P (TREE_TYPE (value))
&& FLOAT_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow length-preserving conversions between integer types. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0)))
&& (TYPE_PRECISION (TREE_TYPE (value))
== TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0)))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0), endtype);
/* Allow conversions between other integer types only if
explicit value. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
{
tree inner = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
if (inner == null_pointer_node)
return null_pointer_node;
break;
}
/* Allow (int) &foo provided int is as wide as a pointer. */
if (INTEGRAL_TYPE_P (TREE_TYPE (value))
&& POINTER_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0)))
&& (TYPE_PRECISION (TREE_TYPE (value))
>= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0)))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
/* Likewise conversions from int to pointers, but also allow
conversions from 0. */
if (POINTER_TYPE_P (TREE_TYPE (value))
&& INTEGRAL_TYPE_P (TREE_TYPE (TREE_OPERAND (value, 0))))
{
if (integer_zerop (TREE_OPERAND (value, 0)))
return null_pointer_node;
else if (TYPE_PRECISION (TREE_TYPE (value))
<= TYPE_PRECISION (TREE_TYPE (TREE_OPERAND (value, 0))))
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
}
/* Allow conversions to union types if the value inside is okay. */
if (TREE_CODE (TREE_TYPE (value)) == UNION_TYPE)
return initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
break;
case PLUS_EXPR:
if (! INTEGRAL_TYPE_P (endtype)
|| TYPE_PRECISION (endtype) >= POINTER_SIZE)
{
tree valid0 = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
tree valid1 = initializer_constant_valid_p (TREE_OPERAND (value, 1),
endtype);
/* If either term is absolute, use the other terms relocation. */
if (valid0 == null_pointer_node)
return valid1;
if (valid1 == null_pointer_node)
return valid0;
}
break;
case MINUS_EXPR:
if (! INTEGRAL_TYPE_P (endtype)
|| TYPE_PRECISION (endtype) >= POINTER_SIZE)
{
tree valid0 = initializer_constant_valid_p (TREE_OPERAND (value, 0),
endtype);
tree valid1 = initializer_constant_valid_p (TREE_OPERAND (value, 1),
endtype);
/* Win if second argument is absolute. */
if (valid1 == null_pointer_node)
return valid0;
/* Win if both arguments have the same relocation.
Then the value is absolute. */
if (valid0 == valid1 && valid0 != 0)
return null_pointer_node;
}
/* Support differences between labels. */
if (INTEGRAL_TYPE_P (endtype))
{
tree op0, op1;
op0 = TREE_OPERAND (value, 0);
op1 = TREE_OPERAND (value, 1);
STRIP_NOPS (op0);
STRIP_NOPS (op1);
if (TREE_CODE (op0) == ADDR_EXPR
&& TREE_CODE (TREE_OPERAND (op0, 0)) == LABEL_DECL
&& TREE_CODE (op1) == ADDR_EXPR
&& TREE_CODE (TREE_OPERAND (op1, 0)) == LABEL_DECL)
return null_pointer_node;
}
break;
default:
break;
}
return 0;
}
/* Output assembler code for constant EXP to FILE, with no label.
This includes the pseudo-op such as ".int" or ".byte", and a newline.
Assumes output_addressed_constants has been done on EXP already.
@ -4608,7 +4786,7 @@ make_decl_one_only (decl)
void
init_varasm_once ()
{
ggc_add_root (const_hash_table, MAX_HASH_TABLE, sizeof(const_hash_table[0]),
ggc_add_root (const_hash_table, MAX_HASH_TABLE, sizeof const_hash_table[0],
mark_const_hash_entry);
ggc_add_string_root (&in_named_name, 1);
}