linux/arch/arm64/lib
Mark Rutland c47d6a04e6 arm64: klib: bitops: fix unpredictable stxr usage
We're currently relying on unpredictable behaviour in our testops
(test_and_*_bit), as stxr is unpredictable when the status register and
the source register are the same

This patch changes reallocates the status register so as to bring us back into
the realm of predictable behaviour. Boot tested on an AEMv8 model.

Signed-off-by: Mark Rutland <mark.rutland@arm.com>
Signed-off-by: Catalin Marinas <catalin.marinas@arm.com>
2013-04-30 15:53:01 +01:00
..
bitops.S arm64: klib: bitops: fix unpredictable stxr usage 2013-04-30 15:53:01 +01:00
clear_page.S
clear_user.S
copy_from_user.S
copy_in_user.S
copy_page.S
copy_to_user.S
delay.c
Makefile arm64: klib: Optimised string functions 2013-03-21 17:39:30 +00:00
memchr.S arm64: klib: Optimised memory functions 2013-03-21 17:39:29 +00:00
memcpy.S arm64: klib: Optimised memory functions 2013-03-21 17:39:29 +00:00
memmove.S arm64: klib: Optimised memory functions 2013-03-21 17:39:29 +00:00
memset.S arm64: klib: Optimised memory functions 2013-03-21 17:39:29 +00:00
strchr.S arm64: klib: Optimised string functions 2013-03-21 17:39:30 +00:00
strncpy_from_user.S
strnlen_user.S
strrchr.S arm64: klib: Optimised string functions 2013-03-21 17:39:30 +00:00