From mboxrd@z Thu Jan 1 00:00:00 1970 From: Jerin Jacob Subject: [PATCH v4 14/29] eal/arm64: change barrier definitions to macros Date: Tue, 17 Jan 2017 12:43:49 +0530 Message-ID: <1484637244-7548-15-git-send-email-jerin.jacob@caviumnetworks.com> References: <1484212646-10338-1-git-send-email-jerin.jacob@caviumnetworks.com> <1484637244-7548-1-git-send-email-jerin.jacob@caviumnetworks.com> Mime-Version: 1.0 Content-Type: text/plain Cc: , , , , , , Jerin Jacob To: Return-path: Received: from NAM03-CO1-obe.outbound.protection.outlook.com (mail-co1nam03on0046.outbound.protection.outlook.com [104.47.40.46]) by dpdk.org (Postfix) with ESMTP id 7DEFAF97A for ; Tue, 17 Jan 2017 08:15:37 +0100 (CET) In-Reply-To: <1484637244-7548-1-git-send-email-jerin.jacob@caviumnetworks.com> List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Sender: "dev" Change rte_*wb definitions to macros in order to keep consistent with other barrier definitions in the file. Suggested-by: Jianbo Liu Signed-off-by: Jerin Jacob --- .../common/include/arch/arm/rte_atomic_64.h | 36 ++-------------------- 1 file changed, 3 insertions(+), 33 deletions(-) diff --git a/lib/librte_eal/common/include/arch/arm/rte_atomic_64.h b/lib/librte_eal/common/include/arch/arm/rte_atomic_64.h index ef0efc7..dc3a0f3 100644 --- a/lib/librte_eal/common/include/arch/arm/rte_atomic_64.h +++ b/lib/librte_eal/common/include/arch/arm/rte_atomic_64.h @@ -46,41 +46,11 @@ extern "C" { #define dsb(opt) { asm volatile("dsb " #opt : : : "memory"); } #define dmb(opt) { asm volatile("dmb " #opt : : : "memory"); } -/** - * General memory barrier. - * - * Guarantees that the LOAD and STORE operations generated before the - * barrier occur before the LOAD and STORE operations generated after. - * This function is architecture dependent. - */ -static inline void rte_mb(void) -{ - dsb(sy); -} +#define rte_mb() dsb(sy) -/** - * Write memory barrier. - * - * Guarantees that the STORE operations generated before the barrier - * occur before the STORE operations generated after. - * This function is architecture dependent. - */ -static inline void rte_wmb(void) -{ - dsb(st); -} +#define rte_wmb() dsb(st) -/** - * Read memory barrier. - * - * Guarantees that the LOAD operations generated before the barrier - * occur before the LOAD operations generated after. - * This function is architecture dependent. - */ -static inline void rte_rmb(void) -{ - dsb(ld); -} +#define rte_rmb() dsb(ld) #define rte_smp_mb() dmb(ish) -- 2.5.5