linux-crypto.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [v2 PATCH 00/11] crypto: lib - Add partial block helper
@ 2025-04-27  0:59 Herbert Xu
  2025-04-27  0:59 ` [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out Herbert Xu
                   ` (10 more replies)
  0 siblings, 11 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  0:59 UTC (permalink / raw)
  To: Linux Crypto Mailing List

v2:
- Remove the polyval patches.
- Rename one-block key to raw_key.
- Rename poly1305_block to poly1305_blocks.
- Fix the arch/generic if clause in poly1305_blocks.
- Rewrite crypto/chacha20poly1305 to use lib/crypto poly1305.
- Remove shash poly1305.

This is based on

	https://patchwork.kernel.org/project/linux-crypto/list/?series=955753
	https://patchwork.kernel.org/project/linux-crypto/list/?series=957401

This series introduces a partial block helper for lib/crypto hash
algorithms based on the one from sha256_base.

It then uses it on poly1305 to eliminate duplication between
architectures.  In particular, instead of having complete update
functions for each architecture, reduce it to a block function
per architecture instead.  The partial block handling is handled
by the generic library layer.

The poly1305 implementation was anomalous due to the inability
to call setkey in softirq.  It also has just a single user, which
is chacha20poly1305 that is hard-coded to use poly1305.  Replace
the gratuitous use of ahash in chacha20poly1305 with the lib/crypto
poly1305 instead.

This then allows the shash poly1305 to be removed.

Herbert Xu (11):
  crypto: lib/sha256 - Move partial block handling out
  crypto: lib/poly1305 - Add block-only interface
  crypto: arm/poly1305 - Add block-only interface
  crypto: arm64/poly1305 - Add block-only interface
  crypto: mips/poly1305 - Add block-only interface
  crypto: powerpc/poly1305 - Add block-only interface
  crypto: x86/poly1305 - Add block-only interface
  crypto: chacha20poly1305 - Use lib/crypto poly1305
  crypto: testmgr - Remove poly1305
  crypto: poly1305 - Remove algorithm
  crypto: lib/poly1305 - Use block-only interface

 arch/arm/lib/crypto/poly1305-armv4.pl         |   4 +-
 arch/arm/lib/crypto/poly1305-glue.c           | 113 ++----
 arch/arm64/lib/crypto/Makefile                |   3 +-
 arch/arm64/lib/crypto/poly1305-glue.c         | 105 ++----
 arch/mips/lib/crypto/poly1305-glue.c          |  75 +---
 arch/mips/lib/crypto/poly1305-mips.pl         |  12 +-
 arch/powerpc/lib/crypto/poly1305-p10-glue.c   | 109 ++----
 .../lib/crypto/poly1305-x86_64-cryptogams.pl  |  33 +-
 arch/x86/lib/crypto/poly1305_glue.c           | 169 +++------
 crypto/Kconfig                                |  14 +-
 crypto/Makefile                               |   2 -
 crypto/chacha20poly1305.c                     | 323 ++++--------------
 crypto/poly1305.c                             | 152 ---------
 crypto/testmgr.c                              |   6 -
 crypto/testmgr.h                              | 288 ----------------
 include/crypto/internal/blockhash.h           |  52 +++
 include/crypto/internal/poly1305.h            |  28 +-
 include/crypto/poly1305.h                     |  60 +---
 include/crypto/sha2.h                         |   9 +-
 include/crypto/sha256_base.h                  |  38 +--
 lib/crypto/poly1305.c                         |  83 ++---
 21 files changed, 396 insertions(+), 1282 deletions(-)
 delete mode 100644 crypto/poly1305.c
 create mode 100644 include/crypto/internal/blockhash.h

-- 
2.39.5


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
@ 2025-04-27  0:59 ` Herbert Xu
  2025-04-27  1:24   ` Eric Biggers
  2025-04-27  1:00 ` [v2 PATCH 02/11] crypto: lib/poly1305 - Add block-only interface Herbert Xu
                   ` (9 subsequent siblings)
  10 siblings, 1 reply; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  0:59 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Extract the common partial block handling into a helper macro
that can be reused by other library code.

Also delete the unused sha256_base_do_finalize function.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 include/crypto/internal/blockhash.h | 52 +++++++++++++++++++++++++++++
 include/crypto/sha2.h               |  9 +++--
 include/crypto/sha256_base.h        | 38 ++-------------------
 3 files changed, 62 insertions(+), 37 deletions(-)
 create mode 100644 include/crypto/internal/blockhash.h

diff --git a/include/crypto/internal/blockhash.h b/include/crypto/internal/blockhash.h
new file mode 100644
index 000000000000..4184e2337d68
--- /dev/null
+++ b/include/crypto/internal/blockhash.h
@@ -0,0 +1,52 @@
+/* SPDX-License-Identifier: GPL-2.0-or-later */
+/*
+ * Handle partial blocks for block hash.
+ *
+ * Copyright (c) 2015 Linaro Ltd <ard.biesheuvel@linaro.org>
+ * Copyright (c) 2025 Herbert Xu <herbert@gondor.apana.org.au>
+ */
+
+#ifndef _CRYPTO_INTERNAL_BLOCKHASH_H
+#define _CRYPTO_INTERNAL_BLOCKHASH_H
+
+#include <linux/string.h>
+#include <linux/types.h>
+
+#define BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, dv, buf,	\
+			       buflen)					\
+	({								\
+		unsigned int _nbytes = (nbytes);			\
+		unsigned int _buflen = (buflen);			\
+		typeof(block) _block = (block);				\
+		typeof(state) _state = (state); 			\
+		unsigned int _bs = (bs);				\
+		unsigned int _dv = (dv);				\
+		const u8 *_src = (src);					\
+		u8 *_buf = (buf);					\
+		while ((_buflen + _nbytes) >= _bs) {			\
+			unsigned int len = _nbytes;			\
+			const u8 *data = _src;				\
+			int blocks, remain;				\
+			if (_buflen) {					\
+				remain = _bs - _buflen;			\
+				memcpy(_buf + _buflen, _src, remain);	\
+				data = _buf;				\
+				len = _bs;				\
+			}						\
+			remain = len % bs;				\
+			blocks = (len - remain) / _dv;			\
+			_block(_state, data, blocks);			\
+			_src += len - remain - _buflen;			\
+			_nbytes -= len - remain - _buflen;		\
+			_buflen = 0;					\
+		}							\
+		memcpy(_buf + _buflen, _src, _nbytes);			\
+		_buflen += _nbytes;					\
+	})
+
+#define BLOCK_HASH_UPDATE(block, state, src, nbytes, bs, buf, buflen) \
+	BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, 1, buf, buflen)
+#define BLOCK_HASH_UPDATE_BLOCKS(block, state, src, nbytes, bs, buf, buflen) \
+	BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, bs, buf, buflen)
+
+#endif	/* _CRYPTO_INTERNAL_BLOCKHASH_H */
diff --git a/include/crypto/sha2.h b/include/crypto/sha2.h
index abbd882f7849..f873c2207b1e 100644
--- a/include/crypto/sha2.h
+++ b/include/crypto/sha2.h
@@ -71,8 +71,13 @@ struct crypto_sha256_state {
 };
 
 struct sha256_state {
-	u32 state[SHA256_DIGEST_SIZE / 4];
-	u64 count;
+	union {
+		struct crypto_sha256_state ctx;
+		struct {
+			u32 state[SHA256_DIGEST_SIZE / 4];
+			u64 count;
+		};
+	};
 	u8 buf[SHA256_BLOCK_SIZE];
 };
 
diff --git a/include/crypto/sha256_base.h b/include/crypto/sha256_base.h
index 08cd5e41d4fd..9f284bed5a51 100644
--- a/include/crypto/sha256_base.h
+++ b/include/crypto/sha256_base.h
@@ -8,6 +8,7 @@
 #ifndef _CRYPTO_SHA256_BASE_H
 #define _CRYPTO_SHA256_BASE_H
 
+#include <crypto/internal/blockhash.h>
 #include <crypto/internal/hash.h>
 #include <crypto/sha2.h>
 #include <linux/math.h>
@@ -40,35 +41,10 @@ static inline int lib_sha256_base_do_update(struct sha256_state *sctx,
 					    sha256_block_fn *block_fn)
 {
 	unsigned int partial = sctx->count % SHA256_BLOCK_SIZE;
-	struct crypto_sha256_state *state = (void *)sctx;
 
 	sctx->count += len;
-
-	if (unlikely((partial + len) >= SHA256_BLOCK_SIZE)) {
-		int blocks;
-
-		if (partial) {
-			int p = SHA256_BLOCK_SIZE - partial;
-
-			memcpy(sctx->buf + partial, data, p);
-			data += p;
-			len -= p;
-
-			block_fn(state, sctx->buf, 1);
-		}
-
-		blocks = len / SHA256_BLOCK_SIZE;
-		len %= SHA256_BLOCK_SIZE;
-
-		if (blocks) {
-			block_fn(state, data, blocks);
-			data += blocks * SHA256_BLOCK_SIZE;
-		}
-		partial = 0;
-	}
-	if (len)
-		memcpy(sctx->buf + partial, data, len);
-
+	BLOCK_HASH_UPDATE_BLOCKS(block_fn, &sctx->ctx, data, len,
+				 SHA256_BLOCK_SIZE, sctx->buf, partial);
 	return 0;
 }
 
@@ -140,14 +116,6 @@ static inline int lib_sha256_base_do_finalize(struct sha256_state *sctx,
 	return lib_sha256_base_do_finup(state, sctx->buf, partial, block_fn);
 }
 
-static inline int sha256_base_do_finalize(struct shash_desc *desc,
-					  sha256_block_fn *block_fn)
-{
-	struct sha256_state *sctx = shash_desc_ctx(desc);
-
-	return lib_sha256_base_do_finalize(sctx, block_fn);
-}
-
 static inline int __sha256_base_finish(u32 state[SHA256_DIGEST_SIZE / 4],
 				       u8 *out, unsigned int digest_size)
 {
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 02/11] crypto: lib/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
  2025-04-27  0:59 ` [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 03/11] crypto: arm/poly1305 " Herbert Xu
                   ` (8 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add a block-only interface for poly1305.  Implement the generic
code first.

Also use the generic partial block helper.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 include/crypto/internal/poly1305.h | 28 ++++++++++++++--
 include/crypto/poly1305.h          | 25 ++++++++++----
 lib/crypto/poly1305.c              | 54 +++++++++++++-----------------
 3 files changed, 68 insertions(+), 39 deletions(-)

diff --git a/include/crypto/internal/poly1305.h b/include/crypto/internal/poly1305.h
index e614594f88c1..c60315f47562 100644
--- a/include/crypto/internal/poly1305.h
+++ b/include/crypto/internal/poly1305.h
@@ -6,9 +6,8 @@
 #ifndef _CRYPTO_INTERNAL_POLY1305_H
 #define _CRYPTO_INTERNAL_POLY1305_H
 
-#include <linux/unaligned.h>
-#include <linux/types.h>
 #include <crypto/poly1305.h>
+#include <linux/types.h>
 
 /*
  * Poly1305 core functions.  These only accept whole blocks; the caller must
@@ -31,4 +30,29 @@ void poly1305_core_blocks(struct poly1305_state *state,
 void poly1305_core_emit(const struct poly1305_state *state, const u32 nonce[4],
 			void *dst);
 
+void poly1305_block_init_arch(struct poly1305_block_state *state,
+			      const u8 raw_key[POLY1305_BLOCK_SIZE]);
+void poly1305_block_init_generic(struct poly1305_block_state *state,
+				 const u8 raw_key[POLY1305_BLOCK_SIZE]);
+void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
+			  unsigned int len, u32 padbit);
+
+static inline void poly1305_blocks_generic(struct poly1305_block_state *state,
+					   const u8 *src, unsigned int len,
+					   u32 padbit)
+{
+	poly1305_core_blocks(&state->h, &state->core_r, src,
+			     len / POLY1305_BLOCK_SIZE, padbit);
+}
+
+void poly1305_emit_arch(const struct poly1305_state *state,
+			u8 digest[POLY1305_DIGEST_SIZE], const u32 nonce[4]);
+
+static inline void poly1305_emit_generic(const struct poly1305_state *state,
+					 u8 digest[POLY1305_DIGEST_SIZE],
+					 const u32 nonce[4])
+{
+	poly1305_core_emit(state, nonce, digest);
+}
+
 #endif
diff --git a/include/crypto/poly1305.h b/include/crypto/poly1305.h
index 6e21ec2d1dc2..027d74842cd5 100644
--- a/include/crypto/poly1305.h
+++ b/include/crypto/poly1305.h
@@ -7,7 +7,6 @@
 #define _CRYPTO_POLY1305_H
 
 #include <linux/types.h>
-#include <linux/crypto.h>
 
 #define POLY1305_BLOCK_SIZE	16
 #define POLY1305_KEY_SIZE	32
@@ -38,6 +37,17 @@ struct poly1305_state {
 	};
 };
 
+/* Combined state for block function. */
+struct poly1305_block_state {
+	/* accumulator */
+	struct poly1305_state h;
+	/* key */
+	union {
+		struct poly1305_key opaque_r[CONFIG_CRYPTO_LIB_POLY1305_RSIZE];
+		struct poly1305_core_key core_r;
+	};
+};
+
 struct poly1305_desc_ctx {
 	/* partial buffer */
 	u8 buf[POLY1305_BLOCK_SIZE];
@@ -45,12 +55,15 @@ struct poly1305_desc_ctx {
 	unsigned int buflen;
 	/* finalize key */
 	u32 s[4];
-	/* accumulator */
-	struct poly1305_state h;
-	/* key */
 	union {
-		struct poly1305_key opaque_r[CONFIG_CRYPTO_LIB_POLY1305_RSIZE];
-		struct poly1305_core_key core_r;
+		struct {
+			struct poly1305_state h;
+			union {
+				struct poly1305_key opaque_r[CONFIG_CRYPTO_LIB_POLY1305_RSIZE];
+				struct poly1305_core_key core_r;
+			};
+		};
+		struct poly1305_block_state state;
 	};
 };
 
diff --git a/lib/crypto/poly1305.c b/lib/crypto/poly1305.c
index b633b043f0f6..f35692376acf 100644
--- a/lib/crypto/poly1305.c
+++ b/lib/crypto/poly1305.c
@@ -7,54 +7,45 @@
  * Based on public domain code by Andrew Moon and Daniel J. Bernstein.
  */
 
+#include <crypto/internal/blockhash.h>
 #include <crypto/internal/poly1305.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
 
+void poly1305_block_init_generic(struct poly1305_block_state *desc,
+				 const u8 raw_key[POLY1305_BLOCK_SIZE])
+{
+	poly1305_core_init(&desc->h);
+	poly1305_core_setkey(&desc->core_r, raw_key);
+}
+EXPORT_SYMBOL_GPL(poly1305_block_init_generic);
+
 void poly1305_init_generic(struct poly1305_desc_ctx *desc,
 			   const u8 key[POLY1305_KEY_SIZE])
 {
-	poly1305_core_setkey(&desc->core_r, key);
 	desc->s[0] = get_unaligned_le32(key + 16);
 	desc->s[1] = get_unaligned_le32(key + 20);
 	desc->s[2] = get_unaligned_le32(key + 24);
 	desc->s[3] = get_unaligned_le32(key + 28);
-	poly1305_core_init(&desc->h);
 	desc->buflen = 0;
+	poly1305_block_init_generic(&desc->state, key);
 }
 EXPORT_SYMBOL_GPL(poly1305_init_generic);
 
+static inline void poly1305_blocks(struct poly1305_block_state *state,
+				   const u8 *src, unsigned int len)
+{
+	poly1305_blocks_generic(state, src, len, 1);
+}
+
 void poly1305_update_generic(struct poly1305_desc_ctx *desc, const u8 *src,
 			     unsigned int nbytes)
 {
-	unsigned int bytes;
-
-	if (unlikely(desc->buflen)) {
-		bytes = min(nbytes, POLY1305_BLOCK_SIZE - desc->buflen);
-		memcpy(desc->buf + desc->buflen, src, bytes);
-		src += bytes;
-		nbytes -= bytes;
-		desc->buflen += bytes;
-
-		if (desc->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_core_blocks(&desc->h, &desc->core_r, desc->buf,
-					     1, 1);
-			desc->buflen = 0;
-		}
-	}
-
-	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		poly1305_core_blocks(&desc->h, &desc->core_r, src,
-				     nbytes / POLY1305_BLOCK_SIZE, 1);
-		src += nbytes - (nbytes % POLY1305_BLOCK_SIZE);
-		nbytes %= POLY1305_BLOCK_SIZE;
-	}
-
-	if (unlikely(nbytes)) {
-		desc->buflen = nbytes;
-		memcpy(desc->buf, src, nbytes);
-	}
+	desc->buflen = BLOCK_HASH_UPDATE(&poly1305_blocks, &desc->state,
+					 src, nbytes, POLY1305_BLOCK_SIZE,
+					 desc->buf, desc->buflen);
 }
 EXPORT_SYMBOL_GPL(poly1305_update_generic);
 
@@ -64,10 +55,11 @@ void poly1305_final_generic(struct poly1305_desc_ctx *desc, u8 *dst)
 		desc->buf[desc->buflen++] = 1;
 		memset(desc->buf + desc->buflen, 0,
 		       POLY1305_BLOCK_SIZE - desc->buflen);
-		poly1305_core_blocks(&desc->h, &desc->core_r, desc->buf, 1, 0);
+		poly1305_blocks_generic(&desc->state, desc->buf,
+					POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_core_emit(&desc->h, desc->s, dst);
+	poly1305_emit_generic(&desc->h, dst, desc->s);
 	*desc = (struct poly1305_desc_ctx){};
 }
 EXPORT_SYMBOL_GPL(poly1305_final_generic);
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 03/11] crypto: arm/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
  2025-04-27  0:59 ` [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 02/11] crypto: lib/poly1305 - Add block-only interface Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 04/11] crypto: arm64/poly1305 " Herbert Xu
                   ` (7 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add block-only interface.

Also remove the unnecessary SIMD fallback path.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 arch/arm/lib/crypto/poly1305-armv4.pl |  4 +-
 arch/arm/lib/crypto/poly1305-glue.c   | 78 +++++++++++++++------------
 2 files changed, 47 insertions(+), 35 deletions(-)

diff --git a/arch/arm/lib/crypto/poly1305-armv4.pl b/arch/arm/lib/crypto/poly1305-armv4.pl
index 6d79498d3115..d57c6e2fc84a 100644
--- a/arch/arm/lib/crypto/poly1305-armv4.pl
+++ b/arch/arm/lib/crypto/poly1305-armv4.pl
@@ -43,9 +43,9 @@ $code.=<<___;
 #else
 # define __ARM_ARCH__ __LINUX_ARM_ARCH__
 # define __ARM_MAX_ARCH__ __LINUX_ARM_ARCH__
-# define poly1305_init   poly1305_init_arm
+# define poly1305_init   poly1305_block_init_arch
 # define poly1305_blocks poly1305_blocks_arm
-# define poly1305_emit   poly1305_emit_arm
+# define poly1305_emit   poly1305_emit_arch
 .globl	poly1305_blocks_neon
 #endif
 
diff --git a/arch/arm/lib/crypto/poly1305-glue.c b/arch/arm/lib/crypto/poly1305-glue.c
index 42d0ebde1ae1..3ee16048ec7c 100644
--- a/arch/arm/lib/crypto/poly1305-glue.c
+++ b/arch/arm/lib/crypto/poly1305-glue.c
@@ -7,20 +7,29 @@
 
 #include <asm/hwcap.h>
 #include <asm/neon.h>
-#include <asm/simd.h>
-#include <crypto/poly1305.h>
-#include <crypto/internal/simd.h>
+#include <crypto/internal/poly1305.h>
 #include <linux/cpufeature.h>
 #include <linux/jump_label.h>
+#include <linux/kernel.h>
 #include <linux/module.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
 
-void poly1305_init_arm(void *state, const u8 *key);
-void poly1305_blocks_arm(void *state, const u8 *src, u32 len, u32 hibit);
-void poly1305_blocks_neon(void *state, const u8 *src, u32 len, u32 hibit);
-void poly1305_emit_arm(void *state, u8 *digest, const u32 *nonce);
+asmlinkage void poly1305_block_init_arch(
+	struct poly1305_block_state *state,
+	const u8 raw_key[POLY1305_BLOCK_SIZE]);
+EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
+asmlinkage void poly1305_blocks_arm(struct poly1305_block_state *state,
+				    const u8 *src, u32 len, u32 hibit);
+asmlinkage void poly1305_blocks_neon(struct poly1305_block_state *state,
+				     const u8 *src, u32 len, u32 hibit);
+asmlinkage void poly1305_emit_arch(const struct poly1305_state *state,
+				   u8 digest[POLY1305_DIGEST_SIZE],
+				   const u32 nonce[4]);
+EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
-void __weak poly1305_blocks_neon(void *state, const u8 *src, u32 len, u32 hibit)
+void __weak poly1305_blocks_neon(struct poly1305_block_state *state,
+				 const u8 *src, u32 len, u32 hibit)
 {
 }
 
@@ -28,21 +37,39 @@ static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon);
 
 void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
 {
-	poly1305_init_arm(&dctx->h, key);
 	dctx->s[0] = get_unaligned_le32(key + 16);
 	dctx->s[1] = get_unaligned_le32(key + 20);
 	dctx->s[2] = get_unaligned_le32(key + 24);
 	dctx->s[3] = get_unaligned_le32(key + 28);
 	dctx->buflen = 0;
+	poly1305_block_init_arch(&dctx->state, key);
 }
 EXPORT_SYMBOL(poly1305_init_arch);
 
+void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
+			  unsigned int len, u32 padbit)
+{
+	len = round_down(len, POLY1305_BLOCK_SIZE);
+	if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) &&
+	    static_branch_likely(&have_neon)) {
+		do {
+			unsigned int todo = min_t(unsigned int, len, SZ_4K);
+
+			kernel_neon_begin();
+			poly1305_blocks_neon(state, src, todo, padbit);
+			kernel_neon_end();
+
+			len -= todo;
+			src += todo;
+		} while (len);
+	} else
+		poly1305_blocks_arm(state, src, len, padbit);
+}
+EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
+
 void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 			  unsigned int nbytes)
 {
-	bool do_neon = IS_ENABLED(CONFIG_KERNEL_MODE_NEON) &&
-		       crypto_simd_usable();
-
 	if (unlikely(dctx->buflen)) {
 		u32 bytes = min(nbytes, POLY1305_BLOCK_SIZE - dctx->buflen);
 
@@ -52,30 +79,15 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 		dctx->buflen += bytes;
 
 		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_arm(&dctx->h, dctx->buf,
-					    POLY1305_BLOCK_SIZE, 1);
+			poly1305_blocks_arch(&dctx->state, dctx->buf,
+					     POLY1305_BLOCK_SIZE, 1);
 			dctx->buflen = 0;
 		}
 	}
 
 	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		unsigned int len = round_down(nbytes, POLY1305_BLOCK_SIZE);
-
-		if (static_branch_likely(&have_neon) && do_neon) {
-			do {
-				unsigned int todo = min_t(unsigned int, len, SZ_4K);
-
-				kernel_neon_begin();
-				poly1305_blocks_neon(&dctx->h, src, todo, 1);
-				kernel_neon_end();
-
-				len -= todo;
-				src += todo;
-			} while (len);
-		} else {
-			poly1305_blocks_arm(&dctx->h, src, len, 1);
-			src += len;
-		}
+		poly1305_blocks_arch(&dctx->state, src, nbytes, 1);
+		src += round_down(nbytes, POLY1305_BLOCK_SIZE);
 		nbytes %= POLY1305_BLOCK_SIZE;
 	}
 
@@ -92,10 +104,10 @@ void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
 		dctx->buf[dctx->buflen++] = 1;
 		memset(dctx->buf + dctx->buflen, 0,
 		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arm(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 0);
+		poly1305_blocks_arch(&dctx->state, dctx->buf, POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_emit_arm(&dctx->h, dst, dctx->s);
+	poly1305_emit_arch(&dctx->h, dst, dctx->s);
 	*dctx = (struct poly1305_desc_ctx){};
 }
 EXPORT_SYMBOL(poly1305_final_arch);
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 04/11] crypto: arm64/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (2 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 03/11] crypto: arm/poly1305 " Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 05/11] crypto: mips/poly1305 " Herbert Xu
                   ` (6 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add block-only interface.

Also remove the unnecessary SIMD fallback path.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 arch/arm64/lib/crypto/Makefile        |  3 +-
 arch/arm64/lib/crypto/poly1305-glue.c | 71 ++++++++++++++++-----------
 2 files changed, 45 insertions(+), 29 deletions(-)

diff --git a/arch/arm64/lib/crypto/Makefile b/arch/arm64/lib/crypto/Makefile
index ac624c3effda..6207088397a7 100644
--- a/arch/arm64/lib/crypto/Makefile
+++ b/arch/arm64/lib/crypto/Makefile
@@ -5,7 +5,8 @@ chacha-neon-y := chacha-neon-core.o chacha-neon-glue.o
 
 obj-$(CONFIG_CRYPTO_POLY1305_NEON) += poly1305-neon.o
 poly1305-neon-y := poly1305-core.o poly1305-glue.o
-AFLAGS_poly1305-core.o += -Dpoly1305_init=poly1305_init_arm64
+AFLAGS_poly1305-core.o += -Dpoly1305_init=poly1305_block_init_arch
+AFLAGS_poly1305-core.o += -Dpoly1305_emit=poly1305_emit_arch
 
 quiet_cmd_perlasm = PERLASM $@
       cmd_perlasm = $(PERL) $(<) void $(@)
diff --git a/arch/arm64/lib/crypto/poly1305-glue.c b/arch/arm64/lib/crypto/poly1305-glue.c
index 906970dd5373..d66a820e32d5 100644
--- a/arch/arm64/lib/crypto/poly1305-glue.c
+++ b/arch/arm64/lib/crypto/poly1305-glue.c
@@ -7,32 +7,60 @@
 
 #include <asm/hwcap.h>
 #include <asm/neon.h>
-#include <asm/simd.h>
-#include <crypto/poly1305.h>
-#include <crypto/internal/simd.h>
+#include <crypto/internal/poly1305.h>
 #include <linux/cpufeature.h>
 #include <linux/jump_label.h>
+#include <linux/kernel.h>
 #include <linux/module.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
 
-asmlinkage void poly1305_init_arm64(void *state, const u8 *key);
-asmlinkage void poly1305_blocks(void *state, const u8 *src, u32 len, u32 hibit);
-asmlinkage void poly1305_blocks_neon(void *state, const u8 *src, u32 len, u32 hibit);
-asmlinkage void poly1305_emit(void *state, u8 *digest, const u32 *nonce);
+asmlinkage void poly1305_block_init_arch(
+	struct poly1305_block_state *state,
+	const u8 raw_key[POLY1305_BLOCK_SIZE]);
+EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
+asmlinkage void poly1305_blocks(struct poly1305_block_state *state,
+				const u8 *src, u32 len, u32 hibit);
+asmlinkage void poly1305_blocks_neon(struct poly1305_block_state *state,
+				     const u8 *src, u32 len, u32 hibit);
+asmlinkage void poly1305_emit_arch(const struct poly1305_state *state,
+				   u8 digest[POLY1305_DIGEST_SIZE],
+				   const u32 nonce[4]);
+EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
 static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon);
 
 void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
 {
-	poly1305_init_arm64(&dctx->h, key);
 	dctx->s[0] = get_unaligned_le32(key + 16);
 	dctx->s[1] = get_unaligned_le32(key + 20);
 	dctx->s[2] = get_unaligned_le32(key + 24);
 	dctx->s[3] = get_unaligned_le32(key + 28);
 	dctx->buflen = 0;
+	poly1305_block_init_arch(&dctx->state, key);
 }
 EXPORT_SYMBOL(poly1305_init_arch);
 
+void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
+			  unsigned int len, u32 padbit)
+{
+	len = round_down(len, POLY1305_BLOCK_SIZE);
+	if (static_branch_likely(&have_neon)) {
+		do {
+			unsigned int todo = min_t(unsigned int, len, SZ_4K);
+
+			kernel_neon_begin();
+			poly1305_blocks_neon(state, src, todo, 1);
+			kernel_neon_end();
+
+			len -= todo;
+			src += todo;
+		} while (len);
+	} else
+		poly1305_blocks(state, src, len, 1);
+}
+EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
+
 void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 			  unsigned int nbytes)
 {
@@ -45,29 +73,15 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 		dctx->buflen += bytes;
 
 		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 1);
+			poly1305_blocks_arch(&dctx->state, dctx->buf,
+					     POLY1305_BLOCK_SIZE, 1);
 			dctx->buflen = 0;
 		}
 	}
 
 	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		unsigned int len = round_down(nbytes, POLY1305_BLOCK_SIZE);
-
-		if (static_branch_likely(&have_neon) && crypto_simd_usable()) {
-			do {
-				unsigned int todo = min_t(unsigned int, len, SZ_4K);
-
-				kernel_neon_begin();
-				poly1305_blocks_neon(&dctx->h, src, todo, 1);
-				kernel_neon_end();
-
-				len -= todo;
-				src += todo;
-			} while (len);
-		} else {
-			poly1305_blocks(&dctx->h, src, len, 1);
-			src += len;
-		}
+		poly1305_blocks_arch(&dctx->state, src, nbytes, 1);
+		src += round_down(nbytes, POLY1305_BLOCK_SIZE);
 		nbytes %= POLY1305_BLOCK_SIZE;
 	}
 
@@ -84,10 +98,11 @@ void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
 		dctx->buf[dctx->buflen++] = 1;
 		memset(dctx->buf + dctx->buflen, 0,
 		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 0);
+		poly1305_blocks_arch(&dctx->state, dctx->buf,
+				     POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_emit(&dctx->h, dst, dctx->s);
+	poly1305_emit_arch(&dctx->h, dst, dctx->s);
 	memzero_explicit(dctx, sizeof(*dctx));
 }
 EXPORT_SYMBOL(poly1305_final_arch);
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 05/11] crypto: mips/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (3 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 04/11] crypto: arm64/poly1305 " Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 06/11] crypto: powerpc/poly1305 " Herbert Xu
                   ` (5 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add block-only interface.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 arch/mips/lib/crypto/poly1305-glue.c  | 29 ++++++++++++++++++---------
 arch/mips/lib/crypto/poly1305-mips.pl | 12 +++++------
 2 files changed, 26 insertions(+), 15 deletions(-)

diff --git a/arch/mips/lib/crypto/poly1305-glue.c b/arch/mips/lib/crypto/poly1305-glue.c
index 576e7a58e0b1..2fea4cacfe27 100644
--- a/arch/mips/lib/crypto/poly1305-glue.c
+++ b/arch/mips/lib/crypto/poly1305-glue.c
@@ -5,23 +5,33 @@
  * Copyright (C) 2019 Linaro Ltd. <ard.biesheuvel@linaro.org>
  */
 
-#include <crypto/poly1305.h>
+#include <crypto/internal/poly1305.h>
 #include <linux/cpufeature.h>
+#include <linux/kernel.h>
 #include <linux/module.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
 
-asmlinkage void poly1305_init_mips(void *state, const u8 *key);
-asmlinkage void poly1305_blocks_mips(void *state, const u8 *src, u32 len, u32 hibit);
-asmlinkage void poly1305_emit_mips(void *state, u8 *digest, const u32 *nonce);
+asmlinkage void poly1305_block_init_arch(
+	struct poly1305_block_state *state,
+	const u8 raw_key[POLY1305_BLOCK_SIZE]);
+EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
+asmlinkage void poly1305_blocks_arch(struct poly1305_block_state *state,
+				     const u8 *src, u32 len, u32 hibit);
+EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
+asmlinkage void poly1305_emit_arch(const struct poly1305_state *state,
+				   u8 digest[POLY1305_DIGEST_SIZE],
+				   const u32 nonce[4]);
+EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
 void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
 {
-	poly1305_init_mips(&dctx->h, key);
 	dctx->s[0] = get_unaligned_le32(key + 16);
 	dctx->s[1] = get_unaligned_le32(key + 20);
 	dctx->s[2] = get_unaligned_le32(key + 24);
 	dctx->s[3] = get_unaligned_le32(key + 28);
 	dctx->buflen = 0;
+	poly1305_block_init_arch(&dctx->state, key);
 }
 EXPORT_SYMBOL(poly1305_init_arch);
 
@@ -37,7 +47,7 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 		dctx->buflen += bytes;
 
 		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_mips(&dctx->h, dctx->buf,
+			poly1305_blocks_arch(&dctx->state, dctx->buf,
 					     POLY1305_BLOCK_SIZE, 1);
 			dctx->buflen = 0;
 		}
@@ -46,7 +56,7 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
 		unsigned int len = round_down(nbytes, POLY1305_BLOCK_SIZE);
 
-		poly1305_blocks_mips(&dctx->h, src, len, 1);
+		poly1305_blocks_arch(&dctx->state, src, len, 1);
 		src += len;
 		nbytes %= POLY1305_BLOCK_SIZE;
 	}
@@ -64,10 +74,11 @@ void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
 		dctx->buf[dctx->buflen++] = 1;
 		memset(dctx->buf + dctx->buflen, 0,
 		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_mips(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 0);
+		poly1305_blocks_arch(&dctx->state, dctx->buf,
+				     POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_emit_mips(&dctx->h, dst, dctx->s);
+	poly1305_emit_arch(&dctx->h, dst, dctx->s);
 	*dctx = (struct poly1305_desc_ctx){};
 }
 EXPORT_SYMBOL(poly1305_final_arch);
diff --git a/arch/mips/lib/crypto/poly1305-mips.pl b/arch/mips/lib/crypto/poly1305-mips.pl
index b05bab884ed2..399f10c3e385 100644
--- a/arch/mips/lib/crypto/poly1305-mips.pl
+++ b/arch/mips/lib/crypto/poly1305-mips.pl
@@ -93,9 +93,9 @@ $code.=<<___;
 #endif
 
 #ifdef	__KERNEL__
-# define poly1305_init   poly1305_init_mips
-# define poly1305_blocks poly1305_blocks_mips
-# define poly1305_emit   poly1305_emit_mips
+# define poly1305_init   poly1305_block_init_arch
+# define poly1305_blocks poly1305_blocks_arch
+# define poly1305_emit   poly1305_emit_arch
 #endif
 
 #if defined(__MIPSEB__) && !defined(MIPSEB)
@@ -565,9 +565,9 @@ $code.=<<___;
 #endif
 
 #ifdef	__KERNEL__
-# define poly1305_init   poly1305_init_mips
-# define poly1305_blocks poly1305_blocks_mips
-# define poly1305_emit   poly1305_emit_mips
+# define poly1305_init   poly1305_block_init_arch
+# define poly1305_blocks poly1305_blocks_arch
+# define poly1305_emit   poly1305_emit_arch
 #endif
 
 #if defined(__MIPSEB__) && !defined(MIPSEB)
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 06/11] crypto: powerpc/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (4 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 05/11] crypto: mips/poly1305 " Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 07/11] crypto: x86/poly1305 " Herbert Xu
                   ` (4 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add block-only interface.

Also remove the unnecessary SIMD fallback path.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 arch/powerpc/lib/crypto/poly1305-p10-glue.c | 84 ++++++++++++---------
 1 file changed, 49 insertions(+), 35 deletions(-)

diff --git a/arch/powerpc/lib/crypto/poly1305-p10-glue.c b/arch/powerpc/lib/crypto/poly1305-p10-glue.c
index 00617f4c58e6..708435beaba6 100644
--- a/arch/powerpc/lib/crypto/poly1305-p10-glue.c
+++ b/arch/powerpc/lib/crypto/poly1305-p10-glue.c
@@ -4,19 +4,20 @@
  *
  * Copyright 2023- IBM Corp. All rights reserved.
  */
+#include <asm/switch_to.h>
+#include <crypto/internal/poly1305.h>
+#include <linux/cpufeature.h>
+#include <linux/jump_label.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
-#include <linux/jump_label.h>
-#include <crypto/internal/simd.h>
-#include <crypto/poly1305.h>
-#include <linux/cpufeature.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
-#include <asm/simd.h>
-#include <asm/switch_to.h>
 
-asmlinkage void poly1305_p10le_4blocks(void *h, const u8 *m, u32 mlen);
-asmlinkage void poly1305_64s(void *h, const u8 *m, u32 mlen, int highbit);
-asmlinkage void poly1305_emit_64(void *h, void *s, u8 *dst);
+asmlinkage void poly1305_p10le_4blocks(struct poly1305_block_state *state, const u8 *m, u32 mlen);
+asmlinkage void poly1305_64s(struct poly1305_block_state *state, const u8 *m, u32 mlen, int highbit);
+asmlinkage void poly1305_emit_arch(const struct poly1305_state *state,
+				   u8 digest[POLY1305_DIGEST_SIZE],
+				   const u32 nonce[4]);
 
 static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_p10);
 
@@ -32,22 +33,49 @@ static void vsx_end(void)
 	preempt_enable();
 }
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
+void poly1305_block_init_arch(struct poly1305_block_state *dctx,
+			      const u8 raw_key[POLY1305_BLOCK_SIZE])
 {
 	if (!static_key_enabled(&have_p10))
-		return poly1305_init_generic(dctx, key);
+		return poly1305_block_init_generic(dctx, raw_key);
 
 	dctx->h = (struct poly1305_state){};
-	dctx->core_r.key.r64[0] = get_unaligned_le64(key + 0);
-	dctx->core_r.key.r64[1] = get_unaligned_le64(key + 8);
+	dctx->core_r.key.r64[0] = get_unaligned_le64(raw_key + 0);
+	dctx->core_r.key.r64[1] = get_unaligned_le64(raw_key + 8);
+}
+EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
+
+void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
+{
 	dctx->s[0] = get_unaligned_le32(key + 16);
 	dctx->s[1] = get_unaligned_le32(key + 20);
 	dctx->s[2] = get_unaligned_le32(key + 24);
 	dctx->s[3] = get_unaligned_le32(key + 28);
 	dctx->buflen = 0;
+	poly1305_block_init_arch(&dctx->state, key);
 }
 EXPORT_SYMBOL(poly1305_init_arch);
 
+void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
+			  unsigned int len, u32 padbit)
+{
+	if (!static_key_enabled(&have_p10))
+		return poly1305_blocks_generic(state, src, len, padbit);
+	vsx_begin();
+	if (len >= POLY1305_BLOCK_SIZE * 4) {
+		poly1305_p10le_4blocks(state, src, len);
+		src += len - (len % (POLY1305_BLOCK_SIZE * 4));
+		len %= POLY1305_BLOCK_SIZE * 4;
+	}
+	while (len >= POLY1305_BLOCK_SIZE) {
+		poly1305_64s(state, src, POLY1305_BLOCK_SIZE, padbit);
+		len -= POLY1305_BLOCK_SIZE;
+		src += POLY1305_BLOCK_SIZE;
+	}
+	vsx_end();
+}
+EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
+
 void poly1305_update_arch(struct poly1305_desc_ctx *dctx,
 			  const u8 *src, unsigned int srclen)
 {
@@ -64,28 +92,15 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx,
 		dctx->buflen += bytes;
 		if (dctx->buflen < POLY1305_BLOCK_SIZE)
 			return;
-		vsx_begin();
-		poly1305_64s(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 1);
-		vsx_end();
+		poly1305_blocks_arch(&dctx->state, dctx->buf,
+				     POLY1305_BLOCK_SIZE, 1);
 		dctx->buflen = 0;
 	}
 
 	if (likely(srclen >= POLY1305_BLOCK_SIZE)) {
-		bytes = round_down(srclen, POLY1305_BLOCK_SIZE);
-		if (crypto_simd_usable() && (srclen >= POLY1305_BLOCK_SIZE*4)) {
-			vsx_begin();
-			poly1305_p10le_4blocks(&dctx->h, src, srclen);
-			vsx_end();
-			src += srclen - (srclen % (POLY1305_BLOCK_SIZE * 4));
-			srclen %= POLY1305_BLOCK_SIZE * 4;
-		}
-		while (srclen >= POLY1305_BLOCK_SIZE) {
-			vsx_begin();
-			poly1305_64s(&dctx->h, src, POLY1305_BLOCK_SIZE, 1);
-			vsx_end();
-			srclen -= POLY1305_BLOCK_SIZE;
-			src += POLY1305_BLOCK_SIZE;
-		}
+		poly1305_blocks_arch(&dctx->state, src, srclen, 1);
+		src += srclen - (srclen % POLY1305_BLOCK_SIZE);
+		srclen %= POLY1305_BLOCK_SIZE;
 	}
 
 	if (unlikely(srclen)) {
@@ -104,12 +119,11 @@ void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
 		dctx->buf[dctx->buflen++] = 1;
 		memset(dctx->buf + dctx->buflen, 0,
 		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		vsx_begin();
-		poly1305_64s(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 0);
-		vsx_end();
+		poly1305_blocks_arch(&dctx->state, dctx->buf,
+				     POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_emit_64(&dctx->h, &dctx->s, dst);
+	poly1305_emit_arch(&dctx->h, dst, dctx->s);
 }
 EXPORT_SYMBOL(poly1305_final_arch);
 
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 07/11] crypto: x86/poly1305 - Add block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (5 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 06/11] crypto: powerpc/poly1305 " Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 08/11] crypto: chacha20poly1305 - Use lib/crypto poly1305 Herbert Xu
                   ` (3 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Add block-only interface.

Also remove the unnecessary SIMD fallback path.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 .../lib/crypto/poly1305-x86_64-cryptogams.pl  |  33 +++--
 arch/x86/lib/crypto/poly1305_glue.c           | 125 +++++++-----------
 2 files changed, 71 insertions(+), 87 deletions(-)

diff --git a/arch/x86/lib/crypto/poly1305-x86_64-cryptogams.pl b/arch/x86/lib/crypto/poly1305-x86_64-cryptogams.pl
index 409ec6955733..501827254fed 100644
--- a/arch/x86/lib/crypto/poly1305-x86_64-cryptogams.pl
+++ b/arch/x86/lib/crypto/poly1305-x86_64-cryptogams.pl
@@ -118,6 +118,19 @@ sub declare_function() {
 	}
 }
 
+sub declare_typed_function() {
+	my ($name, $align, $nargs) = @_;
+	if($kernel) {
+		$code .= "SYM_TYPED_FUNC_START($name)\n";
+		$code .= ".L$name:\n";
+	} else {
+		$code .= ".globl	$name\n";
+		$code .= ".type	$name,\@function,$nargs\n";
+		$code .= ".align	$align\n";
+		$code .= "$name:\n";
+	}
+}
+
 sub end_function() {
 	my ($name) = @_;
 	if($kernel) {
@@ -128,7 +141,7 @@ sub end_function() {
 }
 
 $code.=<<___ if $kernel;
-#include <linux/linkage.h>
+#include <linux/cfi_types.h>
 ___
 
 if ($avx) {
@@ -236,14 +249,14 @@ ___
 $code.=<<___ if (!$kernel);
 .extern	OPENSSL_ia32cap_P
 
-.globl	poly1305_init_x86_64
-.hidden	poly1305_init_x86_64
+.globl	poly1305_block_init_arch
+.hidden	poly1305_block_init_arch
 .globl	poly1305_blocks_x86_64
 .hidden	poly1305_blocks_x86_64
 .globl	poly1305_emit_x86_64
 .hidden	poly1305_emit_x86_64
 ___
-&declare_function("poly1305_init_x86_64", 32, 3);
+&declare_typed_function("poly1305_block_init_arch", 32, 3);
 $code.=<<___;
 	xor	%eax,%eax
 	mov	%rax,0($ctx)		# initialize hash value
@@ -298,7 +311,7 @@ $code.=<<___;
 .Lno_key:
 	RET
 ___
-&end_function("poly1305_init_x86_64");
+&end_function("poly1305_block_init_arch");
 
 &declare_function("poly1305_blocks_x86_64", 32, 4);
 $code.=<<___;
@@ -4105,9 +4118,9 @@ avx_handler:
 
 .section	.pdata
 .align	4
-	.rva	.LSEH_begin_poly1305_init_x86_64
-	.rva	.LSEH_end_poly1305_init_x86_64
-	.rva	.LSEH_info_poly1305_init_x86_64
+	.rva	.LSEH_begin_poly1305_block_init_arch
+	.rva	.LSEH_end_poly1305_block_init_arch
+	.rva	.LSEH_info_poly1305_block_init_arch
 
 	.rva	.LSEH_begin_poly1305_blocks_x86_64
 	.rva	.LSEH_end_poly1305_blocks_x86_64
@@ -4155,10 +4168,10 @@ ___
 $code.=<<___;
 .section	.xdata
 .align	8
-.LSEH_info_poly1305_init_x86_64:
+.LSEH_info_poly1305_block_init_arch:
 	.byte	9,0,0,0
 	.rva	se_handler
-	.rva	.LSEH_begin_poly1305_init_x86_64,.LSEH_begin_poly1305_init_x86_64
+	.rva	.LSEH_begin_poly1305_block_init_arch,.LSEH_begin_poly1305_block_init_arch
 
 .LSEH_info_poly1305_blocks_x86_64:
 	.byte	9,0,0,0
diff --git a/arch/x86/lib/crypto/poly1305_glue.c b/arch/x86/lib/crypto/poly1305_glue.c
index cff35ca5822a..d98764ec3b47 100644
--- a/arch/x86/lib/crypto/poly1305_glue.c
+++ b/arch/x86/lib/crypto/poly1305_glue.c
@@ -3,34 +3,15 @@
  * Copyright (C) 2015-2019 Jason A. Donenfeld <Jason@zx2c4.com>. All Rights Reserved.
  */
 
-#include <crypto/internal/simd.h>
-#include <crypto/poly1305.h>
+#include <asm/cpu_device_id.h>
+#include <asm/fpu/api.h>
+#include <crypto/internal/poly1305.h>
 #include <linux/jump_label.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
 #include <linux/sizes.h>
+#include <linux/string.h>
 #include <linux/unaligned.h>
-#include <asm/cpu_device_id.h>
-#include <asm/simd.h>
-
-asmlinkage void poly1305_init_x86_64(void *ctx,
-				     const u8 key[POLY1305_BLOCK_SIZE]);
-asmlinkage void poly1305_blocks_x86_64(void *ctx, const u8 *inp,
-				       const size_t len, const u32 padbit);
-asmlinkage void poly1305_emit_x86_64(void *ctx, u8 mac[POLY1305_DIGEST_SIZE],
-				     const u32 nonce[4]);
-asmlinkage void poly1305_emit_avx(void *ctx, u8 mac[POLY1305_DIGEST_SIZE],
-				  const u32 nonce[4]);
-asmlinkage void poly1305_blocks_avx(void *ctx, const u8 *inp, const size_t len,
-				    const u32 padbit);
-asmlinkage void poly1305_blocks_avx2(void *ctx, const u8 *inp, const size_t len,
-				     const u32 padbit);
-asmlinkage void poly1305_blocks_avx512(void *ctx, const u8 *inp,
-				       const size_t len, const u32 padbit);
-
-static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx);
-static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx2);
-static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx512);
 
 struct poly1305_arch_internal {
 	union {
@@ -45,64 +26,50 @@ struct poly1305_arch_internal {
 	struct { u32 r2, r1, r4, r3; } rn[9];
 };
 
-/* The AVX code uses base 2^26, while the scalar code uses base 2^64. If we hit
- * the unfortunate situation of using AVX and then having to go back to scalar
- * -- because the user is silly and has called the update function from two
- * separate contexts -- then we need to convert back to the original base before
- * proceeding. It is possible to reason that the initial reduction below is
- * sufficient given the implementation invariants. However, for an avoidance of
- * doubt and because this is not performance critical, we do the full reduction
- * anyway. Z3 proof of below function: https://xn--4db.cc/ltPtHCKN/py
- */
-static void convert_to_base2_64(void *ctx)
+asmlinkage void poly1305_block_init_arch(
+	struct poly1305_block_state *state,
+	const u8 raw_key[POLY1305_BLOCK_SIZE]);
+EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
+asmlinkage void poly1305_blocks_x86_64(struct poly1305_arch_internal *ctx,
+				       const u8 *inp,
+				       const size_t len, const u32 padbit);
+asmlinkage void poly1305_emit_x86_64(const struct poly1305_state *ctx,
+				     u8 mac[POLY1305_DIGEST_SIZE],
+				     const u32 nonce[4]);
+asmlinkage void poly1305_emit_avx(const struct poly1305_state *ctx,
+				  u8 mac[POLY1305_DIGEST_SIZE],
+				  const u32 nonce[4]);
+asmlinkage void poly1305_blocks_avx(struct poly1305_arch_internal *ctx,
+				    const u8 *inp, const size_t len,
+				    const u32 padbit);
+asmlinkage void poly1305_blocks_avx2(struct poly1305_arch_internal *ctx,
+				     const u8 *inp, const size_t len,
+				     const u32 padbit);
+asmlinkage void poly1305_blocks_avx512(struct poly1305_arch_internal *ctx,
+				       const u8 *inp,
+				       const size_t len, const u32 padbit);
+
+static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx);
+static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx2);
+static __ro_after_init DEFINE_STATIC_KEY_FALSE(poly1305_use_avx512);
+
+void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *inp,
+			  unsigned int len, u32 padbit)
 {
-	struct poly1305_arch_internal *state = ctx;
-	u32 cy;
-
-	if (!state->is_base2_26)
-		return;
-
-	cy = state->h[0] >> 26; state->h[0] &= 0x3ffffff; state->h[1] += cy;
-	cy = state->h[1] >> 26; state->h[1] &= 0x3ffffff; state->h[2] += cy;
-	cy = state->h[2] >> 26; state->h[2] &= 0x3ffffff; state->h[3] += cy;
-	cy = state->h[3] >> 26; state->h[3] &= 0x3ffffff; state->h[4] += cy;
-	state->hs[0] = ((u64)state->h[2] << 52) | ((u64)state->h[1] << 26) | state->h[0];
-	state->hs[1] = ((u64)state->h[4] << 40) | ((u64)state->h[3] << 14) | (state->h[2] >> 12);
-	state->hs[2] = state->h[4] >> 24;
-#define ULT(a, b) ((a ^ ((a ^ b) | ((a - b) ^ b))) >> (sizeof(a) * 8 - 1))
-	cy = (state->hs[2] >> 2) + (state->hs[2] & ~3ULL);
-	state->hs[2] &= 3;
-	state->hs[0] += cy;
-	state->hs[1] += (cy = ULT(state->hs[0], cy));
-	state->hs[2] += ULT(state->hs[1], cy);
-#undef ULT
-	state->is_base2_26 = 0;
-}
-
-static void poly1305_simd_init(void *ctx, const u8 key[POLY1305_BLOCK_SIZE])
-{
-	poly1305_init_x86_64(ctx, key);
-}
-
-static void poly1305_simd_blocks(void *ctx, const u8 *inp, size_t len,
-				 const u32 padbit)
-{
-	struct poly1305_arch_internal *state = ctx;
+	struct poly1305_arch_internal *ctx =
+		container_of(&state->h.h, struct poly1305_arch_internal, h);
 
 	/* SIMD disables preemption, so relax after processing each page. */
 	BUILD_BUG_ON(SZ_4K < POLY1305_BLOCK_SIZE ||
 		     SZ_4K % POLY1305_BLOCK_SIZE);
 
-	if (!static_branch_likely(&poly1305_use_avx) ||
-	    (len < (POLY1305_BLOCK_SIZE * 18) && !state->is_base2_26) ||
-	    !crypto_simd_usable()) {
-		convert_to_base2_64(ctx);
+	if (!static_branch_likely(&poly1305_use_avx)) {
 		poly1305_blocks_x86_64(ctx, inp, len, padbit);
 		return;
 	}
 
 	do {
-		const size_t bytes = min_t(size_t, len, SZ_4K);
+		const unsigned int bytes = min(len, SZ_4K);
 
 		kernel_fpu_begin();
 		if (static_branch_likely(&poly1305_use_avx512))
@@ -117,24 +84,26 @@ static void poly1305_simd_blocks(void *ctx, const u8 *inp, size_t len,
 		inp += bytes;
 	} while (len);
 }
+EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
 
-static void poly1305_simd_emit(void *ctx, u8 mac[POLY1305_DIGEST_SIZE],
-			       const u32 nonce[4])
+void poly1305_emit_arch(const struct poly1305_state *ctx,
+			u8 mac[POLY1305_DIGEST_SIZE], const u32 nonce[4])
 {
 	if (!static_branch_likely(&poly1305_use_avx))
 		poly1305_emit_x86_64(ctx, mac, nonce);
 	else
 		poly1305_emit_avx(ctx, mac, nonce);
 }
+EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
 void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
 {
-	poly1305_simd_init(&dctx->h, key);
 	dctx->s[0] = get_unaligned_le32(&key[16]);
 	dctx->s[1] = get_unaligned_le32(&key[20]);
 	dctx->s[2] = get_unaligned_le32(&key[24]);
 	dctx->s[3] = get_unaligned_le32(&key[28]);
 	dctx->buflen = 0;
+	poly1305_block_init_arch(&dctx->state, key);
 }
 EXPORT_SYMBOL(poly1305_init_arch);
 
@@ -151,14 +120,15 @@ void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
 		dctx->buflen += bytes;
 
 		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_simd_blocks(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 1);
+			poly1305_blocks_arch(&dctx->state, dctx->buf,
+					     POLY1305_BLOCK_SIZE, 1);
 			dctx->buflen = 0;
 		}
 	}
 
 	if (likely(srclen >= POLY1305_BLOCK_SIZE)) {
 		bytes = round_down(srclen, POLY1305_BLOCK_SIZE);
-		poly1305_simd_blocks(&dctx->h, src, bytes, 1);
+		poly1305_blocks_arch(&dctx->state, src, bytes, 1);
 		src += bytes;
 		srclen -= bytes;
 	}
@@ -176,10 +146,11 @@ void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
 		dctx->buf[dctx->buflen++] = 1;
 		memset(dctx->buf + dctx->buflen, 0,
 		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_simd_blocks(&dctx->h, dctx->buf, POLY1305_BLOCK_SIZE, 0);
+		poly1305_blocks_arch(&dctx->state, dctx->buf,
+				     POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_simd_emit(&dctx->h, dst, dctx->s);
+	poly1305_emit_arch(&dctx->h, dst, dctx->s);
 	memzero_explicit(dctx, sizeof(*dctx));
 }
 EXPORT_SYMBOL(poly1305_final_arch);
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 08/11] crypto: chacha20poly1305 - Use lib/crypto poly1305
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (6 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 07/11] crypto: x86/poly1305 " Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 09/11] crypto: testmgr - Remove poly1305 Herbert Xu
                   ` (2 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Since the poly1305 algorithm is fixed, there is no point in going
through the Crypto API for it.  Use the lib/crypto poly1305 interface
instead.

For compatiblity keep the poly1305 parameter in the algorithm name.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 crypto/Kconfig            |   2 +-
 crypto/chacha20poly1305.c | 323 ++++++++------------------------------
 2 files changed, 67 insertions(+), 258 deletions(-)

diff --git a/crypto/Kconfig b/crypto/Kconfig
index 9878286d1d68..f87e2a26d2dd 100644
--- a/crypto/Kconfig
+++ b/crypto/Kconfig
@@ -784,8 +784,8 @@ config CRYPTO_AEGIS128_SIMD
 config CRYPTO_CHACHA20POLY1305
 	tristate "ChaCha20-Poly1305"
 	select CRYPTO_CHACHA20
-	select CRYPTO_POLY1305
 	select CRYPTO_AEAD
+	select CRYPTO_LIB_POLY1305
 	select CRYPTO_MANAGER
 	help
 	  ChaCha20 stream cipher and Poly1305 authenticator combined
diff --git a/crypto/chacha20poly1305.c b/crypto/chacha20poly1305.c
index d740849f1c19..b29f66ba1e2f 100644
--- a/crypto/chacha20poly1305.c
+++ b/crypto/chacha20poly1305.c
@@ -12,36 +12,23 @@
 #include <crypto/chacha.h>
 #include <crypto/poly1305.h>
 #include <linux/err.h>
-#include <linux/init.h>
 #include <linux/kernel.h>
+#include <linux/mm.h>
 #include <linux/module.h>
+#include <linux/string.h>
 
 struct chachapoly_instance_ctx {
 	struct crypto_skcipher_spawn chacha;
-	struct crypto_ahash_spawn poly;
 	unsigned int saltlen;
 };
 
 struct chachapoly_ctx {
 	struct crypto_skcipher *chacha;
-	struct crypto_ahash *poly;
 	/* key bytes we use for the ChaCha20 IV */
 	unsigned int saltlen;
 	u8 salt[] __counted_by(saltlen);
 };
 
-struct poly_req {
-	/* zero byte padding for AD/ciphertext, as needed */
-	u8 pad[POLY1305_BLOCK_SIZE];
-	/* tail data with AD/ciphertext lengths */
-	struct {
-		__le64 assoclen;
-		__le64 cryptlen;
-	} tail;
-	struct scatterlist src[1];
-	struct ahash_request req; /* must be last member */
-};
-
 struct chacha_req {
 	u8 iv[CHACHA_IV_SIZE];
 	struct scatterlist src[1];
@@ -62,7 +49,6 @@ struct chachapoly_req_ctx {
 	/* request flags, with MAY_SLEEP cleared if needed */
 	u32 flags;
 	union {
-		struct poly_req poly;
 		struct chacha_req chacha;
 	} u;
 };
@@ -105,16 +91,6 @@ static int poly_verify_tag(struct aead_request *req)
 	return 0;
 }
 
-static int poly_copy_tag(struct aead_request *req)
-{
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-
-	scatterwalk_map_and_copy(rctx->tag, req->dst,
-				 req->assoclen + rctx->cryptlen,
-				 sizeof(rctx->tag), 1);
-	return 0;
-}
-
 static void chacha_decrypt_done(void *data, int err)
 {
 	async_done_continue(data, err, poly_verify_tag);
@@ -151,210 +127,76 @@ static int chacha_decrypt(struct aead_request *req)
 	return poly_verify_tag(req);
 }
 
-static int poly_tail_continue(struct aead_request *req)
+static int poly_hash(struct aead_request *req)
 {
 	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
+	const void *zp = page_address(ZERO_PAGE(0));
+	struct scatterlist *sg = req->src;
+	struct poly1305_desc_ctx desc;
+	struct scatter_walk walk;
+	struct {
+		union {
+			struct {
+				__le64 assoclen;
+				__le64 cryptlen;
+			};
+			u8 u8[16];
+		};
+	} tail;
+	unsigned int padlen;
+	unsigned int total;
+
+	if (sg != req->dst)
+		memcpy_sglist(req->dst, sg, req->assoclen);
 
 	if (rctx->cryptlen == req->cryptlen) /* encrypting */
-		return poly_copy_tag(req);
+		sg = req->dst;
 
-	return chacha_decrypt(req);
-}
+	poly1305_init(&desc, rctx->key);
+	scatterwalk_start(&walk, sg);
 
-static void poly_tail_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_tail_continue);
-}
+	total = rctx->assoclen;
+	while (total) {
+		unsigned int n = scatterwalk_next(&walk, total);
 
-static int poly_tail(struct aead_request *req)
-{
-	struct crypto_aead *tfm = crypto_aead_reqtfm(req);
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(tfm);
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	int err;
-
-	preq->tail.assoclen = cpu_to_le64(rctx->assoclen);
-	preq->tail.cryptlen = cpu_to_le64(rctx->cryptlen);
-	sg_init_one(preq->src, &preq->tail, sizeof(preq->tail));
-
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_tail_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, preq->src,
-				rctx->tag, sizeof(preq->tail));
-
-	err = crypto_ahash_finup(&preq->req);
-	if (err)
-		return err;
-
-	return poly_tail_continue(req);
-}
-
-static void poly_cipherpad_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_tail);
-}
-
-static int poly_cipherpad(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	unsigned int padlen;
-	int err;
-
-	padlen = -rctx->cryptlen % POLY1305_BLOCK_SIZE;
-	memset(preq->pad, 0, sizeof(preq->pad));
-	sg_init_one(preq->src, preq->pad, padlen);
-
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_cipherpad_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, preq->src, NULL, padlen);
-
-	err = crypto_ahash_update(&preq->req);
-	if (err)
-		return err;
-
-	return poly_tail(req);
-}
-
-static void poly_cipher_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_cipherpad);
-}
-
-static int poly_cipher(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	struct scatterlist *crypt = req->src;
-	int err;
-
-	if (rctx->cryptlen == req->cryptlen) /* encrypting */
-		crypt = req->dst;
-
-	crypt = scatterwalk_ffwd(rctx->src, crypt, req->assoclen);
-
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_cipher_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, crypt, NULL, rctx->cryptlen);
-
-	err = crypto_ahash_update(&preq->req);
-	if (err)
-		return err;
-
-	return poly_cipherpad(req);
-}
-
-static void poly_adpad_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_cipher);
-}
-
-static int poly_adpad(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	unsigned int padlen;
-	int err;
+		poly1305_update(&desc, walk.addr, n);
+		scatterwalk_done_src(&walk, n);
+		total -= n;
+	}
 
 	padlen = -rctx->assoclen % POLY1305_BLOCK_SIZE;
-	memset(preq->pad, 0, sizeof(preq->pad));
-	sg_init_one(preq->src, preq->pad, padlen);
+	poly1305_update(&desc, zp, padlen);
 
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_adpad_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, preq->src, NULL, padlen);
+	scatterwalk_skip(&walk, req->assoclen - rctx->assoclen);
 
-	err = crypto_ahash_update(&preq->req);
-	if (err)
-		return err;
+	total = rctx->cryptlen;
+	while (total) {
+		unsigned int n = scatterwalk_next(&walk, total);
 
-	return poly_cipher(req);
-}
+		poly1305_update(&desc, walk.addr, n);
+		scatterwalk_done_src(&walk, n);
+		total -= n;
+	}
 
-static void poly_ad_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_adpad);
-}
+	padlen = -rctx->cryptlen % POLY1305_BLOCK_SIZE;
+	poly1305_update(&desc, zp, padlen);
 
-static int poly_ad(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	int err;
+	tail.assoclen = cpu_to_le64(rctx->assoclen);
+	tail.cryptlen = cpu_to_le64(rctx->cryptlen);
+	poly1305_update(&desc, tail.u8, sizeof(tail));
+	memzero_explicit(&tail, sizeof(tail));
+	poly1305_final(&desc, rctx->tag);
 
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_ad_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, req->src, NULL, rctx->assoclen);
+	if (rctx->cryptlen != req->cryptlen)
+		return chacha_decrypt(req);
 
-	err = crypto_ahash_update(&preq->req);
-	if (err)
-		return err;
-
-	return poly_adpad(req);
-}
-
-static void poly_setkey_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_ad);
-}
-
-static int poly_setkey(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	int err;
-
-	sg_init_one(preq->src, rctx->key, sizeof(rctx->key));
-
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_setkey_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-	ahash_request_set_crypt(&preq->req, preq->src, NULL, sizeof(rctx->key));
-
-	err = crypto_ahash_update(&preq->req);
-	if (err)
-		return err;
-
-	return poly_ad(req);
-}
-
-static void poly_init_done(void *data, int err)
-{
-	async_done_continue(data, err, poly_setkey);
-}
-
-static int poly_init(struct aead_request *req)
-{
-	struct chachapoly_ctx *ctx = crypto_aead_ctx(crypto_aead_reqtfm(req));
-	struct chachapoly_req_ctx *rctx = aead_request_ctx(req);
-	struct poly_req *preq = &rctx->u.poly;
-	int err;
-
-	ahash_request_set_callback(&preq->req, rctx->flags,
-				   poly_init_done, req);
-	ahash_request_set_tfm(&preq->req, ctx->poly);
-
-	err = crypto_ahash_init(&preq->req);
-	if (err)
-		return err;
-
-	return poly_setkey(req);
+	memcpy_to_scatterwalk(&walk, rctx->tag, sizeof(rctx->tag));
+	return 0;
 }
 
 static void poly_genkey_done(void *data, int err)
 {
-	async_done_continue(data, err, poly_init);
+	async_done_continue(data, err, poly_hash);
 }
 
 static int poly_genkey(struct aead_request *req)
@@ -388,7 +230,7 @@ static int poly_genkey(struct aead_request *req)
 	if (err)
 		return err;
 
-	return poly_init(req);
+	return poly_hash(req);
 }
 
 static void chacha_encrypt_done(void *data, int err)
@@ -437,14 +279,7 @@ static int chachapoly_encrypt(struct aead_request *req)
 	/* encrypt call chain:
 	 * - chacha_encrypt/done()
 	 * - poly_genkey/done()
-	 * - poly_init/done()
-	 * - poly_setkey/done()
-	 * - poly_ad/done()
-	 * - poly_adpad/done()
-	 * - poly_cipher/done()
-	 * - poly_cipherpad/done()
-	 * - poly_tail/done/continue()
-	 * - poly_copy_tag()
+	 * - poly_hash()
 	 */
 	return chacha_encrypt(req);
 }
@@ -458,13 +293,7 @@ static int chachapoly_decrypt(struct aead_request *req)
 
 	/* decrypt call chain:
 	 * - poly_genkey/done()
-	 * - poly_init/done()
-	 * - poly_setkey/done()
-	 * - poly_ad/done()
-	 * - poly_adpad/done()
-	 * - poly_cipher/done()
-	 * - poly_cipherpad/done()
-	 * - poly_tail/done/continue()
+	 * - poly_hash()
 	 * - chacha_decrypt/done()
 	 * - poly_verify_tag()
 	 */
@@ -503,21 +332,13 @@ static int chachapoly_init(struct crypto_aead *tfm)
 	struct chachapoly_instance_ctx *ictx = aead_instance_ctx(inst);
 	struct chachapoly_ctx *ctx = crypto_aead_ctx(tfm);
 	struct crypto_skcipher *chacha;
-	struct crypto_ahash *poly;
 	unsigned long align;
 
-	poly = crypto_spawn_ahash(&ictx->poly);
-	if (IS_ERR(poly))
-		return PTR_ERR(poly);
-
 	chacha = crypto_spawn_skcipher(&ictx->chacha);
-	if (IS_ERR(chacha)) {
-		crypto_free_ahash(poly);
+	if (IS_ERR(chacha))
 		return PTR_ERR(chacha);
-	}
 
 	ctx->chacha = chacha;
-	ctx->poly = poly;
 	ctx->saltlen = ictx->saltlen;
 
 	align = crypto_aead_alignmask(tfm);
@@ -525,12 +346,9 @@ static int chachapoly_init(struct crypto_aead *tfm)
 	crypto_aead_set_reqsize(
 		tfm,
 		align + offsetof(struct chachapoly_req_ctx, u) +
-		max(offsetof(struct chacha_req, req) +
-		    sizeof(struct skcipher_request) +
-		    crypto_skcipher_reqsize(chacha),
-		    offsetof(struct poly_req, req) +
-		    sizeof(struct ahash_request) +
-		    crypto_ahash_reqsize(poly)));
+		offsetof(struct chacha_req, req) +
+		sizeof(struct skcipher_request) +
+		crypto_skcipher_reqsize(chacha));
 
 	return 0;
 }
@@ -539,7 +357,6 @@ static void chachapoly_exit(struct crypto_aead *tfm)
 {
 	struct chachapoly_ctx *ctx = crypto_aead_ctx(tfm);
 
-	crypto_free_ahash(ctx->poly);
 	crypto_free_skcipher(ctx->chacha);
 }
 
@@ -548,7 +365,6 @@ static void chachapoly_free(struct aead_instance *inst)
 	struct chachapoly_instance_ctx *ctx = aead_instance_ctx(inst);
 
 	crypto_drop_skcipher(&ctx->chacha);
-	crypto_drop_ahash(&ctx->poly);
 	kfree(inst);
 }
 
@@ -559,7 +375,6 @@ static int chachapoly_create(struct crypto_template *tmpl, struct rtattr **tb,
 	struct aead_instance *inst;
 	struct chachapoly_instance_ctx *ctx;
 	struct skcipher_alg_common *chacha;
-	struct hash_alg_common *poly;
 	int err;
 
 	if (ivsize > CHACHAPOLY_IV_SIZE)
@@ -581,14 +396,9 @@ static int chachapoly_create(struct crypto_template *tmpl, struct rtattr **tb,
 		goto err_free_inst;
 	chacha = crypto_spawn_skcipher_alg_common(&ctx->chacha);
 
-	err = crypto_grab_ahash(&ctx->poly, aead_crypto_instance(inst),
-				crypto_attr_alg_name(tb[2]), 0, mask);
-	if (err)
-		goto err_free_inst;
-	poly = crypto_spawn_ahash_alg(&ctx->poly);
-
 	err = -EINVAL;
-	if (poly->digestsize != POLY1305_DIGEST_SIZE)
+	if (strcmp(crypto_attr_alg_name(tb[2]), "poly1305") &&
+	    strcmp(crypto_attr_alg_name(tb[2]), "poly1305-generic"))
 		goto err_free_inst;
 	/* Need 16-byte IV size, including Initial Block Counter value */
 	if (chacha->ivsize != CHACHA_IV_SIZE)
@@ -599,16 +409,15 @@ static int chachapoly_create(struct crypto_template *tmpl, struct rtattr **tb,
 
 	err = -ENAMETOOLONG;
 	if (snprintf(inst->alg.base.cra_name, CRYPTO_MAX_ALG_NAME,
-		     "%s(%s,%s)", name, chacha->base.cra_name,
-		     poly->base.cra_name) >= CRYPTO_MAX_ALG_NAME)
+		     "%s(%s,poly1305)", name,
+		     chacha->base.cra_name) >= CRYPTO_MAX_ALG_NAME)
 		goto err_free_inst;
 	if (snprintf(inst->alg.base.cra_driver_name, CRYPTO_MAX_ALG_NAME,
-		     "%s(%s,%s)", name, chacha->base.cra_driver_name,
-		     poly->base.cra_driver_name) >= CRYPTO_MAX_ALG_NAME)
+		     "%s(%s,poly1305-generic)", name,
+		     chacha->base.cra_driver_name) >= CRYPTO_MAX_ALG_NAME)
 		goto err_free_inst;
 
-	inst->alg.base.cra_priority = (chacha->base.cra_priority +
-				       poly->base.cra_priority) / 2;
+	inst->alg.base.cra_priority = chacha->base.cra_priority;
 	inst->alg.base.cra_blocksize = 1;
 	inst->alg.base.cra_alignmask = chacha->base.cra_alignmask;
 	inst->alg.base.cra_ctxsize = sizeof(struct chachapoly_ctx) +
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 09/11] crypto: testmgr - Remove poly1305
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (7 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 08/11] crypto: chacha20poly1305 - Use lib/crypto poly1305 Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 10/11] crypto: poly1305 - Remove algorithm Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 11/11] crypto: lib/poly1305 - Use block-only interface Herbert Xu
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

As poly1305 no longer has any in-kernel users, remove its tests.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 crypto/testmgr.c |   6 -
 crypto/testmgr.h | 288 -----------------------------------------------
 2 files changed, 294 deletions(-)

diff --git a/crypto/testmgr.c b/crypto/testmgr.c
index 82977ea25db3..f100be516f52 100644
--- a/crypto/testmgr.c
+++ b/crypto/testmgr.c
@@ -5406,12 +5406,6 @@ static const struct alg_test_desc alg_test_descs[] = {
 		.alg = "pkcs1pad(rsa)",
 		.test = alg_test_null,
 		.fips_allowed = 1,
-	}, {
-		.alg = "poly1305",
-		.test = alg_test_hash,
-		.suite = {
-			.hash = __VECS(poly1305_tv_template)
-		}
 	}, {
 		.alg = "polyval",
 		.test = alg_test_hash,
diff --git a/crypto/testmgr.h b/crypto/testmgr.h
index afc10af59b0a..32d099ac9e73 100644
--- a/crypto/testmgr.h
+++ b/crypto/testmgr.h
@@ -8836,294 +8836,6 @@ static const struct hash_testvec hmac_sha3_512_tv_template[] = {
 	},
 };
 
-/*
- * Poly1305 test vectors from RFC7539 A.3.
- */
-
-static const struct hash_testvec poly1305_tv_template[] = {
-	{ /* Test Vector #1 */
-		.plaintext	= "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-		.psize		= 96,
-		.digest		= "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #2 */
-		.plaintext	= "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x36\xe5\xf6\xb5\xc5\xe0\x60\x70"
-				  "\xf0\xef\xca\x96\x22\x7a\x86\x3e"
-				  "\x41\x6e\x79\x20\x73\x75\x62\x6d"
-				  "\x69\x73\x73\x69\x6f\x6e\x20\x74"
-				  "\x6f\x20\x74\x68\x65\x20\x49\x45"
-				  "\x54\x46\x20\x69\x6e\x74\x65\x6e"
-				  "\x64\x65\x64\x20\x62\x79\x20\x74"
-				  "\x68\x65\x20\x43\x6f\x6e\x74\x72"
-				  "\x69\x62\x75\x74\x6f\x72\x20\x66"
-				  "\x6f\x72\x20\x70\x75\x62\x6c\x69"
-				  "\x63\x61\x74\x69\x6f\x6e\x20\x61"
-				  "\x73\x20\x61\x6c\x6c\x20\x6f\x72"
-				  "\x20\x70\x61\x72\x74\x20\x6f\x66"
-				  "\x20\x61\x6e\x20\x49\x45\x54\x46"
-				  "\x20\x49\x6e\x74\x65\x72\x6e\x65"
-				  "\x74\x2d\x44\x72\x61\x66\x74\x20"
-				  "\x6f\x72\x20\x52\x46\x43\x20\x61"
-				  "\x6e\x64\x20\x61\x6e\x79\x20\x73"
-				  "\x74\x61\x74\x65\x6d\x65\x6e\x74"
-				  "\x20\x6d\x61\x64\x65\x20\x77\x69"
-				  "\x74\x68\x69\x6e\x20\x74\x68\x65"
-				  "\x20\x63\x6f\x6e\x74\x65\x78\x74"
-				  "\x20\x6f\x66\x20\x61\x6e\x20\x49"
-				  "\x45\x54\x46\x20\x61\x63\x74\x69"
-				  "\x76\x69\x74\x79\x20\x69\x73\x20"
-				  "\x63\x6f\x6e\x73\x69\x64\x65\x72"
-				  "\x65\x64\x20\x61\x6e\x20\x22\x49"
-				  "\x45\x54\x46\x20\x43\x6f\x6e\x74"
-				  "\x72\x69\x62\x75\x74\x69\x6f\x6e"
-				  "\x22\x2e\x20\x53\x75\x63\x68\x20"
-				  "\x73\x74\x61\x74\x65\x6d\x65\x6e"
-				  "\x74\x73\x20\x69\x6e\x63\x6c\x75"
-				  "\x64\x65\x20\x6f\x72\x61\x6c\x20"
-				  "\x73\x74\x61\x74\x65\x6d\x65\x6e"
-				  "\x74\x73\x20\x69\x6e\x20\x49\x45"
-				  "\x54\x46\x20\x73\x65\x73\x73\x69"
-				  "\x6f\x6e\x73\x2c\x20\x61\x73\x20"
-				  "\x77\x65\x6c\x6c\x20\x61\x73\x20"
-				  "\x77\x72\x69\x74\x74\x65\x6e\x20"
-				  "\x61\x6e\x64\x20\x65\x6c\x65\x63"
-				  "\x74\x72\x6f\x6e\x69\x63\x20\x63"
-				  "\x6f\x6d\x6d\x75\x6e\x69\x63\x61"
-				  "\x74\x69\x6f\x6e\x73\x20\x6d\x61"
-				  "\x64\x65\x20\x61\x74\x20\x61\x6e"
-				  "\x79\x20\x74\x69\x6d\x65\x20\x6f"
-				  "\x72\x20\x70\x6c\x61\x63\x65\x2c"
-				  "\x20\x77\x68\x69\x63\x68\x20\x61"
-				  "\x72\x65\x20\x61\x64\x64\x72\x65"
-				  "\x73\x73\x65\x64\x20\x74\x6f",
-		.psize		= 407,
-		.digest		= "\x36\xe5\xf6\xb5\xc5\xe0\x60\x70"
-				  "\xf0\xef\xca\x96\x22\x7a\x86\x3e",
-	}, { /* Test Vector #3 */
-		.plaintext	= "\x36\xe5\xf6\xb5\xc5\xe0\x60\x70"
-				  "\xf0\xef\xca\x96\x22\x7a\x86\x3e"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x41\x6e\x79\x20\x73\x75\x62\x6d"
-				  "\x69\x73\x73\x69\x6f\x6e\x20\x74"
-				  "\x6f\x20\x74\x68\x65\x20\x49\x45"
-				  "\x54\x46\x20\x69\x6e\x74\x65\x6e"
-				  "\x64\x65\x64\x20\x62\x79\x20\x74"
-				  "\x68\x65\x20\x43\x6f\x6e\x74\x72"
-				  "\x69\x62\x75\x74\x6f\x72\x20\x66"
-				  "\x6f\x72\x20\x70\x75\x62\x6c\x69"
-				  "\x63\x61\x74\x69\x6f\x6e\x20\x61"
-				  "\x73\x20\x61\x6c\x6c\x20\x6f\x72"
-				  "\x20\x70\x61\x72\x74\x20\x6f\x66"
-				  "\x20\x61\x6e\x20\x49\x45\x54\x46"
-				  "\x20\x49\x6e\x74\x65\x72\x6e\x65"
-				  "\x74\x2d\x44\x72\x61\x66\x74\x20"
-				  "\x6f\x72\x20\x52\x46\x43\x20\x61"
-				  "\x6e\x64\x20\x61\x6e\x79\x20\x73"
-				  "\x74\x61\x74\x65\x6d\x65\x6e\x74"
-				  "\x20\x6d\x61\x64\x65\x20\x77\x69"
-				  "\x74\x68\x69\x6e\x20\x74\x68\x65"
-				  "\x20\x63\x6f\x6e\x74\x65\x78\x74"
-				  "\x20\x6f\x66\x20\x61\x6e\x20\x49"
-				  "\x45\x54\x46\x20\x61\x63\x74\x69"
-				  "\x76\x69\x74\x79\x20\x69\x73\x20"
-				  "\x63\x6f\x6e\x73\x69\x64\x65\x72"
-				  "\x65\x64\x20\x61\x6e\x20\x22\x49"
-				  "\x45\x54\x46\x20\x43\x6f\x6e\x74"
-				  "\x72\x69\x62\x75\x74\x69\x6f\x6e"
-				  "\x22\x2e\x20\x53\x75\x63\x68\x20"
-				  "\x73\x74\x61\x74\x65\x6d\x65\x6e"
-				  "\x74\x73\x20\x69\x6e\x63\x6c\x75"
-				  "\x64\x65\x20\x6f\x72\x61\x6c\x20"
-				  "\x73\x74\x61\x74\x65\x6d\x65\x6e"
-				  "\x74\x73\x20\x69\x6e\x20\x49\x45"
-				  "\x54\x46\x20\x73\x65\x73\x73\x69"
-				  "\x6f\x6e\x73\x2c\x20\x61\x73\x20"
-				  "\x77\x65\x6c\x6c\x20\x61\x73\x20"
-				  "\x77\x72\x69\x74\x74\x65\x6e\x20"
-				  "\x61\x6e\x64\x20\x65\x6c\x65\x63"
-				  "\x74\x72\x6f\x6e\x69\x63\x20\x63"
-				  "\x6f\x6d\x6d\x75\x6e\x69\x63\x61"
-				  "\x74\x69\x6f\x6e\x73\x20\x6d\x61"
-				  "\x64\x65\x20\x61\x74\x20\x61\x6e"
-				  "\x79\x20\x74\x69\x6d\x65\x20\x6f"
-				  "\x72\x20\x70\x6c\x61\x63\x65\x2c"
-				  "\x20\x77\x68\x69\x63\x68\x20\x61"
-				  "\x72\x65\x20\x61\x64\x64\x72\x65"
-				  "\x73\x73\x65\x64\x20\x74\x6f",
-		.psize		= 407,
-		.digest		= "\xf3\x47\x7e\x7c\xd9\x54\x17\xaf"
-				  "\x89\xa6\xb8\x79\x4c\x31\x0c\xf0",
-	}, { /* Test Vector #4 */
-		.plaintext	= "\x1c\x92\x40\xa5\xeb\x55\xd3\x8a"
-				  "\xf3\x33\x88\x86\x04\xf6\xb5\xf0"
-				  "\x47\x39\x17\xc1\x40\x2b\x80\x09"
-				  "\x9d\xca\x5c\xbc\x20\x70\x75\xc0"
-				  "\x27\x54\x77\x61\x73\x20\x62\x72"
-				  "\x69\x6c\x6c\x69\x67\x2c\x20\x61"
-				  "\x6e\x64\x20\x74\x68\x65\x20\x73"
-				  "\x6c\x69\x74\x68\x79\x20\x74\x6f"
-				  "\x76\x65\x73\x0a\x44\x69\x64\x20"
-				  "\x67\x79\x72\x65\x20\x61\x6e\x64"
-				  "\x20\x67\x69\x6d\x62\x6c\x65\x20"
-				  "\x69\x6e\x20\x74\x68\x65\x20\x77"
-				  "\x61\x62\x65\x3a\x0a\x41\x6c\x6c"
-				  "\x20\x6d\x69\x6d\x73\x79\x20\x77"
-				  "\x65\x72\x65\x20\x74\x68\x65\x20"
-				  "\x62\x6f\x72\x6f\x67\x6f\x76\x65"
-				  "\x73\x2c\x0a\x41\x6e\x64\x20\x74"
-				  "\x68\x65\x20\x6d\x6f\x6d\x65\x20"
-				  "\x72\x61\x74\x68\x73\x20\x6f\x75"
-				  "\x74\x67\x72\x61\x62\x65\x2e",
-		.psize		= 159,
-		.digest		= "\x45\x41\x66\x9a\x7e\xaa\xee\x61"
-				  "\xe7\x08\xdc\x7c\xbc\xc5\xeb\x62",
-	}, { /* Test Vector #5 */
-		.plaintext	= "\x02\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff",
-		.psize		= 48,
-		.digest		= "\x03\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #6 */
-		.plaintext	= "\x02\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\x02\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-		.psize		= 48,
-		.digest		= "\x03\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #7 */
-		.plaintext	= "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xf0\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\x11\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-		.psize		= 80,
-		.digest		= "\x05\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #8 */
-		.plaintext	= "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xfb\xfe\xfe\xfe\xfe\xfe\xfe\xfe"
-				  "\xfe\xfe\xfe\xfe\xfe\xfe\xfe\xfe"
-				  "\x01\x01\x01\x01\x01\x01\x01\x01"
-				  "\x01\x01\x01\x01\x01\x01\x01\x01",
-		.psize		= 80,
-		.digest		= "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #9 */
-		.plaintext	= "\x02\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xfd\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff",
-		.psize		= 48,
-		.digest		= "\xfa\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff",
-	}, { /* Test Vector #10 */
-		.plaintext	= "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x04\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xe3\x35\x94\xd7\x50\x5e\x43\xb9"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x33\x94\xd7\x50\x5e\x43\x79\xcd"
-				  "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-		.psize		= 96,
-		.digest		= "\x14\x00\x00\x00\x00\x00\x00\x00"
-				  "\x55\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Test Vector #11 */
-		.plaintext	= "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x04\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\xe3\x35\x94\xd7\x50\x5e\x43\xb9"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x33\x94\xd7\x50\x5e\x43\x79\xcd"
-				  "\x01\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-		.psize		= 80,
-		.digest		= "\x13\x00\x00\x00\x00\x00\x00\x00"
-				  "\x00\x00\x00\x00\x00\x00\x00\x00",
-	}, { /* Regression test for overflow in AVX2 implementation */
-		.plaintext	= "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff\xff\xff\xff\xff"
-				  "\xff\xff\xff\xff",
-		.psize		= 300,
-		.digest		= "\xfb\x5e\x96\xd8\x61\xd5\xc7\xc8"
-				  "\x78\xe5\x87\xcc\x2d\x5a\x22\xe1",
-	}
-};
-
 /* NHPoly1305 test vectors from https://github.com/google/adiantum */
 static const struct hash_testvec nhpoly1305_tv_template[] = {
 	{
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 10/11] crypto: poly1305 - Remove algorithm
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (8 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 09/11] crypto: testmgr - Remove poly1305 Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  2025-04-27  1:00 ` [v2 PATCH 11/11] crypto: lib/poly1305 - Use block-only interface Herbert Xu
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

As there are no in-kernel users of the Crypto API poly1305 left,
remove it.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 crypto/Kconfig    |  12 ----
 crypto/Makefile   |   2 -
 crypto/poly1305.c | 152 ----------------------------------------------
 3 files changed, 166 deletions(-)
 delete mode 100644 crypto/poly1305.c

diff --git a/crypto/Kconfig b/crypto/Kconfig
index f87e2a26d2dd..3cb5563dc4ab 100644
--- a/crypto/Kconfig
+++ b/crypto/Kconfig
@@ -953,18 +953,6 @@ config CRYPTO_POLYVAL
 	  This is used in HCTR2.  It is not a general-purpose
 	  cryptographic hash function.
 
-config CRYPTO_POLY1305
-	tristate "Poly1305"
-	select CRYPTO_HASH
-	select CRYPTO_LIB_POLY1305
-	select CRYPTO_LIB_POLY1305_GENERIC
-	help
-	  Poly1305 authenticator algorithm (RFC7539)
-
-	  Poly1305 is an authenticator algorithm designed by Daniel J. Bernstein.
-	  It is used for the ChaCha20-Poly1305 AEAD, specified in RFC7539 for use
-	  in IETF protocols. This is the portable C implementation of Poly1305.
-
 config CRYPTO_RMD160
 	tristate "RIPEMD-160"
 	select CRYPTO_HASH
diff --git a/crypto/Makefile b/crypto/Makefile
index 5d2f2a28d8a0..587bc74b6d74 100644
--- a/crypto/Makefile
+++ b/crypto/Makefile
@@ -149,8 +149,6 @@ obj-$(CONFIG_CRYPTO_SEED) += seed.o
 obj-$(CONFIG_CRYPTO_ARIA) += aria_generic.o
 obj-$(CONFIG_CRYPTO_CHACHA20) += chacha.o
 CFLAGS_chacha.o += -DARCH=$(ARCH)
-obj-$(CONFIG_CRYPTO_POLY1305) += poly1305.o
-CFLAGS_poly1305.o += -DARCH=$(ARCH)
 obj-$(CONFIG_CRYPTO_DEFLATE) += deflate.o
 obj-$(CONFIG_CRYPTO_MICHAEL_MIC) += michael_mic.o
 obj-$(CONFIG_CRYPTO_CRC32C) += crc32c_generic.o
diff --git a/crypto/poly1305.c b/crypto/poly1305.c
deleted file mode 100644
index e0436bdc462b..000000000000
--- a/crypto/poly1305.c
+++ /dev/null
@@ -1,152 +0,0 @@
-/*
- * Crypto API wrapper for the Poly1305 library functions
- *
- * Copyright (C) 2015 Martin Willi
- *
- * This program is free software; you can redistribute it and/or modify
- * it under the terms of the GNU General Public License as published by
- * the Free Software Foundation; either version 2 of the License, or
- * (at your option) any later version.
- */
-
-#include <crypto/algapi.h>
-#include <crypto/internal/hash.h>
-#include <crypto/internal/poly1305.h>
-#include <linux/crypto.h>
-#include <linux/kernel.h>
-#include <linux/module.h>
-
-struct crypto_poly1305_desc_ctx {
-	struct poly1305_desc_ctx base;
-	u8 key[POLY1305_KEY_SIZE];
-	unsigned int keysize;
-};
-
-static int crypto_poly1305_init(struct shash_desc *desc)
-{
-	struct crypto_poly1305_desc_ctx *dctx = shash_desc_ctx(desc);
-
-	dctx->keysize = 0;
-	return 0;
-}
-
-static int crypto_poly1305_update(struct shash_desc *desc,
-				  const u8 *src, unsigned int srclen, bool arch)
-{
-	struct crypto_poly1305_desc_ctx *dctx = shash_desc_ctx(desc);
-	unsigned int bytes;
-
-	/*
-	 * The key is passed as the first 32 "data" bytes.  The actual
-	 * poly1305_init() can be called only once the full key is available.
-	 */
-	if (dctx->keysize < POLY1305_KEY_SIZE) {
-		bytes = min(srclen, POLY1305_KEY_SIZE - dctx->keysize);
-		memcpy(&dctx->key[dctx->keysize], src, bytes);
-		dctx->keysize += bytes;
-		if (dctx->keysize < POLY1305_KEY_SIZE)
-			return 0;
-		if (arch)
-			poly1305_init(&dctx->base, dctx->key);
-		else
-			poly1305_init_generic(&dctx->base, dctx->key);
-		src += bytes;
-		srclen -= bytes;
-	}
-
-	if (arch)
-		poly1305_update(&dctx->base, src, srclen);
-	else
-		poly1305_update_generic(&dctx->base, src, srclen);
-
-	return 0;
-}
-
-static int crypto_poly1305_update_generic(struct shash_desc *desc,
-					  const u8 *src, unsigned int srclen)
-{
-	return crypto_poly1305_update(desc, src, srclen, false);
-}
-
-static int crypto_poly1305_update_arch(struct shash_desc *desc,
-				       const u8 *src, unsigned int srclen)
-{
-	return crypto_poly1305_update(desc, src, srclen, true);
-}
-
-static int crypto_poly1305_final(struct shash_desc *desc, u8 *dst, bool arch)
-{
-	struct crypto_poly1305_desc_ctx *dctx = shash_desc_ctx(desc);
-
-	if (unlikely(dctx->keysize != POLY1305_KEY_SIZE))
-		return -ENOKEY;
-
-	if (arch)
-		poly1305_final(&dctx->base, dst);
-	else
-		poly1305_final_generic(&dctx->base, dst);
-	memzero_explicit(&dctx->key, sizeof(dctx->key));
-	return 0;
-}
-
-static int crypto_poly1305_final_generic(struct shash_desc *desc, u8 *dst)
-{
-	return crypto_poly1305_final(desc, dst, false);
-}
-
-static int crypto_poly1305_final_arch(struct shash_desc *desc, u8 *dst)
-{
-	return crypto_poly1305_final(desc, dst, true);
-}
-
-static struct shash_alg poly1305_algs[] = {
-	{
-		.base.cra_name		= "poly1305",
-		.base.cra_driver_name	= "poly1305-generic",
-		.base.cra_priority	= 100,
-		.base.cra_blocksize	= POLY1305_BLOCK_SIZE,
-		.base.cra_module	= THIS_MODULE,
-		.digestsize		= POLY1305_DIGEST_SIZE,
-		.init			= crypto_poly1305_init,
-		.update			= crypto_poly1305_update_generic,
-		.final			= crypto_poly1305_final_generic,
-		.descsize		= sizeof(struct crypto_poly1305_desc_ctx),
-	},
-	{
-		.base.cra_name		= "poly1305",
-		.base.cra_driver_name	= "poly1305-" __stringify(ARCH),
-		.base.cra_priority	= 300,
-		.base.cra_blocksize	= POLY1305_BLOCK_SIZE,
-		.base.cra_module	= THIS_MODULE,
-		.digestsize		= POLY1305_DIGEST_SIZE,
-		.init			= crypto_poly1305_init,
-		.update			= crypto_poly1305_update_arch,
-		.final			= crypto_poly1305_final_arch,
-		.descsize		= sizeof(struct crypto_poly1305_desc_ctx),
-	},
-};
-
-static int num_algs;
-
-static int __init poly1305_mod_init(void)
-{
-	/* register the arch flavours only if they differ from generic */
-	num_algs = poly1305_is_arch_optimized() ? 2 : 1;
-
-	return crypto_register_shashes(poly1305_algs, num_algs);
-}
-
-static void __exit poly1305_mod_exit(void)
-{
-	crypto_unregister_shashes(poly1305_algs, num_algs);
-}
-
-subsys_initcall(poly1305_mod_init);
-module_exit(poly1305_mod_exit);
-
-MODULE_LICENSE("GPL");
-MODULE_AUTHOR("Martin Willi <martin@strongswan.org>");
-MODULE_DESCRIPTION("Crypto API wrapper for the Poly1305 library functions");
-MODULE_ALIAS_CRYPTO("poly1305");
-MODULE_ALIAS_CRYPTO("poly1305-generic");
-MODULE_ALIAS_CRYPTO("poly1305-" __stringify(ARCH));
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [v2 PATCH 11/11] crypto: lib/poly1305 - Use block-only interface
  2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
                   ` (9 preceding siblings ...)
  2025-04-27  1:00 ` [v2 PATCH 10/11] crypto: poly1305 - Remove algorithm Herbert Xu
@ 2025-04-27  1:00 ` Herbert Xu
  10 siblings, 0 replies; 13+ messages in thread
From: Herbert Xu @ 2025-04-27  1:00 UTC (permalink / raw)
  To: Linux Crypto Mailing List

Now that every architecture provides a block function, use that
to implement the lib/poly1305 and remove the old per-arch code.

Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
---
 arch/arm/lib/crypto/poly1305-glue.c         | 57 -------------------
 arch/arm64/lib/crypto/poly1305-glue.c       | 58 -------------------
 arch/mips/lib/crypto/poly1305-glue.c        | 60 --------------------
 arch/powerpc/lib/crypto/poly1305-p10-glue.c | 63 ---------------------
 arch/x86/lib/crypto/poly1305_glue.c         | 60 --------------------
 include/crypto/poly1305.h                   | 53 ++---------------
 lib/crypto/poly1305.c                       | 39 ++++++++-----
 7 files changed, 32 insertions(+), 358 deletions(-)

diff --git a/arch/arm/lib/crypto/poly1305-glue.c b/arch/arm/lib/crypto/poly1305-glue.c
index 3ee16048ec7c..91da42b26d9c 100644
--- a/arch/arm/lib/crypto/poly1305-glue.c
+++ b/arch/arm/lib/crypto/poly1305-glue.c
@@ -12,7 +12,6 @@
 #include <linux/jump_label.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
-#include <linux/string.h>
 #include <linux/unaligned.h>
 
 asmlinkage void poly1305_block_init_arch(
@@ -35,17 +34,6 @@ void __weak poly1305_blocks_neon(struct poly1305_block_state *state,
 
 static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon);
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
-{
-	dctx->s[0] = get_unaligned_le32(key + 16);
-	dctx->s[1] = get_unaligned_le32(key + 20);
-	dctx->s[2] = get_unaligned_le32(key + 24);
-	dctx->s[3] = get_unaligned_le32(key + 28);
-	dctx->buflen = 0;
-	poly1305_block_init_arch(&dctx->state, key);
-}
-EXPORT_SYMBOL(poly1305_init_arch);
-
 void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 			  unsigned int len, u32 padbit)
 {
@@ -67,51 +55,6 @@ void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 }
 EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
 
-void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
-			  unsigned int nbytes)
-{
-	if (unlikely(dctx->buflen)) {
-		u32 bytes = min(nbytes, POLY1305_BLOCK_SIZE - dctx->buflen);
-
-		memcpy(dctx->buf + dctx->buflen, src, bytes);
-		src += bytes;
-		nbytes -= bytes;
-		dctx->buflen += bytes;
-
-		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_arch(&dctx->state, dctx->buf,
-					     POLY1305_BLOCK_SIZE, 1);
-			dctx->buflen = 0;
-		}
-	}
-
-	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		poly1305_blocks_arch(&dctx->state, src, nbytes, 1);
-		src += round_down(nbytes, POLY1305_BLOCK_SIZE);
-		nbytes %= POLY1305_BLOCK_SIZE;
-	}
-
-	if (unlikely(nbytes)) {
-		dctx->buflen = nbytes;
-		memcpy(dctx->buf, src, nbytes);
-	}
-}
-EXPORT_SYMBOL(poly1305_update_arch);
-
-void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
-{
-	if (unlikely(dctx->buflen)) {
-		dctx->buf[dctx->buflen++] = 1;
-		memset(dctx->buf + dctx->buflen, 0,
-		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arch(&dctx->state, dctx->buf, POLY1305_BLOCK_SIZE, 0);
-	}
-
-	poly1305_emit_arch(&dctx->h, dst, dctx->s);
-	*dctx = (struct poly1305_desc_ctx){};
-}
-EXPORT_SYMBOL(poly1305_final_arch);
-
 bool poly1305_is_arch_optimized(void)
 {
 	/* We always can use at least the ARM scalar implementation. */
diff --git a/arch/arm64/lib/crypto/poly1305-glue.c b/arch/arm64/lib/crypto/poly1305-glue.c
index d66a820e32d5..681c26557336 100644
--- a/arch/arm64/lib/crypto/poly1305-glue.c
+++ b/arch/arm64/lib/crypto/poly1305-glue.c
@@ -12,7 +12,6 @@
 #include <linux/jump_label.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
-#include <linux/string.h>
 #include <linux/unaligned.h>
 
 asmlinkage void poly1305_block_init_arch(
@@ -30,17 +29,6 @@ EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
 static __ro_after_init DEFINE_STATIC_KEY_FALSE(have_neon);
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
-{
-	dctx->s[0] = get_unaligned_le32(key + 16);
-	dctx->s[1] = get_unaligned_le32(key + 20);
-	dctx->s[2] = get_unaligned_le32(key + 24);
-	dctx->s[3] = get_unaligned_le32(key + 28);
-	dctx->buflen = 0;
-	poly1305_block_init_arch(&dctx->state, key);
-}
-EXPORT_SYMBOL(poly1305_init_arch);
-
 void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 			  unsigned int len, u32 padbit)
 {
@@ -61,52 +49,6 @@ void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 }
 EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
 
-void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
-			  unsigned int nbytes)
-{
-	if (unlikely(dctx->buflen)) {
-		u32 bytes = min(nbytes, POLY1305_BLOCK_SIZE - dctx->buflen);
-
-		memcpy(dctx->buf + dctx->buflen, src, bytes);
-		src += bytes;
-		nbytes -= bytes;
-		dctx->buflen += bytes;
-
-		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_arch(&dctx->state, dctx->buf,
-					     POLY1305_BLOCK_SIZE, 1);
-			dctx->buflen = 0;
-		}
-	}
-
-	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		poly1305_blocks_arch(&dctx->state, src, nbytes, 1);
-		src += round_down(nbytes, POLY1305_BLOCK_SIZE);
-		nbytes %= POLY1305_BLOCK_SIZE;
-	}
-
-	if (unlikely(nbytes)) {
-		dctx->buflen = nbytes;
-		memcpy(dctx->buf, src, nbytes);
-	}
-}
-EXPORT_SYMBOL(poly1305_update_arch);
-
-void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
-{
-	if (unlikely(dctx->buflen)) {
-		dctx->buf[dctx->buflen++] = 1;
-		memset(dctx->buf + dctx->buflen, 0,
-		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arch(&dctx->state, dctx->buf,
-				     POLY1305_BLOCK_SIZE, 0);
-	}
-
-	poly1305_emit_arch(&dctx->h, dst, dctx->s);
-	memzero_explicit(dctx, sizeof(*dctx));
-}
-EXPORT_SYMBOL(poly1305_final_arch);
-
 bool poly1305_is_arch_optimized(void)
 {
 	/* We always can use at least the ARM64 scalar implementation. */
diff --git a/arch/mips/lib/crypto/poly1305-glue.c b/arch/mips/lib/crypto/poly1305-glue.c
index 2fea4cacfe27..764a38a65200 100644
--- a/arch/mips/lib/crypto/poly1305-glue.c
+++ b/arch/mips/lib/crypto/poly1305-glue.c
@@ -9,7 +9,6 @@
 #include <linux/cpufeature.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
-#include <linux/string.h>
 #include <linux/unaligned.h>
 
 asmlinkage void poly1305_block_init_arch(
@@ -24,65 +23,6 @@ asmlinkage void poly1305_emit_arch(const struct poly1305_state *state,
 				   const u32 nonce[4]);
 EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
-{
-	dctx->s[0] = get_unaligned_le32(key + 16);
-	dctx->s[1] = get_unaligned_le32(key + 20);
-	dctx->s[2] = get_unaligned_le32(key + 24);
-	dctx->s[3] = get_unaligned_le32(key + 28);
-	dctx->buflen = 0;
-	poly1305_block_init_arch(&dctx->state, key);
-}
-EXPORT_SYMBOL(poly1305_init_arch);
-
-void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
-			  unsigned int nbytes)
-{
-	if (unlikely(dctx->buflen)) {
-		u32 bytes = min(nbytes, POLY1305_BLOCK_SIZE - dctx->buflen);
-
-		memcpy(dctx->buf + dctx->buflen, src, bytes);
-		src += bytes;
-		nbytes -= bytes;
-		dctx->buflen += bytes;
-
-		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_arch(&dctx->state, dctx->buf,
-					     POLY1305_BLOCK_SIZE, 1);
-			dctx->buflen = 0;
-		}
-	}
-
-	if (likely(nbytes >= POLY1305_BLOCK_SIZE)) {
-		unsigned int len = round_down(nbytes, POLY1305_BLOCK_SIZE);
-
-		poly1305_blocks_arch(&dctx->state, src, len, 1);
-		src += len;
-		nbytes %= POLY1305_BLOCK_SIZE;
-	}
-
-	if (unlikely(nbytes)) {
-		dctx->buflen = nbytes;
-		memcpy(dctx->buf, src, nbytes);
-	}
-}
-EXPORT_SYMBOL(poly1305_update_arch);
-
-void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
-{
-	if (unlikely(dctx->buflen)) {
-		dctx->buf[dctx->buflen++] = 1;
-		memset(dctx->buf + dctx->buflen, 0,
-		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arch(&dctx->state, dctx->buf,
-				     POLY1305_BLOCK_SIZE, 0);
-	}
-
-	poly1305_emit_arch(&dctx->h, dst, dctx->s);
-	*dctx = (struct poly1305_desc_ctx){};
-}
-EXPORT_SYMBOL(poly1305_final_arch);
-
 bool poly1305_is_arch_optimized(void)
 {
 	return true;
diff --git a/arch/powerpc/lib/crypto/poly1305-p10-glue.c b/arch/powerpc/lib/crypto/poly1305-p10-glue.c
index 708435beaba6..50ac802220e0 100644
--- a/arch/powerpc/lib/crypto/poly1305-p10-glue.c
+++ b/arch/powerpc/lib/crypto/poly1305-p10-glue.c
@@ -10,7 +10,6 @@
 #include <linux/jump_label.h>
 #include <linux/kernel.h>
 #include <linux/module.h>
-#include <linux/string.h>
 #include <linux/unaligned.h>
 
 asmlinkage void poly1305_p10le_4blocks(struct poly1305_block_state *state, const u8 *m, u32 mlen);
@@ -45,17 +44,6 @@ void poly1305_block_init_arch(struct poly1305_block_state *dctx,
 }
 EXPORT_SYMBOL_GPL(poly1305_block_init_arch);
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
-{
-	dctx->s[0] = get_unaligned_le32(key + 16);
-	dctx->s[1] = get_unaligned_le32(key + 20);
-	dctx->s[2] = get_unaligned_le32(key + 24);
-	dctx->s[3] = get_unaligned_le32(key + 28);
-	dctx->buflen = 0;
-	poly1305_block_init_arch(&dctx->state, key);
-}
-EXPORT_SYMBOL(poly1305_init_arch);
-
 void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 			  unsigned int len, u32 padbit)
 {
@@ -76,57 +64,6 @@ void poly1305_blocks_arch(struct poly1305_block_state *state, const u8 *src,
 }
 EXPORT_SYMBOL_GPL(poly1305_blocks_arch);
 
-void poly1305_update_arch(struct poly1305_desc_ctx *dctx,
-			  const u8 *src, unsigned int srclen)
-{
-	unsigned int bytes;
-
-	if (!static_key_enabled(&have_p10))
-		return poly1305_update_generic(dctx, src, srclen);
-
-	if (unlikely(dctx->buflen)) {
-		bytes = min(srclen, POLY1305_BLOCK_SIZE - dctx->buflen);
-		memcpy(dctx->buf + dctx->buflen, src, bytes);
-		src += bytes;
-		srclen -= bytes;
-		dctx->buflen += bytes;
-		if (dctx->buflen < POLY1305_BLOCK_SIZE)
-			return;
-		poly1305_blocks_arch(&dctx->state, dctx->buf,
-				     POLY1305_BLOCK_SIZE, 1);
-		dctx->buflen = 0;
-	}
-
-	if (likely(srclen >= POLY1305_BLOCK_SIZE)) {
-		poly1305_blocks_arch(&dctx->state, src, srclen, 1);
-		src += srclen - (srclen % POLY1305_BLOCK_SIZE);
-		srclen %= POLY1305_BLOCK_SIZE;
-	}
-
-	if (unlikely(srclen)) {
-		dctx->buflen = srclen;
-		memcpy(dctx->buf, src, srclen);
-	}
-}
-EXPORT_SYMBOL(poly1305_update_arch);
-
-void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
-{
-	if (!static_key_enabled(&have_p10))
-		return poly1305_final_generic(dctx, dst);
-
-	if (dctx->buflen) {
-		dctx->buf[dctx->buflen++] = 1;
-		memset(dctx->buf + dctx->buflen, 0,
-		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arch(&dctx->state, dctx->buf,
-				     POLY1305_BLOCK_SIZE, 0);
-	}
-
-	poly1305_emit_arch(&dctx->h, dst, dctx->s);
-}
-EXPORT_SYMBOL(poly1305_final_arch);
-
 bool poly1305_is_arch_optimized(void)
 {
 	return static_key_enabled(&have_p10);
diff --git a/arch/x86/lib/crypto/poly1305_glue.c b/arch/x86/lib/crypto/poly1305_glue.c
index d98764ec3b47..f799828c5809 100644
--- a/arch/x86/lib/crypto/poly1305_glue.c
+++ b/arch/x86/lib/crypto/poly1305_glue.c
@@ -10,7 +10,6 @@
 #include <linux/kernel.h>
 #include <linux/module.h>
 #include <linux/sizes.h>
-#include <linux/string.h>
 #include <linux/unaligned.h>
 
 struct poly1305_arch_internal {
@@ -96,65 +95,6 @@ void poly1305_emit_arch(const struct poly1305_state *ctx,
 }
 EXPORT_SYMBOL_GPL(poly1305_emit_arch);
 
-void poly1305_init_arch(struct poly1305_desc_ctx *dctx, const u8 key[POLY1305_KEY_SIZE])
-{
-	dctx->s[0] = get_unaligned_le32(&key[16]);
-	dctx->s[1] = get_unaligned_le32(&key[20]);
-	dctx->s[2] = get_unaligned_le32(&key[24]);
-	dctx->s[3] = get_unaligned_le32(&key[28]);
-	dctx->buflen = 0;
-	poly1305_block_init_arch(&dctx->state, key);
-}
-EXPORT_SYMBOL(poly1305_init_arch);
-
-void poly1305_update_arch(struct poly1305_desc_ctx *dctx, const u8 *src,
-			  unsigned int srclen)
-{
-	unsigned int bytes;
-
-	if (unlikely(dctx->buflen)) {
-		bytes = min(srclen, POLY1305_BLOCK_SIZE - dctx->buflen);
-		memcpy(dctx->buf + dctx->buflen, src, bytes);
-		src += bytes;
-		srclen -= bytes;
-		dctx->buflen += bytes;
-
-		if (dctx->buflen == POLY1305_BLOCK_SIZE) {
-			poly1305_blocks_arch(&dctx->state, dctx->buf,
-					     POLY1305_BLOCK_SIZE, 1);
-			dctx->buflen = 0;
-		}
-	}
-
-	if (likely(srclen >= POLY1305_BLOCK_SIZE)) {
-		bytes = round_down(srclen, POLY1305_BLOCK_SIZE);
-		poly1305_blocks_arch(&dctx->state, src, bytes, 1);
-		src += bytes;
-		srclen -= bytes;
-	}
-
-	if (unlikely(srclen)) {
-		dctx->buflen = srclen;
-		memcpy(dctx->buf, src, srclen);
-	}
-}
-EXPORT_SYMBOL(poly1305_update_arch);
-
-void poly1305_final_arch(struct poly1305_desc_ctx *dctx, u8 *dst)
-{
-	if (unlikely(dctx->buflen)) {
-		dctx->buf[dctx->buflen++] = 1;
-		memset(dctx->buf + dctx->buflen, 0,
-		       POLY1305_BLOCK_SIZE - dctx->buflen);
-		poly1305_blocks_arch(&dctx->state, dctx->buf,
-				     POLY1305_BLOCK_SIZE, 0);
-	}
-
-	poly1305_emit_arch(&dctx->h, dst, dctx->s);
-	memzero_explicit(dctx, sizeof(*dctx));
-}
-EXPORT_SYMBOL(poly1305_final_arch);
-
 bool poly1305_is_arch_optimized(void)
 {
 	return static_key_enabled(&poly1305_use_avx);
diff --git a/include/crypto/poly1305.h b/include/crypto/poly1305.h
index 027d74842cd5..e54abda8cfe9 100644
--- a/include/crypto/poly1305.h
+++ b/include/crypto/poly1305.h
@@ -55,55 +55,14 @@ struct poly1305_desc_ctx {
 	unsigned int buflen;
 	/* finalize key */
 	u32 s[4];
-	union {
-		struct {
-			struct poly1305_state h;
-			union {
-				struct poly1305_key opaque_r[CONFIG_CRYPTO_LIB_POLY1305_RSIZE];
-				struct poly1305_core_key core_r;
-			};
-		};
-		struct poly1305_block_state state;
-	};
+	struct poly1305_block_state state;
 };
 
-void poly1305_init_arch(struct poly1305_desc_ctx *desc,
-			const u8 key[POLY1305_KEY_SIZE]);
-void poly1305_init_generic(struct poly1305_desc_ctx *desc,
-			   const u8 key[POLY1305_KEY_SIZE]);
-
-static inline void poly1305_init(struct poly1305_desc_ctx *desc, const u8 *key)
-{
-	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
-		poly1305_init_arch(desc, key);
-	else
-		poly1305_init_generic(desc, key);
-}
-
-void poly1305_update_arch(struct poly1305_desc_ctx *desc, const u8 *src,
-			  unsigned int nbytes);
-void poly1305_update_generic(struct poly1305_desc_ctx *desc, const u8 *src,
-			     unsigned int nbytes);
-
-static inline void poly1305_update(struct poly1305_desc_ctx *desc,
-				   const u8 *src, unsigned int nbytes)
-{
-	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
-		poly1305_update_arch(desc, src, nbytes);
-	else
-		poly1305_update_generic(desc, src, nbytes);
-}
-
-void poly1305_final_arch(struct poly1305_desc_ctx *desc, u8 *digest);
-void poly1305_final_generic(struct poly1305_desc_ctx *desc, u8 *digest);
-
-static inline void poly1305_final(struct poly1305_desc_ctx *desc, u8 *digest)
-{
-	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
-		poly1305_final_arch(desc, digest);
-	else
-		poly1305_final_generic(desc, digest);
-}
+void poly1305_init(struct poly1305_desc_ctx *desc,
+		   const u8 key[POLY1305_KEY_SIZE]);
+void poly1305_update(struct poly1305_desc_ctx *desc,
+		     const u8 *src, unsigned int nbytes);
+void poly1305_final(struct poly1305_desc_ctx *desc, u8 *digest);
 
 #if IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305)
 bool poly1305_is_arch_optimized(void);
diff --git a/lib/crypto/poly1305.c b/lib/crypto/poly1305.c
index f35692376acf..e2bcce281330 100644
--- a/lib/crypto/poly1305.c
+++ b/lib/crypto/poly1305.c
@@ -22,47 +22,60 @@ void poly1305_block_init_generic(struct poly1305_block_state *desc,
 }
 EXPORT_SYMBOL_GPL(poly1305_block_init_generic);
 
-void poly1305_init_generic(struct poly1305_desc_ctx *desc,
-			   const u8 key[POLY1305_KEY_SIZE])
+void poly1305_init(struct poly1305_desc_ctx *desc,
+		   const u8 key[POLY1305_KEY_SIZE])
 {
 	desc->s[0] = get_unaligned_le32(key + 16);
 	desc->s[1] = get_unaligned_le32(key + 20);
 	desc->s[2] = get_unaligned_le32(key + 24);
 	desc->s[3] = get_unaligned_le32(key + 28);
 	desc->buflen = 0;
-	poly1305_block_init_generic(&desc->state, key);
+	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
+		poly1305_block_init_arch(&desc->state, key);
+	else
+		poly1305_block_init_generic(&desc->state, key);
 }
-EXPORT_SYMBOL_GPL(poly1305_init_generic);
+EXPORT_SYMBOL(poly1305_init);
 
 static inline void poly1305_blocks(struct poly1305_block_state *state,
 				   const u8 *src, unsigned int len)
 {
-	poly1305_blocks_generic(state, src, len, 1);
+	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
+		poly1305_blocks_arch(state, src, len, 1);
+	else
+		poly1305_blocks_generic(state, src, len, 1);
 }
 
-void poly1305_update_generic(struct poly1305_desc_ctx *desc, const u8 *src,
-			     unsigned int nbytes)
+void poly1305_update(struct poly1305_desc_ctx *desc,
+		     const u8 *src, unsigned int nbytes)
 {
 	desc->buflen = BLOCK_HASH_UPDATE(&poly1305_blocks, &desc->state,
 					 src, nbytes, POLY1305_BLOCK_SIZE,
 					 desc->buf, desc->buflen);
 }
-EXPORT_SYMBOL_GPL(poly1305_update_generic);
+EXPORT_SYMBOL(poly1305_update);
 
-void poly1305_final_generic(struct poly1305_desc_ctx *desc, u8 *dst)
+void poly1305_final(struct poly1305_desc_ctx *desc, u8 *dst)
 {
 	if (unlikely(desc->buflen)) {
 		desc->buf[desc->buflen++] = 1;
 		memset(desc->buf + desc->buflen, 0,
 		       POLY1305_BLOCK_SIZE - desc->buflen);
-		poly1305_blocks_generic(&desc->state, desc->buf,
-					POLY1305_BLOCK_SIZE, 0);
+		if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
+			poly1305_blocks_arch(&desc->state, desc->buf,
+					     POLY1305_BLOCK_SIZE, 0);
+		else
+			poly1305_blocks_generic(&desc->state, desc->buf,
+						POLY1305_BLOCK_SIZE, 0);
 	}
 
-	poly1305_emit_generic(&desc->h, dst, desc->s);
+	if (IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305))
+		poly1305_emit_arch(&desc->state.h, dst, desc->s);
+	else
+		poly1305_emit_generic(&desc->state.h, dst, desc->s);
 	*desc = (struct poly1305_desc_ctx){};
 }
-EXPORT_SYMBOL_GPL(poly1305_final_generic);
+EXPORT_SYMBOL(poly1305_final);
 
 MODULE_LICENSE("GPL");
 MODULE_AUTHOR("Martin Willi <martin@strongswan.org>");
-- 
2.39.5


^ permalink raw reply related	[flat|nested] 13+ messages in thread

* Re: [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out
  2025-04-27  0:59 ` [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out Herbert Xu
@ 2025-04-27  1:24   ` Eric Biggers
  0 siblings, 0 replies; 13+ messages in thread
From: Eric Biggers @ 2025-04-27  1:24 UTC (permalink / raw)
  To: Herbert Xu; +Cc: Linux Crypto Mailing List

On Sun, Apr 27, 2025 at 08:59:59AM +0800, Herbert Xu wrote:
> Extract the common partial block handling into a helper macro
> that can be reused by other library code.
> 
> Also delete the unused sha256_base_do_finalize function.
> 
> Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
> ---
>  include/crypto/internal/blockhash.h | 52 +++++++++++++++++++++++++++++
>  include/crypto/sha2.h               |  9 +++--
>  include/crypto/sha256_base.h        | 38 ++-------------------
>  3 files changed, 62 insertions(+), 37 deletions(-)
>  create mode 100644 include/crypto/internal/blockhash.h
> 
> diff --git a/include/crypto/internal/blockhash.h b/include/crypto/internal/blockhash.h
> new file mode 100644
> index 000000000000..4184e2337d68
> --- /dev/null
> +++ b/include/crypto/internal/blockhash.h
> @@ -0,0 +1,52 @@
> +/* SPDX-License-Identifier: GPL-2.0-or-later */
> +/*
> + * Handle partial blocks for block hash.
> + *
> + * Copyright (c) 2015 Linaro Ltd <ard.biesheuvel@linaro.org>
> + * Copyright (c) 2025 Herbert Xu <herbert@gondor.apana.org.au>
> + */
> +
> +#ifndef _CRYPTO_INTERNAL_BLOCKHASH_H
> +#define _CRYPTO_INTERNAL_BLOCKHASH_H
> +
> +#include <linux/string.h>
> +#include <linux/types.h>
> +
> +#define BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, dv, buf,	\
> +			       buflen)					\
> +	({								\
> +		unsigned int _nbytes = (nbytes);			\
> +		unsigned int _buflen = (buflen);			\
> +		typeof(block) _block = (block);				\
> +		typeof(state) _state = (state); 			\
> +		unsigned int _bs = (bs);				\
> +		unsigned int _dv = (dv);				\
> +		const u8 *_src = (src);					\
> +		u8 *_buf = (buf);					\
> +		while ((_buflen + _nbytes) >= _bs) {			\
> +			unsigned int len = _nbytes;			\
> +			const u8 *data = _src;				\
> +			int blocks, remain;				\
> +			if (_buflen) {					\
> +				remain = _bs - _buflen;			\
> +				memcpy(_buf + _buflen, _src, remain);	\
> +				data = _buf;				\
> +				len = _bs;				\
> +			}						\
> +			remain = len % bs;				\
> +			blocks = (len - remain) / _dv;			\
> +			_block(_state, data, blocks);			\
> +			_src += len - remain - _buflen;			\
> +			_nbytes -= len - remain - _buflen;		\
> +			_buflen = 0;					\
> +		}							\
> +		memcpy(_buf + _buflen, _src, _nbytes);			\
> +		_buflen += _nbytes;					\
> +	})
> +
> +#define BLOCK_HASH_UPDATE(block, state, src, nbytes, bs, buf, buflen) \
> +	BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, 1, buf, buflen)
> +#define BLOCK_HASH_UPDATE_BLOCKS(block, state, src, nbytes, bs, buf, buflen) \
> +	BLOCK_HASH_UPDATE_BASE(block, state, src, nbytes, bs, bs, buf, buflen)

Again, these pointless macros just obfuscate things.  And there's no reason to
still be futzing around with SHA-256 when my patchset reworks it anyway.

- Eric

^ permalink raw reply	[flat|nested] 13+ messages in thread

end of thread, other threads:[~2025-04-27  1:24 UTC | newest]

Thread overview: 13+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-04-27  0:59 [v2 PATCH 00/11] crypto: lib - Add partial block helper Herbert Xu
2025-04-27  0:59 ` [v2 PATCH 01/11] crypto: lib/sha256 - Move partial block handling out Herbert Xu
2025-04-27  1:24   ` Eric Biggers
2025-04-27  1:00 ` [v2 PATCH 02/11] crypto: lib/poly1305 - Add block-only interface Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 03/11] crypto: arm/poly1305 " Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 04/11] crypto: arm64/poly1305 " Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 05/11] crypto: mips/poly1305 " Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 06/11] crypto: powerpc/poly1305 " Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 07/11] crypto: x86/poly1305 " Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 08/11] crypto: chacha20poly1305 - Use lib/crypto poly1305 Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 09/11] crypto: testmgr - Remove poly1305 Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 10/11] crypto: poly1305 - Remove algorithm Herbert Xu
2025-04-27  1:00 ` [v2 PATCH 11/11] crypto: lib/poly1305 - Use block-only interface Herbert Xu

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).