rust-for-linux.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH 00/10] rust: xarray: add entry API with preloading
@ 2025-12-03 22:26 Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 01/10] rust: xarray: minor formatting fixes Andreas Hindborg
                   ` (9 more replies)
  0 siblings, 10 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

This patch series is a mashup of cleanups, bugfixes and feature additions for
the Rust XArray abstractions.

 - Patch 1 starts by fixing minor formatting issues and bringing use
   statements up to date with the new coding guidelines.

 - Patch 2-3 add some minor convenience functionality.

 - Patch 4 adds an abstraction for the C `xa_state` structure. This is a
   prerequisite for all the subsequent patches.

 - Patch 5 fixes a double lock error in `xarray::Guard::load`.

 - Patch 6 is a simplifying refactor of `xarray::Guard::load`.

 - Patch 7 adds two new methods for finding items with keys that are larger
   than a given integer.

 - Patch 8-9 add the new entry API with preloading.

 - Patch 10 fixes a lockdep bug caused by reusing the same static key.

I am sorry for the mashup, but I think these changes are easier to handle
in a single series rather than individually.

The feature additions in this series are dependencies for the rust null
block driver, most of which is still downstream.

Best regards,
Andreas

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
Andreas Hindborg (10):
      rust: xarray: minor formatting fixes
      rust: xarray: add debug format for `StoreError`
      rust: xarray: add `contains_index` method
      rust: xarray: add `XArrayState`
      rust: xarray: use `xas_load` instead of `xa_load` in `Guard::load`
      rust: xarray: simplify `Guard::load`
      rust: xarray: add `find_next` and `find_next_mut`
      rust: xarray: add entry API
      rust: xarray: add preload API
      rust: xarray: fix false positive lockdep warnings

 MAINTAINERS                     |   1 +
 rust/bindings/bindings_helper.h |   4 +
 rust/helpers/xarray.c           |  22 ++-
 rust/kernel/xarray.rs           | 408 ++++++++++++++++++++++++++++++++++++---
 rust/kernel/xarray/entry.rs     | 415 ++++++++++++++++++++++++++++++++++++++++
 rust/kernel/xarray/preload.rs   | 217 +++++++++++++++++++++
 6 files changed, 1034 insertions(+), 33 deletions(-)
---
base-commit: 7d0a66e4bb9081d75c82ec4957c50034cb0ea449
change-id: 20251203-xarray-entry-send-00230f0744e6

Best regards,
-- 
Andreas Hindborg <a.hindborg@kernel.org>



^ permalink raw reply	[flat|nested] 11+ messages in thread

* [PATCH 01/10] rust: xarray: minor formatting fixes
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 02/10] rust: xarray: add debug format for `StoreError` Andreas Hindborg
                   ` (8 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Fix formatting in xarray module to comply with kernel coding
guidelines:

- Update use clauses to use vertical layout with each import on its
  own line.
- Add trailing empty comments to preserve formatting and prevent
  rustfmt from collapsing imports.
- Break long assert_eq! statement in documentation across multiple
  lines for better readability.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 36 +++++++++++++++++++++++++++++-------
 1 file changed, 29 insertions(+), 7 deletions(-)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index a49d6db288458..88625c9abf4ef 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -4,14 +4,33 @@
 //!
 //! C header: [`include/linux/xarray.h`](srctree/include/linux/xarray.h)
 
-use crate::{
-    alloc, bindings, build_assert,
-    error::{Error, Result},
+use core::{
+    iter,
+    marker::PhantomData,
+    pin::Pin,
+    ptr::NonNull, //
+};
+use kernel::{
+    alloc,
+    bindings,
+    build_assert, //
+    error::{
+        Error,
+        Result, //
+    },
     ffi::c_void,
-    types::{ForeignOwnable, NotThreadSafe, Opaque},
+    types::{
+        ForeignOwnable,
+        NotThreadSafe,
+        Opaque, //
+    },
+};
+use pin_init::{
+    pin_data,
+    pin_init,
+    pinned_drop,
+    PinInit, //
 };
-use core::{iter, marker::PhantomData, pin::Pin, ptr::NonNull};
-use pin_init::{pin_data, pin_init, pinned_drop, PinInit};
 
 /// An array which efficiently maps sparse integer indices to owned objects.
 ///
@@ -44,7 +63,10 @@
 /// *guard.get_mut(0).unwrap() = 0xffff;
 /// assert_eq!(guard.get(0).copied(), Some(0xffff));
 ///
-/// assert_eq!(guard.store(0, beef, GFP_KERNEL)?.as_deref().copied(), Some(0xffff));
+/// assert_eq!(
+///     guard.store(0, beef, GFP_KERNEL)?.as_deref().copied(),
+///     Some(0xffff)
+/// );
 /// assert_eq!(guard.get(0).copied(), Some(0xbeef));
 ///
 /// guard.remove(0);

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 02/10] rust: xarray: add debug format for `StoreError`
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 01/10] rust: xarray: minor formatting fixes Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 03/10] rust: xarray: add `contains_index` method Andreas Hindborg
                   ` (7 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add a `Debug` implementation for `StoreError<T>` to enable better error
reporting and debugging. The implementation only displays the `error`
field and omits the `value` field, as `T` may not implement `Debug`.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 8 ++++++++
 1 file changed, 8 insertions(+)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index 88625c9abf4ef..d9762c6bef19c 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -193,6 +193,14 @@ pub struct StoreError<T> {
     pub value: T,
 }
 
+impl<T> core::fmt::Debug for StoreError<T> {
+    fn fmt(&self, f: &mut core::fmt::Formatter<'_>) -> core::fmt::Result {
+        f.debug_struct("StoreError")
+            .field("error", &self.error)
+            .finish()
+    }
+}
+
 impl<T> From<StoreError<T>> for Error {
     fn from(value: StoreError<T>) -> Self {
         value.error

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 03/10] rust: xarray: add `contains_index` method
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 01/10] rust: xarray: minor formatting fixes Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 02/10] rust: xarray: add debug format for `StoreError` Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 04/10] rust: xarray: add `XArrayState` Andreas Hindborg
                   ` (6 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add a convenience method `contains_index` to check whether an element
exists at a given index in the XArray. This method provides a more
ergonomic API compared to calling `get` and checking for `Some`.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 21 +++++++++++++++++++++
 1 file changed, 21 insertions(+)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index d9762c6bef19c..ede48b5e1dba3 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -218,6 +218,27 @@ fn load<F, U>(&self, index: usize, f: F) -> Option<U>
         Some(f(ptr))
     }
 
+    /// Checks if the XArray contains an element at the specified index.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{alloc::{flags::GFP_KERNEL, kbox::KBox}, xarray::{AllocKind, XArray}};
+    /// let xa = KBox::pin_init(XArray::new(AllocKind::Alloc), GFP_KERNEL)?;
+    ///
+    /// let mut guard = xa.lock();
+    /// assert_eq!(guard.contains_index(42), false);
+    ///
+    /// guard.store(42, KBox::new(0u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// assert_eq!(guard.contains_index(42), true);
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn contains_index(&self, index: usize) -> bool {
+        self.get(index).is_some()
+    }
+
     /// Provides a reference to the element at the given index.
     pub fn get(&self, index: usize) -> Option<T::Borrowed<'_>> {
         self.load(index, |ptr| {

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 04/10] rust: xarray: add `XArrayState`
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (2 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 03/10] rust: xarray: add `contains_index` method Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 05/10] rust: xarray: use `xas_load` instead of `xa_load` in `Guard::load` Andreas Hindborg
                   ` (5 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add `XArrayState` as internal state for XArray iteration and entry
operations. This struct wraps the C `xa_state` structure and holds a
reference to a `Guard` to ensure exclusive access to the XArray for the
lifetime of the state object.

The `XAS_RESTART` constant is also exposed through the bindings helper
to properly initialize the `xa_node` field.

The struct and its constructor are marked with `#[expect(dead_code)]` as
there are no users yet. We will remove this annotation in a later patch.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/bindings/bindings_helper.h |  1 +
 rust/kernel/xarray.rs           | 41 ++++++++++++++++++++++++++++++++++++++++-
 2 files changed, 41 insertions(+), 1 deletion(-)

diff --git a/rust/bindings/bindings_helper.h b/rust/bindings/bindings_helper.h
index 2e43c66635a2c..86bca946faff0 100644
--- a/rust/bindings/bindings_helper.h
+++ b/rust/bindings/bindings_helper.h
@@ -109,6 +109,7 @@ const xa_mark_t RUST_CONST_HELPER_XA_PRESENT = XA_PRESENT;
 const gfp_t RUST_CONST_HELPER_XA_FLAGS_ALLOC = XA_FLAGS_ALLOC;
 const gfp_t RUST_CONST_HELPER_XA_FLAGS_ALLOC1 = XA_FLAGS_ALLOC1;
 const vm_flags_t RUST_CONST_HELPER_VM_MERGEABLE = VM_MERGEABLE;
+const size_t RUST_CONST_HELPER_XAS_RESTART = (size_t)XAS_RESTART;
 
 #if IS_ENABLED(CONFIG_ANDROID_BINDER_IPC_RUST)
 #include "../../drivers/android/binder/rust_binder.h"
diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index ede48b5e1dba3..f6e610b059625 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -8,7 +8,10 @@
     iter,
     marker::PhantomData,
     pin::Pin,
-    ptr::NonNull, //
+    ptr::{
+        null_mut,
+        NonNull, //
+    },
 };
 use kernel::{
     alloc,
@@ -319,6 +322,42 @@ pub fn store(
     }
 }
 
+/// Internal state for XArray iteration and entry operations.
+///
+/// # Invariants
+///
+/// - `state` is always a valid `bindings::xa_state`.
+#[expect(dead_code)]
+pub(crate) struct XArrayState<'a, 'b, T: ForeignOwnable> {
+    /// Holds a reference to the lock guard to ensure the lock is not dropped
+    /// while `Self` is live.
+    _access: &'b Guard<'a, T>,
+    state: bindings::xa_state,
+}
+
+impl<'a, 'b, T: ForeignOwnable> XArrayState<'a, 'b, T> {
+    #[expect(dead_code)]
+    fn new(access: &'b Guard<'a, T>, index: usize) -> Self {
+        let ptr = access.xa.xa.get();
+        // INVARIANT: We initialize `self.state` to a valid value below.
+        Self {
+            _access: access,
+            state: bindings::xa_state {
+                xa: ptr,
+                xa_index: index,
+                xa_shift: 0,
+                xa_sibs: 0,
+                xa_offset: 0,
+                xa_pad: 0,
+                xa_node: bindings::XAS_RESTART as *mut bindings::xa_node,
+                xa_alloc: null_mut(),
+                xa_update: None,
+                xa_lru: null_mut(),
+            },
+        }
+    }
+}
+
 // SAFETY: `XArray<T>` has no shared mutable state so it is `Send` iff `T` is `Send`.
 unsafe impl<T: ForeignOwnable + Send> Send for XArray<T> {}
 

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 05/10] rust: xarray: use `xas_load` instead of `xa_load` in `Guard::load`
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (3 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 04/10] rust: xarray: add `XArrayState` Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 06/10] rust: xarray: simplify `Guard::load` Andreas Hindborg
                   ` (4 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Replace the call to `xa_load` with `xas_load` in `Guard::load`. The
`xa_load` function takes the XArray lock internally, which would cause
a double lock since the `Guard` already holds the lock. The `xas_load`
function operates on XArray state and assumes the lock is already held,
which is the correct API to use when holding a `Guard`.

This change also removes the `#[expect(dead_code)]` annotation from
`XArrayState` and its constructor, as they are now in use.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index f6e610b059625..0f69a523b72bf 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -215,8 +215,10 @@ fn load<F, U>(&self, index: usize, f: F) -> Option<U>
     where
         F: FnOnce(NonNull<c_void>) -> U,
     {
-        // SAFETY: `self.xa.xa` is always valid by the type invariant.
-        let ptr = unsafe { bindings::xa_load(self.xa.xa.get(), index) };
+        let mut state = XArrayState::new(self, index);
+        // SAFETY: `state.state` is always valid by the type invariant of
+        // `XArrayState`.
+        let ptr = unsafe { bindings::xas_load(&raw mut state.state) };
         let ptr = NonNull::new(ptr.cast())?;
         Some(f(ptr))
     }
@@ -327,7 +329,6 @@ pub fn store(
 /// # Invariants
 ///
 /// - `state` is always a valid `bindings::xa_state`.
-#[expect(dead_code)]
 pub(crate) struct XArrayState<'a, 'b, T: ForeignOwnable> {
     /// Holds a reference to the lock guard to ensure the lock is not dropped
     /// while `Self` is live.
@@ -336,7 +337,6 @@ pub(crate) struct XArrayState<'a, 'b, T: ForeignOwnable> {
 }
 
 impl<'a, 'b, T: ForeignOwnable> XArrayState<'a, 'b, T> {
-    #[expect(dead_code)]
     fn new(access: &'b Guard<'a, T>, index: usize) -> Self {
         let ptr = access.xa.xa.get();
         // INVARIANT: We initialize `self.state` to a valid value below.

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 06/10] rust: xarray: simplify `Guard::load`
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (4 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 05/10] rust: xarray: use `xas_load` instead of `xa_load` in `Guard::load` Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 07/10] rust: xarray: add `find_next` and `find_next_mut` Andreas Hindborg
                   ` (3 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Simplify the implementation by removing the closure-based API from
`Guard::load` in favor of returning `Option<NonNull<c_void>>` directly.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 23 +++++++++--------------
 1 file changed, 9 insertions(+), 14 deletions(-)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index 0f69a523b72bf..ca97134ba2bd0 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -211,16 +211,12 @@ fn from(value: StoreError<T>) -> Self {
 }
 
 impl<'a, T: ForeignOwnable> Guard<'a, T> {
-    fn load<F, U>(&self, index: usize, f: F) -> Option<U>
-    where
-        F: FnOnce(NonNull<c_void>) -> U,
-    {
+    fn load(&self, index: usize) -> Option<NonNull<c_void>> {
         let mut state = XArrayState::new(self, index);
         // SAFETY: `state.state` is always valid by the type invariant of
         // `XArrayState`.
         let ptr = unsafe { bindings::xas_load(&raw mut state.state) };
-        let ptr = NonNull::new(ptr.cast())?;
-        Some(f(ptr))
+        NonNull::new(ptr)
     }
 
     /// Checks if the XArray contains an element at the specified index.
@@ -246,18 +242,17 @@ pub fn contains_index(&self, index: usize) -> bool {
 
     /// Provides a reference to the element at the given index.
     pub fn get(&self, index: usize) -> Option<T::Borrowed<'_>> {
-        self.load(index, |ptr| {
-            // SAFETY: `ptr` came from `T::into_foreign`.
-            unsafe { T::borrow(ptr.as_ptr()) }
-        })
+        let ptr = self.load(index)?;
+        // SAFETY: `ptr` came from `T::into_foreign`.
+        Some(unsafe { T::borrow(ptr.as_ptr()) })
     }
 
     /// Provides a mutable reference to the element at the given index.
     pub fn get_mut(&mut self, index: usize) -> Option<T::BorrowedMut<'_>> {
-        self.load(index, |ptr| {
-            // SAFETY: `ptr` came from `T::into_foreign`.
-            unsafe { T::borrow_mut(ptr.as_ptr()) }
-        })
+        let ptr = self.load(index)?;
+
+        // SAFETY: `ptr` came from `T::into_foreign`.
+        Some(unsafe { T::borrow_mut(ptr.as_ptr()) })
     }
 
     /// Removes and returns the element at the given index.

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 07/10] rust: xarray: add `find_next` and `find_next_mut`
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (5 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 06/10] rust: xarray: simplify `Guard::load` Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 08/10] rust: xarray: add entry API Andreas Hindborg
                   ` (2 subsequent siblings)
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add methods to find the next element in an XArray starting from a
given index. The methods return a tuple containing the index where the
element was found and a reference to the element.

The implementation uses the XArray state API via `xas_find` to avoid taking
the xarray lock that is already held by `Guard`.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/kernel/xarray.rs | 65 +++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 65 insertions(+)

diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index ca97134ba2bd0..9d4589979fd1d 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -255,6 +255,71 @@ pub fn get_mut(&mut self, index: usize) -> Option<T::BorrowedMut<'_>> {
         Some(unsafe { T::borrow_mut(ptr.as_ptr()) })
     }
 
+    fn load_next(&self, index: usize) -> Option<(usize, NonNull<c_void>)> {
+        let mut state = XArrayState::new(self, index);
+        // SAFETY: `state.state` is always valid by the type invariant of
+        // `XArrayState` and the caller holds the lock.
+        let ptr = unsafe { bindings::xas_find(&raw mut state.state, usize::MAX) };
+        NonNull::new(ptr).map(|ptr| (state.state.xa_index, ptr))
+    }
+
+    /// Finds the next element starting from the given index.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// guard.store(20, KBox::new(20u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Some((found_index, value)) = guard.find_next(11) {
+    ///     assert_eq!(found_index, 20);
+    ///     assert_eq!(*value, 20);
+    /// }
+    ///
+    /// if let Some((found_index, value)) = guard.find_next(5) {
+    ///     assert_eq!(found_index, 10);
+    ///     assert_eq!(*value, 10);
+    /// }
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn find_next(&self, index: usize) -> Option<(usize, T::Borrowed<'_>)> {
+        self.load_next(index)
+            // SAFETY: `ptr` came from `T::into_foreign`.
+            .map(|(index, ptr)| (index, unsafe { T::borrow(ptr.as_ptr()) }))
+    }
+
+    /// Finds the next element starting from the given index, returning a mutable reference.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// guard.store(20, KBox::new(20u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Some((found_index, mut_value)) = guard.find_next_mut(5) {
+    ///     assert_eq!(found_index, 10);
+    ///     *mut_value = 0x99;
+    /// }
+    ///
+    /// assert_eq!(guard.get(10).copied(), Some(0x99));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn find_next_mut(&mut self, index: usize) -> Option<(usize, T::BorrowedMut<'_>)> {
+        self.load_next(index)
+            // SAFETY: `ptr` came from `T::into_foreign`.
+            .map(move |(index, ptr)| (index, unsafe { T::borrow_mut(ptr.as_ptr()) }))
+    }
+
     /// Removes and returns the element at the given index.
     pub fn remove(&mut self, index: usize) -> Option<T> {
         // SAFETY:

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 08/10] rust: xarray: add entry API
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (6 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 07/10] rust: xarray: add `find_next` and `find_next_mut` Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 09/10] rust: xarray: add preload API Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 10/10] rust: xarray: fix false positive lockdep warnings Andreas Hindborg
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add an Entry API for XArray that provides ergonomic access to array
slots that may be vacant or occupied. The API follows the pattern of
Rust's standard library HashMap entry API, allowing efficient
conditional insertion and modification of entries.

The implementation uses the XArray state API (`xas_*` functions) for
efficient operations without requiring multiple lookups. Helper
functions are added to rust/helpers/xarray.c to wrap C macros that are
not directly accessible from Rust.

Also update MAINTAINERS to cover the new rust files.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 MAINTAINERS                 |   1 +
 rust/helpers/xarray.c       |  17 ++
 rust/kernel/xarray.rs       | 112 +++++++++++++
 rust/kernel/xarray/entry.rs | 376 ++++++++++++++++++++++++++++++++++++++++++++
 4 files changed, 506 insertions(+)

diff --git a/MAINTAINERS b/MAINTAINERS
index e8f06145fb54c..79d4c9c9b2b63 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -27909,6 +27909,7 @@ B:	https://github.com/Rust-for-Linux/linux/issues
 C:	https://rust-for-linux.zulipchat.com
 T:	git https://github.com/Rust-for-Linux/linux.git xarray-next
 F:	rust/kernel/xarray.rs
+F:	rust/kernel/xarray/
 
 XBOX DVD IR REMOTE
 M:	Benjamin Valentin <benpicco@googlemail.com>
diff --git a/rust/helpers/xarray.c b/rust/helpers/xarray.c
index 60b299f11451d..425a6cc494734 100644
--- a/rust/helpers/xarray.c
+++ b/rust/helpers/xarray.c
@@ -26,3 +26,20 @@ void rust_helper_xa_unlock(struct xarray *xa)
 {
 	return xa_unlock(xa);
 }
+
+void *rust_helper_xas_result(struct xa_state *xas, void *curr)
+{
+	if (xa_err(xas->xa_node))
+		curr = xas->xa_node;
+	return curr;
+}
+
+void *rust_helper_xa_zero_to_null(void *entry)
+{
+	return xa_is_zero(entry) ? NULL : entry;
+}
+
+int rust_helper_xas_error(const struct xa_state *xas)
+{
+	return xas_error(xas);
+}
diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index 9d4589979fd1d..2b8d56c81e36b 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -13,11 +13,17 @@
         NonNull, //
     },
 };
+pub use entry::{
+    Entry,
+    OccupiedEntry,
+    VacantEntry, //
+};
 use kernel::{
     alloc,
     bindings,
     build_assert, //
     error::{
+        to_result,
         Error,
         Result, //
     },
@@ -255,6 +261,35 @@ pub fn get_mut(&mut self, index: usize) -> Option<T::BorrowedMut<'_>> {
         Some(unsafe { T::borrow_mut(ptr.as_ptr()) })
     }
 
+    /// Gets an entry for the specified index, which can be vacant or occupied.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// assert_eq!(guard.contains_index(42), false);
+    ///
+    /// match guard.get_entry(42) {
+    ///     Entry::Vacant(entry) => {
+    ///         entry.insert(KBox::new(0x1337u32, GFP_KERNEL)?)?;
+    ///     }
+    ///     Entry::Occupied(_) => unreachable!("We did not insert an entry yet"),
+    /// }
+    ///
+    /// assert_eq!(guard.get(42), Some(&0x1337));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn get_entry<'b>(&'b mut self, index: usize) -> Entry<'a, 'b, T> {
+        match self.load(index) {
+            None => Entry::Vacant(VacantEntry::new(self, index)),
+            Some(ptr) => Entry::Occupied(OccupiedEntry::new(self, index, ptr)),
+        }
+    }
+
     fn load_next(&self, index: usize) -> Option<(usize, NonNull<c_void>)> {
         let mut state = XArrayState::new(self, index);
         // SAFETY: `state.state` is always valid by the type invariant of
@@ -320,6 +355,76 @@ pub fn find_next_mut(&mut self, index: usize) -> Option<(usize, T::BorrowedMut<'
             .map(move |(index, ptr)| (index, unsafe { T::borrow_mut(ptr.as_ptr()) }))
     }
 
+    /// Finds the next occupied entry starting from the given index.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// guard.store(20, KBox::new(20u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Some(entry) = guard.find_next_entry(5) {
+    ///     assert_eq!(entry.index(), 10);
+    ///     let value = entry.remove();
+    ///     assert_eq!(*value, 10);
+    /// }
+    ///
+    /// assert_eq!(guard.get(10), None);
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn find_next_entry<'b>(&'b mut self, index: usize) -> Option<OccupiedEntry<'a, 'b, T>> {
+        let mut state = XArrayState::new(self, index);
+
+        // SAFETY: `state.state` is properly initialized by XArrayState::new and the caller holds
+        // the lock.
+        let ptr = NonNull::new(unsafe { bindings::xas_find(&mut state.state, usize::MAX) })?;
+
+        Some(OccupiedEntry { state, ptr })
+    }
+
+    /// Finds the next occupied entry starting at the given index, wrapping around.
+    ///
+    /// Searches for an entry starting at `index` up to the maximum index. If no entry
+    /// is found, wraps around and searches from index 0 up to `index`.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(100, KBox::new(42u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// let entry = guard.find_next_entry_circular(101);
+    /// assert_eq!(entry.map(|e| e.index()), Some(100));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn find_next_entry_circular<'b>(
+        &'b mut self,
+        index: usize,
+    ) -> Option<OccupiedEntry<'a, 'b, T>> {
+        let mut state = XArrayState::new(self, index);
+
+        // SAFETY: `state.state` is properly initialized by XArrayState::new and the caller holds
+        // the lock.
+        let ptr = NonNull::new(unsafe { bindings::xas_find(&mut state.state, usize::MAX) })
+            .or_else(|| {
+                state.state.xa_node = bindings::XAS_RESTART as *mut bindings::xa_node;
+                state.state.xa_index = 0;
+                // SAFETY: `state.state` is properly initialized and by type invariant, we hold the
+                // xarray lock.
+                NonNull::new(unsafe { bindings::xas_find(&mut state.state, index) })
+            })?;
+
+        Some(OccupiedEntry { state, ptr })
+    }
+
     /// Removes and returns the element at the given index.
     pub fn remove(&mut self, index: usize) -> Option<T> {
         // SAFETY:
@@ -416,8 +521,15 @@ fn new(access: &'b Guard<'a, T>, index: usize) -> Self {
             },
         }
     }
+
+    fn status(&self) -> Result {
+        // SAFETY: `self.state` is properly initialized and valid.
+        to_result(unsafe { bindings::xas_error(&self.state) })
+    }
 }
 
+mod entry;
+
 // SAFETY: `XArray<T>` has no shared mutable state so it is `Send` iff `T` is `Send`.
 unsafe impl<T: ForeignOwnable + Send> Send for XArray<T> {}
 
diff --git a/rust/kernel/xarray/entry.rs b/rust/kernel/xarray/entry.rs
new file mode 100644
index 0000000000000..1268dc35bac58
--- /dev/null
+++ b/rust/kernel/xarray/entry.rs
@@ -0,0 +1,376 @@
+// SPDX-License-Identifier: GPL-2.0
+
+use super::{
+    Guard,
+    StoreError,
+    XArrayState, //
+};
+use core::ptr::NonNull;
+use kernel::{
+    prelude::*,
+    types::ForeignOwnable, //
+};
+
+/// Represents either a vacant or occupied entry in an XArray.
+pub enum Entry<'a, 'b, T: ForeignOwnable> {
+    /// A vacant entry that can have a value inserted.
+    Vacant(VacantEntry<'a, 'b, T>),
+    /// An occupied entry containing a value.
+    Occupied(OccupiedEntry<'a, 'b, T>),
+}
+
+impl<T: ForeignOwnable> Entry<'_, '_, T> {
+    /// Returns true if this entry is occupied.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    ///
+    /// let entry = guard.get_entry(42);
+    /// assert_eq!(entry.is_occupied(), false);
+    ///
+    /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// let entry = guard.get_entry(42);
+    /// assert_eq!(entry.is_occupied(), true);
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn is_occupied(&self) -> bool {
+        matches!(self, Entry::Occupied(_))
+    }
+}
+
+/// A view into a vacant entry in an XArray.
+pub struct VacantEntry<'a, 'b, T: ForeignOwnable> {
+    state: XArrayState<'a, 'b, T>,
+}
+
+impl<'a, 'b, T> VacantEntry<'a, 'b, T>
+where
+    T: ForeignOwnable,
+{
+    pub(crate) fn new(guard: &'b mut Guard<'a, T>, index: usize) -> Self {
+        Self {
+            state: XArrayState::new(guard, index),
+        }
+    }
+
+    fn insert_internal(&mut self, value: T) -> Result<*mut c_void, StoreError<T>> {
+        let new = T::into_foreign(value).cast();
+
+        // SAFETY: `self.state.state` is properly initialized and `new` came from `T::into_foreign`.
+        // We hold the xarray lock.
+        unsafe { bindings::xas_store(&mut self.state.state, new) };
+
+        self.state.status().map(|()| new).map_err(|error| {
+            // SAFETY: `new` came from `T::into_foreign` and `xas_store` does not take ownership of
+            // the value on error.
+            let value = unsafe { T::from_foreign(new) };
+            StoreError { value, error }
+        })
+    }
+
+    /// Inserts a value into this vacant entry.
+    ///
+    /// Returns a reference to the newly inserted value.
+    ///
+    /// - This method will fail if the nodes on the path to the index
+    ///   represented by this entry are not present in the XArray.
+    /// - This method will not drop the XArray lock.
+    ///
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// assert_eq!(guard.get(42), None);
+    ///
+    /// if let Entry::Vacant(entry) = guard.get_entry(42) {
+    ///     let value = KBox::new(0x1337u32, GFP_KERNEL)?;
+    ///     let borrowed = entry.insert(value)?;
+    ///     assert_eq!(*borrowed, 0x1337);
+    /// }
+    ///
+    /// assert_eq!(guard.get(42).copied(), Some(0x1337));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn insert(mut self, value: T) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
+        let new = self.insert_internal(value)?;
+
+        // SAFETY: `new` came from `T::into_foreign`. The entry has exclusive
+        // ownership of `new` as it holds a mutable reference to `Guard`.
+        Ok(unsafe { T::borrow_mut(new) })
+    }
+
+    /// Inserts a value and returns an occupied entry representing the newly inserted value.
+    ///
+    /// - This method will fail if the nodes on the path to the index
+    ///   represented by this entry are not present in the XArray.
+    /// - This method will not drop the XArray lock.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// assert_eq!(guard.get(42), None);
+    ///
+    /// if let Entry::Vacant(entry) = guard.get_entry(42) {
+    ///     let value = KBox::new(0x1337u32, GFP_KERNEL)?;
+    ///     let occupied = entry.insert_entry(value)?;
+    ///     assert_eq!(occupied.index(), 42);
+    /// }
+    ///
+    /// assert_eq!(guard.get(42).copied(), Some(0x1337));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn insert_entry(mut self, value: T) -> Result<OccupiedEntry<'a, 'b, T>, StoreError<T>> {
+        let new = self.insert_internal(value)?;
+
+        Ok(OccupiedEntry::<'a, 'b, T> {
+            state: self.state,
+            // SAFETY: `new` came from `T::into_foreign` and is guaranteed non-null.
+            ptr: unsafe { core::ptr::NonNull::new_unchecked(new) },
+        })
+    }
+
+    /// Returns the index of this vacant entry.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// assert_eq!(guard.get(42), None);
+    ///
+    /// if let Entry::Vacant(entry) = guard.get_entry(42) {
+    ///     assert_eq!(entry.index(), 42);
+    /// }
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn index(&self) -> usize {
+        self.state.state.xa_index
+    }
+}
+
+/// A view into an occupied entry in an XArray.
+pub struct OccupiedEntry<'a, 'b, T: ForeignOwnable> {
+    pub(crate) state: XArrayState<'a, 'b, T>,
+    pub(crate) ptr: NonNull<c_void>,
+}
+
+impl<'a, 'b, T> OccupiedEntry<'a, 'b, T>
+where
+    T: ForeignOwnable,
+{
+    pub(crate) fn new(guard: &'b mut Guard<'a, T>, index: usize, ptr: NonNull<c_void>) -> Self {
+        Self {
+            state: XArrayState::new(guard, index),
+            ptr,
+        }
+    }
+
+    /// Removes the value from this occupied entry and returns it, consuming the entry.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    /// assert_eq!(guard.get(42).copied(), Some(0x1337));
+    ///
+    /// if let Entry::Occupied(entry) = guard.get_entry(42) {
+    ///     let value = entry.remove();
+    ///     assert_eq!(*value, 0x1337);
+    /// }
+    ///
+    /// assert_eq!(guard.get(42), None);
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn remove(mut self) -> T {
+        // NOTE: Storing NULL to an occupied slot never fails.
+        // SAFETY: `self.state.state` is properly initialized and valid for XAS operations.
+        let ptr = unsafe {
+            bindings::xas_result(
+                &mut self.state.state,
+                bindings::xa_zero_to_null(bindings::xas_store(
+                    &mut self.state.state,
+                    core::ptr::null_mut(),
+                )),
+            )
+        };
+
+        // SAFETY: `ptr` is a valid return value from xas_result.
+        let errno = unsafe { bindings::xa_err(ptr) };
+        debug_assert!(errno == 0);
+
+        // SAFETY:
+        // - `ptr` came from `T::into_foreign`.
+        // - As this method takes self by value, the lifetimes of any [`T::Borrowed`] and
+        //   [`T::BorrowedMut`] we have created must have ended.
+        unsafe { T::from_foreign(ptr.cast()) }
+    }
+
+    /// Returns the index of this occupied entry.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Entry::Occupied(entry) = guard.get_entry(42) {
+    ///     assert_eq!(entry.index(), 42);
+    /// }
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn index(&self) -> usize {
+        self.state.state.xa_index
+    }
+
+    /// Replaces the value in this occupied entry and returns the old value.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Entry::Occupied(mut entry) = guard.get_entry(42) {
+    ///     let new_value = KBox::new(0x9999u32, GFP_KERNEL)?;
+    ///     let old_value = entry.insert(new_value);
+    ///     assert_eq!(*old_value, 0x1337);
+    /// }
+    ///
+    /// assert_eq!(guard.get(42).copied(), Some(0x9999));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn insert(&mut self, value: T) -> T {
+        // NOTE: Storing to an occupied slot never fails.
+        let new = T::into_foreign(value).cast();
+        // SAFETY: `new` came from `T::into_foreign` and is guaranteed non-null.
+        self.ptr = unsafe { NonNull::new_unchecked(new) };
+
+        // SAFETY: `self.state.state` is properly initialized and valid for XAS operations.
+        let old = unsafe {
+            bindings::xas_result(
+                &mut self.state.state,
+                bindings::xa_zero_to_null(bindings::xas_store(&mut self.state.state, new)),
+            )
+        };
+
+        // SAFETY: `old` is a valid return value from xas_result.
+        let errno = unsafe { bindings::xa_err(old) };
+        debug_assert!(errno == 0);
+
+        // SAFETY:
+        // - `ptr` came from `T::into_foreign`.
+        // - As this method takes self by value, the lifetimes of any [`T::Borrowed`] and
+        //   [`T::BorrowedMut`] we have created must have ended.
+        unsafe { T::from_foreign(old) }
+    }
+
+    /// Converts this occupied entry into a mutable reference to the value in the slot represented
+    /// by the entry.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Entry::Occupied(entry) = guard.get_entry(42) {
+    ///     let value_ref = entry.into_mut();
+    ///     *value_ref = 0x9999;
+    /// }
+    ///
+    /// assert_eq!(guard.get(42).copied(), Some(0x9999));
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn into_mut(self) -> T::BorrowedMut<'b> {
+        // SAFETY: `ptr` came from `T::into_foreign`.
+        unsafe { T::borrow_mut(self.ptr.as_ptr()) }
+    }
+
+    /// Swaps the value in this entry with the provided value.
+    ///
+    /// Returns the old value that was in the entry.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// guard.store(42, KBox::new(100u32, GFP_KERNEL)?, GFP_KERNEL)?;
+    ///
+    /// if let Entry::Occupied(mut entry) = guard.get_entry(42) {
+    ///     let old_value = entry.swap(200u32);
+    ///     assert_eq!(old_value, 100);
+    ///     assert_eq!(*entry, 200);
+    /// }
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn swap<U>(&mut self, mut other: U) -> U
+    where
+        T: for<'c> ForeignOwnable<Borrowed<'c> = &'c U, BorrowedMut<'c> = &'c mut U>,
+    {
+        use core::ops::DerefMut;
+        core::mem::swap(self.deref_mut(), &mut other);
+        other
+    }
+}
+
+impl<T, U> core::ops::Deref for OccupiedEntry<'_, '_, T>
+where
+    T: for<'a> ForeignOwnable<Borrowed<'a> = &'a U, BorrowedMut<'a> = &'a mut U>,
+{
+    type Target = U;
+
+    fn deref(&self) -> &Self::Target {
+        // SAFETY: `ptr` came from `T::into_foreign`.
+        unsafe { T::borrow(self.ptr.as_ptr()) }
+    }
+}
+
+impl<T, U> core::ops::DerefMut for OccupiedEntry<'_, '_, T>
+where
+    T: for<'a> ForeignOwnable<Borrowed<'a> = &'a U, BorrowedMut<'a> = &'a mut U>,
+{
+    fn deref_mut(&mut self) -> &mut Self::Target {
+        // SAFETY: `ptr` came from `T::into_foreign`.
+        unsafe { T::borrow_mut(self.ptr.as_ptr()) }
+    }
+}

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 09/10] rust: xarray: add preload API
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (7 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 08/10] rust: xarray: add entry API Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  2025-12-03 22:26 ` [PATCH 10/10] rust: xarray: fix false positive lockdep warnings Andreas Hindborg
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Add a preload API that allows preallocating memory for XArray
insertions. This enables insertions to proceed without allocation
failures in contexts where memory allocation is not desirable, such as
in atomic contexts or where reliability is critical.

The API includes:

- `XArrayPreloadBuffer` for managing a pool of preallocated nodes.
- `XArrayPreloadNode` representing a single preallocated node.
- Integration with the entry API, allowing `VacantEntry::insert` and
  `VacantEntry::insert_entry` to accept an optional preload buffer.
- A new `Guard::insert_entry` method for inserting with preload support.

The implementation uses a circular buffer to efficiently manage
preallocated nodes. When an insertion would fail due to ENOMEM, the
XArray state API automatically consumes a preallocated node from the
buffer if available.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/bindings/bindings_helper.h |   3 +
 rust/kernel/xarray.rs           |  58 ++++++++++-
 rust/kernel/xarray/entry.rs     |  64 +++++++++---
 rust/kernel/xarray/preload.rs   | 217 ++++++++++++++++++++++++++++++++++++++++
 4 files changed, 324 insertions(+), 18 deletions(-)

diff --git a/rust/bindings/bindings_helper.h b/rust/bindings/bindings_helper.h
index 86bca946faff0..8e9f8762d5e6e 100644
--- a/rust/bindings/bindings_helper.h
+++ b/rust/bindings/bindings_helper.h
@@ -110,6 +110,9 @@ const gfp_t RUST_CONST_HELPER_XA_FLAGS_ALLOC = XA_FLAGS_ALLOC;
 const gfp_t RUST_CONST_HELPER_XA_FLAGS_ALLOC1 = XA_FLAGS_ALLOC1;
 const vm_flags_t RUST_CONST_HELPER_VM_MERGEABLE = VM_MERGEABLE;
 const size_t RUST_CONST_HELPER_XAS_RESTART = (size_t)XAS_RESTART;
+const size_t RUST_CONST_HELPER_XA_CHUNK_SHIFT = XA_CHUNK_SHIFT;
+const size_t RUST_CONST_HELPER_XA_CHUNK_SIZE = XA_CHUNK_SIZE;
+extern struct kmem_cache *radix_tree_node_cachep;
 
 #if IS_ENABLED(CONFIG_ANDROID_BINDER_IPC_RUST)
 #include "../../drivers/android/binder/rust_binder.h"
diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index 2b8d56c81e36b..a405f2b6fdcad 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -23,6 +23,7 @@
     bindings,
     build_assert, //
     error::{
+        code::*,
         to_result,
         Error,
         Result, //
@@ -40,6 +41,12 @@
     pinned_drop,
     PinInit, //
 };
+pub use preload::{
+    XArrayPreloadBuffer,
+    XArrayPreloadNode, //
+};
+
+mod preload;
 
 /// An array which efficiently maps sparse integer indices to owned objects.
 ///
@@ -166,7 +173,6 @@ pub fn try_lock(&self) -> Option<Guard<'_, T>> {
     pub fn lock(&self) -> Guard<'_, T> {
         // SAFETY: `self.xa` is always valid by the type invariant.
         unsafe { bindings::xa_lock(self.xa.get()) };
-
         Guard {
             xa: self,
             _not_send: NotThreadSafe,
@@ -274,7 +280,7 @@ pub fn get_mut(&mut self, index: usize) -> Option<T::BorrowedMut<'_>> {
     ///
     /// match guard.get_entry(42) {
     ///     Entry::Vacant(entry) => {
-    ///         entry.insert(KBox::new(0x1337u32, GFP_KERNEL)?)?;
+    ///         entry.insert(KBox::new(0x1337u32, GFP_KERNEL)?, None)?;
     ///     }
     ///     Entry::Occupied(_) => unreachable!("We did not insert an entry yet"),
     /// }
@@ -487,6 +493,45 @@ pub fn store(
             Ok(unsafe { T::try_from_foreign(old) })
         }
     }
+
+    /// Inserts a value and returns an occupied entry for further operations.
+    ///
+    /// If a value is already present, the operation fails.
+    ///
+    /// This method will not drop the XArray lock. If memory allocation is
+    /// required for the operation to succeed, the user should supply memory
+    /// through the `preload` argument.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
+    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// let mut guard = xa.lock();
+    ///
+    /// assert_eq!(guard.get(42), None);
+    ///
+    /// let value = KBox::new(0x1337u32, GFP_KERNEL)?;
+    /// let entry = guard.insert_entry(42, value, None)?;
+    /// let borrowed = entry.into_mut();
+    /// assert_eq!(borrowed, &0x1337);
+    ///
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn insert_entry<'b>(
+        &'b mut self,
+        index: usize,
+        value: T,
+        preload: Option<&mut XArrayPreloadBuffer>,
+    ) -> Result<OccupiedEntry<'a, 'b, T>, StoreError<T>> {
+        match self.get_entry(index) {
+            Entry::Vacant(entry) => entry.insert_entry(value, preload),
+            Entry::Occupied(_) => Err(StoreError {
+                error: EBUSY,
+                value,
+            }),
+        }
+    }
 }
 
 /// Internal state for XArray iteration and entry operations.
@@ -501,6 +546,15 @@ pub(crate) struct XArrayState<'a, 'b, T: ForeignOwnable> {
     state: bindings::xa_state,
 }
 
+impl<'a, 'b, T: ForeignOwnable> Drop for XArrayState<'a, 'b, T> {
+    fn drop(&mut self) {
+        if !self.state.xa_alloc.is_null() {
+            // SAFETY: `xa_alloc` is a valid pointer to a preallocated node when non-null.
+            drop(unsafe { XArrayPreloadNode::from_raw(self.state.xa_alloc) })
+        }
+    }
+}
+
 impl<'a, 'b, T: ForeignOwnable> XArrayState<'a, 'b, T> {
     fn new(access: &'b Guard<'a, T>, index: usize) -> Self {
         let ptr = access.xa.xa.get();
diff --git a/rust/kernel/xarray/entry.rs b/rust/kernel/xarray/entry.rs
index 1268dc35bac58..2d6ef4781f47d 100644
--- a/rust/kernel/xarray/entry.rs
+++ b/rust/kernel/xarray/entry.rs
@@ -3,6 +3,7 @@
 use super::{
     Guard,
     StoreError,
+    XArrayPreloadBuffer,
     XArrayState, //
 };
 use core::ptr::NonNull;
@@ -29,9 +30,9 @@ impl<T: ForeignOwnable> Entry<'_, '_, T> {
     /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
-    ///
     /// let entry = guard.get_entry(42);
     /// assert_eq!(entry.is_occupied(), false);
+    /// drop(entry);
     ///
     /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
     /// let entry = guard.get_entry(42);
@@ -59,16 +60,37 @@ pub(crate) fn new(guard: &'b mut Guard<'a, T>, index: usize) -> Self {
         }
     }
 
-    fn insert_internal(&mut self, value: T) -> Result<*mut c_void, StoreError<T>> {
+    fn insert_internal(
+        &mut self,
+        value: T,
+        mut preload: Option<&mut XArrayPreloadBuffer>,
+    ) -> Result<*mut c_void, StoreError<T>> {
         let new = T::into_foreign(value).cast();
 
-        // SAFETY: `self.state.state` is properly initialized and `new` came from `T::into_foreign`.
-        // We hold the xarray lock.
-        unsafe { bindings::xas_store(&mut self.state.state, new) };
+        loop {
+            // SAFETY: `self.state.state` is properly initialized and `new` came from
+            // `T::into_foreign`. We hold the xarray lock.
+            unsafe { bindings::xas_store(&mut self.state.state, new) };
+
+            match self.state.status() {
+                Ok(()) => break Ok(new),
+                Err(ENOMEM) => {
+                    debug_assert!(self.state.state.xa_alloc.is_null());
+                    let node = match preload.as_mut().map(|node| node.take_one().ok_or(ENOMEM)) {
+                        None => break Err(ENOMEM),
+                        Some(Err(e)) => break Err(e),
+                        Some(Ok(node)) => node,
+                    };
 
-        self.state.status().map(|()| new).map_err(|error| {
-            // SAFETY: `new` came from `T::into_foreign` and `xas_store` does not take ownership of
-            // the value on error.
+                    self.state.state.xa_alloc = node.into_raw();
+                    continue;
+                }
+                Err(e) => break Err(e),
+            }
+        }
+        .map_err(|error| {
+            // SAFETY: `new` came from `T::into_foreign` and `xas_store` does not take
+            // ownership of the value on error.
             let value = unsafe { T::from_foreign(new) };
             StoreError { value, error }
         })
@@ -79,7 +101,8 @@ fn insert_internal(&mut self, value: T) -> Result<*mut c_void, StoreError<T>> {
     /// Returns a reference to the newly inserted value.
     ///
     /// - This method will fail if the nodes on the path to the index
-    ///   represented by this entry are not present in the XArray.
+    ///   represented by this entry are not present in the XArray and no memory
+    ///   is available via the `preload` argument.
     /// - This method will not drop the XArray lock.
     ///
     ///
@@ -94,7 +117,7 @@ fn insert_internal(&mut self, value: T) -> Result<*mut c_void, StoreError<T>> {
     ///
     /// if let Entry::Vacant(entry) = guard.get_entry(42) {
     ///     let value = KBox::new(0x1337u32, GFP_KERNEL)?;
-    ///     let borrowed = entry.insert(value)?;
+    ///     let borrowed = entry.insert(value, None)?;
     ///     assert_eq!(*borrowed, 0x1337);
     /// }
     ///
@@ -102,8 +125,12 @@ fn insert_internal(&mut self, value: T) -> Result<*mut c_void, StoreError<T>> {
     ///
     /// # Ok::<(), kernel::error::Error>(())
     /// ```
-    pub fn insert(mut self, value: T) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
-        let new = self.insert_internal(value)?;
+    pub fn insert(
+        mut self,
+        value: T,
+        preload: Option<&mut XArrayPreloadBuffer>,
+    ) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
+        let new = self.insert_internal(value, preload)?;
 
         // SAFETY: `new` came from `T::into_foreign`. The entry has exclusive
         // ownership of `new` as it holds a mutable reference to `Guard`.
@@ -113,7 +140,8 @@ pub fn insert(mut self, value: T) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
     /// Inserts a value and returns an occupied entry representing the newly inserted value.
     ///
     /// - This method will fail if the nodes on the path to the index
-    ///   represented by this entry are not present in the XArray.
+    ///   represented by this entry are not present in the XArray and no memory
+    ///   is available via the `preload` argument.
     /// - This method will not drop the XArray lock.
     ///
     /// # Examples
@@ -127,7 +155,7 @@ pub fn insert(mut self, value: T) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
     ///
     /// if let Entry::Vacant(entry) = guard.get_entry(42) {
     ///     let value = KBox::new(0x1337u32, GFP_KERNEL)?;
-    ///     let occupied = entry.insert_entry(value)?;
+    ///     let occupied = entry.insert_entry(value, None)?;
     ///     assert_eq!(occupied.index(), 42);
     /// }
     ///
@@ -135,8 +163,12 @@ pub fn insert(mut self, value: T) -> Result<T::BorrowedMut<'b>, StoreError<T>> {
     ///
     /// # Ok::<(), kernel::error::Error>(())
     /// ```
-    pub fn insert_entry(mut self, value: T) -> Result<OccupiedEntry<'a, 'b, T>, StoreError<T>> {
-        let new = self.insert_internal(value)?;
+    pub fn insert_entry(
+        mut self,
+        value: T,
+        preload: Option<&mut XArrayPreloadBuffer>,
+    ) -> Result<OccupiedEntry<'a, 'b, T>, StoreError<T>> {
+        let new = self.insert_internal(value, preload)?;
 
         Ok(OccupiedEntry::<'a, 'b, T> {
             state: self.state,
diff --git a/rust/kernel/xarray/preload.rs b/rust/kernel/xarray/preload.rs
new file mode 100644
index 0000000000000..964b16a0e6199
--- /dev/null
+++ b/rust/kernel/xarray/preload.rs
@@ -0,0 +1,217 @@
+// SPDX-License-Identifier: GPL-2.0
+
+use kernel::prelude::*;
+
+/// A buffer for preallocating XArray nodes.
+///
+/// This structure allows preallocating memory for XArray insertions to avoid
+/// allocation failures during operations where allocation is not desirable.
+pub struct XArrayPreloadBuffer {
+    nodes: KVec<*mut bindings::xa_node>,
+    size: usize,
+    head: usize,
+    tail: usize,
+}
+
+impl XArrayPreloadBuffer {
+    /// Creates a new preload buffer with capacity for the given number of leaf values.
+    ///
+    /// Inserting a leaf value into an [`XArray`] may require allocating a
+    /// number of internal nodes. This buffer will calculate the upper limit of
+    /// required internal nodes for inserting `entry_count` leaf values and use
+    /// that to size the buffer.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::XArrayPreloadBuffer};
+    /// let buffer = XArrayPreloadBuffer::new(10)?;
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    /// [`XArray`]: super::XArray
+    pub fn new(entry_count: usize) -> Result<Self> {
+        let node_count = entry_count
+            * ((usize::BITS as usize / bindings::XA_CHUNK_SHIFT)
+                + if (usize::BITS as usize % bindings::XA_CHUNK_SHIFT) == 0 {
+                    0
+                } else {
+                    1
+                });
+
+        let mut this = Self {
+            nodes: KVec::new(),
+            size: node_count + 1,
+            head: 0,
+            tail: 0,
+        };
+
+        for _ in 0..this.size {
+            this.nodes.push(core::ptr::null_mut(), GFP_KERNEL)?;
+        }
+
+        Ok(this)
+    }
+
+    /// Allocates internal nodes until the buffer is full.
+    pub fn preload(&mut self, flags: kernel::alloc::Flags) -> Result {
+        while !self.full() {
+            self.alloc(flags)?
+        }
+        Ok(())
+    }
+
+    /// Fills the buffer with preallocated nodes from the given vector.
+    ///
+    /// Nodes are moved from the vector into the buffer until the buffer is full
+    /// or the vector is empty.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{XArrayPreloadBuffer, XArrayPreloadNode}};
+    /// let mut buffer = XArrayPreloadBuffer::new(5)?;
+    /// let mut nodes = KVec::new();
+    /// nodes.push(XArrayPreloadNode::new(GFP_KERNEL)?, GFP_KERNEL)?;
+    /// buffer.preload_with(&mut nodes)?;
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn preload_with(&mut self, nodes: &mut KVec<XArrayPreloadNode>) -> Result {
+        while !self.full() {
+            if let Some(node) = nodes.pop() {
+                self.push(node)?
+            } else {
+                break;
+            }
+        }
+
+        Ok(())
+    }
+
+    /// Returns `true` if the buffer is full and cannot accept more nodes.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::{XArrayPreloadBuffer, XArrayPreloadNode}};
+    /// let mut buffer = XArrayPreloadBuffer::new(1)?;
+    /// if !buffer.full() {
+    ///     let count = buffer.free_count();
+    ///     let mut nodes = KVec::new();
+    ///     for _ in 0..count {
+    ///         nodes.push(XArrayPreloadNode::new(GFP_KERNEL)?, GFP_KERNEL)?;
+    ///     }
+    ///     buffer.preload_with(&mut nodes)?;
+    /// }
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn full(&self) -> bool {
+        (self.head + 1) % self.size != self.tail
+    }
+
+    fn empty(&self) -> bool {
+        self.head == self.tail
+    }
+
+    /// Returns the number of available slots in the buffer.
+    pub fn free_count(&self) -> usize {
+        (if self.head >= self.tail {
+            self.size - (self.head - self.tail)
+        } else {
+            (self.size - self.tail) + self.head
+        } - 1)
+    }
+
+    fn alloc(&mut self, flags: kernel::alloc::Flags) -> Result {
+        if self.full() {
+            return Err(ENOSPC);
+        }
+
+        self.push(XArrayPreloadNode::new(flags)?)?;
+
+        Ok(())
+    }
+
+    fn push(&mut self, node: XArrayPreloadNode) -> Result {
+        if self.full() {
+            return Err(ENOSPC);
+        }
+
+        self.nodes[self.head] = node.into_raw();
+        self.head = (self.head + 1) % self.size;
+
+        Ok(())
+    }
+
+    /// Removes and returns one preallocated node from the buffer.
+    ///
+    /// Returns `None` if the buffer is empty.
+    pub(crate) fn take_one(&mut self) -> Option<XArrayPreloadNode> {
+        if self.empty() {
+            return None;
+        }
+
+        let node = self.nodes[self.tail];
+        self.tail = (self.tail + 1) % self.size;
+
+        Some(XArrayPreloadNode(node))
+    }
+}
+
+impl Drop for XArrayPreloadBuffer {
+    fn drop(&mut self) {
+        while !self.empty() {
+            drop(self.take_one().expect("Not empty"));
+        }
+    }
+}
+
+/// A preallocated XArray node.
+///
+/// This represents a single preallocated internal node for an XArray.
+/// Nodes can be stored in an [`XArrayPreloadBuffer`] for later use.
+pub struct XArrayPreloadNode(*mut bindings::xa_node);
+
+impl XArrayPreloadNode {
+    /// Allocates a new XArray node.
+    ///
+    /// # Examples
+    ///
+    /// ```
+    /// # use kernel::{prelude::*, xarray::XArrayPreloadNode};
+    /// let node = XArrayPreloadNode::new(GFP_KERNEL)?;
+    /// # Ok::<(), kernel::error::Error>(())
+    /// ```
+    pub fn new(flags: kernel::alloc::Flags) -> Result<Self> {
+        // SAFETY: `radix_tree_node_cachep` is a valid kmem cache for XArray nodes.
+        let ptr = unsafe {
+            bindings::kmem_cache_alloc_noprof(bindings::radix_tree_node_cachep, flags.as_raw())
+        };
+
+        if ptr.is_null() {
+            return Err(ENOMEM);
+        }
+
+        // SAFETY: `ptr` is non-null and was allocated from `radix_tree_node_cachep`.
+        Ok(unsafe { XArrayPreloadNode::from_raw(ptr.cast()) })
+    }
+
+    pub(crate) fn into_raw(self) -> *mut bindings::xa_node {
+        self.0
+    }
+
+    /// Creates an `XArrayPreloadNode` from a raw pointer.
+    ///
+    /// # Safety
+    ///
+    /// `ptr` must be a valid pointer to an XArray node allocated from `radix_tree_node_cachep`.
+    pub(crate) unsafe fn from_raw(ptr: *mut bindings::xa_node) -> Self {
+        Self(ptr)
+    }
+}
+
+impl Drop for XArrayPreloadNode {
+    fn drop(&mut self) {
+        // SAFETY: `self.0` is a valid pointer allocated from `radix_tree_node_cachep`.
+        unsafe { bindings::kmem_cache_free(bindings::radix_tree_node_cachep, self.0.cast()) }
+    }
+}

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

* [PATCH 10/10] rust: xarray: fix false positive lockdep warnings
  2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
                   ` (8 preceding siblings ...)
  2025-12-03 22:26 ` [PATCH 09/10] rust: xarray: add preload API Andreas Hindborg
@ 2025-12-03 22:26 ` Andreas Hindborg
  9 siblings, 0 replies; 11+ messages in thread
From: Andreas Hindborg @ 2025-12-03 22:26 UTC (permalink / raw)
  To: Tamir Duberstein, Miguel Ojeda, Alex Gaynor, Boqun Feng, Gary Guo,
	Björn Roy Baron, Benno Lossin, Alice Ryhl, Trevor Gross,
	Danilo Krummrich
  Cc: Daniel Gomez, rust-for-linux, linux-kernel, Andreas Hindborg

Replace the `xa_init_flags` helper with direct initialization of XArray
structures using `__spin_lock_init`. This allows each XArray to have
its own lock class key for lockdep, preventing false positive warnings
about lock ordering violations.

Add a `new_xarray!` macro that automatically generates a unique lock
class key for each XArray instantiation site. The macro accepts an
optional name parameter and uses the `optional_name!` and
`static_lock_class!` macros to generate appropriate names and lock
classes.

Update the `XArray::new` method signature to accept explicit name and
lock class key parameters. This enables proper lockdep tracking while
maintaining flexibility for advanced use cases.

Remove `rust_helper_xa_init_flags` as it is now dead code.

Signed-off-by: Andreas Hindborg <a.hindborg@kernel.org>
---
 rust/helpers/xarray.c       |  5 ---
 rust/kernel/xarray.rs       | 74 +++++++++++++++++++++++++++++++++------------
 rust/kernel/xarray/entry.rs | 55 ++++++++++++++++++---------------
 3 files changed, 86 insertions(+), 48 deletions(-)

diff --git a/rust/helpers/xarray.c b/rust/helpers/xarray.c
index 425a6cc494734..2852f63388eea 100644
--- a/rust/helpers/xarray.c
+++ b/rust/helpers/xarray.c
@@ -7,11 +7,6 @@ int rust_helper_xa_err(void *entry)
 	return xa_err(entry);
 }
 
-void rust_helper_xa_init_flags(struct xarray *xa, gfp_t flags)
-{
-	return xa_init_flags(xa, flags);
-}
-
 int rust_helper_xa_trylock(struct xarray *xa)
 {
 	return xa_trylock(xa);
diff --git a/rust/kernel/xarray.rs b/rust/kernel/xarray.rs
index a405f2b6fdcad..c393d4c01af2a 100644
--- a/rust/kernel/xarray.rs
+++ b/rust/kernel/xarray.rs
@@ -29,6 +29,8 @@
         Result, //
     },
     ffi::c_void,
+    str::CStr,
+    sync::LockClassKey,
     types::{
         ForeignOwnable,
         NotThreadSafe,
@@ -48,6 +50,19 @@
 
 mod preload;
 
+/// Creates a [`XArray`] initialiser with the given name and a newly-created lock class.
+///
+/// It uses the name if one is given, otherwise it generates one based on the file name and line
+/// number.
+#[macro_export]
+macro_rules! new_xarray {
+    ($kind:expr $(, $name:literal)? $(,)?) => {
+        $crate::xarray::XArray::new(
+            $kind, $crate::optional_name!($($name)?), $crate::static_lock_class!())
+    };
+}
+pub use new_xarray;
+
 /// An array which efficiently maps sparse integer indices to owned objects.
 ///
 /// This is similar to a [`crate::alloc::kvec::Vec<Option<T>>`], but more efficient when there are
@@ -62,9 +77,10 @@
 ///
 /// ```rust
 /// use kernel::alloc::KBox;
-/// use kernel::xarray::{AllocKind, XArray};
+/// use kernel::xarray::{new_xarray, AllocKind, XArray};
 ///
-/// let xa = KBox::pin_init(XArray::new(AllocKind::Alloc1), GFP_KERNEL)?;
+/// let xa: Pin<KBox<XArray<KBox<u32>>>> =
+///     KBox::pin_init(new_xarray!(AllocKind::Alloc1), GFP_KERNEL)?;
 ///
 /// let dead = KBox::new(0xdead, GFP_KERNEL)?;
 /// let beef = KBox::new(0xbeef, GFP_KERNEL)?;
@@ -124,7 +140,11 @@ pub enum AllocKind {
 
 impl<T: ForeignOwnable> XArray<T> {
     /// Creates a new initializer for this type.
-    pub fn new(kind: AllocKind) -> impl PinInit<Self> {
+    pub fn new(
+        kind: AllocKind,
+        name: &'static CStr,
+        key: Pin<&'static LockClassKey>,
+    ) -> impl PinInit<Self> {
         let flags = match kind {
             AllocKind::Alloc => bindings::XA_FLAGS_ALLOC,
             AllocKind::Alloc1 => bindings::XA_FLAGS_ALLOC1,
@@ -133,8 +153,14 @@ pub fn new(kind: AllocKind) -> impl PinInit<Self> {
             // SAFETY: `xa` is valid while the closure is called.
             //
             // INVARIANT: `xa` is initialized here to an empty, valid [`bindings::xarray`].
-            xa <- Opaque::ffi_init(|xa| unsafe {
-                bindings::xa_init_flags(xa, flags)
+            xa <- Opaque::ffi_init(|xa: *mut bindings::xarray| unsafe {
+                bindings::__spin_lock_init(
+                    &raw mut (*xa).xa_lock,
+                    name.as_ptr().cast(),
+                    key.as_ptr(),
+                );
+                (*xa).xa_flags = flags;
+                (*xa).xa_head = null_mut();
             }),
             _p: PhantomData,
         })
@@ -236,8 +262,12 @@ fn load(&self, index: usize) -> Option<NonNull<c_void>> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{alloc::{flags::GFP_KERNEL, kbox::KBox}, xarray::{AllocKind, XArray}};
-    /// let xa = KBox::pin_init(XArray::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{
+    /// #     alloc::{flags::GFP_KERNEL, kbox::KBox},
+    /// #     xarray::{new_xarray, AllocKind, XArray},
+    /// # };
+    /// let xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     ///
     /// let mut guard = xa.lock();
     /// assert_eq!(guard.contains_index(42), false);
@@ -272,8 +302,9 @@ pub fn get_mut(&mut self, index: usize) -> Option<T::BorrowedMut<'_>> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// assert_eq!(guard.contains_index(42), false);
@@ -309,8 +340,9 @@ fn load_next(&self, index: usize) -> Option<(usize, NonNull<c_void>)> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -339,8 +371,9 @@ pub fn find_next(&self, index: usize) -> Option<(usize, T::Borrowed<'_>)> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -366,8 +399,9 @@ pub fn find_next_mut(&mut self, index: usize) -> Option<(usize, T::BorrowedMut<'
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(10, KBox::new(10u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -401,8 +435,9 @@ pub fn find_next_entry<'b>(&'b mut self, index: usize) -> Option<OccupiedEntry<'
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(100, KBox::new(42u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -505,8 +540,9 @@ pub fn store(
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// assert_eq!(guard.get(42), None);
diff --git a/rust/kernel/xarray/entry.rs b/rust/kernel/xarray/entry.rs
index 2d6ef4781f47d..6e370252309fc 100644
--- a/rust/kernel/xarray/entry.rs
+++ b/rust/kernel/xarray/entry.rs
@@ -26,8 +26,9 @@ impl<T: ForeignOwnable> Entry<'_, '_, T> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// let entry = guard.get_entry(42);
@@ -100,17 +101,17 @@ fn insert_internal(
     ///
     /// Returns a reference to the newly inserted value.
     ///
-    /// - This method will fail if the nodes on the path to the index
-    ///   represented by this entry are not present in the XArray and no memory
-    ///   is available via the `preload` argument.
+    /// - This method will fail if the nodes on the path to the index represented by this entry are
+    ///   not present in the XArray and no memory is available via the `preload` argument.
     /// - This method will not drop the XArray lock.
     ///
     ///
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// assert_eq!(guard.get(42), None);
@@ -139,16 +140,16 @@ pub fn insert(
 
     /// Inserts a value and returns an occupied entry representing the newly inserted value.
     ///
-    /// - This method will fail if the nodes on the path to the index
-    ///   represented by this entry are not present in the XArray and no memory
-    ///   is available via the `preload` argument.
+    /// - This method will fail if the nodes on the path to the index represented by this entry are
+    ///   not present in the XArray and no memory is available via the `preload` argument.
     /// - This method will not drop the XArray lock.
     ///
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// assert_eq!(guard.get(42), None);
@@ -182,8 +183,9 @@ pub fn insert_entry(
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// assert_eq!(guard.get(42), None);
@@ -221,8 +223,9 @@ pub(crate) fn new(guard: &'b mut Guard<'a, T>, index: usize, ptr: NonNull<c_void
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -266,8 +269,9 @@ pub fn remove(mut self) -> T {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -287,8 +291,9 @@ pub fn index(&self) -> usize {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -334,8 +339,9 @@ pub fn insert(&mut self, value: T) -> T {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(42, KBox::new(0x1337u32, GFP_KERNEL)?, GFP_KERNEL)?;
@@ -361,8 +367,9 @@ pub fn into_mut(self) -> T::BorrowedMut<'b> {
     /// # Examples
     ///
     /// ```
-    /// # use kernel::{prelude::*, xarray::{AllocKind, XArray, Entry}};
-    /// let mut xa = KBox::pin_init(XArray::<KBox<u32>>::new(AllocKind::Alloc), GFP_KERNEL)?;
+    /// # use kernel::{prelude::*, xarray::{new_xarray, AllocKind, XArray, Entry}};
+    /// let mut xa: Pin<KBox<XArray<KBox<u32>>>> =
+    ///     KBox::pin_init(new_xarray!(AllocKind::Alloc), GFP_KERNEL)?;
     /// let mut guard = xa.lock();
     ///
     /// guard.store(42, KBox::new(100u32, GFP_KERNEL)?, GFP_KERNEL)?;

-- 
2.51.2



^ permalink raw reply related	[flat|nested] 11+ messages in thread

end of thread, other threads:[~2025-12-03 22:28 UTC | newest]

Thread overview: 11+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-12-03 22:26 [PATCH 00/10] rust: xarray: add entry API with preloading Andreas Hindborg
2025-12-03 22:26 ` [PATCH 01/10] rust: xarray: minor formatting fixes Andreas Hindborg
2025-12-03 22:26 ` [PATCH 02/10] rust: xarray: add debug format for `StoreError` Andreas Hindborg
2025-12-03 22:26 ` [PATCH 03/10] rust: xarray: add `contains_index` method Andreas Hindborg
2025-12-03 22:26 ` [PATCH 04/10] rust: xarray: add `XArrayState` Andreas Hindborg
2025-12-03 22:26 ` [PATCH 05/10] rust: xarray: use `xas_load` instead of `xa_load` in `Guard::load` Andreas Hindborg
2025-12-03 22:26 ` [PATCH 06/10] rust: xarray: simplify `Guard::load` Andreas Hindborg
2025-12-03 22:26 ` [PATCH 07/10] rust: xarray: add `find_next` and `find_next_mut` Andreas Hindborg
2025-12-03 22:26 ` [PATCH 08/10] rust: xarray: add entry API Andreas Hindborg
2025-12-03 22:26 ` [PATCH 09/10] rust: xarray: add preload API Andreas Hindborg
2025-12-03 22:26 ` [PATCH 10/10] rust: xarray: fix false positive lockdep warnings Andreas Hindborg

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).