linux-doc.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile
@ 2025-09-01 15:33 Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
                   ` (14 more replies)
  0 siblings, 15 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Hi Jon,

Sorry for resending this one. Two e-mails ended being encoded
using base64 (*). The content of this series is identical. The
only difference is that all patches including 05 and 15 are now
in plain text (using 8bit encoding).

This series does a major cleanup at docs Makefile by moving the
actual doc build logic to a helper script (scripts/sphinx-build-wrapper).

Such script was written in a way that it can be called either
directly or via a makefile. When running via makefile, it will
use GNU jobserver to ensure that, when sphinx-build is
called, the number of jobs will match at most what it is
specified by the "-j" parameter.

The first 3 patches do a cleanup at scripts/jobserver-exec
and moves the actual code to a library. Such library is used
by both the jobserver-exec command line and by sphinx-build-wrappper.

The change also gets rid of parallel-wrapper.sh, whose
functions are now part of the wrapper code.

I added two patches at the end adding an extra target: "mandocs".
The patches came from a series I sent in separate with 2 patches.

---

(*) I just converted by internal mailbomb script to Python, but
EmailMessage is really tricky - I spent most of my time scratching
my head to get it right - It has a confusing set of "policy" rules
and two functions to generate messages. It turns that just using a
good policy is not enough: one needs to use as_bytes() to avoid
base64 encoding.

---

v3:
- rebased on the top of docs-next;
- added two patches to build man files that were on a separate
  patch series.

v2:

- there's no generic exception handler anymore;
- it moves sphinx-pre-install to tools/docs;
- the logic which ensures a minimal Python version got moved
  to a library, which is now used by both pre-install and wrapper;
- The first wrapper (05/13) doesn't contain comments (except for
  shebang and SPDX). The goal is to help showing the size increase
  when moving from Makefile to Python. Some file increase is
  unavoidable, as Makefile is more compact: no includes, multple
  statements per line, no argparse, etc;
- The second patch adds docstrings and comments. It has almost
  the same size of the code itself;
- I moved the venv logic to a third wrapper patch;
- I fixed an issue at the paraller build logic;
- There are no generic except blocks anymore.

Mauro Carvalho Chehab (15):
  scripts/jobserver-exec: move the code to a class
  scripts/jobserver-exec: move its class to the lib directory
  scripts/jobserver-exec: add a help message
  scripts: sphinx-pre-install: move it to tools/docs
  tools/docs: sphinx-pre-install: move Python version handling to lib
  tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build
  tools/docs: sphinx-build-wrapper: add comments and blank lines
  tools/docs: sphinx-build-wrapper: add support to run inside venv
  docs: parallel-wrapper.sh: remove script
  docs: Makefile: document latex/PDF PAPER= parameter
  tools/docs: sphinx-build-wrapper: add an argument for LaTeX
    interactive mode
  tools/docs,scripts: sphinx-*: prevent sphinx-build crashes
  tools/docs: sphinx-build-wrapper: allow building PDF files in parallel
  docs: add support to build manpages from kerneldoc output
  tools: kernel-doc: add a see also section at man pages

 Documentation/Makefile                     | 134 +---
 Documentation/doc-guide/kernel-doc.rst     |  29 +-
 Documentation/sphinx/parallel-wrapper.sh   |  33 -
 Makefile                                   |   2 +-
 scripts/jobserver-exec                     |  88 +--
 scripts/lib/jobserver.py                   | 149 +++++
 scripts/lib/kdoc/kdoc_files.py             |   5 +-
 scripts/lib/kdoc/kdoc_output.py            |  84 ++-
 scripts/split-man.pl                       |  28 -
 tools/docs/lib/python_version.py           | 133 ++++
 tools/docs/sphinx-build-wrapper            | 731 +++++++++++++++++++++
 {scripts => tools/docs}/sphinx-pre-install | 134 +---
 12 files changed, 1194 insertions(+), 356 deletions(-)
 delete mode 100644 Documentation/sphinx/parallel-wrapper.sh
 create mode 100755 scripts/lib/jobserver.py
 delete mode 100755 scripts/split-man.pl
 create mode 100644 tools/docs/lib/python_version.py
 create mode 100755 tools/docs/sphinx-build-wrapper
 rename {scripts => tools/docs}/sphinx-pre-install (93%)

-- 
2.51.0


^ permalink raw reply	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 01/15] scripts/jobserver-exec: move the code to a class
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
                   ` (13 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Convert the code inside jobserver-exec to a class and
properly document it.

Using a class allows reusing the jobserver logic on other
scripts.

While the main code remains unchanged, being compatible with
Python 2.6 and 3.0+, its coding style now follows a more
modern standard, having tabs replaced by a 4-spaces
indent, passing autopep8, black and pylint.

The code now allows allows using a pythonic way to
enter/exit a python code, e.g. it now supports:

	with JobserverExec() as jobserver:
	    jobserver.run(sys.argv[1:])

With the new code, the __exit__() function should ensure
that the jobserver slot will be closed at the end, even if
something bad happens somewhere.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec | 218 ++++++++++++++++++++++++++++-------------
 1 file changed, 151 insertions(+), 67 deletions(-)

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index 7eca035472d3..b386b1a845de 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,77 +1,161 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
 #
+# pylint: disable=C0103,C0209
+#
 # This determines how many parallel tasks "make" is expecting, as it is
 # not exposed via an special variables, reserves them all, runs a subprocess
 # with PARALLELISM environment variable set, and releases the jobs back again.
 #
 # https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
-from __future__ import print_function
-import os, sys, errno
+
+"""
+Interacts with the POSIX jobserver during the Kernel build time.
+
+A "normal" jobserver task, like the one initiated by a make subrocess would do:
+
+    - open read/write file descriptors to communicate with the job server;
+    - ask for one slot by calling:
+        claim = os.read(reader, 1)
+    - when the job finshes, call:
+        os.write(writer, b"+")  # os.write(writer, claim)
+
+Here, the goal is different: This script aims to get the remaining number
+of slots available, using all of them to run a command which handle tasks in
+parallel. To to that, it has a loop that ends only after there are no
+slots left. It then increments the number by one, in order to allow a
+call equivalent to make -j$((claim+1)), e.g. having a parent make creating
+$claim child to do the actual work.
+
+The end goal here is to keep the total number of build tasks under the
+limit established by the initial make -j$n_proc call.
+"""
+
+import errno
+import os
 import subprocess
+import sys
 
-# Extract and prepare jobserver file descriptors from environment.
-claim = 0
-jobs = b""
-try:
-	# Fetch the make environment options.
-	flags = os.environ['MAKEFLAGS']
-
-	# Look for "--jobserver=R,W"
-	# Note that GNU Make has used --jobserver-fds and --jobserver-auth
-	# so this handles all of them.
-	opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
-
-	# Parse out R,W file descriptor numbers and set them nonblocking.
-	# If the MAKEFLAGS variable contains multiple instances of the
-	# --jobserver-auth= option, the last one is relevant.
-	fds = opts[-1].split("=", 1)[1]
-
-	# Starting with GNU Make 4.4, named pipes are used for reader and writer.
-	# Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
-	_, _, path = fds.partition('fifo:')
-
-	if path:
-		reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
-		writer = os.open(path, os.O_WRONLY)
-	else:
-		reader, writer = [int(x) for x in fds.split(",", 1)]
-		# Open a private copy of reader to avoid setting nonblocking
-		# on an unexpecting process with the same reader fd.
-		reader = os.open("/proc/self/fd/%d" % (reader),
-				 os.O_RDONLY | os.O_NONBLOCK)
-
-	# Read out as many jobserver slots as possible.
-	while True:
-		try:
-			slot = os.read(reader, 8)
-			jobs += slot
-		except (OSError, IOError) as e:
-			if e.errno == errno.EWOULDBLOCK:
-				# Stop at the end of the jobserver queue.
-				break
-			# If something went wrong, give back the jobs.
-			if len(jobs):
-				os.write(writer, jobs)
-			raise e
-	# Add a bump for our caller's reserveration, since we're just going
-	# to sit here blocked on our child.
-	claim = len(jobs) + 1
-except (KeyError, IndexError, ValueError, OSError, IOError) as e:
-	# Any missing environment strings or bad fds should result in just
-	# not being parallel.
-	pass
-
-# We can only claim parallelism if there was a jobserver (i.e. a top-level
-# "-jN" argument) and there were no other failures. Otherwise leave out the
-# environment variable and let the child figure out what is best.
-if claim > 0:
-	os.environ['PARALLELISM'] = '%d' % (claim)
-
-rc = subprocess.call(sys.argv[1:])
-
-# Return all the reserved slots.
-if len(jobs):
-	os.write(writer, jobs)
-
-sys.exit(rc)
+
+class JobserverExec:
+    """
+    Claim all slots from make using POSIX Jobserver.
+
+    The main methods here are:
+    - open(): reserves all slots;
+    - close(): method returns all used slots back to make;
+    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
+    """
+
+    def __init__(self):
+        """Initialize internal vars"""
+        self.claim = 0
+        self.jobs = b""
+        self.reader = None
+        self.writer = None
+        self.is_open = False
+
+    def open(self):
+        """Reserve all available slots to be claimed later on"""
+
+        if self.is_open:
+            return
+
+        try:
+            # Fetch the make environment options.
+            flags = os.environ["MAKEFLAGS"]
+            # Look for "--jobserver=R,W"
+            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
+            # so this handles all of them.
+            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
+
+            # Parse out R,W file descriptor numbers and set them nonblocking.
+            # If the MAKEFLAGS variable contains multiple instances of the
+            # --jobserver-auth= option, the last one is relevant.
+            fds = opts[-1].split("=", 1)[1]
+
+            # Starting with GNU Make 4.4, named pipes are used for reader
+            # and writer.
+            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
+            _, _, path = fds.partition("fifo:")
+
+            if path:
+                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
+                self.writer = os.open(path, os.O_WRONLY)
+            else:
+                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
+                # Open a private copy of reader to avoid setting nonblocking
+                # on an unexpecting process with the same reader fd.
+                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
+                                      os.O_RDONLY | os.O_NONBLOCK)
+
+            # Read out as many jobserver slots as possible
+            while True:
+                try:
+                    slot = os.read(self.reader, 8)
+                    self.jobs += slot
+                except (OSError, IOError) as e:
+                    if e.errno == errno.EWOULDBLOCK:
+                        # Stop at the end of the jobserver queue.
+                        break
+                    # If something went wrong, give back the jobs.
+                    if self.jobs:
+                        os.write(self.writer, self.jobs)
+                    raise e
+
+            # Add a bump for our caller's reserveration, since we're just going
+            # to sit here blocked on our child.
+            self.claim = len(self.jobs) + 1
+
+        except (KeyError, IndexError, ValueError, OSError, IOError):
+            # Any missing environment strings or bad fds should result in just
+            # not being parallel.
+            self.claim = None
+
+        self.is_open = True
+
+    def close(self):
+        """Return all reserved slots to Jobserver"""
+
+        if not self.is_open:
+            return
+
+        # Return all the reserved slots.
+        if len(self.jobs):
+            os.write(self.writer, self.jobs)
+
+        self.is_open = False
+
+    def __enter__(self):
+        self.open()
+        return self
+
+    def __exit__(self, exc_type, exc_value, exc_traceback):
+        self.close()
+
+    def run(self, cmd):
+        """
+        Run a command setting PARALLELISM env variable to the number of
+        available job slots (claim) + 1, e.g. it will reserve claim slots
+        to do the actual build work, plus one to monitor its childs.
+        """
+        self.open()             # Ensure that self.claim is set
+
+        # We can only claim parallelism if there was a jobserver (i.e. a
+        # top-level "-jN" argument) and there were no other failures. Otherwise
+        # leave out the environment variable and let the child figure out what
+        # is best.
+        if self.claim:
+            os.environ["PARALLELISM"] = str(self.claim)
+
+        return subprocess.call(cmd)
+
+
+def main():
+    """Main program"""
+    with JobserverExec() as jobserver:
+        jobserver.run(sys.argv[1:])
+
+
+if __name__ == "__main__":
+    main()
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 02/15] scripts/jobserver-exec: move its class to the lib directory
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
                   ` (12 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

To make it easier to be re-used, move the JobserverExec class
to the library directory.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec   | 152 +++------------------------------------
 scripts/lib/jobserver.py | 149 ++++++++++++++++++++++++++++++++++++++
 2 files changed, 160 insertions(+), 141 deletions(-)
 create mode 100755 scripts/lib/jobserver.py

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index b386b1a845de..40a0f0058733 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,155 +1,25 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
-#
-# pylint: disable=C0103,C0209
-#
-# This determines how many parallel tasks "make" is expecting, as it is
-# not exposed via an special variables, reserves them all, runs a subprocess
-# with PARALLELISM environment variable set, and releases the jobs back again.
-#
-# https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
 
-"""
-Interacts with the POSIX jobserver during the Kernel build time.
-
-A "normal" jobserver task, like the one initiated by a make subrocess would do:
-
-    - open read/write file descriptors to communicate with the job server;
-    - ask for one slot by calling:
-        claim = os.read(reader, 1)
-    - when the job finshes, call:
-        os.write(writer, b"+")  # os.write(writer, claim)
-
-Here, the goal is different: This script aims to get the remaining number
-of slots available, using all of them to run a command which handle tasks in
-parallel. To to that, it has a loop that ends only after there are no
-slots left. It then increments the number by one, in order to allow a
-call equivalent to make -j$((claim+1)), e.g. having a parent make creating
-$claim child to do the actual work.
-
-The end goal here is to keep the total number of build tasks under the
-limit established by the initial make -j$n_proc call.
-"""
-
-import errno
 import os
-import subprocess
 import sys
 
+LIB_DIR = "lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
-class JobserverExec:
-    """
-    Claim all slots from make using POSIX Jobserver.
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-    The main methods here are:
-    - open(): reserves all slots;
-    - close(): method returns all used slots back to make;
-    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
-    """
+from jobserver import JobserverExec                  # pylint: disable=C0415
 
-    def __init__(self):
-        """Initialize internal vars"""
-        self.claim = 0
-        self.jobs = b""
-        self.reader = None
-        self.writer = None
-        self.is_open = False
 
-    def open(self):
-        """Reserve all available slots to be claimed later on"""
-
-        if self.is_open:
-            return
-
-        try:
-            # Fetch the make environment options.
-            flags = os.environ["MAKEFLAGS"]
-            # Look for "--jobserver=R,W"
-            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
-            # so this handles all of them.
-            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
-
-            # Parse out R,W file descriptor numbers and set them nonblocking.
-            # If the MAKEFLAGS variable contains multiple instances of the
-            # --jobserver-auth= option, the last one is relevant.
-            fds = opts[-1].split("=", 1)[1]
-
-            # Starting with GNU Make 4.4, named pipes are used for reader
-            # and writer.
-            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
-            _, _, path = fds.partition("fifo:")
-
-            if path:
-                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
-                self.writer = os.open(path, os.O_WRONLY)
-            else:
-                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
-                # Open a private copy of reader to avoid setting nonblocking
-                # on an unexpecting process with the same reader fd.
-                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
-                                      os.O_RDONLY | os.O_NONBLOCK)
-
-            # Read out as many jobserver slots as possible
-            while True:
-                try:
-                    slot = os.read(self.reader, 8)
-                    self.jobs += slot
-                except (OSError, IOError) as e:
-                    if e.errno == errno.EWOULDBLOCK:
-                        # Stop at the end of the jobserver queue.
-                        break
-                    # If something went wrong, give back the jobs.
-                    if self.jobs:
-                        os.write(self.writer, self.jobs)
-                    raise e
-
-            # Add a bump for our caller's reserveration, since we're just going
-            # to sit here blocked on our child.
-            self.claim = len(self.jobs) + 1
-
-        except (KeyError, IndexError, ValueError, OSError, IOError):
-            # Any missing environment strings or bad fds should result in just
-            # not being parallel.
-            self.claim = None
-
-        self.is_open = True
-
-    def close(self):
-        """Return all reserved slots to Jobserver"""
-
-        if not self.is_open:
-            return
-
-        # Return all the reserved slots.
-        if len(self.jobs):
-            os.write(self.writer, self.jobs)
-
-        self.is_open = False
-
-    def __enter__(self):
-        self.open()
-        return self
-
-    def __exit__(self, exc_type, exc_value, exc_traceback):
-        self.close()
-
-    def run(self, cmd):
-        """
-        Run a command setting PARALLELISM env variable to the number of
-        available job slots (claim) + 1, e.g. it will reserve claim slots
-        to do the actual build work, plus one to monitor its childs.
-        """
-        self.open()             # Ensure that self.claim is set
-
-        # We can only claim parallelism if there was a jobserver (i.e. a
-        # top-level "-jN" argument) and there were no other failures. Otherwise
-        # leave out the environment variable and let the child figure out what
-        # is best.
-        if self.claim:
-            os.environ["PARALLELISM"] = str(self.claim)
-
-        return subprocess.call(cmd)
+"""
+Determines how many parallel tasks "make" is expecting, as it is
+not exposed via an special variables, reserves them all, runs a subprocess
+with PARALLELISM environment variable set, and releases the jobs back again.
 
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
 
 def main():
     """Main program"""
diff --git a/scripts/lib/jobserver.py b/scripts/lib/jobserver.py
new file mode 100755
index 000000000000..98d8b0ff0c89
--- /dev/null
+++ b/scripts/lib/jobserver.py
@@ -0,0 +1,149 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0+
+#
+# pylint: disable=C0103,C0209
+#
+#
+
+"""
+Interacts with the POSIX jobserver during the Kernel build time.
+
+A "normal" jobserver task, like the one initiated by a make subrocess would do:
+
+    - open read/write file descriptors to communicate with the job server;
+    - ask for one slot by calling:
+        claim = os.read(reader, 1)
+    - when the job finshes, call:
+        os.write(writer, b"+")  # os.write(writer, claim)
+
+Here, the goal is different: This script aims to get the remaining number
+of slots available, using all of them to run a command which handle tasks in
+parallel. To to that, it has a loop that ends only after there are no
+slots left. It then increments the number by one, in order to allow a
+call equivalent to make -j$((claim+1)), e.g. having a parent make creating
+$claim child to do the actual work.
+
+The end goal here is to keep the total number of build tasks under the
+limit established by the initial make -j$n_proc call.
+
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
+
+import errno
+import os
+import subprocess
+import sys
+
+class JobserverExec:
+    """
+    Claim all slots from make using POSIX Jobserver.
+
+    The main methods here are:
+    - open(): reserves all slots;
+    - close(): method returns all used slots back to make;
+    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
+    """
+
+    def __init__(self):
+        """Initialize internal vars"""
+        self.claim = 0
+        self.jobs = b""
+        self.reader = None
+        self.writer = None
+        self.is_open = False
+
+    def open(self):
+        """Reserve all available slots to be claimed later on"""
+
+        if self.is_open:
+            return
+
+        try:
+            # Fetch the make environment options.
+            flags = os.environ["MAKEFLAGS"]
+            # Look for "--jobserver=R,W"
+            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
+            # so this handles all of them.
+            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
+
+            # Parse out R,W file descriptor numbers and set them nonblocking.
+            # If the MAKEFLAGS variable contains multiple instances of the
+            # --jobserver-auth= option, the last one is relevant.
+            fds = opts[-1].split("=", 1)[1]
+
+            # Starting with GNU Make 4.4, named pipes are used for reader
+            # and writer.
+            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
+            _, _, path = fds.partition("fifo:")
+
+            if path:
+                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
+                self.writer = os.open(path, os.O_WRONLY)
+            else:
+                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
+                # Open a private copy of reader to avoid setting nonblocking
+                # on an unexpecting process with the same reader fd.
+                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
+                                      os.O_RDONLY | os.O_NONBLOCK)
+
+            # Read out as many jobserver slots as possible
+            while True:
+                try:
+                    slot = os.read(self.reader, 8)
+                    self.jobs += slot
+                except (OSError, IOError) as e:
+                    if e.errno == errno.EWOULDBLOCK:
+                        # Stop at the end of the jobserver queue.
+                        break
+                    # If something went wrong, give back the jobs.
+                    if self.jobs:
+                        os.write(self.writer, self.jobs)
+                    raise e
+
+            # Add a bump for our caller's reserveration, since we're just going
+            # to sit here blocked on our child.
+            self.claim = len(self.jobs) + 1
+
+        except (KeyError, IndexError, ValueError, OSError, IOError):
+            # Any missing environment strings or bad fds should result in just
+            # not being parallel.
+            self.claim = None
+
+        self.is_open = True
+
+    def close(self):
+        """Return all reserved slots to Jobserver"""
+
+        if not self.is_open:
+            return
+
+        # Return all the reserved slots.
+        if len(self.jobs):
+            os.write(self.writer, self.jobs)
+
+        self.is_open = False
+
+    def __enter__(self):
+        self.open()
+        return self
+
+    def __exit__(self, exc_type, exc_value, exc_traceback):
+        self.close()
+
+    def run(self, cmd, *args, **pwargs):
+        """
+        Run a command setting PARALLELISM env variable to the number of
+        available job slots (claim) + 1, e.g. it will reserve claim slots
+        to do the actual build work, plus one to monitor its childs.
+        """
+        self.open()             # Ensure that self.claim is set
+
+        # We can only claim parallelism if there was a jobserver (i.e. a
+        # top-level "-jN" argument) and there were no other failures. Otherwise
+        # leave out the environment variable and let the child figure out what
+        # is best.
+        if self.claim:
+            os.environ["PARALLELISM"] = str(self.claim)
+
+        return subprocess.call(cmd, *args, **pwargs)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 03/15] scripts/jobserver-exec: add a help message
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Currently, calling it without an argument shows an ugly error
message. Instead, print a message using pythondoc as description.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec | 22 +++++++++++++---------
 1 file changed, 13 insertions(+), 9 deletions(-)

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index 40a0f0058733..ae23afd344ec 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,6 +1,15 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
 
+"""
+Determines how many parallel tasks "make" is expecting, as it is
+not exposed via any special variables, reserves them all, runs a subprocess
+with PARALLELISM environment variable set, and releases the jobs back again.
+
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
+
 import os
 import sys
 
@@ -12,17 +21,12 @@ sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 from jobserver import JobserverExec                  # pylint: disable=C0415
 
 
-"""
-Determines how many parallel tasks "make" is expecting, as it is
-not exposed via an special variables, reserves them all, runs a subprocess
-with PARALLELISM environment variable set, and releases the jobs back again.
-
-See:
-    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
-"""
-
 def main():
     """Main program"""
+    if len(sys.argv) < 2:
+        name = os.path.basename(__file__)
+        sys.exit("usage: " + name +" command [args ...]\n" + __doc__)
+
     with JobserverExec() as jobserver:
         jobserver.run(sys.argv[1:])
 
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 04/15] scripts: sphinx-pre-install: move it to tools/docs
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

As we're reorganizing the place where doc scripts are located,
move this one to tools/docs.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                     | 14 +++++++-------
 {scripts => tools/docs}/sphinx-pre-install |  0
 2 files changed, 7 insertions(+), 7 deletions(-)
 rename {scripts => tools/docs}/sphinx-pre-install (100%)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 5c20c68be89a..deb2029228ed 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -46,7 +46,7 @@ ifeq ($(HAVE_SPHINX),0)
 .DEFAULT:
 	$(warning The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed and in PATH, or set the SPHINXBUILD make variable to point to the full path of the '$(SPHINXBUILD)' executable.)
 	@echo
-	@$(srctree)/scripts/sphinx-pre-install
+	@$(srctree)/tools/docs/sphinx-pre-install
 	@echo "  SKIP    Sphinx $@ target."
 
 else # HAVE_SPHINX
@@ -105,7 +105,7 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 	fi
 
 htmldocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
 
 # If Rust support is available and .config exists, add rustdoc generated contents.
@@ -119,7 +119,7 @@ endif
 endif
 
 texinfodocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,texinfo,$(var),texinfo,$(var)))
 
 # Note: the 'info' Make target is generated by sphinx itself when
@@ -131,7 +131,7 @@ linkcheckdocs:
 	@$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,linkcheck,$(var),,$(var)))
 
 latexdocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,latex,$(var),latex,$(var)))
 
 ifeq ($(HAVE_PDFLATEX),0)
@@ -144,7 +144,7 @@ else # HAVE_PDFLATEX
 
 pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
 pdfdocs: latexdocs
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	$(foreach var,$(SPHINXDIRS), \
 	   $(MAKE) PDFLATEX="$(PDFLATEX)" LATEXOPTS="$(LATEXOPTS)" $(DENY_VF) -C $(BUILDDIR)/$(var)/latex || sh $(srctree)/scripts/check-variable-fonts.sh || exit; \
 	   mkdir -p $(BUILDDIR)/$(var)/pdf; \
@@ -154,11 +154,11 @@ pdfdocs: latexdocs
 endif # HAVE_PDFLATEX
 
 epubdocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,epub,$(var),epub,$(var)))
 
 xmldocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,xml,$(var),xml,$(var)))
 
 endif # HAVE_SPHINX
diff --git a/scripts/sphinx-pre-install b/tools/docs/sphinx-pre-install
similarity index 100%
rename from scripts/sphinx-pre-install
rename to tools/docs/sphinx-pre-install
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

The sphinx-pre-install code has some logic to deal with Python
version, which ensures that a minimal version will be enforced
for documentation build logic.

Move it to a separate library to allow re-using its code.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/lib/python_version.py | 133 +++++++++++++++++++++++++++++++
 tools/docs/sphinx-pre-install    | 120 +++-------------------------
 2 files changed, 146 insertions(+), 107 deletions(-)
 create mode 100644 tools/docs/lib/python_version.py

diff --git a/tools/docs/lib/python_version.py b/tools/docs/lib/python_version.py
new file mode 100644
index 000000000000..0519d524e547
--- /dev/null
+++ b/tools/docs/lib/python_version.py
@@ -0,0 +1,133 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0-or-later
+# Copyright (c) 2017-2025 Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+
+"""
+Handle Python version check logic.
+
+Not all Python versions are supported by scripts. Yet, on some cases,
+like during documentation build, a newer version of python could be
+available.
+
+This class allows checking if the minimal requirements are followed.
+
+Better than that, PythonVersion.check_python() not only checks the minimal
+requirements, but it automatically switches to a the newest available
+Python version if present.
+
+"""
+
+import os
+import re
+import subprocess
+import sys
+
+from glob import glob
+
+class PythonVersion:
+    """
+    Ancillary methods that checks for missing dependencies for different
+    types of types, like binaries, python modules, rpm deps, etc.
+    """
+
+    def __init__(self, version):
+        """��nitialize self.version tuple from a version string"""
+        self.version = self.parse_version(version)
+
+    @staticmethod
+    def parse_version(version):
+        """Convert a major.minor.patch version into a tuple"""
+        return tuple(int(x) for x in version.split("."))
+
+    @staticmethod
+    def ver_str(version):
+        """Returns a version tuple as major.minor.patch"""
+        return ".".join([str(x) for x in version])
+
+    def __str__(self):
+        """Returns a version tuple as major.minor.patch from self.version"""
+        return self.ver_str(self.version)
+
+    @staticmethod
+    def get_python_version(cmd):
+        """
+        Get python version from a Python binary. As we need to detect if
+        are out there newer python binaries, we can't rely on sys.release here.
+        """
+
+        kwargs = {}
+        if sys.version_info < (3, 7):
+            kwargs['universal_newlines'] = True
+        else:
+            kwargs['text'] = True
+
+        result = subprocess.run([cmd, "--version"],
+                                stdout = subprocess.PIPE,
+                                stderr = subprocess.PIPE,
+                                **kwargs, check=False)
+
+        version = result.stdout.strip()
+
+        match = re.search(r"(\d+\.\d+\.\d+)", version)
+        if match:
+            return PythonVersion.parse_version(match.group(1))
+
+        print(f"Can't parse version {version}")
+        return (0, 0, 0)
+
+    @staticmethod
+    def find_python(min_version):
+        """
+        Detect if are out there any python 3.xy version newer than the
+        current one.
+
+        Note: this routine is limited to up to 2 digits for python3. We
+        may need to update it one day, hopefully on a distant future.
+        """
+        patterns = [
+            "python3.[0-9]",
+            "python3.[0-9][0-9]",
+        ]
+
+        # Seek for a python binary newer than min_version
+        for path in os.getenv("PATH", "").split(":"):
+            for pattern in patterns:
+                for cmd in glob(os.path.join(path, pattern)):
+                    if os.path.isfile(cmd) and os.access(cmd, os.X_OK):
+                        version = PythonVersion.get_python_version(cmd)
+                        if version >= min_version:
+                            return cmd
+
+        return None
+
+    @staticmethod
+    def check_python(min_version):
+        """
+        Check if the current python binary satisfies our minimal requirement
+        for Sphinx build. If not, re-run with a newer version if found.
+        """
+        cur_ver = sys.version_info[:3]
+        if cur_ver >= min_version:
+            ver = PythonVersion.ver_str(cur_ver)
+            print(f"Python version: {ver}")
+
+            return
+
+        python_ver = PythonVersion.ver_str(cur_ver)
+
+        new_python_cmd = PythonVersion.find_python(min_version)
+        if not new_python_cmd:
+            print(f"ERROR: Python version {python_ver} is not spported anymore\n")
+            print("       Can't find a new version. This script may fail")
+            return
+
+        # Restart script using the newer version
+        script_path = os.path.abspath(sys.argv[0])
+        args = [new_python_cmd, script_path] + sys.argv[1:]
+
+        print(f"Python {python_ver} not supported. Changing to {new_python_cmd}")
+
+        try:
+            os.execv(new_python_cmd, args)
+        except OSError as e:
+            sys.exit(f"Failed to restart with {new_python_cmd}: {e}")
diff --git a/tools/docs/sphinx-pre-install b/tools/docs/sphinx-pre-install
index 954ed3dc0645..d6d673b7945c 100755
--- a/tools/docs/sphinx-pre-install
+++ b/tools/docs/sphinx-pre-install
@@ -32,20 +32,10 @@ import subprocess
 import sys
 from glob import glob
 
+from lib.python_version import PythonVersion
 
-def parse_version(version):
-    """Convert a major.minor.patch version into a tuple"""
-    return tuple(int(x) for x in version.split("."))
-
-
-def ver_str(version):
-    """Returns a version tuple as major.minor.patch"""
-
-    return ".".join([str(x) for x in version])
-
-
-RECOMMENDED_VERSION = parse_version("3.4.3")
-MIN_PYTHON_VERSION = parse_version("3.7")
+RECOMMENDED_VERSION = PythonVersion("3.4.3").version
+MIN_PYTHON_VERSION = PythonVersion("3.7").version
 
 
 class DepManager:
@@ -235,95 +225,11 @@ class AncillaryMethods:
 
         return None
 
-    @staticmethod
-    def get_python_version(cmd):
-        """
-        Get python version from a Python binary. As we need to detect if
-        are out there newer python binaries, we can't rely on sys.release here.
-        """
-
-        result = SphinxDependencyChecker.run([cmd, "--version"],
-                                            capture_output=True, text=True)
-        version = result.stdout.strip()
-
-        match = re.search(r"(\d+\.\d+\.\d+)", version)
-        if match:
-            return parse_version(match.group(1))
-
-        print(f"Can't parse version {version}")
-        return (0, 0, 0)
-
-    @staticmethod
-    def find_python():
-        """
-        Detect if are out there any python 3.xy version newer than the
-        current one.
-
-        Note: this routine is limited to up to 2 digits for python3. We
-        may need to update it one day, hopefully on a distant future.
-        """
-        patterns = [
-            "python3.[0-9]",
-            "python3.[0-9][0-9]",
-        ]
-
-        # Seek for a python binary newer than MIN_PYTHON_VERSION
-        for path in os.getenv("PATH", "").split(":"):
-            for pattern in patterns:
-                for cmd in glob(os.path.join(path, pattern)):
-                    if os.path.isfile(cmd) and os.access(cmd, os.X_OK):
-                        version = SphinxDependencyChecker.get_python_version(cmd)
-                        if version >= MIN_PYTHON_VERSION:
-                            return cmd
-
-    @staticmethod
-    def check_python():
-        """
-        Check if the current python binary satisfies our minimal requirement
-        for Sphinx build. If not, re-run with a newer version if found.
-        """
-        cur_ver = sys.version_info[:3]
-        if cur_ver >= MIN_PYTHON_VERSION:
-            ver = ver_str(cur_ver)
-            print(f"Python version: {ver}")
-
-            # This could be useful for debugging purposes
-            if SphinxDependencyChecker.which("docutils"):
-                result = SphinxDependencyChecker.run(["docutils", "--version"],
-                                                    capture_output=True, text=True)
-                ver = result.stdout.strip()
-                match = re.search(r"(\d+\.\d+\.\d+)", ver)
-                if match:
-                    ver = match.group(1)
-
-                print(f"Docutils version: {ver}")
-
-            return
-
-        python_ver = ver_str(cur_ver)
-
-        new_python_cmd = SphinxDependencyChecker.find_python()
-        if not new_python_cmd:
-            print(f"ERROR: Python version {python_ver} is not spported anymore\n")
-            print("       Can't find a new version. This script may fail")
-            return
-
-        # Restart script using the newer version
-        script_path = os.path.abspath(sys.argv[0])
-        args = [new_python_cmd, script_path] + sys.argv[1:]
-
-        print(f"Python {python_ver} not supported. Changing to {new_python_cmd}")
-
-        try:
-            os.execv(new_python_cmd, args)
-        except OSError as e:
-            sys.exit(f"Failed to restart with {new_python_cmd}: {e}")
-
     @staticmethod
     def run(*args, **kwargs):
         """
         Excecute a command, hiding its output by default.
-        Preserve comatibility with older Python versions.
+        Preserve compatibility with older Python versions.
         """
 
         capture_output = kwargs.pop('capture_output', False)
@@ -527,11 +433,11 @@ class MissingCheckers(AncillaryMethods):
         for line in result.stdout.split("\n"):
             match = re.match(r"^sphinx-build\s+([\d\.]+)(?:\+(?:/[\da-f]+)|b\d+)?\s*$", line)
             if match:
-                return parse_version(match.group(1))
+                return PythonVersion.parse_version(match.group(1))
 
             match = re.match(r"^Sphinx.*\s+([\d\.]+)\s*$", line)
             if match:
-                return parse_version(match.group(1))
+                return PythonVersion.parse_version(match.group(1))
 
     def check_sphinx(self, conf):
         """
@@ -542,7 +448,7 @@ class MissingCheckers(AncillaryMethods):
                 for line in f:
                     match = re.match(r"^\s*needs_sphinx\s*=\s*[\'\"]([\d\.]+)[\'\"]", line)
                     if match:
-                        self.min_version = parse_version(match.group(1))
+                        self.min_version = PythonVersion.parse_version(match.group(1))
                         break
         except IOError:
             sys.exit(f"Can't open {conf}")
@@ -562,8 +468,8 @@ class MissingCheckers(AncillaryMethods):
             sys.exit(f"{sphinx} didn't return its version")
 
         if self.cur_version < self.min_version:
-            curver = ver_str(self.cur_version)
-            minver = ver_str(self.min_version)
+            curver = PythonVersion.ver_str(self.cur_version)
+            minver = PythonVersion.ver_str(self.min_version)
 
             print(f"ERROR: Sphinx version is {curver}. It should be >= {minver}")
             self.need_sphinx = 1
@@ -1304,7 +1210,7 @@ class SphinxDependencyChecker(MissingCheckers):
             else:
                 if self.need_sphinx and ver >= self.min_version:
                     return (f, ver)
-                elif parse_version(ver) > self.cur_version:
+                elif PythonVersion.parse_version(ver) > self.cur_version:
                     return (f, ver)
 
         return ("", ver)
@@ -1411,7 +1317,7 @@ class SphinxDependencyChecker(MissingCheckers):
             return
 
         if self.latest_avail_ver:
-            latest_avail_ver = ver_str(self.latest_avail_ver)
+            latest_avail_ver = PythonVersion.ver_str(self.latest_avail_ver)
 
         if not self.need_sphinx:
             # sphinx-build is present and its version is >= $min_version
@@ -1507,7 +1413,7 @@ class SphinxDependencyChecker(MissingCheckers):
         else:
             print("Unknown OS")
         if self.cur_version != (0, 0, 0):
-            ver = ver_str(self.cur_version)
+            ver = PythonVersion.ver_str(self.cur_version)
             print(f"Sphinx version: {ver}\n")
 
         # Check the type of virtual env, depending on Python version
@@ -1613,7 +1519,7 @@ def main():
 
     checker = SphinxDependencyChecker(args)
 
-    checker.check_python()
+    PythonVersion.check_python(MIN_PYTHON_VERSION)
     checker.check_needs()
 
 # Call main if not used as module
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Björn Roy Baron, Alex Gaynor,
	Alice Ryhl, Andreas Hindborg, Benno Lossin, Boqun Feng,
	Danilo Krummrich, Gary Guo, Mauro Carvalho Chehab, Miguel Ojeda,
	Trevor Gross, linux-kernel, rust-for-linux

There are too much magic inside docs Makefile to properly run
sphinx-build. Create an ancillary script that contains all
kernel-related sphinx-build call logic currently at Makefile.

Such script is designed to work both as an standalone command
and as part of a Makefile. As such, it properly handles POSIX
jobserver used by GNU make.

On a side note, there was a line number increase due to the
conversion:

$ git show|diffstat -p1
 Documentation/Makefile          |  129 +++-------------
 tools/docs/sphinx-build-wrapper |  288 +++++++++++++++++++++++++++++++++++++
 2 files changed, 316 insertions(+), 101 deletions(-)

This is because some things are more verbosed on Python and because
it requires reading env vars from Makefile. Besides it, this script
has some extra features that don't exist at the Makefile:

- It can be called directly from command line;
- It properly return PDF build errors.

When running the script alone, it will only take handle sphinx-build
targets. On other words, it won't runn make rustdoc after building
htmlfiles, nor it will run the extra check scripts.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile          | 129 ++++----------
 tools/docs/sphinx-build-wrapper | 288 ++++++++++++++++++++++++++++++++
 2 files changed, 316 insertions(+), 101 deletions(-)
 create mode 100755 tools/docs/sphinx-build-wrapper

diff --git a/Documentation/Makefile b/Documentation/Makefile
index deb2029228ed..2b0ed8cd5ea8 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -23,21 +23,22 @@ SPHINXOPTS    =
 SPHINXDIRS    = .
 DOCS_THEME    =
 DOCS_CSS      =
-_SPHINXDIRS   = $(sort $(patsubst $(srctree)/Documentation/%/index.rst,%,$(wildcard $(srctree)/Documentation/*/index.rst)))
 SPHINX_CONF   = conf.py
 PAPER         =
 BUILDDIR      = $(obj)/output
 PDFLATEX      = xelatex
 LATEXOPTS     = -interaction=batchmode -no-shell-escape
 
+PYTHONPYCACHEPREFIX ?= $(abspath $(BUILDDIR)/__pycache__)
+
+# Wrapper for sphinx-build
+
+BUILD_WRAPPER = $(srctree)/tools/docs/sphinx-build-wrapper
+
 # For denylisting "variable font" files
 # Can be overridden by setting as an env variable
 FONTS_CONF_DENY_VF ?= $(HOME)/deny-vf
 
-ifeq ($(findstring 1, $(KBUILD_VERBOSE)),)
-SPHINXOPTS    += "-q"
-endif
-
 # User-friendly check for sphinx-build
 HAVE_SPHINX := $(shell if which $(SPHINXBUILD) >/dev/null 2>&1; then echo 1; else echo 0; fi)
 
@@ -51,63 +52,29 @@ ifeq ($(HAVE_SPHINX),0)
 
 else # HAVE_SPHINX
 
-# User-friendly check for pdflatex and latexmk
-HAVE_PDFLATEX := $(shell if which $(PDFLATEX) >/dev/null 2>&1; then echo 1; else echo 0; fi)
-HAVE_LATEXMK := $(shell if which latexmk >/dev/null 2>&1; then echo 1; else echo 0; fi)
+# Common documentation targets
+infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
+	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
+	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
+		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
+		--theme=$(DOCS_THEME) --css=$(DOCS_CSS) --paper=$(PAPER)
 
-ifeq ($(HAVE_LATEXMK),1)
-	PDFLATEX := latexmk -$(PDFLATEX)
-endif #HAVE_LATEXMK
-
-# Internal variables.
-PAPEROPT_a4     = -D latex_elements.papersize=a4paper
-PAPEROPT_letter = -D latex_elements.papersize=letterpaper
-ALLSPHINXOPTS   = -D kerneldoc_srctree=$(srctree) -D kerneldoc_bin=$(KERNELDOC)
-ALLSPHINXOPTS   += $(PAPEROPT_$(PAPER)) $(SPHINXOPTS)
-ifneq ($(wildcard $(srctree)/.config),)
-ifeq ($(CONFIG_RUST),y)
-	# Let Sphinx know we will include rustdoc
-	ALLSPHINXOPTS   +=  -t rustdoc
-endif
+# Special handling for pdfdocs
+ifeq ($(shell which $(PDFLATEX) >/dev/null 2>&1; echo $$?),0)
+pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
+else
+pdfdocs:
+	$(warning The '$(PDFLATEX)' command was not found. Make sure you have it installed and in PATH to produce PDF output.)
+	@echo "  SKIP    Sphinx $@ target."
 endif
-# the i18n builder cannot share the environment and doctrees with the others
-I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
-
-# commands; the 'cmd' from scripts/Kbuild.include is not *loopable*
-loop_cmd = $(echo-cmd) $(cmd_$(1)) || exit;
-
-# $2 sphinx builder e.g. "html"
-# $3 name of the build subfolder / e.g. "userspace-api/media", used as:
-#    * dest folder relative to $(BUILDDIR) and
-#    * cache folder relative to $(BUILDDIR)/.doctrees
-# $4 dest subfolder e.g. "man" for man pages at userspace-api/media/man
-# $5 reST source folder relative to $(src),
-#    e.g. "userspace-api/media" for the linux-tv book-set at ./Documentation/userspace-api/media
-
-PYTHONPYCACHEPREFIX ?= $(abspath $(BUILDDIR)/__pycache__)
-
-quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
-      cmd_sphinx = \
-	PYTHONPYCACHEPREFIX="$(PYTHONPYCACHEPREFIX)" \
-	BUILDDIR=$(abspath $(BUILDDIR)) SPHINX_CONF=$(abspath $(src)/$5/$(SPHINX_CONF)) \
-	$(PYTHON3) $(srctree)/scripts/jobserver-exec \
-	$(CONFIG_SHELL) $(srctree)/Documentation/sphinx/parallel-wrapper.sh \
-	$(SPHINXBUILD) \
-	-b $2 \
-	-c $(abspath $(src)) \
-	-d $(abspath $(BUILDDIR)/.doctrees/$3) \
-	-D version=$(KERNELVERSION) -D release=$(KERNELRELEASE) \
-	$(ALLSPHINXOPTS) \
-	$(abspath $(src)/$5) \
-	$(abspath $(BUILDDIR)/$3/$4) && \
-	if [ "x$(DOCS_CSS)" != "x" ]; then \
-		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
-	fi
 
+# HTML main logic is identical to other targets. However, if rust is enabled,
+# an extra step at the end is required to generate rustdoc.
 htmldocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
-
+	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
+	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
+		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
+		--theme=$(DOCS_THEME) --css=$(DOCS_CSS) --paper=$(PAPER)
 # If Rust support is available and .config exists, add rustdoc generated contents.
 # If there are any, the errors from this make rustdoc will be displayed but
 # won't stop the execution of htmldocs
@@ -118,49 +85,6 @@ ifeq ($(CONFIG_RUST),y)
 endif
 endif
 
-texinfodocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,texinfo,$(var),texinfo,$(var)))
-
-# Note: the 'info' Make target is generated by sphinx itself when
-# running the texinfodocs target define above.
-infodocs: texinfodocs
-	$(MAKE) -C $(BUILDDIR)/texinfo info
-
-linkcheckdocs:
-	@$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,linkcheck,$(var),,$(var)))
-
-latexdocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,latex,$(var),latex,$(var)))
-
-ifeq ($(HAVE_PDFLATEX),0)
-
-pdfdocs:
-	$(warning The '$(PDFLATEX)' command was not found. Make sure you have it installed and in PATH to produce PDF output.)
-	@echo "  SKIP    Sphinx $@ target."
-
-else # HAVE_PDFLATEX
-
-pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
-pdfdocs: latexdocs
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	$(foreach var,$(SPHINXDIRS), \
-	   $(MAKE) PDFLATEX="$(PDFLATEX)" LATEXOPTS="$(LATEXOPTS)" $(DENY_VF) -C $(BUILDDIR)/$(var)/latex || sh $(srctree)/scripts/check-variable-fonts.sh || exit; \
-	   mkdir -p $(BUILDDIR)/$(var)/pdf; \
-	   mv $(subst .tex,.pdf,$(wildcard $(BUILDDIR)/$(var)/latex/*.tex)) $(BUILDDIR)/$(var)/pdf/; \
-	)
-
-endif # HAVE_PDFLATEX
-
-epubdocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,epub,$(var),epub,$(var)))
-
-xmldocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,xml,$(var),xml,$(var)))
-
 endif # HAVE_SPHINX
 
 # The following targets are independent of HAVE_SPHINX, and the rules should
@@ -172,6 +96,9 @@ refcheckdocs:
 cleandocs:
 	$(Q)rm -rf $(BUILDDIR)
 
+# Used only on help
+_SPHINXDIRS   = $(sort $(patsubst $(srctree)/Documentation/%/index.rst,%,$(wildcard $(srctree)/Documentation/*/index.rst)))
+
 dochelp:
 	@echo  ' Linux kernel internal documentation in different formats from ReST:'
 	@echo  '  htmldocs        - HTML'
diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
new file mode 100755
index 000000000000..df469af8a4ef
--- /dev/null
+++ b/tools/docs/sphinx-build-wrapper
@@ -0,0 +1,288 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+import argparse
+import os
+import shlex
+import shutil
+import subprocess
+import sys
+from lib.python_version import PythonVersion
+
+LIB_DIR = "../../scripts/lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
+
+from jobserver import JobserverExec
+
+MIN_PYTHON_VERSION = PythonVersion("3.7").version
+PAPER = ["", "a4", "letter"]
+TARGETS = {
+    "cleandocs":     { "builder": "clean" },
+    "linkcheckdocs": { "builder": "linkcheck" },
+    "htmldocs":      { "builder": "html" },
+    "epubdocs":      { "builder": "epub",    "out_dir": "epub" },
+    "texinfodocs":   { "builder": "texinfo", "out_dir": "texinfo" },
+    "infodocs":      { "builder": "texinfo", "out_dir": "texinfo" },
+    "latexdocs":     { "builder": "latex",   "out_dir": "latex" },
+    "pdfdocs":       { "builder": "latex",   "out_dir": "latex" },
+    "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
+}
+
+class SphinxBuilder:
+    def is_rust_enabled(self):
+        config_path = os.path.join(self.srctree, ".config")
+        if os.path.isfile(config_path):
+            with open(config_path, "r", encoding="utf-8") as f:
+                return "CONFIG_RUST=y" in f.read()
+        return False
+
+    def get_path(self, path, abs_path=False):
+        path = os.path.expanduser(path)
+        if not path.startswith("/"):
+            path = os.path.join(self.srctree, path)
+        if abs_path:
+            return os.path.abspath(path)
+        return path
+
+    def __init__(self, verbose=False, n_jobs=None):
+        self.verbose = None
+        self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
+        self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
+        self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
+        self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+        if not verbose:
+            verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
+        if verbose is not None:
+            self.verbose = verbose
+        parser = argparse.ArgumentParser()
+        parser.add_argument('-j', '--jobs', type=int)
+        parser.add_argument('-q', '--quiet', type=int)
+        sphinxopts = shlex.split(os.environ.get("SPHINXOPTS", ""))
+        sphinx_args, self.sphinxopts = parser.parse_known_args(sphinxopts)
+        if sphinx_args.quiet is True:
+            self.verbose = False
+        if sphinx_args.jobs:
+            self.n_jobs = sphinx_args.jobs
+        self.n_jobs = n_jobs
+        self.srctree = os.environ.get("srctree")
+        if not self.srctree:
+            self.srctree = "."
+            os.environ["srctree"] = self.srctree
+        self.sphinxbuild = os.environ.get("SPHINXBUILD", "sphinx-build")
+        self.kerneldoc = self.get_path(os.environ.get("KERNELDOC",
+                                                      "scripts/kernel-doc.py"))
+        self.obj = os.environ.get("obj", "Documentation")
+        self.builddir = self.get_path(os.path.join(self.obj, "output"),
+                                      abs_path=True)
+
+        self.config_rust = self.is_rust_enabled()
+
+        self.pdflatex_cmd = shutil.which(self.pdflatex)
+        self.latexmk_cmd = shutil.which("latexmk")
+
+        self.env = os.environ.copy()
+
+    def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
+        with JobserverExec() as jobserver:
+            if jobserver.claim:
+                n_jobs = str(jobserver.claim)
+            else:
+                n_jobs = "auto" # Supported since Sphinx 1.7
+            cmd = []
+            cmd.append(sys.executable)
+            cmd.append(sphinx_build)
+            if self.n_jobs:
+                n_jobs = str(self.n_jobs)
+
+            if n_jobs:
+                cmd += [f"-j{n_jobs}"]
+
+            if not self.verbose:
+                cmd.append("-q")
+            cmd += self.sphinxopts
+            cmd += build_args
+            if self.verbose:
+                print(" ".join(cmd))
+            return subprocess.call(cmd, *args, **pwargs)
+
+    def handle_html(self, css, output_dir):
+        if not css:
+            return
+        css = os.path.expanduser(css)
+        if not css.startswith("/"):
+            css = os.path.join(self.srctree, css)
+        static_dir = os.path.join(output_dir, "_static")
+        os.makedirs(static_dir, exist_ok=True)
+        try:
+            shutil.copy2(css, static_dir)
+        except (OSError, IOError) as e:
+            print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
+
+    def handle_pdf(self, output_dirs):
+        builds = {}
+        max_len = 0
+        for from_dir in output_dirs:
+            pdf_dir = os.path.join(from_dir, "../pdf")
+            os.makedirs(pdf_dir, exist_ok=True)
+            if self.latexmk_cmd:
+                latex_cmd = [self.latexmk_cmd, f"-{self.pdflatex}"]
+            else:
+                latex_cmd = [self.pdflatex]
+            latex_cmd.extend(shlex.split(self.latexopts))
+            tex_suffix = ".tex"
+            has_tex = False
+            build_failed = False
+            with os.scandir(from_dir) as it:
+                for entry in it:
+                    if not entry.name.endswith(tex_suffix):
+                        continue
+                    name = entry.name[:-len(tex_suffix)]
+                    has_tex = True
+                    try:
+                        subprocess.run(latex_cmd + [entry.path],
+                                       cwd=from_dir, check=True)
+                    except subprocess.CalledProcessError:
+                        pass
+                    pdf_name = name + ".pdf"
+                    pdf_from = os.path.join(from_dir, pdf_name)
+                    pdf_to = os.path.join(pdf_dir, pdf_name)
+                    if os.path.exists(pdf_from):
+                        os.rename(pdf_from, pdf_to)
+                        builds[name] = os.path.relpath(pdf_to, self.builddir)
+                    else:
+                        builds[name] = "FAILED"
+                        build_failed = True
+                    name = entry.name.removesuffix(".tex")
+                    max_len = max(max_len, len(name))
+
+            if not has_tex:
+                name = os.path.basename(from_dir)
+                max_len = max(max_len, len(name))
+                builds[name] = "FAILED (no .tex)"
+                build_failed = True
+        msg = "Summary"
+        msg += "\n" + "=" * len(msg)
+        print()
+        print(msg)
+        for pdf_name, pdf_file in builds.items():
+            print(f"{pdf_name:<{max_len}}: {pdf_file}")
+        print()
+        if build_failed:
+            sys.exit("PDF build failed: not all PDF files were created.")
+        else:
+            print("All PDF files were built.")
+
+    def handle_info(self, output_dirs):
+        for output_dir in output_dirs:
+            try:
+                subprocess.run(["make", "info"], cwd=output_dir, check=True)
+            except subprocess.CalledProcessError as e:
+                sys.exit(f"Error generating info docs: {e}")
+
+    def cleandocs(self, builder):
+        shutil.rmtree(self.builddir, ignore_errors=True)
+
+    def build(self, target, sphinxdirs=None, conf="conf.py",
+              theme=None, css=None, paper=None):
+        builder = TARGETS[target]["builder"]
+        out_dir = TARGETS[target].get("out_dir", "")
+        if target == "cleandocs":
+            self.cleandocs(builder)
+            return
+        if theme:
+                os.environ["DOCS_THEME"] = theme
+        sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
+        if not sphinxbuild:
+            sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
+        if builder == "latex":
+            if not self.pdflatex_cmd and not self.latexmk_cmd:
+                sys.exit("Error: pdflatex or latexmk required for PDF generation")
+        docs_dir = os.path.abspath(os.path.join(self.srctree, "Documentation"))
+        kerneldoc = self.kerneldoc
+        if kerneldoc.startswith(self.srctree):
+            kerneldoc = os.path.relpath(kerneldoc, self.srctree)
+        args = [ "-b", builder, "-c", docs_dir ]
+        if builder == "latex":
+            if not paper:
+                paper = PAPER[1]
+            args.extend(["-D", f"latex_elements.papersize={paper}paper"])
+        if self.config_rust:
+            args.extend(["-t", "rustdoc"])
+        if conf:
+            self.env["SPHINX_CONF"] = self.get_path(conf, abs_path=True)
+        if not sphinxdirs:
+            sphinxdirs = os.environ.get("SPHINXDIRS", ".")
+        sphinxdirs_list = []
+        for sphinxdir in sphinxdirs:
+            if isinstance(sphinxdir, list):
+                sphinxdirs_list += sphinxdir
+            else:
+                for name in sphinxdir.split(" "):
+                    sphinxdirs_list.append(name)
+        output_dirs = []
+        for sphinxdir in sphinxdirs_list:
+            src_dir = os.path.join(docs_dir, sphinxdir)
+            doctree_dir = os.path.join(self.builddir, ".doctrees")
+            output_dir = os.path.join(self.builddir, sphinxdir, out_dir)
+            src_dir = os.path.normpath(src_dir)
+            doctree_dir = os.path.normpath(doctree_dir)
+            output_dir = os.path.normpath(output_dir)
+            os.makedirs(doctree_dir, exist_ok=True)
+            os.makedirs(output_dir, exist_ok=True)
+            output_dirs.append(output_dir)
+            build_args = args + [
+                "-d", doctree_dir,
+                "-D", f"kerneldoc_bin={kerneldoc}",
+                "-D", f"version={self.kernelversion}",
+                "-D", f"release={self.kernelrelease}",
+                "-D", f"kerneldoc_srctree={self.srctree}",
+                src_dir,
+                output_dir,
+            ]
+            try:
+                self.run_sphinx(sphinxbuild, build_args, env=self.env)
+            except (OSError, ValueError, subprocess.SubprocessError) as e:
+                sys.exit(f"Build failed: {repr(e)}")
+            if target in ["htmldocs", "epubdocs"]:
+                self.handle_html(css, output_dir)
+        if target == "pdfdocs":
+            self.handle_pdf(output_dirs)
+        elif target == "infodocs":
+            self.handle_info(output_dirs)
+
+def jobs_type(value):
+    if value is None:
+        return None
+    if value.lower() == 'auto':
+        return value.lower()
+    try:
+        if int(value) >= 1:
+            return value
+        raise argparse.ArgumentTypeError(f"Minimum jobs is 1, got {value}")
+    except ValueError:
+        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")
+
+def main():
+    parser = argparse.ArgumentParser(description="Kernel documentation builder")
+    parser.add_argument("target", choices=list(TARGETS.keys()),
+                        help="Documentation target to build")
+    parser.add_argument("--sphinxdirs", nargs="+",
+                        help="Specific directories to build")
+    parser.add_argument("--conf", default="conf.py",
+                        help="Sphinx configuration file")
+    parser.add_argument("--theme", help="Sphinx theme to use")
+    parser.add_argument("--css", help="Custom CSS file for HTML/EPUB")
+    parser.add_argument("--paper", choices=PAPER, default=PAPER[0],
+                        help="Paper size for LaTeX/PDF output")
+    parser.add_argument("-v", "--verbose", action='store_true',
+                        help="place build in verbose mode")
+    parser.add_argument('-j', '--jobs', type=jobs_type,
+                        help="Sets number of jobs to use with sphinx-build")
+    args = parser.parse_args()
+    PythonVersion.check_python(MIN_PYTHON_VERSION)
+    builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+    builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
+                  theme=args.theme, css=args.css, paper=args.paper)
+
+if __name__ == "__main__":
+    main()
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Björn Roy Baron, Alex Gaynor,
	Alice Ryhl, Andreas Hindborg, Benno Lossin, Boqun Feng,
	Danilo Krummrich, Gary Guo, Mauro Carvalho Chehab, Miguel Ojeda,
	Trevor Gross, linux-kernel, rust-for-linux

To help seing the actual size of the script when it was added,
I opted to strip out all comments from the original script.

Add them here:

 tools/docs/sphinx-build-wrapper | 261 +++++++++++++++++++++++++++++++-
 1 file changed, 257 insertions(+), 4 deletion(-)

As the code from the script has 288 lines of code, it means that
about half of the script are comments.

Also ensure pylint won't report any warnings.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 261 +++++++++++++++++++++++++++++++-
 1 file changed, 257 insertions(+), 4 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index df469af8a4ef..e9c522794fbe 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -1,21 +1,71 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0
+# Copyright (C) 2025 Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+#
+# pylint: disable=R0902, R0912, R0913, R0914, R0915, R0917, C0103
+#
+# Converted from docs Makefile and parallel-wrapper.sh, both under
+# GPLv2, copyrighted since 2008 by the following authors:
+#
+#    Akira Yokosawa <akiyks@gmail.com>
+#    Arnd Bergmann <arnd@arndb.de>
+#    Breno Leitao <leitao@debian.org>
+#    Carlos Bilbao <carlos.bilbao@amd.com>
+#    Dave Young <dyoung@redhat.com>
+#    Donald Hunter <donald.hunter@gmail.com>
+#    Geert Uytterhoeven <geert+renesas@glider.be>
+#    Jani Nikula <jani.nikula@intel.com>
+#    Jan Stancek <jstancek@redhat.com>
+#    Jonathan Corbet <corbet@lwn.net>
+#    Joshua Clayton <stillcompiling@gmail.com>
+#    Kees Cook <keescook@chromium.org>
+#    Linus Torvalds <torvalds@linux-foundation.org>
+#    Magnus Damm <damm+renesas@opensource.se>
+#    Masahiro Yamada <masahiroy@kernel.org>
+#    Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+#    Maxim Cournoyer <maxim.cournoyer@gmail.com>
+#    Peter Foley <pefoley2@pefoley.com>
+#    Randy Dunlap <rdunlap@infradead.org>
+#    Rob Herring <robh@kernel.org>
+#    Shuah Khan <shuahkh@osg.samsung.com>
+#    Thorsten Blum <thorsten.blum@toblux.com>
+#    Tomas Winkler <tomas.winkler@intel.com>
+
+
+"""
+Sphinx build wrapper that handles Kernel-specific business rules:
+
+- it gets the Kernel build environment vars;
+- it determines what's the best parallelism;
+- it handles SPHINXDIRS
+
+This tool ensures that MIN_PYTHON_VERSION is satisfied. If version is
+below that, it seeks for a new Python version. If found, it re-runs using
+the newer version.
+"""
+
 import argparse
 import os
 import shlex
 import shutil
 import subprocess
 import sys
+
 from lib.python_version import PythonVersion
 
 LIB_DIR = "../../scripts/lib"
 SRC_DIR = os.path.dirname(os.path.realpath(__file__))
+
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from jobserver import JobserverExec
+from jobserver import JobserverExec         # pylint: disable=C0413,C0411,E0401
 
+#
+#  Some constants
+#
 MIN_PYTHON_VERSION = PythonVersion("3.7").version
 PAPER = ["", "a4", "letter"]
+
 TARGETS = {
     "cleandocs":     { "builder": "clean" },
     "linkcheckdocs": { "builder": "linkcheck" },
@@ -28,8 +78,19 @@ TARGETS = {
     "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
 }
 
+
+#
+# SphinxBuilder class
+#
+
 class SphinxBuilder:
+    """
+    Handles a sphinx-build target, adding needed arguments to build
+    with the Kernel.
+    """
+
     def is_rust_enabled(self):
+        """Check if rust is enabled at .config"""
         config_path = os.path.join(self.srctree, ".config")
         if os.path.isfile(config_path):
             with open(config_path, "r", encoding="utf-8") as f:
@@ -37,37 +98,83 @@ class SphinxBuilder:
         return False
 
     def get_path(self, path, abs_path=False):
+        """
+        Ancillary routine to handle patches the right way, as shell does.
+
+        It first expands "~" and "~user". Then, if patch is not absolute,
+        join self.srctree. Finally, if requested, convert to abspath.
+        """
+
         path = os.path.expanduser(path)
         if not path.startswith("/"):
             path = os.path.join(self.srctree, path)
+
         if abs_path:
             return os.path.abspath(path)
+
         return path
 
     def __init__(self, verbose=False, n_jobs=None):
+        """Initialize internal variables"""
         self.verbose = None
+
+        #
+        # Normal variables passed from Kernel's makefile
+        #
         self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
         self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
         self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
         self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+
         if not verbose:
             verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
+
         if verbose is not None:
             self.verbose = verbose
+
+        #
+        # As we handle number of jobs and quiet in separate, we need to pick
+        # both the same way as sphinx-build would pick, optionally accepts
+        # whitespaces or not. So let's use argparse to handle argument expansion
+        #
         parser = argparse.ArgumentParser()
         parser.add_argument('-j', '--jobs', type=int)
         parser.add_argument('-q', '--quiet', type=int)
+
+        #
+        # Other sphinx-build arguments go as-is, so place them
+        # at self.sphinxopts, using shell parser
+        #
         sphinxopts = shlex.split(os.environ.get("SPHINXOPTS", ""))
+
+        #
+        # Build a list of sphinx args
+        #
         sphinx_args, self.sphinxopts = parser.parse_known_args(sphinxopts)
         if sphinx_args.quiet is True:
             self.verbose = False
+
         if sphinx_args.jobs:
             self.n_jobs = sphinx_args.jobs
+
+        #
+        # If the command line argument "-j" is used override SPHINXOPTS
+        #
+
         self.n_jobs = n_jobs
+
+        #
+        # Source tree directory. This needs to be at os.environ, as
+        # Sphinx extensions use it
+        #
         self.srctree = os.environ.get("srctree")
         if not self.srctree:
             self.srctree = "."
             os.environ["srctree"] = self.srctree
+
+        #
+        # Now that we can expand srctree, get other directories as well
+        #
         self.sphinxbuild = os.environ.get("SPHINXBUILD", "sphinx-build")
         self.kerneldoc = self.get_path(os.environ.get("KERNELDOC",
                                                       "scripts/kernel-doc.py"))
@@ -77,20 +184,36 @@ class SphinxBuilder:
 
         self.config_rust = self.is_rust_enabled()
 
+        #
+        # Get directory locations for LaTeX build toolchain
+        #
         self.pdflatex_cmd = shutil.which(self.pdflatex)
         self.latexmk_cmd = shutil.which("latexmk")
 
         self.env = os.environ.copy()
 
     def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
+        """
+        Executes sphinx-build using current python3 command and setting
+        -j parameter if possible to run the build in parallel.
+        """
+
         with JobserverExec() as jobserver:
             if jobserver.claim:
                 n_jobs = str(jobserver.claim)
             else:
                 n_jobs = "auto" # Supported since Sphinx 1.7
+
             cmd = []
+
             cmd.append(sys.executable)
+
             cmd.append(sphinx_build)
+
+            #
+            # Override auto setting, if explicitly passed from command line
+            # or via SPHINXOPTS
+            #
             if self.n_jobs:
                 n_jobs = str(self.n_jobs)
 
@@ -99,59 +222,100 @@ class SphinxBuilder:
 
             if not self.verbose:
                 cmd.append("-q")
+
             cmd += self.sphinxopts
             cmd += build_args
+
             if self.verbose:
                 print(" ".join(cmd))
             return subprocess.call(cmd, *args, **pwargs)
 
     def handle_html(self, css, output_dir):
+        """
+        Extra steps for HTML and epub output.
+
+        For such targets, we need to ensure that CSS will be properly
+        copied to the output _static directory
+        """
+
         if not css:
             return
+
         css = os.path.expanduser(css)
         if not css.startswith("/"):
             css = os.path.join(self.srctree, css)
+
         static_dir = os.path.join(output_dir, "_static")
         os.makedirs(static_dir, exist_ok=True)
+
         try:
             shutil.copy2(css, static_dir)
         except (OSError, IOError) as e:
             print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
 
     def handle_pdf(self, output_dirs):
+        """
+        Extra steps for PDF output.
+
+        As PDF is handled via a LaTeX output, after building the .tex file,
+        a new build is needed to create the PDF output from the latex
+        directory.
+        """
         builds = {}
         max_len = 0
+
         for from_dir in output_dirs:
             pdf_dir = os.path.join(from_dir, "../pdf")
             os.makedirs(pdf_dir, exist_ok=True)
+
             if self.latexmk_cmd:
                 latex_cmd = [self.latexmk_cmd, f"-{self.pdflatex}"]
             else:
                 latex_cmd = [self.pdflatex]
+
             latex_cmd.extend(shlex.split(self.latexopts))
+
             tex_suffix = ".tex"
+
+            #
+            # Process each .tex file
+            #
+
             has_tex = False
             build_failed = False
             with os.scandir(from_dir) as it:
                 for entry in it:
                     if not entry.name.endswith(tex_suffix):
                         continue
+
                     name = entry.name[:-len(tex_suffix)]
                     has_tex = True
+
+                    #
+                    # LaTeX PDF error code is almost useless for us:
+                    # any warning makes it non-zero. For kernel doc builds it
+                    # always return non-zero even when build succeeds.
+                    # So, let's do the best next thing: check if all PDF
+                    # files were built. If they're, print a summary and
+                    # return 0 at the end of this function
+                    #
                     try:
                         subprocess.run(latex_cmd + [entry.path],
                                        cwd=from_dir, check=True)
                     except subprocess.CalledProcessError:
                         pass
+
                     pdf_name = name + ".pdf"
                     pdf_from = os.path.join(from_dir, pdf_name)
                     pdf_to = os.path.join(pdf_dir, pdf_name)
+
                     if os.path.exists(pdf_from):
                         os.rename(pdf_from, pdf_to)
                         builds[name] = os.path.relpath(pdf_to, self.builddir)
                     else:
                         builds[name] = "FAILED"
                         build_failed = True
+
                     name = entry.name.removesuffix(".tex")
                     max_len = max(max_len, len(name))
 
@@ -160,58 +324,100 @@ class SphinxBuilder:
                 max_len = max(max_len, len(name))
                 builds[name] = "FAILED (no .tex)"
                 build_failed = True
+
         msg = "Summary"
         msg += "\n" + "=" * len(msg)
         print()
         print(msg)
+
         for pdf_name, pdf_file in builds.items():
             print(f"{pdf_name:<{max_len}}: {pdf_file}")
+
         print()
+
         if build_failed:
             sys.exit("PDF build failed: not all PDF files were created.")
         else:
             print("All PDF files were built.")
 
     def handle_info(self, output_dirs):
+        """
+        Extra steps for Info output.
+
+        For texinfo generation, an additional make is needed from the
+        texinfo directory.
+        """
+
         for output_dir in output_dirs:
             try:
                 subprocess.run(["make", "info"], cwd=output_dir, check=True)
             except subprocess.CalledProcessError as e:
                 sys.exit(f"Error generating info docs: {e}")
 
-    def cleandocs(self, builder):
+    def cleandocs(self, builder):           # pylint: disable=W0613
+        """Remove documentation output directory"""
         shutil.rmtree(self.builddir, ignore_errors=True)
 
     def build(self, target, sphinxdirs=None, conf="conf.py",
               theme=None, css=None, paper=None):
+        """
+        Build documentation using Sphinx. This is the core function of this
+        module. It prepares all arguments required by sphinx-build.
+        """
+
         builder = TARGETS[target]["builder"]
         out_dir = TARGETS[target].get("out_dir", "")
+
+        #
+        # Cleandocs doesn't require sphinx-build
+        #
         if target == "cleandocs":
             self.cleandocs(builder)
             return
+
         if theme:
-                os.environ["DOCS_THEME"] = theme
+            os.environ["DOCS_THEME"] = theme
+
+        #
+        # Other targets require sphinx-build, so check if it exists
+        #
         sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
         if not sphinxbuild:
             sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
+
         if builder == "latex":
             if not self.pdflatex_cmd and not self.latexmk_cmd:
                 sys.exit("Error: pdflatex or latexmk required for PDF generation")
+
         docs_dir = os.path.abspath(os.path.join(self.srctree, "Documentation"))
+
+        #
+        # Fill in base arguments for Sphinx build
+        #
         kerneldoc = self.kerneldoc
         if kerneldoc.startswith(self.srctree):
             kerneldoc = os.path.relpath(kerneldoc, self.srctree)
+
         args = [ "-b", builder, "-c", docs_dir ]
+
         if builder == "latex":
             if not paper:
                 paper = PAPER[1]
+
             args.extend(["-D", f"latex_elements.papersize={paper}paper"])
+
         if self.config_rust:
             args.extend(["-t", "rustdoc"])
+
         if conf:
             self.env["SPHINX_CONF"] = self.get_path(conf, abs_path=True)
+
         if not sphinxdirs:
             sphinxdirs = os.environ.get("SPHINXDIRS", ".")
+
+        #
+        # sphinxdirs can be a list or a whitespace-separated string
+        #
         sphinxdirs_list = []
         for sphinxdir in sphinxdirs:
             if isinstance(sphinxdir, list):
@@ -219,17 +425,32 @@ class SphinxBuilder:
             else:
                 for name in sphinxdir.split(" "):
                     sphinxdirs_list.append(name)
+
+        #
+        # Step 1:  Build each directory in separate.
+        #
+        # This is not the best way of handling it, as cross-references between
+        # them will be broken, but this is what we've been doing since
+        # the beginning.
+        #
         output_dirs = []
         for sphinxdir in sphinxdirs_list:
             src_dir = os.path.join(docs_dir, sphinxdir)
             doctree_dir = os.path.join(self.builddir, ".doctrees")
             output_dir = os.path.join(self.builddir, sphinxdir, out_dir)
+
+            #
+            # Make directory names canonical
+            #
             src_dir = os.path.normpath(src_dir)
             doctree_dir = os.path.normpath(doctree_dir)
             output_dir = os.path.normpath(output_dir)
+
             os.makedirs(doctree_dir, exist_ok=True)
             os.makedirs(output_dir, exist_ok=True)
+
             output_dirs.append(output_dir)
+
             build_args = args + [
                 "-d", doctree_dir,
                 "-D", f"kerneldoc_bin={kerneldoc}",
@@ -239,48 +460,80 @@ class SphinxBuilder:
                 src_dir,
                 output_dir,
             ]
+
             try:
                 self.run_sphinx(sphinxbuild, build_args, env=self.env)
             except (OSError, ValueError, subprocess.SubprocessError) as e:
                 sys.exit(f"Build failed: {repr(e)}")
+
+            #
+            # Ensure that each html/epub output will have needed static files
+            #
             if target in ["htmldocs", "epubdocs"]:
                 self.handle_html(css, output_dir)
+
+        #
+        # Step 2: Some targets (PDF and info) require an extra step once
+        #         sphinx-build finishes
+        #
         if target == "pdfdocs":
             self.handle_pdf(output_dirs)
         elif target == "infodocs":
             self.handle_info(output_dirs)
 
 def jobs_type(value):
+    """
+    Handle valid values for -j. Accepts Sphinx "-jauto", plus a number
+    equal or bigger than one.
+    """
     if value is None:
         return None
+
     if value.lower() == 'auto':
         return value.lower()
+
     try:
         if int(value) >= 1:
             return value
+
         raise argparse.ArgumentTypeError(f"Minimum jobs is 1, got {value}")
     except ValueError:
-        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")
+        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")  # pylint: disable=W0707
 
 def main():
+    """
+    Main function. The only mandatory argument is the target. If not
+    specified, the other arguments will use default values if not
+    specified at os.environ.
+    """
     parser = argparse.ArgumentParser(description="Kernel documentation builder")
+
     parser.add_argument("target", choices=list(TARGETS.keys()),
                         help="Documentation target to build")
     parser.add_argument("--sphinxdirs", nargs="+",
                         help="Specific directories to build")
     parser.add_argument("--conf", default="conf.py",
                         help="Sphinx configuration file")
+
     parser.add_argument("--theme", help="Sphinx theme to use")
+
     parser.add_argument("--css", help="Custom CSS file for HTML/EPUB")
+
     parser.add_argument("--paper", choices=PAPER, default=PAPER[0],
                         help="Paper size for LaTeX/PDF output")
+
     parser.add_argument("-v", "--verbose", action='store_true',
                         help="place build in verbose mode")
+
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
+
     args = parser.parse_args()
+
     PythonVersion.check_python(MIN_PYTHON_VERSION)
+
     builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
 
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Sometimes, it is desired to run Sphinx from a virtual environment.
Add a command line parameter to automatically build Sphinx from
such environment.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 31 ++++++++++++++++++++++++++++---
 1 file changed, 28 insertions(+), 3 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index e9c522794fbe..9c5df50fb99d 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -63,6 +63,7 @@ from jobserver import JobserverExec         # pylint: disable=C0413,C0411,E0401
 #
 #  Some constants
 #
+VENV_DEFAULT = "sphinx_latest"
 MIN_PYTHON_VERSION = PythonVersion("3.7").version
 PAPER = ["", "a4", "letter"]
 
@@ -114,8 +115,9 @@ class SphinxBuilder:
 
         return path
 
-    def __init__(self, verbose=False, n_jobs=None):
+    def __init__(self, venv=None, verbose=False, n_jobs=None):
         """Initialize internal variables"""
+        self.venv = venv
         self.verbose = None
 
         #
@@ -192,6 +194,21 @@ class SphinxBuilder:
 
         self.env = os.environ.copy()
 
+        #
+        # If venv command line argument is specified, run Sphinx from venv
+        #
+        if venv:
+            bin_dir = os.path.join(venv, "bin")
+            if not os.path.isfile(os.path.join(bin_dir, "activate")):
+                sys.exit(f"Venv {venv} not found.")
+
+            # "activate" virtual env
+            self.env["PATH"] = bin_dir + ":" + self.env["PATH"]
+            self.env["VIRTUAL_ENV"] = venv
+            if "PYTHONHOME" in self.env:
+                del self.env["PYTHONHOME"]
+            print(f"Setting venv to {venv}")
+
     def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
         """
         Executes sphinx-build using current python3 command and setting
@@ -206,7 +223,10 @@ class SphinxBuilder:
 
             cmd = []
 
-            cmd.append(sys.executable)
+            if self.venv:
+                cmd.append("python")
+            else:
+                cmd.append(sys.executable)
 
             cmd.append(sphinx_build)
 
@@ -528,11 +548,16 @@ def main():
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
 
+    parser.add_argument("-V", "--venv", nargs='?', const=f'{VENV_DEFAULT}',
+                        default=None,
+                        help=f'If used, run Sphinx from a venv dir (default dir: {VENV_DEFAULT})')
+
     args = parser.parse_args()
 
     PythonVersion.check_python(MIN_PYTHON_VERSION)
 
-    builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+    builder = SphinxBuilder(venv=args.venv, verbose=args.verbose,
+                            n_jobs=args.jobs)
 
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 09/15] docs: parallel-wrapper.sh: remove script
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

The only usage of this script was docs Makefile. Now that
it is using the new sphinx-build-wrapper, which has inside
the code from parallel-wrapper.sh, we can drop this script.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parallel-wrapper.sh | 33 ------------------------
 1 file changed, 33 deletions(-)
 delete mode 100644 Documentation/sphinx/parallel-wrapper.sh

diff --git a/Documentation/sphinx/parallel-wrapper.sh b/Documentation/sphinx/parallel-wrapper.sh
deleted file mode 100644
index e54c44ce117d..000000000000
--- a/Documentation/sphinx/parallel-wrapper.sh
+++ /dev/null
@@ -1,33 +0,0 @@
-#!/bin/sh
-# SPDX-License-Identifier: GPL-2.0+
-#
-# Figure out if we should follow a specific parallelism from the make
-# environment (as exported by scripts/jobserver-exec), or fall back to
-# the "auto" parallelism when "-jN" is not specified at the top-level
-# "make" invocation.
-
-sphinx="$1"
-shift || true
-
-parallel="$PARALLELISM"
-if [ -z "$parallel" ] ; then
-	# If no parallelism is specified at the top-level make, then
-	# fall back to the expected "-jauto" mode that the "htmldocs"
-	# target has had.
-	auto=$(perl -e 'open IN,"'"$sphinx"' --version 2>&1 |";
-			while (<IN>) {
-				if (m/([\d\.]+)/) {
-					print "auto" if ($1 >= "1.7")
-				}
-			}
-			close IN')
-	if [ -n "$auto" ] ; then
-		parallel="$auto"
-	fi
-fi
-# Only if some parallelism has been determined do we add the -jN option.
-if [ -n "$parallel" ] ; then
-	parallel="-j$parallel"
-fi
-
-exec "$sphinx" $parallel "$@"
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

While the build system supports this for a long time, this was
never documented. Add a documentation for it.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 2b0ed8cd5ea8..3e1cb44a5fbb 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -124,4 +124,6 @@ dochelp:
 	@echo
 	@echo  '  make DOCS_CSS={a .css file} adds a DOCS_CSS override file for html/epub output.'
 	@echo
+	@echo  '  make PAPER={a4|letter} Specifies the paper size used for LaTeX/PDF output.'
+	@echo
 	@echo  '  Default location for the generated documents is Documentation/output'
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

By default, we use LaTeX batch mode to build docs. This way, when
an error happens, the build fails. This is good for normal builds,
but when debugging problems with pdf generation, the best is to
use interactive mode.

We already support it via LATEXOPTS, but having a command line
argument makes it easier:

Interactive mode:
	./scripts/sphinx-build-wrapper pdfdocs --sphinxdirs peci -v -i
	...
	Running 'xelatex --no-pdf  -no-pdf -recorder  ".../Documentation/output/peci/latex/peci.tex"'
	...

Default batch mode:
        ./scripts/sphinx-build-wrapper pdfdocs --sphinxdirs peci -v
	...
	Running 'xelatex --no-pdf  -no-pdf -interaction=batchmode -no-shell-escape -recorder  ".../Documentation/output/peci/latex/peci.tex"'
	...

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 13 ++++++++++---
 1 file changed, 10 insertions(+), 3 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 9c5df50fb99d..4e87584a92cd 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -115,7 +115,7 @@ class SphinxBuilder:
 
         return path
 
-    def __init__(self, venv=None, verbose=False, n_jobs=None):
+    def __init__(self, venv=None, verbose=False, n_jobs=None, interactive=None):
         """Initialize internal variables"""
         self.venv = venv
         self.verbose = None
@@ -126,7 +126,11 @@ class SphinxBuilder:
         self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
         self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
         self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
-        self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+
+        if not interactive:
+            self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+        else:
+            self.latexopts = os.environ.get("LATEXOPTS", "")
 
         if not verbose:
             verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
@@ -548,6 +552,9 @@ def main():
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
 
+    parser.add_argument('-i', '--interactive', action='store_true',
+                        help="Change latex default to run in interactive mode")
+
     parser.add_argument("-V", "--venv", nargs='?', const=f'{VENV_DEFAULT}',
                         default=None,
                         help=f'If used, run Sphinx from a venv dir (default dir: {VENV_DEFAULT})')
@@ -557,7 +564,7 @@ def main():
     PythonVersion.check_python(MIN_PYTHON_VERSION)
 
     builder = SphinxBuilder(venv=args.venv, verbose=args.verbose,
-                            n_jobs=args.jobs)
+                            n_jobs=args.jobs, interactive=args.interactive)
 
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

On a properly set system, LANG and LC_ALL is always defined.
However, some distros like Debian, Gentoo and their variants
start with those undefioned.

When Sphinx tries to set a locale with:

	locale.setlocale(locale.LC_ALL, '')

It raises an exception, making Sphinx fail. This is more likely
to happen with test containers.

Add a logic to detect and workaround such issue by setting
locale to C.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 11 +++++++++++
 tools/docs/sphinx-pre-install   | 14 +++++++++++++-
 2 files changed, 24 insertions(+), 1 deletion(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 4e87584a92cd..580582a76c93 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -45,6 +45,7 @@ the newer version.
 """
 
 import argparse
+import locale
 import os
 import shlex
 import shutil
@@ -439,6 +440,16 @@ class SphinxBuilder:
         if not sphinxdirs:
             sphinxdirs = os.environ.get("SPHINXDIRS", ".")
 
+        #
+        # The sphinx-build tool has a bug: internally, it tries to set
+        # locale with locale.setlocale(locale.LC_ALL, ''). This causes a
+        # crash if language is not set. Detect and fix it.
+        #
+        try:
+            locale.setlocale(locale.LC_ALL, '')
+        except locale.Error:
+            self.env["LC_ALL"] = "C"
+
         #
         # sphinxdirs can be a list or a whitespace-separated string
         #
diff --git a/tools/docs/sphinx-pre-install b/tools/docs/sphinx-pre-install
index d6d673b7945c..663d4e2a3f57 100755
--- a/tools/docs/sphinx-pre-install
+++ b/tools/docs/sphinx-pre-install
@@ -26,6 +26,7 @@ system pacage install is recommended.
 """
 
 import argparse
+import locale
 import os
 import re
 import subprocess
@@ -422,8 +423,19 @@ class MissingCheckers(AncillaryMethods):
         """
         Gets sphinx-build version.
         """
+        env = os.environ.copy()
+
+        # The sphinx-build tool has a bug: internally, it tries to set
+        # locale with locale.setlocale(locale.LC_ALL, ''). This causes a
+        # crash if language is not set. Detect and fix it.
         try:
-            result = self.run([cmd, "--version"],
+            locale.setlocale(locale.LC_ALL, '')
+        except Exception:
+            env["LC_ALL"] = "C"
+            env["LANG"] = "C"
+
+        try:
+            result = self.run([cmd, "--version"], env=env,
                               stdout=subprocess.PIPE,
                               stderr=subprocess.STDOUT,
                               text=True, check=True)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Use POSIX jobserver when available or -j<number> to run PDF
builds in parallel, restoring pdf build performance. Yet,
running it when debugging troubles is a bad idea, so, when
calling directly via command line, except if "-j" is splicitly
requested, it will serialize the build.

With such change, a PDF doc builds now takes around 5 minutes
on a Ryzen 9 machine with 32 cpu threads:

	# Explicitly paralelize both Sphinx and LaTeX pdf builds
	$ make cleandocs; time scripts/sphinx-build-wrapper pdfdocs -j 33

	real	5m17.901s
	user	15m1.499s
	sys	2m31.482s

	# Use POSIX jobserver to paralelize both sphinx-build and LaTeX
	$ make cleandocs; time make pdfdocs

	real	5m22.369s
	user	15m9.076s
	sys	2m31.419s

	# Serializes PDF build, while keeping Sphinx parallelized.
	# it is equivalent of passing -jauto via command line
	$ make cleandocs; time scripts/sphinx-build-wrapper pdfdocs

	real	11m20.901s
	user	13m2.910s
	sys	1m44.553s

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 158 +++++++++++++++++++++++---------
 1 file changed, 117 insertions(+), 41 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 580582a76c93..c884022ad733 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -52,6 +52,8 @@ import shutil
 import subprocess
 import sys
 
+from concurrent import futures
+
 from lib.python_version import PythonVersion
 
 LIB_DIR = "../../scripts/lib"
@@ -278,6 +280,82 @@ class SphinxBuilder:
         except (OSError, IOError) as e:
             print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
 
+    def build_pdf_file(self, latex_cmd, from_dir, path):
+        """Builds a single pdf file using latex_cmd"""
+        try:
+            subprocess.run(latex_cmd + [path],
+                            cwd=from_dir, check=True)
+
+            return True
+        except subprocess.CalledProcessError:
+            return False
+
+    def pdf_parallel_build(self, tex_suffix, latex_cmd, tex_files, n_jobs):
+        """Build PDF files in parallel if possible"""
+        builds = {}
+        build_failed = False
+        max_len = 0
+        has_tex = False
+
+        #
+        # LaTeX PDF error code is almost useless for us:
+        # any warning makes it non-zero. For kernel doc builds it always return
+        # non-zero even when build succeeds. So, let's do the best next thing:
+        # Ignore build errors. At the end, check if all PDF files were built,
+        # printing a summary with the built ones and returning 0 if all of
+        # them were actually built.
+        #
+        with futures.ThreadPoolExecutor(max_workers=n_jobs) as executor:
+            jobs = {}
+
+            for from_dir, pdf_dir, entry in tex_files:
+                name = entry.name
+
+                if not name.endswith(tex_suffix):
+                    continue
+
+                name = name[:-len(tex_suffix)]
+
+                max_len = max(max_len, len(name))
+
+                has_tex = True
+
+                future = executor.submit(self.build_pdf_file, latex_cmd,
+                                         from_dir, entry.path)
+                jobs[future] = (from_dir, pdf_dir, name)
+
+            for future in futures.as_completed(jobs):
+                from_dir, pdf_dir, name = jobs[future]
+
+                pdf_name = name + ".pdf"
+                pdf_from = os.path.join(from_dir, pdf_name)
+
+                try:
+                    success = future.result()
+
+                    if success and os.path.exists(pdf_from):
+                        pdf_to = os.path.join(pdf_dir, pdf_name)
+
+                        os.rename(pdf_from, pdf_to)
+                        builds[name] = os.path.relpath(pdf_to, self.builddir)
+                    else:
+                        builds[name] = "FAILED"
+                        build_failed = True
+                except futures.Error as e:
+                    builds[name] = f"FAILED ({repr(e)})"
+                    build_failed = True
+
+        #
+        # Handle case where no .tex files were found
+        #
+        if not has_tex:
+            name = "Sphinx LaTeX builder"
+            max_len = max(max_len, len(name))
+            builds[name] = "FAILED (no .tex file was generated)"
+            build_failed = True
+
+        return builds, build_failed, max_len
+
     def handle_pdf(self, output_dirs):
         """
         Extra steps for PDF output.
@@ -288,7 +366,9 @@ class SphinxBuilder:
         """
         builds = {}
         max_len = 0
+        tex_suffix = ".tex"
 
+        tex_files = []
         for from_dir in output_dirs:
             pdf_dir = os.path.join(from_dir, "../pdf")
             os.makedirs(pdf_dir, exist_ok=True)
@@ -300,55 +380,51 @@ class SphinxBuilder:
 
             latex_cmd.extend(shlex.split(self.latexopts))
 
-            tex_suffix = ".tex"
-
-            #
-            # Process each .tex file
-            #
-
-            has_tex = False
-            build_failed = False
+            # Get a list of tex files to process
             with os.scandir(from_dir) as it:
                 for entry in it:
-                    if not entry.name.endswith(tex_suffix):
-                        continue
+                    if entry.name.endswith(tex_suffix):
+                        tex_files.append((from_dir, pdf_dir, entry))
 
-                    name = entry.name[:-len(tex_suffix)]
-                    has_tex = True
+        #
+        # When using make, this won't be used, as the number of jobs comes
+        # from POSIX jobserver. So, this covers the case where build comes
+        # from command line. On such case, serialize by default, except if
+        # the user explicitly sets the number of jobs.
+        #
+        n_jobs = 1
 
-                    #
-                    # LaTeX PDF error code is almost useless for us:
-                    # any warning makes it non-zero. For kernel doc builds it
-                    # always return non-zero even when build succeeds.
-                    # So, let's do the best next thing: check if all PDF
-                    # files were built. If they're, print a summary and
-                    # return 0 at the end of this function
-                    #
-                    try:
-                        subprocess.run(latex_cmd + [entry.path],
-                                       cwd=from_dir, check=True)
-                    except subprocess.CalledProcessError:
-                        pass
+        # n_jobs is either an integer or "auto". Only use it if it is a number
+        if self.n_jobs:
+            try:
+                n_jobs = int(self.n_jobs)
+            except ValueError:
+                pass
 
-                    pdf_name = name + ".pdf"
-                    pdf_from = os.path.join(from_dir, pdf_name)
-                    pdf_to = os.path.join(pdf_dir, pdf_name)
+        #
+        # When using make, jobserver.claim is the number of jobs that were
+        # used with "-j" and that aren't used by other make targets
+        #
+        with JobserverExec() as jobserver:
+            n_jobs = 1
 
-                    if os.path.exists(pdf_from):
-                        os.rename(pdf_from, pdf_to)
-                        builds[name] = os.path.relpath(pdf_to, self.builddir)
-                    else:
-                        builds[name] = "FAILED"
-                        build_failed = True
+            #
+            # Handle the case when a parameter is passed via command line,
+            # using it as default, if jobserver doesn't claim anything
+            #
+            if self.n_jobs:
+                try:
+                    n_jobs = int(self.n_jobs)
+                except ValueError:
+                    pass
 
-                    name = entry.name.removesuffix(".tex")
-                    max_len = max(max_len, len(name))
+            if jobserver.claim:
+                n_jobs = jobserver.claim
 
-            if not has_tex:
-                name = os.path.basename(from_dir)
-                max_len = max(max_len, len(name))
-                builds[name] = "FAILED (no .tex)"
-                build_failed = True
+            builds, build_failed, max_len = self.pdf_parallel_build(tex_suffix,
+                                                                    latex_cmd,
+                                                                    tex_files,
+                                                                    n_jobs)
 
         msg = "Summary"
         msg += "\n" + "=" * len(msg)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 14/15] docs: add support to build manpages from kerneldoc output
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (12 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  2025-09-01 15:33 ` [PATCH RESEND v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Thomas Weißschuh, Alice Ryhl,
	Masahiro Yamada, Mauro Carvalho Chehab, Miguel Ojeda,
	Nathan Chancellor, Nicolas Schier, Randy Dunlap, Tamir Duberstein,
	linux-kbuild, linux-kernel

Generating man files currently requires running a separate
script. The target also doesn't appear at the docs Makefile.

Add support for mandocs at the Makefile, adding the build
logic inside sphinx-build-wrapper, updating documentation
and dropping the ancillary script.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                 |  3 +-
 Documentation/doc-guide/kernel-doc.rst | 29 ++++-----
 Makefile                               |  2 +-
 scripts/split-man.pl                   | 28 ---------
 tools/docs/sphinx-build-wrapper        | 81 ++++++++++++++++++++++++--
 5 files changed, 95 insertions(+), 48 deletions(-)
 delete mode 100755 scripts/split-man.pl

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 3e1cb44a5fbb..22e39e5ed07d 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -53,7 +53,7 @@ ifeq ($(HAVE_SPHINX),0)
 else # HAVE_SPHINX
 
 # Common documentation targets
-infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
+mandocs infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
 	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
 		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
@@ -104,6 +104,7 @@ dochelp:
 	@echo  '  htmldocs        - HTML'
 	@echo  '  texinfodocs     - Texinfo'
 	@echo  '  infodocs        - Info'
+	@echo  '  mandocs         - Man pages'
 	@echo  '  latexdocs       - LaTeX'
 	@echo  '  pdfdocs         - PDF'
 	@echo  '  epubdocs        - EPUB'
diff --git a/Documentation/doc-guide/kernel-doc.rst b/Documentation/doc-guide/kernel-doc.rst
index af9697e60165..4370cc8fbcf5 100644
--- a/Documentation/doc-guide/kernel-doc.rst
+++ b/Documentation/doc-guide/kernel-doc.rst
@@ -579,20 +579,23 @@ source.
 How to use kernel-doc to generate man pages
 -------------------------------------------
 
-If you just want to use kernel-doc to generate man pages you can do this
-from the kernel git tree::
+To generate man pages for all files that contain kernel-doc markups, run::
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- :^Documentation :^tools) \
-    | scripts/split-man.pl /tmp/man
+  $ make mandocs
 
-Some older versions of git do not support some of the variants of syntax for
-path exclusion.  One of the following commands may work for those versions::
+Or calling ``script-build-wrapper`` directly::
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- . ':!Documentation' ':!tools') \
-    | scripts/split-man.pl /tmp/man
+  $ ./tools/docs/sphinx-build-wrapper mandocs
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- . ":(exclude)Documentation" ":(exclude)tools") \
-    | scripts/split-man.pl /tmp/man
+The output will be at ``/man`` directory inside the output directory
+(by default: ``Documentation/output``).
+
+Optionally, it is possible to generate a partial set of man pages by
+using SPHINXDIRS:
+
+  $ make SPHINXDIRS=driver-api/media mandocs
+
+.. note::
+
+   When SPHINXDIRS={subdir} is used, it will only generate man pages for
+   the files explicitly inside a ``Documentation/{subdir}/.../*.rst`` file.
diff --git a/Makefile b/Makefile
index 6bfe776bf3c5..9bd44afeda26 100644
--- a/Makefile
+++ b/Makefile
@@ -1800,7 +1800,7 @@ $(help-board-dirs): help-%:
 # Documentation targets
 # ---------------------------------------------------------------------------
 DOC_TARGETS := xmldocs latexdocs pdfdocs htmldocs epubdocs cleandocs \
-	       linkcheckdocs dochelp refcheckdocs texinfodocs infodocs
+	       linkcheckdocs dochelp refcheckdocs texinfodocs infodocs mandocs
 PHONY += $(DOC_TARGETS)
 $(DOC_TARGETS):
 	$(Q)$(MAKE) $(build)=Documentation $@
diff --git a/scripts/split-man.pl b/scripts/split-man.pl
deleted file mode 100755
index 96bd99dc977a..000000000000
--- a/scripts/split-man.pl
+++ /dev/null
@@ -1,28 +0,0 @@
-#!/usr/bin/env perl
-# SPDX-License-Identifier: GPL-2.0
-#
-# Author: Mauro Carvalho Chehab <mchehab+samsung@kernel.org>
-#
-# Produce manpages from kernel-doc.
-# See Documentation/doc-guide/kernel-doc.rst for instructions
-
-if ($#ARGV < 0) {
-   die "where do I put the results?\n";
-}
-
-mkdir $ARGV[0],0777;
-$state = 0;
-while (<STDIN>) {
-    if (/^\.TH \"[^\"]*\" 9 \"([^\"]*)\"/) {
-	if ($state == 1) { close OUT }
-	$state = 1;
-	$fn = "$ARGV[0]/$1.9";
-	print STDERR "Creating $fn\n";
-	open OUT, ">$fn" or die "can't open $fn: $!\n";
-	print OUT $_;
-    } elsif ($state != 0) {
-	print OUT $_;
-    }
-}
-
-close OUT;
diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index c884022ad733..932b1b675274 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -47,6 +47,7 @@ the newer version.
 import argparse
 import locale
 import os
+import re
 import shlex
 import shutil
 import subprocess
@@ -55,6 +56,7 @@ import sys
 from concurrent import futures
 
 from lib.python_version import PythonVersion
+from glob import glob
 
 LIB_DIR = "../../scripts/lib"
 SRC_DIR = os.path.dirname(os.path.realpath(__file__))
@@ -77,6 +79,7 @@ TARGETS = {
     "epubdocs":      { "builder": "epub",    "out_dir": "epub" },
     "texinfodocs":   { "builder": "texinfo", "out_dir": "texinfo" },
     "infodocs":      { "builder": "texinfo", "out_dir": "texinfo" },
+    "mandocs":       { "builder": "man",     "out_dir": "man" },
     "latexdocs":     { "builder": "latex",   "out_dir": "latex" },
     "pdfdocs":       { "builder": "latex",   "out_dir": "latex" },
     "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
@@ -455,6 +458,71 @@ class SphinxBuilder:
             except subprocess.CalledProcessError as e:
                 sys.exit(f"Error generating info docs: {e}")
 
+    def handle_man(self, kerneldoc, docs_dir, src_dir, output_dir):
+        """
+        Create man pages from kernel-doc output
+        """
+
+        re_kernel_doc = re.compile(r"^\.\.\s+kernel-doc::\s*(\S+)")
+        re_man = re.compile(r'^\.TH "[^"]*" (\d+) "([^"]*)"')
+
+        if docs_dir == src_dir:
+            #
+            # Pick the entire set of kernel-doc markups from the entire tree
+            #
+            kdoc_files = set([self.srctree])
+        else:
+            kdoc_files = set()
+
+            for fname in glob(os.path.join(src_dir, "**"), recursive=True):
+                if os.path.isfile(fname) and fname.endswith(".rst"):
+                    with open(fname, "r", encoding="utf-8") as in_fp:
+                        data = in_fp.read()
+
+                    for line in data.split("\n"):
+                        match = re_kernel_doc.match(line)
+                        if match:
+                            if os.path.isfile(match.group(1)):
+                                kdoc_files.add(match.group(1))
+
+        if not kdoc_files:
+                sys.exit(f"Directory {src_dir} doesn't contain kernel-doc tags")
+
+        cmd = [ kerneldoc, "-m" ] + sorted(kdoc_files)
+        try:
+            if self.verbose:
+                print(" ".join(cmd))
+
+            result = subprocess.run(cmd, stdout=subprocess.PIPE, text= True)
+
+            if result.returncode:
+                print(f"Warning: kernel-doc returned {result.returncode} warnings")
+
+        except (OSError, ValueError, subprocess.SubprocessError) as e:
+            sys.exit(f"Failed to create man pages for {src_dir}: {repr(e)}")
+
+        fp = None
+        try:
+            for line in result.stdout.split("\n"):
+                match = re_man.match(line)
+                if not match:
+                    if fp:
+                        fp.write(line + '\n')
+                    continue
+
+                if fp:
+                    fp.close()
+
+                fname = f"{output_dir}/{match.group(2)}.{match.group(1)}"
+
+                if self.verbose:
+                    print(f"Creating {fname}")
+                fp = open(fname, "w", encoding="utf-8")
+                fp.write(line + '\n')
+        finally:
+            if fp:
+                fp.close()
+
     def cleandocs(self, builder):           # pylint: disable=W0613
         """Remove documentation output directory"""
         shutil.rmtree(self.builddir, ignore_errors=True)
@@ -483,7 +551,7 @@ class SphinxBuilder:
         # Other targets require sphinx-build, so check if it exists
         #
         sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
-        if not sphinxbuild:
+        if not sphinxbuild and target != "mandocs":
             sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
 
         if builder == "latex":
@@ -572,10 +640,13 @@ class SphinxBuilder:
                 output_dir,
             ]
 
-            try:
-                self.run_sphinx(sphinxbuild, build_args, env=self.env)
-            except (OSError, ValueError, subprocess.SubprocessError) as e:
-                sys.exit(f"Build failed: {repr(e)}")
+            if target == "mandocs":
+                self.handle_man(kerneldoc, docs_dir, src_dir, output_dir)
+            else:
+                try:
+                    self.run_sphinx(sphinxbuild, build_args, env=self.env)
+                except (OSError, ValueError, subprocess.SubprocessError) as e:
+                    sys.exit(f"Build failed: {repr(e)}")
 
             #
             # Ensure that each html/epub output will have needed static files
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH RESEND v3 15/15] tools: kernel-doc: add a see also section at man pages
  2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (13 preceding siblings ...)
  2025-09-01 15:33 ` [PATCH RESEND v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
@ 2025-09-01 15:33 ` Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 15:33 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

While cross-references are complex, as related ones can be on
different files, we can at least correlate the ones that belong
to the same file, adding a SEE ALSO section for them.

The result is not bad. See for instance:

	$ tools/docs/sphinx-build-wrapper --sphinxdirs driver-api/media -- mandocs
	$ man Documentation/output/driver-api/man/edac_pci_add_device.9

	edac_pci_add_device(9)  Kernel Hacker's Manual  edac_pci_add_device(9)

	NAME
	       edac_pci_add_device  - Insert the 'edac_dev' structure into the
	       edac_pci global list and create sysfs entries  associated  with
	       edac_pci structure.

	SYNOPSIS
	       int  edac_pci_add_device  (struct  edac_pci_ctl_info *pci , int
	       edac_idx );

	ARGUMENTS
	       pci         pointer to the edac_device structure to be added to
	                   the list

	       edac_idx    A unique numeric identifier to be assigned to the

	RETURN
	       0 on Success, or an error code on failure

	SEE ALSO
	       edac_pci_alloc_ctl_info(9),          edac_pci_free_ctl_info(9),
	       edac_pci_alloc_index(9),  edac_pci_del_device(9), edac_pci_cre‐
	       ate_generic_ctl(9),            edac_pci_release_generic_ctl(9),
	       edac_pci_create_sysfs(9), edac_pci_remove_sysfs(9)

	August 2025               edac_pci_add_device   edac_pci_add_device(9)

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/lib/kdoc/kdoc_files.py  |  5 +-
 scripts/lib/kdoc/kdoc_output.py | 84 +++++++++++++++++++++++++++++++--
 2 files changed, 83 insertions(+), 6 deletions(-)

diff --git a/scripts/lib/kdoc/kdoc_files.py b/scripts/lib/kdoc/kdoc_files.py
index 9e09b45b02fa..061c033f32da 100644
--- a/scripts/lib/kdoc/kdoc_files.py
+++ b/scripts/lib/kdoc/kdoc_files.py
@@ -275,7 +275,10 @@ class KernelFiles():
                 self.config.log.warning("No kernel-doc for file %s", fname)
                 continue
 
-            for arg in self.results[fname]:
+            symbols = self.results[fname]
+            self.out_style.set_symbols(symbols)
+
+            for arg in symbols:
                 m = self.out_msg(fname, arg.name, arg)
 
                 if m is None:
diff --git a/scripts/lib/kdoc/kdoc_output.py b/scripts/lib/kdoc/kdoc_output.py
index ea8914537ba0..1eca9a918558 100644
--- a/scripts/lib/kdoc/kdoc_output.py
+++ b/scripts/lib/kdoc/kdoc_output.py
@@ -215,6 +215,9 @@ class OutputFormat:
 
     # Virtual methods to be overridden by inherited classes
     # At the base class, those do nothing.
+    def set_symbols(self, symbols):
+        """Get a list of all symbols from kernel_doc"""
+
     def out_doc(self, fname, name, args):
         """Outputs a DOC block"""
 
@@ -577,6 +580,7 @@ class ManFormat(OutputFormat):
 
         super().__init__()
         self.modulename = modulename
+        self.symbols = []
 
         dt = None
         tstamp = os.environ.get("KBUILD_BUILD_TIMESTAMP")
@@ -593,6 +597,68 @@ class ManFormat(OutputFormat):
 
         self.man_date = dt.strftime("%B %Y")
 
+    def arg_name(self, args, name):
+        """
+        Return the name that will be used for the man page.
+
+        As we may have the same name on different namespaces,
+        prepend the data type for all types except functions and typedefs.
+
+        The doc section is special: it uses the modulename.
+        """
+
+        dtype = args.type
+
+        if dtype == "doc":
+            return self.modulename
+
+        if dtype in ["function", "typedef"]:
+            return name
+
+        return f"{dtype} {name}"
+
+    def set_symbols(self, symbols):
+        """
+        Get a list of all symbols from kernel_doc.
+
+        Man pages will uses it to add a SEE ALSO section with other
+        symbols at the same file.
+        """
+        self.symbols = symbols
+
+    def out_tail(self, fname, name, args):
+        """Adds a tail for all man pages"""
+
+        # SEE ALSO section
+        if len(self.symbols) >= 2:
+            cur_name = self.arg_name(args, name)
+
+            self.data += f'.SH "SEE ALSO"' + "\n.PP\n"
+            related = []
+            for arg in self.symbols:
+                out_name = self.arg_name(arg, arg.name)
+
+                if cur_name == out_name:
+                    continue
+
+                related.append(f"\\fB{out_name}\\fR(9)")
+
+            self.data += ",\n".join(related) + "\n"
+
+        # TODO: does it make sense to add other sections? Maybe
+        # REPORTING ISSUES? LICENSE?
+
+    def msg(self, fname, name, args):
+        """
+        Handles a single entry from kernel-doc parser.
+
+        Add a tail at the end of man pages output.
+        """
+        super().msg(fname, name, args)
+        self.out_tail(fname, name, args)
+
+        return self.data
+
     def output_highlight(self, block):
         """
         Outputs a C symbol that may require being highlighted with
@@ -618,7 +684,9 @@ class ManFormat(OutputFormat):
         if not self.check_doc(name, args):
             return
 
-        self.data += f'.TH "{self.modulename}" 9 "{self.modulename}" "{self.man_date}" "API Manual" LINUX' + "\n"
+        out_name = self.arg_name(args, name)
+
+        self.data += f'.TH "{self.modulename}" 9 "{out_name}" "{self.man_date}" "API Manual" LINUX' + "\n"
 
         for section, text in args.sections.items():
             self.data += f'.SH "{section}"' + "\n"
@@ -627,7 +695,9 @@ class ManFormat(OutputFormat):
     def out_function(self, fname, name, args):
         """output function in man"""
 
-        self.data += f'.TH "{name}" 9 "{name}" "{self.man_date}" "Kernel Hacker\'s Manual" LINUX' + "\n"
+        out_name = self.arg_name(args, name)
+
+        self.data += f'.TH "{name}" 9 "{out_name}" "{self.man_date}" "Kernel Hacker\'s Manual" LINUX' + "\n"
 
         self.data += ".SH NAME\n"
         self.data += f"{name} \\- {args['purpose']}\n"
@@ -671,7 +741,9 @@ class ManFormat(OutputFormat):
             self.output_highlight(text)
 
     def out_enum(self, fname, name, args):
-        self.data += f'.TH "{self.modulename}" 9 "enum {name}" "{self.man_date}" "API Manual" LINUX' + "\n"
+        out_name = self.arg_name(args, name)
+
+        self.data += f'.TH "{self.modulename}" 9 "{out_name}" "{self.man_date}" "API Manual" LINUX' + "\n"
 
         self.data += ".SH NAME\n"
         self.data += f"enum {name} \\- {args['purpose']}\n"
@@ -703,8 +775,9 @@ class ManFormat(OutputFormat):
     def out_typedef(self, fname, name, args):
         module = self.modulename
         purpose = args.get('purpose')
+        out_name = self.arg_name(args, name)
 
-        self.data += f'.TH "{module}" 9 "{name}" "{self.man_date}" "API Manual" LINUX' + "\n"
+        self.data += f'.TH "{module}" 9 "{out_name}" "{self.man_date}" "API Manual" LINUX' + "\n"
 
         self.data += ".SH NAME\n"
         self.data += f"typedef {name} \\- {purpose}\n"
@@ -717,8 +790,9 @@ class ManFormat(OutputFormat):
         module = self.modulename
         purpose = args.get('purpose')
         definition = args.get('definition')
+        out_name = self.arg_name(args, name)
 
-        self.data += f'.TH "{module}" 9 "{args.type} {name}" "{self.man_date}" "API Manual" LINUX' + "\n"
+        self.data += f'.TH "{module}" 9 "{out_name}" "{self.man_date}" "API Manual" LINUX' + "\n"
 
         self.data += ".SH NAME\n"
         self.data += f"{args.type} {name} \\- {purpose}\n"
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

end of thread, other threads:[~2025-09-01 15:33 UTC | newest]

Thread overview: 16+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-09-01 15:33 [PATCH RESEND v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
2025-09-01 15:33 ` [PATCH RESEND v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).