linux-kernel.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH v3 00/15] Split sphinx call logic from docs Makefile
@ 2025-09-01 14:42 Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
                   ` (14 more replies)
  0 siblings, 15 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Hi Jon,

This series does a major cleanup at docs Makefile by moving the
actual doc build logic to a helper script (scripts/sphinx-build-wrapper).

Such script was written in a way that it can be called either
directly or via a makefile. When running via makefile, it will
use GNU jobserver to ensure that, when sphinx-build is
called, the number of jobs will match at most what it is
specified by the "-j" parameter.

The first 3 patches do a cleanup at scripts/jobserver-exec
and moves the actual code to a library. Such library is used
by both the jobserver-exec command line and by sphinx-build-wrappper.

The change also gets rid of parallel-wrapper.sh, whose
functions are now part of the wrapper code.

I added two patches at the end adding an extra target: "mandocs".
The patches came from a series I sent in separate with 2 patches.

---

v3:
- rebased on the top of docs-next;
- added two patches to build man files that were on a separate
  patch series.

v2:

- there's no generic exception handler anymore;
- it moves sphinx-pre-install to tools/docs;
- the logic which ensures a minimal Python version got moved
  to a library, which is now used by both pre-install and wrapper;
- The first wrapper (05/13) doesn't contain comments (except for
  shebang and SPDX). The goal is to help showing the size increase
  when moving from Makefile to Python. Some file increase is
  unavoidable, as Makefile is more compact: no includes, multple
  statements per line, no argparse, etc;
- The second patch adds docstrings and comments. It has almost
  the same size of the code itself;
- I moved the venv logic to a third wrapper patch;
- I fixed an issue at the paraller build logic;
- There are no generic except blocks anymore.

Mauro Carvalho Chehab (15):
  scripts/jobserver-exec: move the code to a class
  scripts/jobserver-exec: move its class to the lib directory
  scripts/jobserver-exec: add a help message
  scripts: sphinx-pre-install: move it to tools/docs
  tools/docs: sphinx-pre-install: move Python version handling to lib
  tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build
  tools/docs: sphinx-build-wrapper: add comments and blank lines
  tools/docs: sphinx-build-wrapper: add support to run inside venv
  docs: parallel-wrapper.sh: remove script
  docs: Makefile: document latex/PDF PAPER= parameter
  tools/docs: sphinx-build-wrapper: add an argument for LaTeX
    interactive mode
  tools/docs,scripts: sphinx-*: prevent sphinx-build crashes
  tools/docs: sphinx-build-wrapper: allow building PDF files in parallel
  docs: add support to build manpages from kerneldoc output
  tools: kernel-doc: add a see also section at man pages

 Documentation/Makefile                     | 134 +---
 Documentation/doc-guide/kernel-doc.rst     |  29 +-
 Documentation/sphinx/parallel-wrapper.sh   |  33 -
 Makefile                                   |   2 +-
 scripts/jobserver-exec                     |  88 +--
 scripts/lib/jobserver.py                   | 149 +++++
 scripts/lib/kdoc/kdoc_files.py             |   5 +-
 scripts/lib/kdoc/kdoc_output.py            |  84 ++-
 scripts/split-man.pl                       |  28 -
 tools/docs/lib/python_version.py           | 133 ++++
 tools/docs/sphinx-build-wrapper            | 731 +++++++++++++++++++++
 {scripts => tools/docs}/sphinx-pre-install | 134 +---
 12 files changed, 1194 insertions(+), 356 deletions(-)
 delete mode 100644 Documentation/sphinx/parallel-wrapper.sh
 create mode 100755 scripts/lib/jobserver.py
 delete mode 100755 scripts/split-man.pl
 create mode 100644 tools/docs/lib/python_version.py
 create mode 100755 tools/docs/sphinx-build-wrapper
 rename {scripts => tools/docs}/sphinx-pre-install (93%)

-- 
2.51.0


^ permalink raw reply	[flat|nested] 16+ messages in thread

* [PATCH v3 01/15] scripts/jobserver-exec: move the code to a class
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
                   ` (13 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Convert the code inside jobserver-exec to a class and
properly document it.

Using a class allows reusing the jobserver logic on other
scripts.

While the main code remains unchanged, being compatible with
Python 2.6 and 3.0+, its coding style now follows a more
modern standard, having tabs replaced by a 4-spaces
indent, passing autopep8, black and pylint.

The code now allows allows using a pythonic way to
enter/exit a python code, e.g. it now supports:

	with JobserverExec() as jobserver:
	    jobserver.run(sys.argv[1:])

With the new code, the __exit__() function should ensure
that the jobserver slot will be closed at the end, even if
something bad happens somewhere.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec | 218 ++++++++++++++++++++++++++++-------------
 1 file changed, 151 insertions(+), 67 deletions(-)

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index 7eca035472d3..b386b1a845de 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,77 +1,161 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
 #
+# pylint: disable=C0103,C0209
+#
 # This determines how many parallel tasks "make" is expecting, as it is
 # not exposed via an special variables, reserves them all, runs a subprocess
 # with PARALLELISM environment variable set, and releases the jobs back again.
 #
 # https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
-from __future__ import print_function
-import os, sys, errno
+
+"""
+Interacts with the POSIX jobserver during the Kernel build time.
+
+A "normal" jobserver task, like the one initiated by a make subrocess would do:
+
+    - open read/write file descriptors to communicate with the job server;
+    - ask for one slot by calling:
+        claim = os.read(reader, 1)
+    - when the job finshes, call:
+        os.write(writer, b"+")  # os.write(writer, claim)
+
+Here, the goal is different: This script aims to get the remaining number
+of slots available, using all of them to run a command which handle tasks in
+parallel. To to that, it has a loop that ends only after there are no
+slots left. It then increments the number by one, in order to allow a
+call equivalent to make -j$((claim+1)), e.g. having a parent make creating
+$claim child to do the actual work.
+
+The end goal here is to keep the total number of build tasks under the
+limit established by the initial make -j$n_proc call.
+"""
+
+import errno
+import os
 import subprocess
+import sys
 
-# Extract and prepare jobserver file descriptors from environment.
-claim = 0
-jobs = b""
-try:
-	# Fetch the make environment options.
-	flags = os.environ['MAKEFLAGS']
-
-	# Look for "--jobserver=R,W"
-	# Note that GNU Make has used --jobserver-fds and --jobserver-auth
-	# so this handles all of them.
-	opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
-
-	# Parse out R,W file descriptor numbers and set them nonblocking.
-	# If the MAKEFLAGS variable contains multiple instances of the
-	# --jobserver-auth= option, the last one is relevant.
-	fds = opts[-1].split("=", 1)[1]
-
-	# Starting with GNU Make 4.4, named pipes are used for reader and writer.
-	# Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
-	_, _, path = fds.partition('fifo:')
-
-	if path:
-		reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
-		writer = os.open(path, os.O_WRONLY)
-	else:
-		reader, writer = [int(x) for x in fds.split(",", 1)]
-		# Open a private copy of reader to avoid setting nonblocking
-		# on an unexpecting process with the same reader fd.
-		reader = os.open("/proc/self/fd/%d" % (reader),
-				 os.O_RDONLY | os.O_NONBLOCK)
-
-	# Read out as many jobserver slots as possible.
-	while True:
-		try:
-			slot = os.read(reader, 8)
-			jobs += slot
-		except (OSError, IOError) as e:
-			if e.errno == errno.EWOULDBLOCK:
-				# Stop at the end of the jobserver queue.
-				break
-			# If something went wrong, give back the jobs.
-			if len(jobs):
-				os.write(writer, jobs)
-			raise e
-	# Add a bump for our caller's reserveration, since we're just going
-	# to sit here blocked on our child.
-	claim = len(jobs) + 1
-except (KeyError, IndexError, ValueError, OSError, IOError) as e:
-	# Any missing environment strings or bad fds should result in just
-	# not being parallel.
-	pass
-
-# We can only claim parallelism if there was a jobserver (i.e. a top-level
-# "-jN" argument) and there were no other failures. Otherwise leave out the
-# environment variable and let the child figure out what is best.
-if claim > 0:
-	os.environ['PARALLELISM'] = '%d' % (claim)
-
-rc = subprocess.call(sys.argv[1:])
-
-# Return all the reserved slots.
-if len(jobs):
-	os.write(writer, jobs)
-
-sys.exit(rc)
+
+class JobserverExec:
+    """
+    Claim all slots from make using POSIX Jobserver.
+
+    The main methods here are:
+    - open(): reserves all slots;
+    - close(): method returns all used slots back to make;
+    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
+    """
+
+    def __init__(self):
+        """Initialize internal vars"""
+        self.claim = 0
+        self.jobs = b""
+        self.reader = None
+        self.writer = None
+        self.is_open = False
+
+    def open(self):
+        """Reserve all available slots to be claimed later on"""
+
+        if self.is_open:
+            return
+
+        try:
+            # Fetch the make environment options.
+            flags = os.environ["MAKEFLAGS"]
+            # Look for "--jobserver=R,W"
+            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
+            # so this handles all of them.
+            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
+
+            # Parse out R,W file descriptor numbers and set them nonblocking.
+            # If the MAKEFLAGS variable contains multiple instances of the
+            # --jobserver-auth= option, the last one is relevant.
+            fds = opts[-1].split("=", 1)[1]
+
+            # Starting with GNU Make 4.4, named pipes are used for reader
+            # and writer.
+            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
+            _, _, path = fds.partition("fifo:")
+
+            if path:
+                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
+                self.writer = os.open(path, os.O_WRONLY)
+            else:
+                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
+                # Open a private copy of reader to avoid setting nonblocking
+                # on an unexpecting process with the same reader fd.
+                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
+                                      os.O_RDONLY | os.O_NONBLOCK)
+
+            # Read out as many jobserver slots as possible
+            while True:
+                try:
+                    slot = os.read(self.reader, 8)
+                    self.jobs += slot
+                except (OSError, IOError) as e:
+                    if e.errno == errno.EWOULDBLOCK:
+                        # Stop at the end of the jobserver queue.
+                        break
+                    # If something went wrong, give back the jobs.
+                    if self.jobs:
+                        os.write(self.writer, self.jobs)
+                    raise e
+
+            # Add a bump for our caller's reserveration, since we're just going
+            # to sit here blocked on our child.
+            self.claim = len(self.jobs) + 1
+
+        except (KeyError, IndexError, ValueError, OSError, IOError):
+            # Any missing environment strings or bad fds should result in just
+            # not being parallel.
+            self.claim = None
+
+        self.is_open = True
+
+    def close(self):
+        """Return all reserved slots to Jobserver"""
+
+        if not self.is_open:
+            return
+
+        # Return all the reserved slots.
+        if len(self.jobs):
+            os.write(self.writer, self.jobs)
+
+        self.is_open = False
+
+    def __enter__(self):
+        self.open()
+        return self
+
+    def __exit__(self, exc_type, exc_value, exc_traceback):
+        self.close()
+
+    def run(self, cmd):
+        """
+        Run a command setting PARALLELISM env variable to the number of
+        available job slots (claim) + 1, e.g. it will reserve claim slots
+        to do the actual build work, plus one to monitor its childs.
+        """
+        self.open()             # Ensure that self.claim is set
+
+        # We can only claim parallelism if there was a jobserver (i.e. a
+        # top-level "-jN" argument) and there were no other failures. Otherwise
+        # leave out the environment variable and let the child figure out what
+        # is best.
+        if self.claim:
+            os.environ["PARALLELISM"] = str(self.claim)
+
+        return subprocess.call(cmd)
+
+
+def main():
+    """Main program"""
+    with JobserverExec() as jobserver:
+        jobserver.run(sys.argv[1:])
+
+
+if __name__ == "__main__":
+    main()
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 02/15] scripts/jobserver-exec: move its class to the lib directory
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
                   ` (12 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

To make it easier to be re-used, move the JobserverExec class
to the library directory.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec   | 152 +++------------------------------------
 scripts/lib/jobserver.py | 149 ++++++++++++++++++++++++++++++++++++++
 2 files changed, 160 insertions(+), 141 deletions(-)
 create mode 100755 scripts/lib/jobserver.py

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index b386b1a845de..40a0f0058733 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,155 +1,25 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
-#
-# pylint: disable=C0103,C0209
-#
-# This determines how many parallel tasks "make" is expecting, as it is
-# not exposed via an special variables, reserves them all, runs a subprocess
-# with PARALLELISM environment variable set, and releases the jobs back again.
-#
-# https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
 
-"""
-Interacts with the POSIX jobserver during the Kernel build time.
-
-A "normal" jobserver task, like the one initiated by a make subrocess would do:
-
-    - open read/write file descriptors to communicate with the job server;
-    - ask for one slot by calling:
-        claim = os.read(reader, 1)
-    - when the job finshes, call:
-        os.write(writer, b"+")  # os.write(writer, claim)
-
-Here, the goal is different: This script aims to get the remaining number
-of slots available, using all of them to run a command which handle tasks in
-parallel. To to that, it has a loop that ends only after there are no
-slots left. It then increments the number by one, in order to allow a
-call equivalent to make -j$((claim+1)), e.g. having a parent make creating
-$claim child to do the actual work.
-
-The end goal here is to keep the total number of build tasks under the
-limit established by the initial make -j$n_proc call.
-"""
-
-import errno
 import os
-import subprocess
 import sys
 
+LIB_DIR = "lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
-class JobserverExec:
-    """
-    Claim all slots from make using POSIX Jobserver.
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-    The main methods here are:
-    - open(): reserves all slots;
-    - close(): method returns all used slots back to make;
-    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
-    """
+from jobserver import JobserverExec                  # pylint: disable=C0415
 
-    def __init__(self):
-        """Initialize internal vars"""
-        self.claim = 0
-        self.jobs = b""
-        self.reader = None
-        self.writer = None
-        self.is_open = False
 
-    def open(self):
-        """Reserve all available slots to be claimed later on"""
-
-        if self.is_open:
-            return
-
-        try:
-            # Fetch the make environment options.
-            flags = os.environ["MAKEFLAGS"]
-            # Look for "--jobserver=R,W"
-            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
-            # so this handles all of them.
-            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
-
-            # Parse out R,W file descriptor numbers and set them nonblocking.
-            # If the MAKEFLAGS variable contains multiple instances of the
-            # --jobserver-auth= option, the last one is relevant.
-            fds = opts[-1].split("=", 1)[1]
-
-            # Starting with GNU Make 4.4, named pipes are used for reader
-            # and writer.
-            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
-            _, _, path = fds.partition("fifo:")
-
-            if path:
-                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
-                self.writer = os.open(path, os.O_WRONLY)
-            else:
-                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
-                # Open a private copy of reader to avoid setting nonblocking
-                # on an unexpecting process with the same reader fd.
-                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
-                                      os.O_RDONLY | os.O_NONBLOCK)
-
-            # Read out as many jobserver slots as possible
-            while True:
-                try:
-                    slot = os.read(self.reader, 8)
-                    self.jobs += slot
-                except (OSError, IOError) as e:
-                    if e.errno == errno.EWOULDBLOCK:
-                        # Stop at the end of the jobserver queue.
-                        break
-                    # If something went wrong, give back the jobs.
-                    if self.jobs:
-                        os.write(self.writer, self.jobs)
-                    raise e
-
-            # Add a bump for our caller's reserveration, since we're just going
-            # to sit here blocked on our child.
-            self.claim = len(self.jobs) + 1
-
-        except (KeyError, IndexError, ValueError, OSError, IOError):
-            # Any missing environment strings or bad fds should result in just
-            # not being parallel.
-            self.claim = None
-
-        self.is_open = True
-
-    def close(self):
-        """Return all reserved slots to Jobserver"""
-
-        if not self.is_open:
-            return
-
-        # Return all the reserved slots.
-        if len(self.jobs):
-            os.write(self.writer, self.jobs)
-
-        self.is_open = False
-
-    def __enter__(self):
-        self.open()
-        return self
-
-    def __exit__(self, exc_type, exc_value, exc_traceback):
-        self.close()
-
-    def run(self, cmd):
-        """
-        Run a command setting PARALLELISM env variable to the number of
-        available job slots (claim) + 1, e.g. it will reserve claim slots
-        to do the actual build work, plus one to monitor its childs.
-        """
-        self.open()             # Ensure that self.claim is set
-
-        # We can only claim parallelism if there was a jobserver (i.e. a
-        # top-level "-jN" argument) and there were no other failures. Otherwise
-        # leave out the environment variable and let the child figure out what
-        # is best.
-        if self.claim:
-            os.environ["PARALLELISM"] = str(self.claim)
-
-        return subprocess.call(cmd)
+"""
+Determines how many parallel tasks "make" is expecting, as it is
+not exposed via an special variables, reserves them all, runs a subprocess
+with PARALLELISM environment variable set, and releases the jobs back again.
 
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
 
 def main():
     """Main program"""
diff --git a/scripts/lib/jobserver.py b/scripts/lib/jobserver.py
new file mode 100755
index 000000000000..98d8b0ff0c89
--- /dev/null
+++ b/scripts/lib/jobserver.py
@@ -0,0 +1,149 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0+
+#
+# pylint: disable=C0103,C0209
+#
+#
+
+"""
+Interacts with the POSIX jobserver during the Kernel build time.
+
+A "normal" jobserver task, like the one initiated by a make subrocess would do:
+
+    - open read/write file descriptors to communicate with the job server;
+    - ask for one slot by calling:
+        claim = os.read(reader, 1)
+    - when the job finshes, call:
+        os.write(writer, b"+")  # os.write(writer, claim)
+
+Here, the goal is different: This script aims to get the remaining number
+of slots available, using all of them to run a command which handle tasks in
+parallel. To to that, it has a loop that ends only after there are no
+slots left. It then increments the number by one, in order to allow a
+call equivalent to make -j$((claim+1)), e.g. having a parent make creating
+$claim child to do the actual work.
+
+The end goal here is to keep the total number of build tasks under the
+limit established by the initial make -j$n_proc call.
+
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
+
+import errno
+import os
+import subprocess
+import sys
+
+class JobserverExec:
+    """
+    Claim all slots from make using POSIX Jobserver.
+
+    The main methods here are:
+    - open(): reserves all slots;
+    - close(): method returns all used slots back to make;
+    - run(): executes a command setting PARALLELISM=<available slots jobs + 1>
+    """
+
+    def __init__(self):
+        """Initialize internal vars"""
+        self.claim = 0
+        self.jobs = b""
+        self.reader = None
+        self.writer = None
+        self.is_open = False
+
+    def open(self):
+        """Reserve all available slots to be claimed later on"""
+
+        if self.is_open:
+            return
+
+        try:
+            # Fetch the make environment options.
+            flags = os.environ["MAKEFLAGS"]
+            # Look for "--jobserver=R,W"
+            # Note that GNU Make has used --jobserver-fds and --jobserver-auth
+            # so this handles all of them.
+            opts = [x for x in flags.split(" ") if x.startswith("--jobserver")]
+
+            # Parse out R,W file descriptor numbers and set them nonblocking.
+            # If the MAKEFLAGS variable contains multiple instances of the
+            # --jobserver-auth= option, the last one is relevant.
+            fds = opts[-1].split("=", 1)[1]
+
+            # Starting with GNU Make 4.4, named pipes are used for reader
+            # and writer.
+            # Example argument: --jobserver-auth=fifo:/tmp/GMfifo8134
+            _, _, path = fds.partition("fifo:")
+
+            if path:
+                self.reader = os.open(path, os.O_RDONLY | os.O_NONBLOCK)
+                self.writer = os.open(path, os.O_WRONLY)
+            else:
+                self.reader, self.writer = [int(x) for x in fds.split(",", 1)]
+                # Open a private copy of reader to avoid setting nonblocking
+                # on an unexpecting process with the same reader fd.
+                self.reader = os.open("/proc/self/fd/%d" % (self.reader),
+                                      os.O_RDONLY | os.O_NONBLOCK)
+
+            # Read out as many jobserver slots as possible
+            while True:
+                try:
+                    slot = os.read(self.reader, 8)
+                    self.jobs += slot
+                except (OSError, IOError) as e:
+                    if e.errno == errno.EWOULDBLOCK:
+                        # Stop at the end of the jobserver queue.
+                        break
+                    # If something went wrong, give back the jobs.
+                    if self.jobs:
+                        os.write(self.writer, self.jobs)
+                    raise e
+
+            # Add a bump for our caller's reserveration, since we're just going
+            # to sit here blocked on our child.
+            self.claim = len(self.jobs) + 1
+
+        except (KeyError, IndexError, ValueError, OSError, IOError):
+            # Any missing environment strings or bad fds should result in just
+            # not being parallel.
+            self.claim = None
+
+        self.is_open = True
+
+    def close(self):
+        """Return all reserved slots to Jobserver"""
+
+        if not self.is_open:
+            return
+
+        # Return all the reserved slots.
+        if len(self.jobs):
+            os.write(self.writer, self.jobs)
+
+        self.is_open = False
+
+    def __enter__(self):
+        self.open()
+        return self
+
+    def __exit__(self, exc_type, exc_value, exc_traceback):
+        self.close()
+
+    def run(self, cmd, *args, **pwargs):
+        """
+        Run a command setting PARALLELISM env variable to the number of
+        available job slots (claim) + 1, e.g. it will reserve claim slots
+        to do the actual build work, plus one to monitor its childs.
+        """
+        self.open()             # Ensure that self.claim is set
+
+        # We can only claim parallelism if there was a jobserver (i.e. a
+        # top-level "-jN" argument) and there were no other failures. Otherwise
+        # leave out the environment variable and let the child figure out what
+        # is best.
+        if self.claim:
+            os.environ["PARALLELISM"] = str(self.claim)
+
+        return subprocess.call(cmd, *args, **pwargs)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 03/15] scripts/jobserver-exec: add a help message
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Currently, calling it without an argument shows an ugly error
message. Instead, print a message using pythondoc as description.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/jobserver-exec | 22 +++++++++++++---------
 1 file changed, 13 insertions(+), 9 deletions(-)

diff --git a/scripts/jobserver-exec b/scripts/jobserver-exec
index 40a0f0058733..ae23afd344ec 100755
--- a/scripts/jobserver-exec
+++ b/scripts/jobserver-exec
@@ -1,6 +1,15 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0+
 
+"""
+Determines how many parallel tasks "make" is expecting, as it is
+not exposed via any special variables, reserves them all, runs a subprocess
+with PARALLELISM environment variable set, and releases the jobs back again.
+
+See:
+    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
+"""
+
 import os
 import sys
 
@@ -12,17 +21,12 @@ sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 from jobserver import JobserverExec                  # pylint: disable=C0415
 
 
-"""
-Determines how many parallel tasks "make" is expecting, as it is
-not exposed via an special variables, reserves them all, runs a subprocess
-with PARALLELISM environment variable set, and releases the jobs back again.
-
-See:
-    https://www.gnu.org/software/make/manual/html_node/POSIX-Jobserver.html#POSIX-Jobserver
-"""
-
 def main():
     """Main program"""
+    if len(sys.argv) < 2:
+        name = os.path.basename(__file__)
+        sys.exit("usage: " + name +" command [args ...]\n" + __doc__)
+
     with JobserverExec() as jobserver:
         jobserver.run(sys.argv[1:])
 
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 04/15] scripts: sphinx-pre-install: move it to tools/docs
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

As we're reorganizing the place where doc scripts are located,
move this one to tools/docs.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                     | 14 +++++++-------
 {scripts => tools/docs}/sphinx-pre-install |  0
 2 files changed, 7 insertions(+), 7 deletions(-)
 rename {scripts => tools/docs}/sphinx-pre-install (100%)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 5c20c68be89a..deb2029228ed 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -46,7 +46,7 @@ ifeq ($(HAVE_SPHINX),0)
 .DEFAULT:
 	$(warning The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed and in PATH, or set the SPHINXBUILD make variable to point to the full path of the '$(SPHINXBUILD)' executable.)
 	@echo
-	@$(srctree)/scripts/sphinx-pre-install
+	@$(srctree)/tools/docs/sphinx-pre-install
 	@echo "  SKIP    Sphinx $@ target."
 
 else # HAVE_SPHINX
@@ -105,7 +105,7 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 	fi
 
 htmldocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
 
 # If Rust support is available and .config exists, add rustdoc generated contents.
@@ -119,7 +119,7 @@ endif
 endif
 
 texinfodocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,texinfo,$(var),texinfo,$(var)))
 
 # Note: the 'info' Make target is generated by sphinx itself when
@@ -131,7 +131,7 @@ linkcheckdocs:
 	@$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,linkcheck,$(var),,$(var)))
 
 latexdocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,latex,$(var),latex,$(var)))
 
 ifeq ($(HAVE_PDFLATEX),0)
@@ -144,7 +144,7 @@ else # HAVE_PDFLATEX
 
 pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
 pdfdocs: latexdocs
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	$(foreach var,$(SPHINXDIRS), \
 	   $(MAKE) PDFLATEX="$(PDFLATEX)" LATEXOPTS="$(LATEXOPTS)" $(DENY_VF) -C $(BUILDDIR)/$(var)/latex || sh $(srctree)/scripts/check-variable-fonts.sh || exit; \
 	   mkdir -p $(BUILDDIR)/$(var)/pdf; \
@@ -154,11 +154,11 @@ pdfdocs: latexdocs
 endif # HAVE_PDFLATEX
 
 epubdocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,epub,$(var),epub,$(var)))
 
 xmldocs:
-	@$(srctree)/scripts/sphinx-pre-install --version-check
+	@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,xml,$(var),xml,$(var)))
 
 endif # HAVE_SPHINX
diff --git a/scripts/sphinx-pre-install b/tools/docs/sphinx-pre-install
similarity index 100%
rename from scripts/sphinx-pre-install
rename to tools/docs/sphinx-pre-install
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

VGhlIHNwaGlueC1wcmUtaW5zdGFsbCBjb2RlIGhhcyBzb21lIGxvZ2ljIHRvIGRlYWwgd2l0aCBQ
eXRob24KdmVyc2lvbiwgd2hpY2ggZW5zdXJlcyB0aGF0IGEgbWluaW1hbCB2ZXJzaW9uIHdpbGwg
YmUgZW5mb3JjZWQKZm9yIGRvY3VtZW50YXRpb24gYnVpbGQgbG9naWMuCgpNb3ZlIGl0IHRvIGEg
c2VwYXJhdGUgbGlicmFyeSB0byBhbGxvdyByZS11c2luZyBpdHMgY29kZS4KCk5vIGZ1bmN0aW9u
YWwgY2hhbmdlcy4KClNpZ25lZC1vZmYtYnk6IE1hdXJvIENhcnZhbGhvIENoZWhhYiA8bWNoZWhh
YitodWF3ZWlAa2VybmVsLm9yZz4KLS0tCiB0b29scy9kb2NzL2xpYi9weXRob25fdmVyc2lvbi5w
eSB8IDEzMyArKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrCiB0b29scy9kb2NzL3NwaGlu
eC1wcmUtaW5zdGFsbCAgICB8IDEyMCArKystLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tCiAyIGZp
bGVzIGNoYW5nZWQsIDE0NiBpbnNlcnRpb25zKCspLCAxMDcgZGVsZXRpb25zKC0pCiBjcmVhdGUg
bW9kZSAxMDA2NDQgdG9vbHMvZG9jcy9saWIvcHl0aG9uX3ZlcnNpb24ucHkKCmRpZmYgLS1naXQg
YS90b29scy9kb2NzL2xpYi9weXRob25fdmVyc2lvbi5weSBiL3Rvb2xzL2RvY3MvbGliL3B5dGhv
bl92ZXJzaW9uLnB5Cm5ldyBmaWxlIG1vZGUgMTAwNjQ0CmluZGV4IDAwMDAwMDAwMDAwMC4uMDUx
OWQ1MjRlNTQ3Ci0tLSAvZGV2L251bGwKKysrIGIvdG9vbHMvZG9jcy9saWIvcHl0aG9uX3ZlcnNp
b24ucHkKQEAgLTAsMCArMSwxMzMgQEAKKyMhL3Vzci9iaW4vZW52IHB5dGhvbjMKKyMgU1BEWC1M
aWNlbnNlLUlkZW50aWZpZXI6IEdQTC0yLjAtb3ItbGF0ZXIKKyMgQ29weXJpZ2h0IChjKSAyMDE3
LTIwMjUgTWF1cm8gQ2FydmFsaG8gQ2hlaGFiIDxtY2hlaGFiK2h1YXdlaUBrZXJuZWwub3JnPgor
CisiIiIKK0hhbmRsZSBQeXRob24gdmVyc2lvbiBjaGVjayBsb2dpYy4KKworTm90IGFsbCBQeXRo
b24gdmVyc2lvbnMgYXJlIHN1cHBvcnRlZCBieSBzY3JpcHRzLiBZZXQsIG9uIHNvbWUgY2FzZXMs
CitsaWtlIGR1cmluZyBkb2N1bWVudGF0aW9uIGJ1aWxkLCBhIG5ld2VyIHZlcnNpb24gb2YgcHl0
aG9uIGNvdWxkIGJlCithdmFpbGFibGUuCisKK1RoaXMgY2xhc3MgYWxsb3dzIGNoZWNraW5nIGlm
IHRoZSBtaW5pbWFsIHJlcXVpcmVtZW50cyBhcmUgZm9sbG93ZWQuCisKK0JldHRlciB0aGFuIHRo
YXQsIFB5dGhvblZlcnNpb24uY2hlY2tfcHl0aG9uKCkgbm90IG9ubHkgY2hlY2tzIHRoZSBtaW5p
bWFsCityZXF1aXJlbWVudHMsIGJ1dCBpdCBhdXRvbWF0aWNhbGx5IHN3aXRjaGVzIHRvIGEgdGhl
IG5ld2VzdCBhdmFpbGFibGUKK1B5dGhvbiB2ZXJzaW9uIGlmIHByZXNlbnQuCisKKyIiIgorCitp
bXBvcnQgb3MKK2ltcG9ydCByZQoraW1wb3J0IHN1YnByb2Nlc3MKK2ltcG9ydCBzeXMKKworZnJv
bSBnbG9iIGltcG9ydCBnbG9iCisKK2NsYXNzIFB5dGhvblZlcnNpb246CisgICAgIiIiCisgICAg
QW5jaWxsYXJ5IG1ldGhvZHMgdGhhdCBjaGVja3MgZm9yIG1pc3NpbmcgZGVwZW5kZW5jaWVzIGZv
ciBkaWZmZXJlbnQKKyAgICB0eXBlcyBvZiB0eXBlcywgbGlrZSBiaW5hcmllcywgcHl0aG9uIG1v
ZHVsZXMsIHJwbSBkZXBzLCBldGMuCisgICAgIiIiCisKKyAgICBkZWYgX19pbml0X18oc2VsZiwg
dmVyc2lvbik6CisgICAgICAgICIiIu+/ve+/vW5pdGlhbGl6ZSBzZWxmLnZlcnNpb24gdHVwbGUg
ZnJvbSBhIHZlcnNpb24gc3RyaW5nIiIiCisgICAgICAgIHNlbGYudmVyc2lvbiA9IHNlbGYucGFy
c2VfdmVyc2lvbih2ZXJzaW9uKQorCisgICAgQHN0YXRpY21ldGhvZAorICAgIGRlZiBwYXJzZV92
ZXJzaW9uKHZlcnNpb24pOgorICAgICAgICAiIiJDb252ZXJ0IGEgbWFqb3IubWlub3IucGF0Y2gg
dmVyc2lvbiBpbnRvIGEgdHVwbGUiIiIKKyAgICAgICAgcmV0dXJuIHR1cGxlKGludCh4KSBmb3Ig
eCBpbiB2ZXJzaW9uLnNwbGl0KCIuIikpCisKKyAgICBAc3RhdGljbWV0aG9kCisgICAgZGVmIHZl
cl9zdHIodmVyc2lvbik6CisgICAgICAgICIiIlJldHVybnMgYSB2ZXJzaW9uIHR1cGxlIGFzIG1h
am9yLm1pbm9yLnBhdGNoIiIiCisgICAgICAgIHJldHVybiAiLiIuam9pbihbc3RyKHgpIGZvciB4
IGluIHZlcnNpb25dKQorCisgICAgZGVmIF9fc3RyX18oc2VsZik6CisgICAgICAgICIiIlJldHVy
bnMgYSB2ZXJzaW9uIHR1cGxlIGFzIG1ham9yLm1pbm9yLnBhdGNoIGZyb20gc2VsZi52ZXJzaW9u
IiIiCisgICAgICAgIHJldHVybiBzZWxmLnZlcl9zdHIoc2VsZi52ZXJzaW9uKQorCisgICAgQHN0
YXRpY21ldGhvZAorICAgIGRlZiBnZXRfcHl0aG9uX3ZlcnNpb24oY21kKToKKyAgICAgICAgIiIi
CisgICAgICAgIEdldCBweXRob24gdmVyc2lvbiBmcm9tIGEgUHl0aG9uIGJpbmFyeS4gQXMgd2Ug
bmVlZCB0byBkZXRlY3QgaWYKKyAgICAgICAgYXJlIG91dCB0aGVyZSBuZXdlciBweXRob24gYmlu
YXJpZXMsIHdlIGNhbid0IHJlbHkgb24gc3lzLnJlbGVhc2UgaGVyZS4KKyAgICAgICAgIiIiCisK
KyAgICAgICAga3dhcmdzID0ge30KKyAgICAgICAgaWYgc3lzLnZlcnNpb25faW5mbyA8ICgzLCA3
KToKKyAgICAgICAgICAgIGt3YXJnc1sndW5pdmVyc2FsX25ld2xpbmVzJ10gPSBUcnVlCisgICAg
ICAgIGVsc2U6CisgICAgICAgICAgICBrd2FyZ3NbJ3RleHQnXSA9IFRydWUKKworICAgICAgICBy
ZXN1bHQgPSBzdWJwcm9jZXNzLnJ1bihbY21kLCAiLS12ZXJzaW9uIl0sCisgICAgICAgICAgICAg
ICAgICAgICAgICAgICAgICAgIHN0ZG91dCA9IHN1YnByb2Nlc3MuUElQRSwKKyAgICAgICAgICAg
ICAgICAgICAgICAgICAgICAgICAgc3RkZXJyID0gc3VicHJvY2Vzcy5QSVBFLAorICAgICAgICAg
ICAgICAgICAgICAgICAgICAgICAgICAqKmt3YXJncywgY2hlY2s9RmFsc2UpCisKKyAgICAgICAg
dmVyc2lvbiA9IHJlc3VsdC5zdGRvdXQuc3RyaXAoKQorCisgICAgICAgIG1hdGNoID0gcmUuc2Vh
cmNoKHIiKFxkK1wuXGQrXC5cZCspIiwgdmVyc2lvbikKKyAgICAgICAgaWYgbWF0Y2g6CisgICAg
ICAgICAgICByZXR1cm4gUHl0aG9uVmVyc2lvbi5wYXJzZV92ZXJzaW9uKG1hdGNoLmdyb3VwKDEp
KQorCisgICAgICAgIHByaW50KGYiQ2FuJ3QgcGFyc2UgdmVyc2lvbiB7dmVyc2lvbn0iKQorICAg
ICAgICByZXR1cm4gKDAsIDAsIDApCisKKyAgICBAc3RhdGljbWV0aG9kCisgICAgZGVmIGZpbmRf
cHl0aG9uKG1pbl92ZXJzaW9uKToKKyAgICAgICAgIiIiCisgICAgICAgIERldGVjdCBpZiBhcmUg
b3V0IHRoZXJlIGFueSBweXRob24gMy54eSB2ZXJzaW9uIG5ld2VyIHRoYW4gdGhlCisgICAgICAg
IGN1cnJlbnQgb25lLgorCisgICAgICAgIE5vdGU6IHRoaXMgcm91dGluZSBpcyBsaW1pdGVkIHRv
IHVwIHRvIDIgZGlnaXRzIGZvciBweXRob24zLiBXZQorICAgICAgICBtYXkgbmVlZCB0byB1cGRh
dGUgaXQgb25lIGRheSwgaG9wZWZ1bGx5IG9uIGEgZGlzdGFudCBmdXR1cmUuCisgICAgICAgICIi
IgorICAgICAgICBwYXR0ZXJucyA9IFsKKyAgICAgICAgICAgICJweXRob24zLlswLTldIiwKKyAg
ICAgICAgICAgICJweXRob24zLlswLTldWzAtOV0iLAorICAgICAgICBdCisKKyAgICAgICAgIyBT
ZWVrIGZvciBhIHB5dGhvbiBiaW5hcnkgbmV3ZXIgdGhhbiBtaW5fdmVyc2lvbgorICAgICAgICBm
b3IgcGF0aCBpbiBvcy5nZXRlbnYoIlBBVEgiLCAiIikuc3BsaXQoIjoiKToKKyAgICAgICAgICAg
IGZvciBwYXR0ZXJuIGluIHBhdHRlcm5zOgorICAgICAgICAgICAgICAgIGZvciBjbWQgaW4gZ2xv
Yihvcy5wYXRoLmpvaW4ocGF0aCwgcGF0dGVybikpOgorICAgICAgICAgICAgICAgICAgICBpZiBv
cy5wYXRoLmlzZmlsZShjbWQpIGFuZCBvcy5hY2Nlc3MoY21kLCBvcy5YX09LKToKKyAgICAgICAg
ICAgICAgICAgICAgICAgIHZlcnNpb24gPSBQeXRob25WZXJzaW9uLmdldF9weXRob25fdmVyc2lv
bihjbWQpCisgICAgICAgICAgICAgICAgICAgICAgICBpZiB2ZXJzaW9uID49IG1pbl92ZXJzaW9u
OgorICAgICAgICAgICAgICAgICAgICAgICAgICAgIHJldHVybiBjbWQKKworICAgICAgICByZXR1
cm4gTm9uZQorCisgICAgQHN0YXRpY21ldGhvZAorICAgIGRlZiBjaGVja19weXRob24obWluX3Zl
cnNpb24pOgorICAgICAgICAiIiIKKyAgICAgICAgQ2hlY2sgaWYgdGhlIGN1cnJlbnQgcHl0aG9u
IGJpbmFyeSBzYXRpc2ZpZXMgb3VyIG1pbmltYWwgcmVxdWlyZW1lbnQKKyAgICAgICAgZm9yIFNw
aGlueCBidWlsZC4gSWYgbm90LCByZS1ydW4gd2l0aCBhIG5ld2VyIHZlcnNpb24gaWYgZm91bmQu
CisgICAgICAgICIiIgorICAgICAgICBjdXJfdmVyID0gc3lzLnZlcnNpb25faW5mb1s6M10KKyAg
ICAgICAgaWYgY3VyX3ZlciA+PSBtaW5fdmVyc2lvbjoKKyAgICAgICAgICAgIHZlciA9IFB5dGhv
blZlcnNpb24udmVyX3N0cihjdXJfdmVyKQorICAgICAgICAgICAgcHJpbnQoZiJQeXRob24gdmVy
c2lvbjoge3Zlcn0iKQorCisgICAgICAgICAgICByZXR1cm4KKworICAgICAgICBweXRob25fdmVy
ID0gUHl0aG9uVmVyc2lvbi52ZXJfc3RyKGN1cl92ZXIpCisKKyAgICAgICAgbmV3X3B5dGhvbl9j
bWQgPSBQeXRob25WZXJzaW9uLmZpbmRfcHl0aG9uKG1pbl92ZXJzaW9uKQorICAgICAgICBpZiBu
b3QgbmV3X3B5dGhvbl9jbWQ6CisgICAgICAgICAgICBwcmludChmIkVSUk9SOiBQeXRob24gdmVy
c2lvbiB7cHl0aG9uX3Zlcn0gaXMgbm90IHNwcG9ydGVkIGFueW1vcmVcbiIpCisgICAgICAgICAg
ICBwcmludCgiICAgICAgIENhbid0IGZpbmQgYSBuZXcgdmVyc2lvbi4gVGhpcyBzY3JpcHQgbWF5
IGZhaWwiKQorICAgICAgICAgICAgcmV0dXJuCisKKyAgICAgICAgIyBSZXN0YXJ0IHNjcmlwdCB1
c2luZyB0aGUgbmV3ZXIgdmVyc2lvbgorICAgICAgICBzY3JpcHRfcGF0aCA9IG9zLnBhdGguYWJz
cGF0aChzeXMuYXJndlswXSkKKyAgICAgICAgYXJncyA9IFtuZXdfcHl0aG9uX2NtZCwgc2NyaXB0
X3BhdGhdICsgc3lzLmFyZ3ZbMTpdCisKKyAgICAgICAgcHJpbnQoZiJQeXRob24ge3B5dGhvbl92
ZXJ9IG5vdCBzdXBwb3J0ZWQuIENoYW5naW5nIHRvIHtuZXdfcHl0aG9uX2NtZH0iKQorCisgICAg
ICAgIHRyeToKKyAgICAgICAgICAgIG9zLmV4ZWN2KG5ld19weXRob25fY21kLCBhcmdzKQorICAg
ICAgICBleGNlcHQgT1NFcnJvciBhcyBlOgorICAgICAgICAgICAgc3lzLmV4aXQoZiJGYWlsZWQg
dG8gcmVzdGFydCB3aXRoIHtuZXdfcHl0aG9uX2NtZH06IHtlfSIpCmRpZmYgLS1naXQgYS90b29s
cy9kb2NzL3NwaGlueC1wcmUtaW5zdGFsbCBiL3Rvb2xzL2RvY3Mvc3BoaW54LXByZS1pbnN0YWxs
CmluZGV4IDk1NGVkM2RjMDY0NS4uZDZkNjczYjc5NDVjIDEwMDc1NQotLS0gYS90b29scy9kb2Nz
L3NwaGlueC1wcmUtaW5zdGFsbAorKysgYi90b29scy9kb2NzL3NwaGlueC1wcmUtaW5zdGFsbApA
QCAtMzIsMjAgKzMyLDEwIEBAIGltcG9ydCBzdWJwcm9jZXNzCiBpbXBvcnQgc3lzCiBmcm9tIGds
b2IgaW1wb3J0IGdsb2IKIAorZnJvbSBsaWIucHl0aG9uX3ZlcnNpb24gaW1wb3J0IFB5dGhvblZl
cnNpb24KIAotZGVmIHBhcnNlX3ZlcnNpb24odmVyc2lvbik6Ci0gICAgIiIiQ29udmVydCBhIG1h
am9yLm1pbm9yLnBhdGNoIHZlcnNpb24gaW50byBhIHR1cGxlIiIiCi0gICAgcmV0dXJuIHR1cGxl
KGludCh4KSBmb3IgeCBpbiB2ZXJzaW9uLnNwbGl0KCIuIikpCi0KLQotZGVmIHZlcl9zdHIodmVy
c2lvbik6Ci0gICAgIiIiUmV0dXJucyBhIHZlcnNpb24gdHVwbGUgYXMgbWFqb3IubWlub3IucGF0
Y2giIiIKLQotICAgIHJldHVybiAiLiIuam9pbihbc3RyKHgpIGZvciB4IGluIHZlcnNpb25dKQot
Ci0KLVJFQ09NTUVOREVEX1ZFUlNJT04gPSBwYXJzZV92ZXJzaW9uKCIzLjQuMyIpCi1NSU5fUFlU
SE9OX1ZFUlNJT04gPSBwYXJzZV92ZXJzaW9uKCIzLjciKQorUkVDT01NRU5ERURfVkVSU0lPTiA9
IFB5dGhvblZlcnNpb24oIjMuNC4zIikudmVyc2lvbgorTUlOX1BZVEhPTl9WRVJTSU9OID0gUHl0
aG9uVmVyc2lvbigiMy43IikudmVyc2lvbgogCiAKIGNsYXNzIERlcE1hbmFnZXI6CkBAIC0yMzUs
OTUgKzIyNSwxMSBAQCBjbGFzcyBBbmNpbGxhcnlNZXRob2RzOgogCiAgICAgICAgIHJldHVybiBO
b25lCiAKLSAgICBAc3RhdGljbWV0aG9kCi0gICAgZGVmIGdldF9weXRob25fdmVyc2lvbihjbWQp
OgotICAgICAgICAiIiIKLSAgICAgICAgR2V0IHB5dGhvbiB2ZXJzaW9uIGZyb20gYSBQeXRob24g
YmluYXJ5LiBBcyB3ZSBuZWVkIHRvIGRldGVjdCBpZgotICAgICAgICBhcmUgb3V0IHRoZXJlIG5l
d2VyIHB5dGhvbiBiaW5hcmllcywgd2UgY2FuJ3QgcmVseSBvbiBzeXMucmVsZWFzZSBoZXJlLgot
ICAgICAgICAiIiIKLQotICAgICAgICByZXN1bHQgPSBTcGhpbnhEZXBlbmRlbmN5Q2hlY2tlci5y
dW4oW2NtZCwgIi0tdmVyc2lvbiJdLAotICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAg
ICAgICAgICAgICBjYXB0dXJlX291dHB1dD1UcnVlLCB0ZXh0PVRydWUpCi0gICAgICAgIHZlcnNp
b24gPSByZXN1bHQuc3Rkb3V0LnN0cmlwKCkKLQotICAgICAgICBtYXRjaCA9IHJlLnNlYXJjaChy
IihcZCtcLlxkK1wuXGQrKSIsIHZlcnNpb24pCi0gICAgICAgIGlmIG1hdGNoOgotICAgICAgICAg
ICAgcmV0dXJuIHBhcnNlX3ZlcnNpb24obWF0Y2guZ3JvdXAoMSkpCi0KLSAgICAgICAgcHJpbnQo
ZiJDYW4ndCBwYXJzZSB2ZXJzaW9uIHt2ZXJzaW9ufSIpCi0gICAgICAgIHJldHVybiAoMCwgMCwg
MCkKLQotICAgIEBzdGF0aWNtZXRob2QKLSAgICBkZWYgZmluZF9weXRob24oKToKLSAgICAgICAg
IiIiCi0gICAgICAgIERldGVjdCBpZiBhcmUgb3V0IHRoZXJlIGFueSBweXRob24gMy54eSB2ZXJz
aW9uIG5ld2VyIHRoYW4gdGhlCi0gICAgICAgIGN1cnJlbnQgb25lLgotCi0gICAgICAgIE5vdGU6
IHRoaXMgcm91dGluZSBpcyBsaW1pdGVkIHRvIHVwIHRvIDIgZGlnaXRzIGZvciBweXRob24zLiBX
ZQotICAgICAgICBtYXkgbmVlZCB0byB1cGRhdGUgaXQgb25lIGRheSwgaG9wZWZ1bGx5IG9uIGEg
ZGlzdGFudCBmdXR1cmUuCi0gICAgICAgICIiIgotICAgICAgICBwYXR0ZXJucyA9IFsKLSAgICAg
ICAgICAgICJweXRob24zLlswLTldIiwKLSAgICAgICAgICAgICJweXRob24zLlswLTldWzAtOV0i
LAotICAgICAgICBdCi0KLSAgICAgICAgIyBTZWVrIGZvciBhIHB5dGhvbiBiaW5hcnkgbmV3ZXIg
dGhhbiBNSU5fUFlUSE9OX1ZFUlNJT04KLSAgICAgICAgZm9yIHBhdGggaW4gb3MuZ2V0ZW52KCJQ
QVRIIiwgIiIpLnNwbGl0KCI6Iik6Ci0gICAgICAgICAgICBmb3IgcGF0dGVybiBpbiBwYXR0ZXJu
czoKLSAgICAgICAgICAgICAgICBmb3IgY21kIGluIGdsb2Iob3MucGF0aC5qb2luKHBhdGgsIHBh
dHRlcm4pKToKLSAgICAgICAgICAgICAgICAgICAgaWYgb3MucGF0aC5pc2ZpbGUoY21kKSBhbmQg
b3MuYWNjZXNzKGNtZCwgb3MuWF9PSyk6Ci0gICAgICAgICAgICAgICAgICAgICAgICB2ZXJzaW9u
ID0gU3BoaW54RGVwZW5kZW5jeUNoZWNrZXIuZ2V0X3B5dGhvbl92ZXJzaW9uKGNtZCkKLSAgICAg
ICAgICAgICAgICAgICAgICAgIGlmIHZlcnNpb24gPj0gTUlOX1BZVEhPTl9WRVJTSU9OOgotICAg
ICAgICAgICAgICAgICAgICAgICAgICAgIHJldHVybiBjbWQKLQotICAgIEBzdGF0aWNtZXRob2QK
LSAgICBkZWYgY2hlY2tfcHl0aG9uKCk6Ci0gICAgICAgICIiIgotICAgICAgICBDaGVjayBpZiB0
aGUgY3VycmVudCBweXRob24gYmluYXJ5IHNhdGlzZmllcyBvdXIgbWluaW1hbCByZXF1aXJlbWVu
dAotICAgICAgICBmb3IgU3BoaW54IGJ1aWxkLiBJZiBub3QsIHJlLXJ1biB3aXRoIGEgbmV3ZXIg
dmVyc2lvbiBpZiBmb3VuZC4KLSAgICAgICAgIiIiCi0gICAgICAgIGN1cl92ZXIgPSBzeXMudmVy
c2lvbl9pbmZvWzozXQotICAgICAgICBpZiBjdXJfdmVyID49IE1JTl9QWVRIT05fVkVSU0lPTjoK
LSAgICAgICAgICAgIHZlciA9IHZlcl9zdHIoY3VyX3ZlcikKLSAgICAgICAgICAgIHByaW50KGYi
UHl0aG9uIHZlcnNpb246IHt2ZXJ9IikKLQotICAgICAgICAgICAgIyBUaGlzIGNvdWxkIGJlIHVz
ZWZ1bCBmb3IgZGVidWdnaW5nIHB1cnBvc2VzCi0gICAgICAgICAgICBpZiBTcGhpbnhEZXBlbmRl
bmN5Q2hlY2tlci53aGljaCgiZG9jdXRpbHMiKToKLSAgICAgICAgICAgICAgICByZXN1bHQgPSBT
cGhpbnhEZXBlbmRlbmN5Q2hlY2tlci5ydW4oWyJkb2N1dGlscyIsICItLXZlcnNpb24iXSwKLSAg
ICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBjYXB0dXJl
X291dHB1dD1UcnVlLCB0ZXh0PVRydWUpCi0gICAgICAgICAgICAgICAgdmVyID0gcmVzdWx0LnN0
ZG91dC5zdHJpcCgpCi0gICAgICAgICAgICAgICAgbWF0Y2ggPSByZS5zZWFyY2gociIoXGQrXC5c
ZCtcLlxkKykiLCB2ZXIpCi0gICAgICAgICAgICAgICAgaWYgbWF0Y2g6Ci0gICAgICAgICAgICAg
ICAgICAgIHZlciA9IG1hdGNoLmdyb3VwKDEpCi0KLSAgICAgICAgICAgICAgICBwcmludChmIkRv
Y3V0aWxzIHZlcnNpb246IHt2ZXJ9IikKLQotICAgICAgICAgICAgcmV0dXJuCi0KLSAgICAgICAg
cHl0aG9uX3ZlciA9IHZlcl9zdHIoY3VyX3ZlcikKLQotICAgICAgICBuZXdfcHl0aG9uX2NtZCA9
IFNwaGlueERlcGVuZGVuY3lDaGVja2VyLmZpbmRfcHl0aG9uKCkKLSAgICAgICAgaWYgbm90IG5l
d19weXRob25fY21kOgotICAgICAgICAgICAgcHJpbnQoZiJFUlJPUjogUHl0aG9uIHZlcnNpb24g
e3B5dGhvbl92ZXJ9IGlzIG5vdCBzcHBvcnRlZCBhbnltb3JlXG4iKQotICAgICAgICAgICAgcHJp
bnQoIiAgICAgICBDYW4ndCBmaW5kIGEgbmV3IHZlcnNpb24uIFRoaXMgc2NyaXB0IG1heSBmYWls
IikKLSAgICAgICAgICAgIHJldHVybgotCi0gICAgICAgICMgUmVzdGFydCBzY3JpcHQgdXNpbmcg
dGhlIG5ld2VyIHZlcnNpb24KLSAgICAgICAgc2NyaXB0X3BhdGggPSBvcy5wYXRoLmFic3BhdGgo
c3lzLmFyZ3ZbMF0pCi0gICAgICAgIGFyZ3MgPSBbbmV3X3B5dGhvbl9jbWQsIHNjcmlwdF9wYXRo
XSArIHN5cy5hcmd2WzE6XQotCi0gICAgICAgIHByaW50KGYiUHl0aG9uIHtweXRob25fdmVyfSBu
b3Qgc3VwcG9ydGVkLiBDaGFuZ2luZyB0byB7bmV3X3B5dGhvbl9jbWR9IikKLQotICAgICAgICB0
cnk6Ci0gICAgICAgICAgICBvcy5leGVjdihuZXdfcHl0aG9uX2NtZCwgYXJncykKLSAgICAgICAg
ZXhjZXB0IE9TRXJyb3IgYXMgZToKLSAgICAgICAgICAgIHN5cy5leGl0KGYiRmFpbGVkIHRvIHJl
c3RhcnQgd2l0aCB7bmV3X3B5dGhvbl9jbWR9OiB7ZX0iKQotCiAgICAgQHN0YXRpY21ldGhvZAog
ICAgIGRlZiBydW4oKmFyZ3MsICoqa3dhcmdzKToKICAgICAgICAgIiIiCiAgICAgICAgIEV4Y2Vj
dXRlIGEgY29tbWFuZCwgaGlkaW5nIGl0cyBvdXRwdXQgYnkgZGVmYXVsdC4KLSAgICAgICAgUHJl
c2VydmUgY29tYXRpYmlsaXR5IHdpdGggb2xkZXIgUHl0aG9uIHZlcnNpb25zLgorICAgICAgICBQ
cmVzZXJ2ZSBjb21wYXRpYmlsaXR5IHdpdGggb2xkZXIgUHl0aG9uIHZlcnNpb25zLgogICAgICAg
ICAiIiIKIAogICAgICAgICBjYXB0dXJlX291dHB1dCA9IGt3YXJncy5wb3AoJ2NhcHR1cmVfb3V0
cHV0JywgRmFsc2UpCkBAIC01MjcsMTEgKzQzMywxMSBAQCBjbGFzcyBNaXNzaW5nQ2hlY2tlcnMo
QW5jaWxsYXJ5TWV0aG9kcyk6CiAgICAgICAgIGZvciBsaW5lIGluIHJlc3VsdC5zdGRvdXQuc3Bs
aXQoIlxuIik6CiAgICAgICAgICAgICBtYXRjaCA9IHJlLm1hdGNoKHIiXnNwaGlueC1idWlsZFxz
KyhbXGRcLl0rKSg/OlwrKD86L1tcZGEtZl0rKXxiXGQrKT9ccyokIiwgbGluZSkKICAgICAgICAg
ICAgIGlmIG1hdGNoOgotICAgICAgICAgICAgICAgIHJldHVybiBwYXJzZV92ZXJzaW9uKG1hdGNo
Lmdyb3VwKDEpKQorICAgICAgICAgICAgICAgIHJldHVybiBQeXRob25WZXJzaW9uLnBhcnNlX3Zl
cnNpb24obWF0Y2guZ3JvdXAoMSkpCiAKICAgICAgICAgICAgIG1hdGNoID0gcmUubWF0Y2gociJe
U3BoaW54LipccysoW1xkXC5dKylccyokIiwgbGluZSkKICAgICAgICAgICAgIGlmIG1hdGNoOgot
ICAgICAgICAgICAgICAgIHJldHVybiBwYXJzZV92ZXJzaW9uKG1hdGNoLmdyb3VwKDEpKQorICAg
ICAgICAgICAgICAgIHJldHVybiBQeXRob25WZXJzaW9uLnBhcnNlX3ZlcnNpb24obWF0Y2guZ3Jv
dXAoMSkpCiAKICAgICBkZWYgY2hlY2tfc3BoaW54KHNlbGYsIGNvbmYpOgogICAgICAgICAiIiIK
QEAgLTU0Miw3ICs0NDgsNyBAQCBjbGFzcyBNaXNzaW5nQ2hlY2tlcnMoQW5jaWxsYXJ5TWV0aG9k
cyk6CiAgICAgICAgICAgICAgICAgZm9yIGxpbmUgaW4gZjoKICAgICAgICAgICAgICAgICAgICAg
bWF0Y2ggPSByZS5tYXRjaChyIl5ccypuZWVkc19zcGhpbnhccyo9XHMqW1wnXCJdKFtcZFwuXSsp
W1wnXCJdIiwgbGluZSkKICAgICAgICAgICAgICAgICAgICAgaWYgbWF0Y2g6Ci0gICAgICAgICAg
ICAgICAgICAgICAgICBzZWxmLm1pbl92ZXJzaW9uID0gcGFyc2VfdmVyc2lvbihtYXRjaC5ncm91
cCgxKSkKKyAgICAgICAgICAgICAgICAgICAgICAgIHNlbGYubWluX3ZlcnNpb24gPSBQeXRob25W
ZXJzaW9uLnBhcnNlX3ZlcnNpb24obWF0Y2guZ3JvdXAoMSkpCiAgICAgICAgICAgICAgICAgICAg
ICAgICBicmVhawogICAgICAgICBleGNlcHQgSU9FcnJvcjoKICAgICAgICAgICAgIHN5cy5leGl0
KGYiQ2FuJ3Qgb3BlbiB7Y29uZn0iKQpAQCAtNTYyLDggKzQ2OCw4IEBAIGNsYXNzIE1pc3NpbmdD
aGVja2VycyhBbmNpbGxhcnlNZXRob2RzKToKICAgICAgICAgICAgIHN5cy5leGl0KGYie3NwaGlu
eH0gZGlkbid0IHJldHVybiBpdHMgdmVyc2lvbiIpCiAKICAgICAgICAgaWYgc2VsZi5jdXJfdmVy
c2lvbiA8IHNlbGYubWluX3ZlcnNpb246Ci0gICAgICAgICAgICBjdXJ2ZXIgPSB2ZXJfc3RyKHNl
bGYuY3VyX3ZlcnNpb24pCi0gICAgICAgICAgICBtaW52ZXIgPSB2ZXJfc3RyKHNlbGYubWluX3Zl
cnNpb24pCisgICAgICAgICAgICBjdXJ2ZXIgPSBQeXRob25WZXJzaW9uLnZlcl9zdHIoc2VsZi5j
dXJfdmVyc2lvbikKKyAgICAgICAgICAgIG1pbnZlciA9IFB5dGhvblZlcnNpb24udmVyX3N0cihz
ZWxmLm1pbl92ZXJzaW9uKQogCiAgICAgICAgICAgICBwcmludChmIkVSUk9SOiBTcGhpbnggdmVy
c2lvbiBpcyB7Y3VydmVyfS4gSXQgc2hvdWxkIGJlID49IHttaW52ZXJ9IikKICAgICAgICAgICAg
IHNlbGYubmVlZF9zcGhpbnggPSAxCkBAIC0xMzA0LDcgKzEyMTAsNyBAQCBjbGFzcyBTcGhpbnhE
ZXBlbmRlbmN5Q2hlY2tlcihNaXNzaW5nQ2hlY2tlcnMpOgogICAgICAgICAgICAgZWxzZToKICAg
ICAgICAgICAgICAgICBpZiBzZWxmLm5lZWRfc3BoaW54IGFuZCB2ZXIgPj0gc2VsZi5taW5fdmVy
c2lvbjoKICAgICAgICAgICAgICAgICAgICAgcmV0dXJuIChmLCB2ZXIpCi0gICAgICAgICAgICAg
ICAgZWxpZiBwYXJzZV92ZXJzaW9uKHZlcikgPiBzZWxmLmN1cl92ZXJzaW9uOgorICAgICAgICAg
ICAgICAgIGVsaWYgUHl0aG9uVmVyc2lvbi5wYXJzZV92ZXJzaW9uKHZlcikgPiBzZWxmLmN1cl92
ZXJzaW9uOgogICAgICAgICAgICAgICAgICAgICByZXR1cm4gKGYsIHZlcikKIAogICAgICAgICBy
ZXR1cm4gKCIiLCB2ZXIpCkBAIC0xNDExLDcgKzEzMTcsNyBAQCBjbGFzcyBTcGhpbnhEZXBlbmRl
bmN5Q2hlY2tlcihNaXNzaW5nQ2hlY2tlcnMpOgogICAgICAgICAgICAgcmV0dXJuCiAKICAgICAg
ICAgaWYgc2VsZi5sYXRlc3RfYXZhaWxfdmVyOgotICAgICAgICAgICAgbGF0ZXN0X2F2YWlsX3Zl
ciA9IHZlcl9zdHIoc2VsZi5sYXRlc3RfYXZhaWxfdmVyKQorICAgICAgICAgICAgbGF0ZXN0X2F2
YWlsX3ZlciA9IFB5dGhvblZlcnNpb24udmVyX3N0cihzZWxmLmxhdGVzdF9hdmFpbF92ZXIpCiAK
ICAgICAgICAgaWYgbm90IHNlbGYubmVlZF9zcGhpbng6CiAgICAgICAgICAgICAjIHNwaGlueC1i
dWlsZCBpcyBwcmVzZW50IGFuZCBpdHMgdmVyc2lvbiBpcyA+PSAkbWluX3ZlcnNpb24KQEAgLTE1
MDcsNyArMTQxMyw3IEBAIGNsYXNzIFNwaGlueERlcGVuZGVuY3lDaGVja2VyKE1pc3NpbmdDaGVj
a2Vycyk6CiAgICAgICAgIGVsc2U6CiAgICAgICAgICAgICBwcmludCgiVW5rbm93biBPUyIpCiAg
ICAgICAgIGlmIHNlbGYuY3VyX3ZlcnNpb24gIT0gKDAsIDAsIDApOgotICAgICAgICAgICAgdmVy
ID0gdmVyX3N0cihzZWxmLmN1cl92ZXJzaW9uKQorICAgICAgICAgICAgdmVyID0gUHl0aG9uVmVy
c2lvbi52ZXJfc3RyKHNlbGYuY3VyX3ZlcnNpb24pCiAgICAgICAgICAgICBwcmludChmIlNwaGlu
eCB2ZXJzaW9uOiB7dmVyfVxuIikKIAogICAgICAgICAjIENoZWNrIHRoZSB0eXBlIG9mIHZpcnR1
YWwgZW52LCBkZXBlbmRpbmcgb24gUHl0aG9uIHZlcnNpb24KQEAgLTE2MTMsNyArMTUxOSw3IEBA
IGRlZiBtYWluKCk6CiAKICAgICBjaGVja2VyID0gU3BoaW54RGVwZW5kZW5jeUNoZWNrZXIoYXJn
cykKIAotICAgIGNoZWNrZXIuY2hlY2tfcHl0aG9uKCkKKyAgICBQeXRob25WZXJzaW9uLmNoZWNr
X3B5dGhvbihNSU5fUFlUSE9OX1ZFUlNJT04pCiAgICAgY2hlY2tlci5jaGVja19uZWVkcygpCiAK
ICMgQ2FsbCBtYWluIGlmIG5vdCB1c2VkIGFzIG1vZHVsZQotLSAKMi41MS4wCgo=

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [PATCH v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Björn Roy Baron, Alex Gaynor,
	Alice Ryhl, Andreas Hindborg, Benno Lossin, Boqun Feng,
	Danilo Krummrich, Gary Guo, Mauro Carvalho Chehab, Miguel Ojeda,
	Trevor Gross, linux-kernel, rust-for-linux

There are too much magic inside docs Makefile to properly run
sphinx-build. Create an ancillary script that contains all
kernel-related sphinx-build call logic currently at Makefile.

Such script is designed to work both as an standalone command
and as part of a Makefile. As such, it properly handles POSIX
jobserver used by GNU make.

On a side note, there was a line number increase due to the
conversion:

$ git show|diffstat -p1
 Documentation/Makefile          |  129 +++-------------
 tools/docs/sphinx-build-wrapper |  288 +++++++++++++++++++++++++++++++++++++
 2 files changed, 316 insertions(+), 101 deletions(-)

This is because some things are more verbosed on Python and because
it requires reading env vars from Makefile. Besides it, this script
has some extra features that don't exist at the Makefile:

- It can be called directly from command line;
- It properly return PDF build errors.

When running the script alone, it will only take handle sphinx-build
targets. On other words, it won't runn make rustdoc after building
htmlfiles, nor it will run the extra check scripts.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile          | 129 ++++----------
 tools/docs/sphinx-build-wrapper | 288 ++++++++++++++++++++++++++++++++
 2 files changed, 316 insertions(+), 101 deletions(-)
 create mode 100755 tools/docs/sphinx-build-wrapper

diff --git a/Documentation/Makefile b/Documentation/Makefile
index deb2029228ed..2b0ed8cd5ea8 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -23,21 +23,22 @@ SPHINXOPTS    =
 SPHINXDIRS    = .
 DOCS_THEME    =
 DOCS_CSS      =
-_SPHINXDIRS   = $(sort $(patsubst $(srctree)/Documentation/%/index.rst,%,$(wildcard $(srctree)/Documentation/*/index.rst)))
 SPHINX_CONF   = conf.py
 PAPER         =
 BUILDDIR      = $(obj)/output
 PDFLATEX      = xelatex
 LATEXOPTS     = -interaction=batchmode -no-shell-escape
 
+PYTHONPYCACHEPREFIX ?= $(abspath $(BUILDDIR)/__pycache__)
+
+# Wrapper for sphinx-build
+
+BUILD_WRAPPER = $(srctree)/tools/docs/sphinx-build-wrapper
+
 # For denylisting "variable font" files
 # Can be overridden by setting as an env variable
 FONTS_CONF_DENY_VF ?= $(HOME)/deny-vf
 
-ifeq ($(findstring 1, $(KBUILD_VERBOSE)),)
-SPHINXOPTS    += "-q"
-endif
-
 # User-friendly check for sphinx-build
 HAVE_SPHINX := $(shell if which $(SPHINXBUILD) >/dev/null 2>&1; then echo 1; else echo 0; fi)
 
@@ -51,63 +52,29 @@ ifeq ($(HAVE_SPHINX),0)
 
 else # HAVE_SPHINX
 
-# User-friendly check for pdflatex and latexmk
-HAVE_PDFLATEX := $(shell if which $(PDFLATEX) >/dev/null 2>&1; then echo 1; else echo 0; fi)
-HAVE_LATEXMK := $(shell if which latexmk >/dev/null 2>&1; then echo 1; else echo 0; fi)
+# Common documentation targets
+infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
+	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
+	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
+		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
+		--theme=$(DOCS_THEME) --css=$(DOCS_CSS) --paper=$(PAPER)
 
-ifeq ($(HAVE_LATEXMK),1)
-	PDFLATEX := latexmk -$(PDFLATEX)
-endif #HAVE_LATEXMK
-
-# Internal variables.
-PAPEROPT_a4     = -D latex_elements.papersize=a4paper
-PAPEROPT_letter = -D latex_elements.papersize=letterpaper
-ALLSPHINXOPTS   = -D kerneldoc_srctree=$(srctree) -D kerneldoc_bin=$(KERNELDOC)
-ALLSPHINXOPTS   += $(PAPEROPT_$(PAPER)) $(SPHINXOPTS)
-ifneq ($(wildcard $(srctree)/.config),)
-ifeq ($(CONFIG_RUST),y)
-	# Let Sphinx know we will include rustdoc
-	ALLSPHINXOPTS   +=  -t rustdoc
-endif
+# Special handling for pdfdocs
+ifeq ($(shell which $(PDFLATEX) >/dev/null 2>&1; echo $$?),0)
+pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
+else
+pdfdocs:
+	$(warning The '$(PDFLATEX)' command was not found. Make sure you have it installed and in PATH to produce PDF output.)
+	@echo "  SKIP    Sphinx $@ target."
 endif
-# the i18n builder cannot share the environment and doctrees with the others
-I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
-
-# commands; the 'cmd' from scripts/Kbuild.include is not *loopable*
-loop_cmd = $(echo-cmd) $(cmd_$(1)) || exit;
-
-# $2 sphinx builder e.g. "html"
-# $3 name of the build subfolder / e.g. "userspace-api/media", used as:
-#    * dest folder relative to $(BUILDDIR) and
-#    * cache folder relative to $(BUILDDIR)/.doctrees
-# $4 dest subfolder e.g. "man" for man pages at userspace-api/media/man
-# $5 reST source folder relative to $(src),
-#    e.g. "userspace-api/media" for the linux-tv book-set at ./Documentation/userspace-api/media
-
-PYTHONPYCACHEPREFIX ?= $(abspath $(BUILDDIR)/__pycache__)
-
-quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
-      cmd_sphinx = \
-	PYTHONPYCACHEPREFIX="$(PYTHONPYCACHEPREFIX)" \
-	BUILDDIR=$(abspath $(BUILDDIR)) SPHINX_CONF=$(abspath $(src)/$5/$(SPHINX_CONF)) \
-	$(PYTHON3) $(srctree)/scripts/jobserver-exec \
-	$(CONFIG_SHELL) $(srctree)/Documentation/sphinx/parallel-wrapper.sh \
-	$(SPHINXBUILD) \
-	-b $2 \
-	-c $(abspath $(src)) \
-	-d $(abspath $(BUILDDIR)/.doctrees/$3) \
-	-D version=$(KERNELVERSION) -D release=$(KERNELRELEASE) \
-	$(ALLSPHINXOPTS) \
-	$(abspath $(src)/$5) \
-	$(abspath $(BUILDDIR)/$3/$4) && \
-	if [ "x$(DOCS_CSS)" != "x" ]; then \
-		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
-	fi
 
+# HTML main logic is identical to other targets. However, if rust is enabled,
+# an extra step at the end is required to generate rustdoc.
 htmldocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
-
+	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
+	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
+		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
+		--theme=$(DOCS_THEME) --css=$(DOCS_CSS) --paper=$(PAPER)
 # If Rust support is available and .config exists, add rustdoc generated contents.
 # If there are any, the errors from this make rustdoc will be displayed but
 # won't stop the execution of htmldocs
@@ -118,49 +85,6 @@ ifeq ($(CONFIG_RUST),y)
 endif
 endif
 
-texinfodocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,texinfo,$(var),texinfo,$(var)))
-
-# Note: the 'info' Make target is generated by sphinx itself when
-# running the texinfodocs target define above.
-infodocs: texinfodocs
-	$(MAKE) -C $(BUILDDIR)/texinfo info
-
-linkcheckdocs:
-	@$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,linkcheck,$(var),,$(var)))
-
-latexdocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,latex,$(var),latex,$(var)))
-
-ifeq ($(HAVE_PDFLATEX),0)
-
-pdfdocs:
-	$(warning The '$(PDFLATEX)' command was not found. Make sure you have it installed and in PATH to produce PDF output.)
-	@echo "  SKIP    Sphinx $@ target."
-
-else # HAVE_PDFLATEX
-
-pdfdocs: DENY_VF = XDG_CONFIG_HOME=$(FONTS_CONF_DENY_VF)
-pdfdocs: latexdocs
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	$(foreach var,$(SPHINXDIRS), \
-	   $(MAKE) PDFLATEX="$(PDFLATEX)" LATEXOPTS="$(LATEXOPTS)" $(DENY_VF) -C $(BUILDDIR)/$(var)/latex || sh $(srctree)/scripts/check-variable-fonts.sh || exit; \
-	   mkdir -p $(BUILDDIR)/$(var)/pdf; \
-	   mv $(subst .tex,.pdf,$(wildcard $(BUILDDIR)/$(var)/latex/*.tex)) $(BUILDDIR)/$(var)/pdf/; \
-	)
-
-endif # HAVE_PDFLATEX
-
-epubdocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,epub,$(var),epub,$(var)))
-
-xmldocs:
-	@$(srctree)/tools/docs/sphinx-pre-install --version-check
-	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,xml,$(var),xml,$(var)))
-
 endif # HAVE_SPHINX
 
 # The following targets are independent of HAVE_SPHINX, and the rules should
@@ -172,6 +96,9 @@ refcheckdocs:
 cleandocs:
 	$(Q)rm -rf $(BUILDDIR)
 
+# Used only on help
+_SPHINXDIRS   = $(sort $(patsubst $(srctree)/Documentation/%/index.rst,%,$(wildcard $(srctree)/Documentation/*/index.rst)))
+
 dochelp:
 	@echo  ' Linux kernel internal documentation in different formats from ReST:'
 	@echo  '  htmldocs        - HTML'
diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
new file mode 100755
index 000000000000..df469af8a4ef
--- /dev/null
+++ b/tools/docs/sphinx-build-wrapper
@@ -0,0 +1,288 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+import argparse
+import os
+import shlex
+import shutil
+import subprocess
+import sys
+from lib.python_version import PythonVersion
+
+LIB_DIR = "../../scripts/lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
+
+from jobserver import JobserverExec
+
+MIN_PYTHON_VERSION = PythonVersion("3.7").version
+PAPER = ["", "a4", "letter"]
+TARGETS = {
+    "cleandocs":     { "builder": "clean" },
+    "linkcheckdocs": { "builder": "linkcheck" },
+    "htmldocs":      { "builder": "html" },
+    "epubdocs":      { "builder": "epub",    "out_dir": "epub" },
+    "texinfodocs":   { "builder": "texinfo", "out_dir": "texinfo" },
+    "infodocs":      { "builder": "texinfo", "out_dir": "texinfo" },
+    "latexdocs":     { "builder": "latex",   "out_dir": "latex" },
+    "pdfdocs":       { "builder": "latex",   "out_dir": "latex" },
+    "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
+}
+
+class SphinxBuilder:
+    def is_rust_enabled(self):
+        config_path = os.path.join(self.srctree, ".config")
+        if os.path.isfile(config_path):
+            with open(config_path, "r", encoding="utf-8") as f:
+                return "CONFIG_RUST=y" in f.read()
+        return False
+
+    def get_path(self, path, abs_path=False):
+        path = os.path.expanduser(path)
+        if not path.startswith("/"):
+            path = os.path.join(self.srctree, path)
+        if abs_path:
+            return os.path.abspath(path)
+        return path
+
+    def __init__(self, verbose=False, n_jobs=None):
+        self.verbose = None
+        self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
+        self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
+        self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
+        self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+        if not verbose:
+            verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
+        if verbose is not None:
+            self.verbose = verbose
+        parser = argparse.ArgumentParser()
+        parser.add_argument('-j', '--jobs', type=int)
+        parser.add_argument('-q', '--quiet', type=int)
+        sphinxopts = shlex.split(os.environ.get("SPHINXOPTS", ""))
+        sphinx_args, self.sphinxopts = parser.parse_known_args(sphinxopts)
+        if sphinx_args.quiet is True:
+            self.verbose = False
+        if sphinx_args.jobs:
+            self.n_jobs = sphinx_args.jobs
+        self.n_jobs = n_jobs
+        self.srctree = os.environ.get("srctree")
+        if not self.srctree:
+            self.srctree = "."
+            os.environ["srctree"] = self.srctree
+        self.sphinxbuild = os.environ.get("SPHINXBUILD", "sphinx-build")
+        self.kerneldoc = self.get_path(os.environ.get("KERNELDOC",
+                                                      "scripts/kernel-doc.py"))
+        self.obj = os.environ.get("obj", "Documentation")
+        self.builddir = self.get_path(os.path.join(self.obj, "output"),
+                                      abs_path=True)
+
+        self.config_rust = self.is_rust_enabled()
+
+        self.pdflatex_cmd = shutil.which(self.pdflatex)
+        self.latexmk_cmd = shutil.which("latexmk")
+
+        self.env = os.environ.copy()
+
+    def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
+        with JobserverExec() as jobserver:
+            if jobserver.claim:
+                n_jobs = str(jobserver.claim)
+            else:
+                n_jobs = "auto" # Supported since Sphinx 1.7
+            cmd = []
+            cmd.append(sys.executable)
+            cmd.append(sphinx_build)
+            if self.n_jobs:
+                n_jobs = str(self.n_jobs)
+
+            if n_jobs:
+                cmd += [f"-j{n_jobs}"]
+
+            if not self.verbose:
+                cmd.append("-q")
+            cmd += self.sphinxopts
+            cmd += build_args
+            if self.verbose:
+                print(" ".join(cmd))
+            return subprocess.call(cmd, *args, **pwargs)
+
+    def handle_html(self, css, output_dir):
+        if not css:
+            return
+        css = os.path.expanduser(css)
+        if not css.startswith("/"):
+            css = os.path.join(self.srctree, css)
+        static_dir = os.path.join(output_dir, "_static")
+        os.makedirs(static_dir, exist_ok=True)
+        try:
+            shutil.copy2(css, static_dir)
+        except (OSError, IOError) as e:
+            print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
+
+    def handle_pdf(self, output_dirs):
+        builds = {}
+        max_len = 0
+        for from_dir in output_dirs:
+            pdf_dir = os.path.join(from_dir, "../pdf")
+            os.makedirs(pdf_dir, exist_ok=True)
+            if self.latexmk_cmd:
+                latex_cmd = [self.latexmk_cmd, f"-{self.pdflatex}"]
+            else:
+                latex_cmd = [self.pdflatex]
+            latex_cmd.extend(shlex.split(self.latexopts))
+            tex_suffix = ".tex"
+            has_tex = False
+            build_failed = False
+            with os.scandir(from_dir) as it:
+                for entry in it:
+                    if not entry.name.endswith(tex_suffix):
+                        continue
+                    name = entry.name[:-len(tex_suffix)]
+                    has_tex = True
+                    try:
+                        subprocess.run(latex_cmd + [entry.path],
+                                       cwd=from_dir, check=True)
+                    except subprocess.CalledProcessError:
+                        pass
+                    pdf_name = name + ".pdf"
+                    pdf_from = os.path.join(from_dir, pdf_name)
+                    pdf_to = os.path.join(pdf_dir, pdf_name)
+                    if os.path.exists(pdf_from):
+                        os.rename(pdf_from, pdf_to)
+                        builds[name] = os.path.relpath(pdf_to, self.builddir)
+                    else:
+                        builds[name] = "FAILED"
+                        build_failed = True
+                    name = entry.name.removesuffix(".tex")
+                    max_len = max(max_len, len(name))
+
+            if not has_tex:
+                name = os.path.basename(from_dir)
+                max_len = max(max_len, len(name))
+                builds[name] = "FAILED (no .tex)"
+                build_failed = True
+        msg = "Summary"
+        msg += "\n" + "=" * len(msg)
+        print()
+        print(msg)
+        for pdf_name, pdf_file in builds.items():
+            print(f"{pdf_name:<{max_len}}: {pdf_file}")
+        print()
+        if build_failed:
+            sys.exit("PDF build failed: not all PDF files were created.")
+        else:
+            print("All PDF files were built.")
+
+    def handle_info(self, output_dirs):
+        for output_dir in output_dirs:
+            try:
+                subprocess.run(["make", "info"], cwd=output_dir, check=True)
+            except subprocess.CalledProcessError as e:
+                sys.exit(f"Error generating info docs: {e}")
+
+    def cleandocs(self, builder):
+        shutil.rmtree(self.builddir, ignore_errors=True)
+
+    def build(self, target, sphinxdirs=None, conf="conf.py",
+              theme=None, css=None, paper=None):
+        builder = TARGETS[target]["builder"]
+        out_dir = TARGETS[target].get("out_dir", "")
+        if target == "cleandocs":
+            self.cleandocs(builder)
+            return
+        if theme:
+                os.environ["DOCS_THEME"] = theme
+        sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
+        if not sphinxbuild:
+            sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
+        if builder == "latex":
+            if not self.pdflatex_cmd and not self.latexmk_cmd:
+                sys.exit("Error: pdflatex or latexmk required for PDF generation")
+        docs_dir = os.path.abspath(os.path.join(self.srctree, "Documentation"))
+        kerneldoc = self.kerneldoc
+        if kerneldoc.startswith(self.srctree):
+            kerneldoc = os.path.relpath(kerneldoc, self.srctree)
+        args = [ "-b", builder, "-c", docs_dir ]
+        if builder == "latex":
+            if not paper:
+                paper = PAPER[1]
+            args.extend(["-D", f"latex_elements.papersize={paper}paper"])
+        if self.config_rust:
+            args.extend(["-t", "rustdoc"])
+        if conf:
+            self.env["SPHINX_CONF"] = self.get_path(conf, abs_path=True)
+        if not sphinxdirs:
+            sphinxdirs = os.environ.get("SPHINXDIRS", ".")
+        sphinxdirs_list = []
+        for sphinxdir in sphinxdirs:
+            if isinstance(sphinxdir, list):
+                sphinxdirs_list += sphinxdir
+            else:
+                for name in sphinxdir.split(" "):
+                    sphinxdirs_list.append(name)
+        output_dirs = []
+        for sphinxdir in sphinxdirs_list:
+            src_dir = os.path.join(docs_dir, sphinxdir)
+            doctree_dir = os.path.join(self.builddir, ".doctrees")
+            output_dir = os.path.join(self.builddir, sphinxdir, out_dir)
+            src_dir = os.path.normpath(src_dir)
+            doctree_dir = os.path.normpath(doctree_dir)
+            output_dir = os.path.normpath(output_dir)
+            os.makedirs(doctree_dir, exist_ok=True)
+            os.makedirs(output_dir, exist_ok=True)
+            output_dirs.append(output_dir)
+            build_args = args + [
+                "-d", doctree_dir,
+                "-D", f"kerneldoc_bin={kerneldoc}",
+                "-D", f"version={self.kernelversion}",
+                "-D", f"release={self.kernelrelease}",
+                "-D", f"kerneldoc_srctree={self.srctree}",
+                src_dir,
+                output_dir,
+            ]
+            try:
+                self.run_sphinx(sphinxbuild, build_args, env=self.env)
+            except (OSError, ValueError, subprocess.SubprocessError) as e:
+                sys.exit(f"Build failed: {repr(e)}")
+            if target in ["htmldocs", "epubdocs"]:
+                self.handle_html(css, output_dir)
+        if target == "pdfdocs":
+            self.handle_pdf(output_dirs)
+        elif target == "infodocs":
+            self.handle_info(output_dirs)
+
+def jobs_type(value):
+    if value is None:
+        return None
+    if value.lower() == 'auto':
+        return value.lower()
+    try:
+        if int(value) >= 1:
+            return value
+        raise argparse.ArgumentTypeError(f"Minimum jobs is 1, got {value}")
+    except ValueError:
+        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")
+
+def main():
+    parser = argparse.ArgumentParser(description="Kernel documentation builder")
+    parser.add_argument("target", choices=list(TARGETS.keys()),
+                        help="Documentation target to build")
+    parser.add_argument("--sphinxdirs", nargs="+",
+                        help="Specific directories to build")
+    parser.add_argument("--conf", default="conf.py",
+                        help="Sphinx configuration file")
+    parser.add_argument("--theme", help="Sphinx theme to use")
+    parser.add_argument("--css", help="Custom CSS file for HTML/EPUB")
+    parser.add_argument("--paper", choices=PAPER, default=PAPER[0],
+                        help="Paper size for LaTeX/PDF output")
+    parser.add_argument("-v", "--verbose", action='store_true',
+                        help="place build in verbose mode")
+    parser.add_argument('-j', '--jobs', type=jobs_type,
+                        help="Sets number of jobs to use with sphinx-build")
+    args = parser.parse_args()
+    PythonVersion.check_python(MIN_PYTHON_VERSION)
+    builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+    builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
+                  theme=args.theme, css=args.css, paper=args.paper)
+
+if __name__ == "__main__":
+    main()
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Björn Roy Baron, Alex Gaynor,
	Alice Ryhl, Andreas Hindborg, Benno Lossin, Boqun Feng,
	Danilo Krummrich, Gary Guo, Mauro Carvalho Chehab, Miguel Ojeda,
	Trevor Gross, linux-kernel, rust-for-linux

To help seing the actual size of the script when it was added,
I opted to strip out all comments from the original script.

Add them here:

 tools/docs/sphinx-build-wrapper | 261 +++++++++++++++++++++++++++++++-
 1 file changed, 257 insertions(+), 4 deletion(-)

As the code from the script has 288 lines of code, it means that
about half of the script are comments.

Also ensure pylint won't report any warnings.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 261 +++++++++++++++++++++++++++++++-
 1 file changed, 257 insertions(+), 4 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index df469af8a4ef..e9c522794fbe 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -1,21 +1,71 @@
 #!/usr/bin/env python3
 # SPDX-License-Identifier: GPL-2.0
+# Copyright (C) 2025 Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+#
+# pylint: disable=R0902, R0912, R0913, R0914, R0915, R0917, C0103
+#
+# Converted from docs Makefile and parallel-wrapper.sh, both under
+# GPLv2, copyrighted since 2008 by the following authors:
+#
+#    Akira Yokosawa <akiyks@gmail.com>
+#    Arnd Bergmann <arnd@arndb.de>
+#    Breno Leitao <leitao@debian.org>
+#    Carlos Bilbao <carlos.bilbao@amd.com>
+#    Dave Young <dyoung@redhat.com>
+#    Donald Hunter <donald.hunter@gmail.com>
+#    Geert Uytterhoeven <geert+renesas@glider.be>
+#    Jani Nikula <jani.nikula@intel.com>
+#    Jan Stancek <jstancek@redhat.com>
+#    Jonathan Corbet <corbet@lwn.net>
+#    Joshua Clayton <stillcompiling@gmail.com>
+#    Kees Cook <keescook@chromium.org>
+#    Linus Torvalds <torvalds@linux-foundation.org>
+#    Magnus Damm <damm+renesas@opensource.se>
+#    Masahiro Yamada <masahiroy@kernel.org>
+#    Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+#    Maxim Cournoyer <maxim.cournoyer@gmail.com>
+#    Peter Foley <pefoley2@pefoley.com>
+#    Randy Dunlap <rdunlap@infradead.org>
+#    Rob Herring <robh@kernel.org>
+#    Shuah Khan <shuahkh@osg.samsung.com>
+#    Thorsten Blum <thorsten.blum@toblux.com>
+#    Tomas Winkler <tomas.winkler@intel.com>
+
+
+"""
+Sphinx build wrapper that handles Kernel-specific business rules:
+
+- it gets the Kernel build environment vars;
+- it determines what's the best parallelism;
+- it handles SPHINXDIRS
+
+This tool ensures that MIN_PYTHON_VERSION is satisfied. If version is
+below that, it seeks for a new Python version. If found, it re-runs using
+the newer version.
+"""
+
 import argparse
 import os
 import shlex
 import shutil
 import subprocess
 import sys
+
 from lib.python_version import PythonVersion
 
 LIB_DIR = "../../scripts/lib"
 SRC_DIR = os.path.dirname(os.path.realpath(__file__))
+
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from jobserver import JobserverExec
+from jobserver import JobserverExec         # pylint: disable=C0413,C0411,E0401
 
+#
+#  Some constants
+#
 MIN_PYTHON_VERSION = PythonVersion("3.7").version
 PAPER = ["", "a4", "letter"]
+
 TARGETS = {
     "cleandocs":     { "builder": "clean" },
     "linkcheckdocs": { "builder": "linkcheck" },
@@ -28,8 +78,19 @@ TARGETS = {
     "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
 }
 
+
+#
+# SphinxBuilder class
+#
+
 class SphinxBuilder:
+    """
+    Handles a sphinx-build target, adding needed arguments to build
+    with the Kernel.
+    """
+
     def is_rust_enabled(self):
+        """Check if rust is enabled at .config"""
         config_path = os.path.join(self.srctree, ".config")
         if os.path.isfile(config_path):
             with open(config_path, "r", encoding="utf-8") as f:
@@ -37,37 +98,83 @@ class SphinxBuilder:
         return False
 
     def get_path(self, path, abs_path=False):
+        """
+        Ancillary routine to handle patches the right way, as shell does.
+
+        It first expands "~" and "~user". Then, if patch is not absolute,
+        join self.srctree. Finally, if requested, convert to abspath.
+        """
+
         path = os.path.expanduser(path)
         if not path.startswith("/"):
             path = os.path.join(self.srctree, path)
+
         if abs_path:
             return os.path.abspath(path)
+
         return path
 
     def __init__(self, verbose=False, n_jobs=None):
+        """Initialize internal variables"""
         self.verbose = None
+
+        #
+        # Normal variables passed from Kernel's makefile
+        #
         self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
         self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
         self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
         self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+
         if not verbose:
             verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
+
         if verbose is not None:
             self.verbose = verbose
+
+        #
+        # As we handle number of jobs and quiet in separate, we need to pick
+        # both the same way as sphinx-build would pick, optionally accepts
+        # whitespaces or not. So let's use argparse to handle argument expansion
+        #
         parser = argparse.ArgumentParser()
         parser.add_argument('-j', '--jobs', type=int)
         parser.add_argument('-q', '--quiet', type=int)
+
+        #
+        # Other sphinx-build arguments go as-is, so place them
+        # at self.sphinxopts, using shell parser
+        #
         sphinxopts = shlex.split(os.environ.get("SPHINXOPTS", ""))
+
+        #
+        # Build a list of sphinx args
+        #
         sphinx_args, self.sphinxopts = parser.parse_known_args(sphinxopts)
         if sphinx_args.quiet is True:
             self.verbose = False
+
         if sphinx_args.jobs:
             self.n_jobs = sphinx_args.jobs
+
+        #
+        # If the command line argument "-j" is used override SPHINXOPTS
+        #
+
         self.n_jobs = n_jobs
+
+        #
+        # Source tree directory. This needs to be at os.environ, as
+        # Sphinx extensions use it
+        #
         self.srctree = os.environ.get("srctree")
         if not self.srctree:
             self.srctree = "."
             os.environ["srctree"] = self.srctree
+
+        #
+        # Now that we can expand srctree, get other directories as well
+        #
         self.sphinxbuild = os.environ.get("SPHINXBUILD", "sphinx-build")
         self.kerneldoc = self.get_path(os.environ.get("KERNELDOC",
                                                       "scripts/kernel-doc.py"))
@@ -77,20 +184,36 @@ class SphinxBuilder:
 
         self.config_rust = self.is_rust_enabled()
 
+        #
+        # Get directory locations for LaTeX build toolchain
+        #
         self.pdflatex_cmd = shutil.which(self.pdflatex)
         self.latexmk_cmd = shutil.which("latexmk")
 
         self.env = os.environ.copy()
 
     def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
+        """
+        Executes sphinx-build using current python3 command and setting
+        -j parameter if possible to run the build in parallel.
+        """
+
         with JobserverExec() as jobserver:
             if jobserver.claim:
                 n_jobs = str(jobserver.claim)
             else:
                 n_jobs = "auto" # Supported since Sphinx 1.7
+
             cmd = []
+
             cmd.append(sys.executable)
+
             cmd.append(sphinx_build)
+
+            #
+            # Override auto setting, if explicitly passed from command line
+            # or via SPHINXOPTS
+            #
             if self.n_jobs:
                 n_jobs = str(self.n_jobs)
 
@@ -99,59 +222,100 @@ class SphinxBuilder:
 
             if not self.verbose:
                 cmd.append("-q")
+
             cmd += self.sphinxopts
             cmd += build_args
+
             if self.verbose:
                 print(" ".join(cmd))
             return subprocess.call(cmd, *args, **pwargs)
 
     def handle_html(self, css, output_dir):
+        """
+        Extra steps for HTML and epub output.
+
+        For such targets, we need to ensure that CSS will be properly
+        copied to the output _static directory
+        """
+
         if not css:
             return
+
         css = os.path.expanduser(css)
         if not css.startswith("/"):
             css = os.path.join(self.srctree, css)
+
         static_dir = os.path.join(output_dir, "_static")
         os.makedirs(static_dir, exist_ok=True)
+
         try:
             shutil.copy2(css, static_dir)
         except (OSError, IOError) as e:
             print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
 
     def handle_pdf(self, output_dirs):
+        """
+        Extra steps for PDF output.
+
+        As PDF is handled via a LaTeX output, after building the .tex file,
+        a new build is needed to create the PDF output from the latex
+        directory.
+        """
         builds = {}
         max_len = 0
+
         for from_dir in output_dirs:
             pdf_dir = os.path.join(from_dir, "../pdf")
             os.makedirs(pdf_dir, exist_ok=True)
+
             if self.latexmk_cmd:
                 latex_cmd = [self.latexmk_cmd, f"-{self.pdflatex}"]
             else:
                 latex_cmd = [self.pdflatex]
+
             latex_cmd.extend(shlex.split(self.latexopts))
+
             tex_suffix = ".tex"
+
+            #
+            # Process each .tex file
+            #
+
             has_tex = False
             build_failed = False
             with os.scandir(from_dir) as it:
                 for entry in it:
                     if not entry.name.endswith(tex_suffix):
                         continue
+
                     name = entry.name[:-len(tex_suffix)]
                     has_tex = True
+
+                    #
+                    # LaTeX PDF error code is almost useless for us:
+                    # any warning makes it non-zero. For kernel doc builds it
+                    # always return non-zero even when build succeeds.
+                    # So, let's do the best next thing: check if all PDF
+                    # files were built. If they're, print a summary and
+                    # return 0 at the end of this function
+                    #
                     try:
                         subprocess.run(latex_cmd + [entry.path],
                                        cwd=from_dir, check=True)
                     except subprocess.CalledProcessError:
                         pass
+
                     pdf_name = name + ".pdf"
                     pdf_from = os.path.join(from_dir, pdf_name)
                     pdf_to = os.path.join(pdf_dir, pdf_name)
+
                     if os.path.exists(pdf_from):
                         os.rename(pdf_from, pdf_to)
                         builds[name] = os.path.relpath(pdf_to, self.builddir)
                     else:
                         builds[name] = "FAILED"
                         build_failed = True
+
                     name = entry.name.removesuffix(".tex")
                     max_len = max(max_len, len(name))
 
@@ -160,58 +324,100 @@ class SphinxBuilder:
                 max_len = max(max_len, len(name))
                 builds[name] = "FAILED (no .tex)"
                 build_failed = True
+
         msg = "Summary"
         msg += "\n" + "=" * len(msg)
         print()
         print(msg)
+
         for pdf_name, pdf_file in builds.items():
             print(f"{pdf_name:<{max_len}}: {pdf_file}")
+
         print()
+
         if build_failed:
             sys.exit("PDF build failed: not all PDF files were created.")
         else:
             print("All PDF files were built.")
 
     def handle_info(self, output_dirs):
+        """
+        Extra steps for Info output.
+
+        For texinfo generation, an additional make is needed from the
+        texinfo directory.
+        """
+
         for output_dir in output_dirs:
             try:
                 subprocess.run(["make", "info"], cwd=output_dir, check=True)
             except subprocess.CalledProcessError as e:
                 sys.exit(f"Error generating info docs: {e}")
 
-    def cleandocs(self, builder):
+    def cleandocs(self, builder):           # pylint: disable=W0613
+        """Remove documentation output directory"""
         shutil.rmtree(self.builddir, ignore_errors=True)
 
     def build(self, target, sphinxdirs=None, conf="conf.py",
               theme=None, css=None, paper=None):
+        """
+        Build documentation using Sphinx. This is the core function of this
+        module. It prepares all arguments required by sphinx-build.
+        """
+
         builder = TARGETS[target]["builder"]
         out_dir = TARGETS[target].get("out_dir", "")
+
+        #
+        # Cleandocs doesn't require sphinx-build
+        #
         if target == "cleandocs":
             self.cleandocs(builder)
             return
+
         if theme:
-                os.environ["DOCS_THEME"] = theme
+            os.environ["DOCS_THEME"] = theme
+
+        #
+        # Other targets require sphinx-build, so check if it exists
+        #
         sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
         if not sphinxbuild:
             sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
+
         if builder == "latex":
             if not self.pdflatex_cmd and not self.latexmk_cmd:
                 sys.exit("Error: pdflatex or latexmk required for PDF generation")
+
         docs_dir = os.path.abspath(os.path.join(self.srctree, "Documentation"))
+
+        #
+        # Fill in base arguments for Sphinx build
+        #
         kerneldoc = self.kerneldoc
         if kerneldoc.startswith(self.srctree):
             kerneldoc = os.path.relpath(kerneldoc, self.srctree)
+
         args = [ "-b", builder, "-c", docs_dir ]
+
         if builder == "latex":
             if not paper:
                 paper = PAPER[1]
+
             args.extend(["-D", f"latex_elements.papersize={paper}paper"])
+
         if self.config_rust:
             args.extend(["-t", "rustdoc"])
+
         if conf:
             self.env["SPHINX_CONF"] = self.get_path(conf, abs_path=True)
+
         if not sphinxdirs:
             sphinxdirs = os.environ.get("SPHINXDIRS", ".")
+
+        #
+        # sphinxdirs can be a list or a whitespace-separated string
+        #
         sphinxdirs_list = []
         for sphinxdir in sphinxdirs:
             if isinstance(sphinxdir, list):
@@ -219,17 +425,32 @@ class SphinxBuilder:
             else:
                 for name in sphinxdir.split(" "):
                     sphinxdirs_list.append(name)
+
+        #
+        # Step 1:  Build each directory in separate.
+        #
+        # This is not the best way of handling it, as cross-references between
+        # them will be broken, but this is what we've been doing since
+        # the beginning.
+        #
         output_dirs = []
         for sphinxdir in sphinxdirs_list:
             src_dir = os.path.join(docs_dir, sphinxdir)
             doctree_dir = os.path.join(self.builddir, ".doctrees")
             output_dir = os.path.join(self.builddir, sphinxdir, out_dir)
+
+            #
+            # Make directory names canonical
+            #
             src_dir = os.path.normpath(src_dir)
             doctree_dir = os.path.normpath(doctree_dir)
             output_dir = os.path.normpath(output_dir)
+
             os.makedirs(doctree_dir, exist_ok=True)
             os.makedirs(output_dir, exist_ok=True)
+
             output_dirs.append(output_dir)
+
             build_args = args + [
                 "-d", doctree_dir,
                 "-D", f"kerneldoc_bin={kerneldoc}",
@@ -239,48 +460,80 @@ class SphinxBuilder:
                 src_dir,
                 output_dir,
             ]
+
             try:
                 self.run_sphinx(sphinxbuild, build_args, env=self.env)
             except (OSError, ValueError, subprocess.SubprocessError) as e:
                 sys.exit(f"Build failed: {repr(e)}")
+
+            #
+            # Ensure that each html/epub output will have needed static files
+            #
             if target in ["htmldocs", "epubdocs"]:
                 self.handle_html(css, output_dir)
+
+        #
+        # Step 2: Some targets (PDF and info) require an extra step once
+        #         sphinx-build finishes
+        #
         if target == "pdfdocs":
             self.handle_pdf(output_dirs)
         elif target == "infodocs":
             self.handle_info(output_dirs)
 
 def jobs_type(value):
+    """
+    Handle valid values for -j. Accepts Sphinx "-jauto", plus a number
+    equal or bigger than one.
+    """
     if value is None:
         return None
+
     if value.lower() == 'auto':
         return value.lower()
+
     try:
         if int(value) >= 1:
             return value
+
         raise argparse.ArgumentTypeError(f"Minimum jobs is 1, got {value}")
     except ValueError:
-        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")
+        raise argparse.ArgumentTypeError(f"Must be 'auto' or positive integer, got {value}")  # pylint: disable=W0707
 
 def main():
+    """
+    Main function. The only mandatory argument is the target. If not
+    specified, the other arguments will use default values if not
+    specified at os.environ.
+    """
     parser = argparse.ArgumentParser(description="Kernel documentation builder")
+
     parser.add_argument("target", choices=list(TARGETS.keys()),
                         help="Documentation target to build")
     parser.add_argument("--sphinxdirs", nargs="+",
                         help="Specific directories to build")
     parser.add_argument("--conf", default="conf.py",
                         help="Sphinx configuration file")
+
     parser.add_argument("--theme", help="Sphinx theme to use")
+
     parser.add_argument("--css", help="Custom CSS file for HTML/EPUB")
+
     parser.add_argument("--paper", choices=PAPER, default=PAPER[0],
                         help="Paper size for LaTeX/PDF output")
+
     parser.add_argument("-v", "--verbose", action='store_true',
                         help="place build in verbose mode")
+
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
+
     args = parser.parse_args()
+
     PythonVersion.check_python(MIN_PYTHON_VERSION)
+
     builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
 
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Sometimes, it is desired to run Sphinx from a virtual environment.
Add a command line parameter to automatically build Sphinx from
such environment.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 31 ++++++++++++++++++++++++++++---
 1 file changed, 28 insertions(+), 3 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index e9c522794fbe..9c5df50fb99d 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -63,6 +63,7 @@ from jobserver import JobserverExec         # pylint: disable=C0413,C0411,E0401
 #
 #  Some constants
 #
+VENV_DEFAULT = "sphinx_latest"
 MIN_PYTHON_VERSION = PythonVersion("3.7").version
 PAPER = ["", "a4", "letter"]
 
@@ -114,8 +115,9 @@ class SphinxBuilder:
 
         return path
 
-    def __init__(self, verbose=False, n_jobs=None):
+    def __init__(self, venv=None, verbose=False, n_jobs=None):
         """Initialize internal variables"""
+        self.venv = venv
         self.verbose = None
 
         #
@@ -192,6 +194,21 @@ class SphinxBuilder:
 
         self.env = os.environ.copy()
 
+        #
+        # If venv command line argument is specified, run Sphinx from venv
+        #
+        if venv:
+            bin_dir = os.path.join(venv, "bin")
+            if not os.path.isfile(os.path.join(bin_dir, "activate")):
+                sys.exit(f"Venv {venv} not found.")
+
+            # "activate" virtual env
+            self.env["PATH"] = bin_dir + ":" + self.env["PATH"]
+            self.env["VIRTUAL_ENV"] = venv
+            if "PYTHONHOME" in self.env:
+                del self.env["PYTHONHOME"]
+            print(f"Setting venv to {venv}")
+
     def run_sphinx(self, sphinx_build, build_args, *args, **pwargs):
         """
         Executes sphinx-build using current python3 command and setting
@@ -206,7 +223,10 @@ class SphinxBuilder:
 
             cmd = []
 
-            cmd.append(sys.executable)
+            if self.venv:
+                cmd.append("python")
+            else:
+                cmd.append(sys.executable)
 
             cmd.append(sphinx_build)
 
@@ -528,11 +548,16 @@ def main():
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
 
+    parser.add_argument("-V", "--venv", nargs='?', const=f'{VENV_DEFAULT}',
+                        default=None,
+                        help=f'If used, run Sphinx from a venv dir (default dir: {VENV_DEFAULT})')
+
     args = parser.parse_args()
 
     PythonVersion.check_python(MIN_PYTHON_VERSION)
 
-    builder = SphinxBuilder(verbose=args.verbose, n_jobs=args.jobs)
+    builder = SphinxBuilder(venv=args.venv, verbose=args.verbose,
+                            n_jobs=args.jobs)
 
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 09/15] docs: parallel-wrapper.sh: remove script
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

The only usage of this script was docs Makefile. Now that
it is using the new sphinx-build-wrapper, which has inside
the code from parallel-wrapper.sh, we can drop this script.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parallel-wrapper.sh | 33 ------------------------
 1 file changed, 33 deletions(-)
 delete mode 100644 Documentation/sphinx/parallel-wrapper.sh

diff --git a/Documentation/sphinx/parallel-wrapper.sh b/Documentation/sphinx/parallel-wrapper.sh
deleted file mode 100644
index e54c44ce117d..000000000000
--- a/Documentation/sphinx/parallel-wrapper.sh
+++ /dev/null
@@ -1,33 +0,0 @@
-#!/bin/sh
-# SPDX-License-Identifier: GPL-2.0+
-#
-# Figure out if we should follow a specific parallelism from the make
-# environment (as exported by scripts/jobserver-exec), or fall back to
-# the "auto" parallelism when "-jN" is not specified at the top-level
-# "make" invocation.
-
-sphinx="$1"
-shift || true
-
-parallel="$PARALLELISM"
-if [ -z "$parallel" ] ; then
-	# If no parallelism is specified at the top-level make, then
-	# fall back to the expected "-jauto" mode that the "htmldocs"
-	# target has had.
-	auto=$(perl -e 'open IN,"'"$sphinx"' --version 2>&1 |";
-			while (<IN>) {
-				if (m/([\d\.]+)/) {
-					print "auto" if ($1 >= "1.7")
-				}
-			}
-			close IN')
-	if [ -n "$auto" ] ; then
-		parallel="$auto"
-	fi
-fi
-# Only if some parallelism has been determined do we add the -jN option.
-if [ -n "$parallel" ] ; then
-	parallel="-j$parallel"
-fi
-
-exec "$sphinx" $parallel "$@"
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

While the build system supports this for a long time, this was
never documented. Add a documentation for it.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 2b0ed8cd5ea8..3e1cb44a5fbb 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -124,4 +124,6 @@ dochelp:
 	@echo
 	@echo  '  make DOCS_CSS={a .css file} adds a DOCS_CSS override file for html/epub output.'
 	@echo
+	@echo  '  make PAPER={a4|letter} Specifies the paper size used for LaTeX/PDF output.'
+	@echo
 	@echo  '  Default location for the generated documents is Documentation/output'
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

By default, we use LaTeX batch mode to build docs. This way, when
an error happens, the build fails. This is good for normal builds,
but when debugging problems with pdf generation, the best is to
use interactive mode.

We already support it via LATEXOPTS, but having a command line
argument makes it easier:

Interactive mode:
	./scripts/sphinx-build-wrapper pdfdocs --sphinxdirs peci -v -i
	...
	Running 'xelatex --no-pdf  -no-pdf -recorder  ".../Documentation/output/peci/latex/peci.tex"'
	...

Default batch mode:
        ./scripts/sphinx-build-wrapper pdfdocs --sphinxdirs peci -v
	...
	Running 'xelatex --no-pdf  -no-pdf -interaction=batchmode -no-shell-escape -recorder  ".../Documentation/output/peci/latex/peci.tex"'
	...

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 13 ++++++++++---
 1 file changed, 10 insertions(+), 3 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 9c5df50fb99d..4e87584a92cd 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -115,7 +115,7 @@ class SphinxBuilder:
 
         return path
 
-    def __init__(self, venv=None, verbose=False, n_jobs=None):
+    def __init__(self, venv=None, verbose=False, n_jobs=None, interactive=None):
         """Initialize internal variables"""
         self.venv = venv
         self.verbose = None
@@ -126,7 +126,11 @@ class SphinxBuilder:
         self.kernelversion = os.environ.get("KERNELVERSION", "unknown")
         self.kernelrelease = os.environ.get("KERNELRELEASE", "unknown")
         self.pdflatex = os.environ.get("PDFLATEX", "xelatex")
-        self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+
+        if not interactive:
+            self.latexopts = os.environ.get("LATEXOPTS", "-interaction=batchmode -no-shell-escape")
+        else:
+            self.latexopts = os.environ.get("LATEXOPTS", "")
 
         if not verbose:
             verbose = bool(os.environ.get("KBUILD_VERBOSE", "") != "")
@@ -548,6 +552,9 @@ def main():
     parser.add_argument('-j', '--jobs', type=jobs_type,
                         help="Sets number of jobs to use with sphinx-build")
 
+    parser.add_argument('-i', '--interactive', action='store_true',
+                        help="Change latex default to run in interactive mode")
+
     parser.add_argument("-V", "--venv", nargs='?', const=f'{VENV_DEFAULT}',
                         default=None,
                         help=f'If used, run Sphinx from a venv dir (default dir: {VENV_DEFAULT})')
@@ -557,7 +564,7 @@ def main():
     PythonVersion.check_python(MIN_PYTHON_VERSION)
 
     builder = SphinxBuilder(venv=args.venv, verbose=args.verbose,
-                            n_jobs=args.jobs)
+                            n_jobs=args.jobs, interactive=args.interactive)
 
     builder.build(args.target, sphinxdirs=args.sphinxdirs, conf=args.conf,
                   theme=args.theme, css=args.css, paper=args.paper)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

On a properly set system, LANG and LC_ALL is always defined.
However, some distros like Debian, Gentoo and their variants
start with those undefioned.

When Sphinx tries to set a locale with:

	locale.setlocale(locale.LC_ALL, '')

It raises an exception, making Sphinx fail. This is more likely
to happen with test containers.

Add a logic to detect and workaround such issue by setting
locale to C.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 11 +++++++++++
 tools/docs/sphinx-pre-install   | 14 +++++++++++++-
 2 files changed, 24 insertions(+), 1 deletion(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 4e87584a92cd..580582a76c93 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -45,6 +45,7 @@ the newer version.
 """
 
 import argparse
+import locale
 import os
 import shlex
 import shutil
@@ -439,6 +440,16 @@ class SphinxBuilder:
         if not sphinxdirs:
             sphinxdirs = os.environ.get("SPHINXDIRS", ".")
 
+        #
+        # The sphinx-build tool has a bug: internally, it tries to set
+        # locale with locale.setlocale(locale.LC_ALL, ''). This causes a
+        # crash if language is not set. Detect and fix it.
+        #
+        try:
+            locale.setlocale(locale.LC_ALL, '')
+        except locale.Error:
+            self.env["LC_ALL"] = "C"
+
         #
         # sphinxdirs can be a list or a whitespace-separated string
         #
diff --git a/tools/docs/sphinx-pre-install b/tools/docs/sphinx-pre-install
index d6d673b7945c..663d4e2a3f57 100755
--- a/tools/docs/sphinx-pre-install
+++ b/tools/docs/sphinx-pre-install
@@ -26,6 +26,7 @@ system pacage install is recommended.
 """
 
 import argparse
+import locale
 import os
 import re
 import subprocess
@@ -422,8 +423,19 @@ class MissingCheckers(AncillaryMethods):
         """
         Gets sphinx-build version.
         """
+        env = os.environ.copy()
+
+        # The sphinx-build tool has a bug: internally, it tries to set
+        # locale with locale.setlocale(locale.LC_ALL, ''). This causes a
+        # crash if language is not set. Detect and fix it.
         try:
-            result = self.run([cmd, "--version"],
+            locale.setlocale(locale.LC_ALL, '')
+        except Exception:
+            env["LC_ALL"] = "C"
+            env["LANG"] = "C"
+
+        try:
+            result = self.run([cmd, "--version"], env=env,
                               stdout=subprocess.PIPE,
                               stderr=subprocess.STDOUT,
                               text=True, check=True)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

Use POSIX jobserver when available or -j<number> to run PDF
builds in parallel, restoring pdf build performance. Yet,
running it when debugging troubles is a bad idea, so, when
calling directly via command line, except if "-j" is splicitly
requested, it will serialize the build.

With such change, a PDF doc builds now takes around 5 minutes
on a Ryzen 9 machine with 32 cpu threads:

	# Explicitly paralelize both Sphinx and LaTeX pdf builds
	$ make cleandocs; time scripts/sphinx-build-wrapper pdfdocs -j 33

	real	5m17.901s
	user	15m1.499s
	sys	2m31.482s

	# Use POSIX jobserver to paralelize both sphinx-build and LaTeX
	$ make cleandocs; time make pdfdocs

	real	5m22.369s
	user	15m9.076s
	sys	2m31.419s

	# Serializes PDF build, while keeping Sphinx parallelized.
	# it is equivalent of passing -jauto via command line
	$ make cleandocs; time scripts/sphinx-build-wrapper pdfdocs

	real	11m20.901s
	user	13m2.910s
	sys	1m44.553s

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/docs/sphinx-build-wrapper | 158 +++++++++++++++++++++++---------
 1 file changed, 117 insertions(+), 41 deletions(-)

diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index 580582a76c93..c884022ad733 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -52,6 +52,8 @@ import shutil
 import subprocess
 import sys
 
+from concurrent import futures
+
 from lib.python_version import PythonVersion
 
 LIB_DIR = "../../scripts/lib"
@@ -278,6 +280,82 @@ class SphinxBuilder:
         except (OSError, IOError) as e:
             print(f"Warning: Failed to copy CSS: {e}", file=sys.stderr)
 
+    def build_pdf_file(self, latex_cmd, from_dir, path):
+        """Builds a single pdf file using latex_cmd"""
+        try:
+            subprocess.run(latex_cmd + [path],
+                            cwd=from_dir, check=True)
+
+            return True
+        except subprocess.CalledProcessError:
+            return False
+
+    def pdf_parallel_build(self, tex_suffix, latex_cmd, tex_files, n_jobs):
+        """Build PDF files in parallel if possible"""
+        builds = {}
+        build_failed = False
+        max_len = 0
+        has_tex = False
+
+        #
+        # LaTeX PDF error code is almost useless for us:
+        # any warning makes it non-zero. For kernel doc builds it always return
+        # non-zero even when build succeeds. So, let's do the best next thing:
+        # Ignore build errors. At the end, check if all PDF files were built,
+        # printing a summary with the built ones and returning 0 if all of
+        # them were actually built.
+        #
+        with futures.ThreadPoolExecutor(max_workers=n_jobs) as executor:
+            jobs = {}
+
+            for from_dir, pdf_dir, entry in tex_files:
+                name = entry.name
+
+                if not name.endswith(tex_suffix):
+                    continue
+
+                name = name[:-len(tex_suffix)]
+
+                max_len = max(max_len, len(name))
+
+                has_tex = True
+
+                future = executor.submit(self.build_pdf_file, latex_cmd,
+                                         from_dir, entry.path)
+                jobs[future] = (from_dir, pdf_dir, name)
+
+            for future in futures.as_completed(jobs):
+                from_dir, pdf_dir, name = jobs[future]
+
+                pdf_name = name + ".pdf"
+                pdf_from = os.path.join(from_dir, pdf_name)
+
+                try:
+                    success = future.result()
+
+                    if success and os.path.exists(pdf_from):
+                        pdf_to = os.path.join(pdf_dir, pdf_name)
+
+                        os.rename(pdf_from, pdf_to)
+                        builds[name] = os.path.relpath(pdf_to, self.builddir)
+                    else:
+                        builds[name] = "FAILED"
+                        build_failed = True
+                except futures.Error as e:
+                    builds[name] = f"FAILED ({repr(e)})"
+                    build_failed = True
+
+        #
+        # Handle case where no .tex files were found
+        #
+        if not has_tex:
+            name = "Sphinx LaTeX builder"
+            max_len = max(max_len, len(name))
+            builds[name] = "FAILED (no .tex file was generated)"
+            build_failed = True
+
+        return builds, build_failed, max_len
+
     def handle_pdf(self, output_dirs):
         """
         Extra steps for PDF output.
@@ -288,7 +366,9 @@ class SphinxBuilder:
         """
         builds = {}
         max_len = 0
+        tex_suffix = ".tex"
 
+        tex_files = []
         for from_dir in output_dirs:
             pdf_dir = os.path.join(from_dir, "../pdf")
             os.makedirs(pdf_dir, exist_ok=True)
@@ -300,55 +380,51 @@ class SphinxBuilder:
 
             latex_cmd.extend(shlex.split(self.latexopts))
 
-            tex_suffix = ".tex"
-
-            #
-            # Process each .tex file
-            #
-
-            has_tex = False
-            build_failed = False
+            # Get a list of tex files to process
             with os.scandir(from_dir) as it:
                 for entry in it:
-                    if not entry.name.endswith(tex_suffix):
-                        continue
+                    if entry.name.endswith(tex_suffix):
+                        tex_files.append((from_dir, pdf_dir, entry))
 
-                    name = entry.name[:-len(tex_suffix)]
-                    has_tex = True
+        #
+        # When using make, this won't be used, as the number of jobs comes
+        # from POSIX jobserver. So, this covers the case where build comes
+        # from command line. On such case, serialize by default, except if
+        # the user explicitly sets the number of jobs.
+        #
+        n_jobs = 1
 
-                    #
-                    # LaTeX PDF error code is almost useless for us:
-                    # any warning makes it non-zero. For kernel doc builds it
-                    # always return non-zero even when build succeeds.
-                    # So, let's do the best next thing: check if all PDF
-                    # files were built. If they're, print a summary and
-                    # return 0 at the end of this function
-                    #
-                    try:
-                        subprocess.run(latex_cmd + [entry.path],
-                                       cwd=from_dir, check=True)
-                    except subprocess.CalledProcessError:
-                        pass
+        # n_jobs is either an integer or "auto". Only use it if it is a number
+        if self.n_jobs:
+            try:
+                n_jobs = int(self.n_jobs)
+            except ValueError:
+                pass
 
-                    pdf_name = name + ".pdf"
-                    pdf_from = os.path.join(from_dir, pdf_name)
-                    pdf_to = os.path.join(pdf_dir, pdf_name)
+        #
+        # When using make, jobserver.claim is the number of jobs that were
+        # used with "-j" and that aren't used by other make targets
+        #
+        with JobserverExec() as jobserver:
+            n_jobs = 1
 
-                    if os.path.exists(pdf_from):
-                        os.rename(pdf_from, pdf_to)
-                        builds[name] = os.path.relpath(pdf_to, self.builddir)
-                    else:
-                        builds[name] = "FAILED"
-                        build_failed = True
+            #
+            # Handle the case when a parameter is passed via command line,
+            # using it as default, if jobserver doesn't claim anything
+            #
+            if self.n_jobs:
+                try:
+                    n_jobs = int(self.n_jobs)
+                except ValueError:
+                    pass
 
-                    name = entry.name.removesuffix(".tex")
-                    max_len = max(max_len, len(name))
+            if jobserver.claim:
+                n_jobs = jobserver.claim
 
-            if not has_tex:
-                name = os.path.basename(from_dir)
-                max_len = max(max_len, len(name))
-                builds[name] = "FAILED (no .tex)"
-                build_failed = True
+            builds, build_failed, max_len = self.pdf_parallel_build(tex_suffix,
+                                                                    latex_cmd,
+                                                                    tex_files,
+                                                                    n_jobs)
 
         msg = "Summary"
         msg += "\n" + "=" * len(msg)
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 14/15] docs: add support to build manpages from kerneldoc output
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (12 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  2025-09-01 14:42 ` [PATCH v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Thomas Weißschuh, Alice Ryhl,
	Masahiro Yamada, Mauro Carvalho Chehab, Miguel Ojeda,
	Nathan Chancellor, Nicolas Schier, Randy Dunlap, Tamir Duberstein,
	linux-kbuild, linux-kernel

Generating man files currently requires running a separate
script. The target also doesn't appear at the docs Makefile.

Add support for mandocs at the Makefile, adding the build
logic inside sphinx-build-wrapper, updating documentation
and dropping the ancillary script.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                 |  3 +-
 Documentation/doc-guide/kernel-doc.rst | 29 ++++-----
 Makefile                               |  2 +-
 scripts/split-man.pl                   | 28 ---------
 tools/docs/sphinx-build-wrapper        | 81 ++++++++++++++++++++++++--
 5 files changed, 95 insertions(+), 48 deletions(-)
 delete mode 100755 scripts/split-man.pl

diff --git a/Documentation/Makefile b/Documentation/Makefile
index 3e1cb44a5fbb..22e39e5ed07d 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -53,7 +53,7 @@ ifeq ($(HAVE_SPHINX),0)
 else # HAVE_SPHINX
 
 # Common documentation targets
-infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
+mandocs infodocs texinfodocs latexdocs epubdocs xmldocs pdfdocs linkcheckdocs:
 	$(Q)@$(srctree)/tools/docs/sphinx-pre-install --version-check
 	+$(Q)$(PYTHON3) $(BUILD_WRAPPER) $@ \
 		--sphinxdirs="$(SPHINXDIRS)" --conf=$(SPHINX_CONF) \
@@ -104,6 +104,7 @@ dochelp:
 	@echo  '  htmldocs        - HTML'
 	@echo  '  texinfodocs     - Texinfo'
 	@echo  '  infodocs        - Info'
+	@echo  '  mandocs         - Man pages'
 	@echo  '  latexdocs       - LaTeX'
 	@echo  '  pdfdocs         - PDF'
 	@echo  '  epubdocs        - EPUB'
diff --git a/Documentation/doc-guide/kernel-doc.rst b/Documentation/doc-guide/kernel-doc.rst
index af9697e60165..4370cc8fbcf5 100644
--- a/Documentation/doc-guide/kernel-doc.rst
+++ b/Documentation/doc-guide/kernel-doc.rst
@@ -579,20 +579,23 @@ source.
 How to use kernel-doc to generate man pages
 -------------------------------------------
 
-If you just want to use kernel-doc to generate man pages you can do this
-from the kernel git tree::
+To generate man pages for all files that contain kernel-doc markups, run::
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- :^Documentation :^tools) \
-    | scripts/split-man.pl /tmp/man
+  $ make mandocs
 
-Some older versions of git do not support some of the variants of syntax for
-path exclusion.  One of the following commands may work for those versions::
+Or calling ``script-build-wrapper`` directly::
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- . ':!Documentation' ':!tools') \
-    | scripts/split-man.pl /tmp/man
+  $ ./tools/docs/sphinx-build-wrapper mandocs
 
-  $ scripts/kernel-doc -man \
-    $(git grep -l '/\*\*' -- . ":(exclude)Documentation" ":(exclude)tools") \
-    | scripts/split-man.pl /tmp/man
+The output will be at ``/man`` directory inside the output directory
+(by default: ``Documentation/output``).
+
+Optionally, it is possible to generate a partial set of man pages by
+using SPHINXDIRS:
+
+  $ make SPHINXDIRS=driver-api/media mandocs
+
+.. note::
+
+   When SPHINXDIRS={subdir} is used, it will only generate man pages for
+   the files explicitly inside a ``Documentation/{subdir}/.../*.rst`` file.
diff --git a/Makefile b/Makefile
index 6bfe776bf3c5..9bd44afeda26 100644
--- a/Makefile
+++ b/Makefile
@@ -1800,7 +1800,7 @@ $(help-board-dirs): help-%:
 # Documentation targets
 # ---------------------------------------------------------------------------
 DOC_TARGETS := xmldocs latexdocs pdfdocs htmldocs epubdocs cleandocs \
-	       linkcheckdocs dochelp refcheckdocs texinfodocs infodocs
+	       linkcheckdocs dochelp refcheckdocs texinfodocs infodocs mandocs
 PHONY += $(DOC_TARGETS)
 $(DOC_TARGETS):
 	$(Q)$(MAKE) $(build)=Documentation $@
diff --git a/scripts/split-man.pl b/scripts/split-man.pl
deleted file mode 100755
index 96bd99dc977a..000000000000
--- a/scripts/split-man.pl
+++ /dev/null
@@ -1,28 +0,0 @@
-#!/usr/bin/env perl
-# SPDX-License-Identifier: GPL-2.0
-#
-# Author: Mauro Carvalho Chehab <mchehab+samsung@kernel.org>
-#
-# Produce manpages from kernel-doc.
-# See Documentation/doc-guide/kernel-doc.rst for instructions
-
-if ($#ARGV < 0) {
-   die "where do I put the results?\n";
-}
-
-mkdir $ARGV[0],0777;
-$state = 0;
-while (<STDIN>) {
-    if (/^\.TH \"[^\"]*\" 9 \"([^\"]*)\"/) {
-	if ($state == 1) { close OUT }
-	$state = 1;
-	$fn = "$ARGV[0]/$1.9";
-	print STDERR "Creating $fn\n";
-	open OUT, ">$fn" or die "can't open $fn: $!\n";
-	print OUT $_;
-    } elsif ($state != 0) {
-	print OUT $_;
-    }
-}
-
-close OUT;
diff --git a/tools/docs/sphinx-build-wrapper b/tools/docs/sphinx-build-wrapper
index c884022ad733..932b1b675274 100755
--- a/tools/docs/sphinx-build-wrapper
+++ b/tools/docs/sphinx-build-wrapper
@@ -47,6 +47,7 @@ the newer version.
 import argparse
 import locale
 import os
+import re
 import shlex
 import shutil
 import subprocess
@@ -55,6 +56,7 @@ import sys
 from concurrent import futures
 
 from lib.python_version import PythonVersion
+from glob import glob
 
 LIB_DIR = "../../scripts/lib"
 SRC_DIR = os.path.dirname(os.path.realpath(__file__))
@@ -77,6 +79,7 @@ TARGETS = {
     "epubdocs":      { "builder": "epub",    "out_dir": "epub" },
     "texinfodocs":   { "builder": "texinfo", "out_dir": "texinfo" },
     "infodocs":      { "builder": "texinfo", "out_dir": "texinfo" },
+    "mandocs":       { "builder": "man",     "out_dir": "man" },
     "latexdocs":     { "builder": "latex",   "out_dir": "latex" },
     "pdfdocs":       { "builder": "latex",   "out_dir": "latex" },
     "xmldocs":       { "builder": "xml",     "out_dir": "xml" },
@@ -455,6 +458,71 @@ class SphinxBuilder:
             except subprocess.CalledProcessError as e:
                 sys.exit(f"Error generating info docs: {e}")
 
+    def handle_man(self, kerneldoc, docs_dir, src_dir, output_dir):
+        """
+        Create man pages from kernel-doc output
+        """
+
+        re_kernel_doc = re.compile(r"^\.\.\s+kernel-doc::\s*(\S+)")
+        re_man = re.compile(r'^\.TH "[^"]*" (\d+) "([^"]*)"')
+
+        if docs_dir == src_dir:
+            #
+            # Pick the entire set of kernel-doc markups from the entire tree
+            #
+            kdoc_files = set([self.srctree])
+        else:
+            kdoc_files = set()
+
+            for fname in glob(os.path.join(src_dir, "**"), recursive=True):
+                if os.path.isfile(fname) and fname.endswith(".rst"):
+                    with open(fname, "r", encoding="utf-8") as in_fp:
+                        data = in_fp.read()
+
+                    for line in data.split("\n"):
+                        match = re_kernel_doc.match(line)
+                        if match:
+                            if os.path.isfile(match.group(1)):
+                                kdoc_files.add(match.group(1))
+
+        if not kdoc_files:
+                sys.exit(f"Directory {src_dir} doesn't contain kernel-doc tags")
+
+        cmd = [ kerneldoc, "-m" ] + sorted(kdoc_files)
+        try:
+            if self.verbose:
+                print(" ".join(cmd))
+
+            result = subprocess.run(cmd, stdout=subprocess.PIPE, text= True)
+
+            if result.returncode:
+                print(f"Warning: kernel-doc returned {result.returncode} warnings")
+
+        except (OSError, ValueError, subprocess.SubprocessError) as e:
+            sys.exit(f"Failed to create man pages for {src_dir}: {repr(e)}")
+
+        fp = None
+        try:
+            for line in result.stdout.split("\n"):
+                match = re_man.match(line)
+                if not match:
+                    if fp:
+                        fp.write(line + '\n')
+                    continue
+
+                if fp:
+                    fp.close()
+
+                fname = f"{output_dir}/{match.group(2)}.{match.group(1)}"
+
+                if self.verbose:
+                    print(f"Creating {fname}")
+                fp = open(fname, "w", encoding="utf-8")
+                fp.write(line + '\n')
+        finally:
+            if fp:
+                fp.close()
+
     def cleandocs(self, builder):           # pylint: disable=W0613
         """Remove documentation output directory"""
         shutil.rmtree(self.builddir, ignore_errors=True)
@@ -483,7 +551,7 @@ class SphinxBuilder:
         # Other targets require sphinx-build, so check if it exists
         #
         sphinxbuild = shutil.which(self.sphinxbuild, path=self.env["PATH"])
-        if not sphinxbuild:
+        if not sphinxbuild and target != "mandocs":
             sys.exit(f"Error: {self.sphinxbuild} not found in PATH.\n")
 
         if builder == "latex":
@@ -572,10 +640,13 @@ class SphinxBuilder:
                 output_dir,
             ]
 
-            try:
-                self.run_sphinx(sphinxbuild, build_args, env=self.env)
-            except (OSError, ValueError, subprocess.SubprocessError) as e:
-                sys.exit(f"Build failed: {repr(e)}")
+            if target == "mandocs":
+                self.handle_man(kerneldoc, docs_dir, src_dir, output_dir)
+            else:
+                try:
+                    self.run_sphinx(sphinxbuild, build_args, env=self.env)
+                except (OSError, ValueError, subprocess.SubprocessError) as e:
+                    sys.exit(f"Build failed: {repr(e)}")
 
             #
             # Ensure that each html/epub output will have needed static files
-- 
2.51.0


^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v3 15/15] tools: kernel-doc: add a see also section at man pages
  2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
                   ` (13 preceding siblings ...)
  2025-09-01 14:42 ` [PATCH v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
@ 2025-09-01 14:42 ` Mauro Carvalho Chehab
  14 siblings, 0 replies; 16+ messages in thread
From: Mauro Carvalho Chehab @ 2025-09-01 14:42 UTC (permalink / raw)
  To: Jonathan Corbet, Linux Doc Mailing List
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel

V2hpbGUgY3Jvc3MtcmVmZXJlbmNlcyBhcmUgY29tcGxleCwgYXMgcmVsYXRlZCBvbmVzIGNhbiBi
ZSBvbgpkaWZmZXJlbnQgZmlsZXMsIHdlIGNhbiBhdCBsZWFzdCBjb3JyZWxhdGUgdGhlIG9uZXMg
dGhhdCBiZWxvbmcKdG8gdGhlIHNhbWUgZmlsZSwgYWRkaW5nIGEgU0VFIEFMU08gc2VjdGlvbiBm
b3IgdGhlbS4KClRoZSByZXN1bHQgaXMgbm90IGJhZC4gU2VlIGZvciBpbnN0YW5jZToKCgkkIHRv
b2xzL2RvY3Mvc3BoaW54LWJ1aWxkLXdyYXBwZXIgLS1zcGhpbnhkaXJzIGRyaXZlci1hcGkvbWVk
aWEgLS0gbWFuZG9jcwoJJCBtYW4gRG9jdW1lbnRhdGlvbi9vdXRwdXQvZHJpdmVyLWFwaS9tYW4v
ZWRhY19wY2lfYWRkX2RldmljZS45CgoJZWRhY19wY2lfYWRkX2RldmljZSg5KSAgS2VybmVsIEhh
Y2tlcidzIE1hbnVhbCAgZWRhY19wY2lfYWRkX2RldmljZSg5KQoKCU5BTUUKCSAgICAgICBlZGFj
X3BjaV9hZGRfZGV2aWNlICAtIEluc2VydCB0aGUgJ2VkYWNfZGV2JyBzdHJ1Y3R1cmUgaW50byB0
aGUKCSAgICAgICBlZGFjX3BjaSBnbG9iYWwgbGlzdCBhbmQgY3JlYXRlIHN5c2ZzIGVudHJpZXMg
IGFzc29jaWF0ZWQgIHdpdGgKCSAgICAgICBlZGFjX3BjaSBzdHJ1Y3R1cmUuCgoJU1lOT1BTSVMK
CSAgICAgICBpbnQgIGVkYWNfcGNpX2FkZF9kZXZpY2UgIChzdHJ1Y3QgIGVkYWNfcGNpX2N0bF9p
bmZvICpwY2kgLCBpbnQKCSAgICAgICBlZGFjX2lkeCApOwoKCUFSR1VNRU5UUwoJICAgICAgIHBj
aSAgICAgICAgIHBvaW50ZXIgdG8gdGhlIGVkYWNfZGV2aWNlIHN0cnVjdHVyZSB0byBiZSBhZGRl
ZCB0bwoJICAgICAgICAgICAgICAgICAgIHRoZSBsaXN0CgoJICAgICAgIGVkYWNfaWR4ICAgIEEg
dW5pcXVlIG51bWVyaWMgaWRlbnRpZmllciB0byBiZSBhc3NpZ25lZCB0byB0aGUKCglSRVRVUk4K
CSAgICAgICAwIG9uIFN1Y2Nlc3MsIG9yIGFuIGVycm9yIGNvZGUgb24gZmFpbHVyZQoKCVNFRSBB
TFNPCgkgICAgICAgZWRhY19wY2lfYWxsb2NfY3RsX2luZm8oOSksICAgICAgICAgIGVkYWNfcGNp
X2ZyZWVfY3RsX2luZm8oOSksCgkgICAgICAgZWRhY19wY2lfYWxsb2NfaW5kZXgoOSksICBlZGFj
X3BjaV9kZWxfZGV2aWNlKDkpLCBlZGFjX3BjaV9jcmXigJAKCSAgICAgICBhdGVfZ2VuZXJpY19j
dGwoOSksICAgICAgICAgICAgZWRhY19wY2lfcmVsZWFzZV9nZW5lcmljX2N0bCg5KSwKCSAgICAg
ICBlZGFjX3BjaV9jcmVhdGVfc3lzZnMoOSksIGVkYWNfcGNpX3JlbW92ZV9zeXNmcyg5KQoKCUF1
Z3VzdCAyMDI1ICAgICAgICAgICAgICAgZWRhY19wY2lfYWRkX2RldmljZSAgIGVkYWNfcGNpX2Fk
ZF9kZXZpY2UoOSkKClNpZ25lZC1vZmYtYnk6IE1hdXJvIENhcnZhbGhvIENoZWhhYiA8bWNoZWhh
YitodWF3ZWlAa2VybmVsLm9yZz4KLS0tCiBzY3JpcHRzL2xpYi9rZG9jL2tkb2NfZmlsZXMucHkg
IHwgIDUgKy0KIHNjcmlwdHMvbGliL2tkb2Mva2RvY19vdXRwdXQucHkgfCA4NCArKysrKysrKysr
KysrKysrKysrKysrKysrKysrKysrLS0KIDIgZmlsZXMgY2hhbmdlZCwgODMgaW5zZXJ0aW9ucygr
KSwgNiBkZWxldGlvbnMoLSkKCmRpZmYgLS1naXQgYS9zY3JpcHRzL2xpYi9rZG9jL2tkb2NfZmls
ZXMucHkgYi9zY3JpcHRzL2xpYi9rZG9jL2tkb2NfZmlsZXMucHkKaW5kZXggOWUwOWI0NWIwMmZh
Li4wNjFjMDMzZjMyZGEgMTAwNjQ0Ci0tLSBhL3NjcmlwdHMvbGliL2tkb2Mva2RvY19maWxlcy5w
eQorKysgYi9zY3JpcHRzL2xpYi9rZG9jL2tkb2NfZmlsZXMucHkKQEAgLTI3NSw3ICsyNzUsMTAg
QEAgY2xhc3MgS2VybmVsRmlsZXMoKToKICAgICAgICAgICAgICAgICBzZWxmLmNvbmZpZy5sb2cu
d2FybmluZygiTm8ga2VybmVsLWRvYyBmb3IgZmlsZSAlcyIsIGZuYW1lKQogICAgICAgICAgICAg
ICAgIGNvbnRpbnVlCiAKLSAgICAgICAgICAgIGZvciBhcmcgaW4gc2VsZi5yZXN1bHRzW2ZuYW1l
XToKKyAgICAgICAgICAgIHN5bWJvbHMgPSBzZWxmLnJlc3VsdHNbZm5hbWVdCisgICAgICAgICAg
ICBzZWxmLm91dF9zdHlsZS5zZXRfc3ltYm9scyhzeW1ib2xzKQorCisgICAgICAgICAgICBmb3Ig
YXJnIGluIHN5bWJvbHM6CiAgICAgICAgICAgICAgICAgbSA9IHNlbGYub3V0X21zZyhmbmFtZSwg
YXJnLm5hbWUsIGFyZykKIAogICAgICAgICAgICAgICAgIGlmIG0gaXMgTm9uZToKZGlmZiAtLWdp
dCBhL3NjcmlwdHMvbGliL2tkb2Mva2RvY19vdXRwdXQucHkgYi9zY3JpcHRzL2xpYi9rZG9jL2tk
b2Nfb3V0cHV0LnB5CmluZGV4IGVhODkxNDUzN2JhMC4uMWVjYTlhOTE4NTU4IDEwMDY0NAotLS0g
YS9zY3JpcHRzL2xpYi9rZG9jL2tkb2Nfb3V0cHV0LnB5CisrKyBiL3NjcmlwdHMvbGliL2tkb2Mv
a2RvY19vdXRwdXQucHkKQEAgLTIxNSw2ICsyMTUsOSBAQCBjbGFzcyBPdXRwdXRGb3JtYXQ6CiAK
ICAgICAjIFZpcnR1YWwgbWV0aG9kcyB0byBiZSBvdmVycmlkZGVuIGJ5IGluaGVyaXRlZCBjbGFz
c2VzCiAgICAgIyBBdCB0aGUgYmFzZSBjbGFzcywgdGhvc2UgZG8gbm90aGluZy4KKyAgICBkZWYg
c2V0X3N5bWJvbHMoc2VsZiwgc3ltYm9scyk6CisgICAgICAgICIiIkdldCBhIGxpc3Qgb2YgYWxs
IHN5bWJvbHMgZnJvbSBrZXJuZWxfZG9jIiIiCisKICAgICBkZWYgb3V0X2RvYyhzZWxmLCBmbmFt
ZSwgbmFtZSwgYXJncyk6CiAgICAgICAgICIiIk91dHB1dHMgYSBET0MgYmxvY2siIiIKIApAQCAt
NTc3LDYgKzU4MCw3IEBAIGNsYXNzIE1hbkZvcm1hdChPdXRwdXRGb3JtYXQpOgogCiAgICAgICAg
IHN1cGVyKCkuX19pbml0X18oKQogICAgICAgICBzZWxmLm1vZHVsZW5hbWUgPSBtb2R1bGVuYW1l
CisgICAgICAgIHNlbGYuc3ltYm9scyA9IFtdCiAKICAgICAgICAgZHQgPSBOb25lCiAgICAgICAg
IHRzdGFtcCA9IG9zLmVudmlyb24uZ2V0KCJLQlVJTERfQlVJTERfVElNRVNUQU1QIikKQEAgLTU5
Myw2ICs1OTcsNjggQEAgY2xhc3MgTWFuRm9ybWF0KE91dHB1dEZvcm1hdCk6CiAKICAgICAgICAg
c2VsZi5tYW5fZGF0ZSA9IGR0LnN0cmZ0aW1lKCIlQiAlWSIpCiAKKyAgICBkZWYgYXJnX25hbWUo
c2VsZiwgYXJncywgbmFtZSk6CisgICAgICAgICIiIgorICAgICAgICBSZXR1cm4gdGhlIG5hbWUg
dGhhdCB3aWxsIGJlIHVzZWQgZm9yIHRoZSBtYW4gcGFnZS4KKworICAgICAgICBBcyB3ZSBtYXkg
aGF2ZSB0aGUgc2FtZSBuYW1lIG9uIGRpZmZlcmVudCBuYW1lc3BhY2VzLAorICAgICAgICBwcmVw
ZW5kIHRoZSBkYXRhIHR5cGUgZm9yIGFsbCB0eXBlcyBleGNlcHQgZnVuY3Rpb25zIGFuZCB0eXBl
ZGVmcy4KKworICAgICAgICBUaGUgZG9jIHNlY3Rpb24gaXMgc3BlY2lhbDogaXQgdXNlcyB0aGUg
bW9kdWxlbmFtZS4KKyAgICAgICAgIiIiCisKKyAgICAgICAgZHR5cGUgPSBhcmdzLnR5cGUKKwor
ICAgICAgICBpZiBkdHlwZSA9PSAiZG9jIjoKKyAgICAgICAgICAgIHJldHVybiBzZWxmLm1vZHVs
ZW5hbWUKKworICAgICAgICBpZiBkdHlwZSBpbiBbImZ1bmN0aW9uIiwgInR5cGVkZWYiXToKKyAg
ICAgICAgICAgIHJldHVybiBuYW1lCisKKyAgICAgICAgcmV0dXJuIGYie2R0eXBlfSB7bmFtZX0i
CisKKyAgICBkZWYgc2V0X3N5bWJvbHMoc2VsZiwgc3ltYm9scyk6CisgICAgICAgICIiIgorICAg
ICAgICBHZXQgYSBsaXN0IG9mIGFsbCBzeW1ib2xzIGZyb20ga2VybmVsX2RvYy4KKworICAgICAg
ICBNYW4gcGFnZXMgd2lsbCB1c2VzIGl0IHRvIGFkZCBhIFNFRSBBTFNPIHNlY3Rpb24gd2l0aCBv
dGhlcgorICAgICAgICBzeW1ib2xzIGF0IHRoZSBzYW1lIGZpbGUuCisgICAgICAgICIiIgorICAg
ICAgICBzZWxmLnN5bWJvbHMgPSBzeW1ib2xzCisKKyAgICBkZWYgb3V0X3RhaWwoc2VsZiwgZm5h
bWUsIG5hbWUsIGFyZ3MpOgorICAgICAgICAiIiJBZGRzIGEgdGFpbCBmb3IgYWxsIG1hbiBwYWdl
cyIiIgorCisgICAgICAgICMgU0VFIEFMU08gc2VjdGlvbgorICAgICAgICBpZiBsZW4oc2VsZi5z
eW1ib2xzKSA+PSAyOgorICAgICAgICAgICAgY3VyX25hbWUgPSBzZWxmLmFyZ19uYW1lKGFyZ3Ms
IG5hbWUpCisKKyAgICAgICAgICAgIHNlbGYuZGF0YSArPSBmJy5TSCAiU0VFIEFMU08iJyArICJc
bi5QUFxuIgorICAgICAgICAgICAgcmVsYXRlZCA9IFtdCisgICAgICAgICAgICBmb3IgYXJnIGlu
IHNlbGYuc3ltYm9sczoKKyAgICAgICAgICAgICAgICBvdXRfbmFtZSA9IHNlbGYuYXJnX25hbWUo
YXJnLCBhcmcubmFtZSkKKworICAgICAgICAgICAgICAgIGlmIGN1cl9uYW1lID09IG91dF9uYW1l
OgorICAgICAgICAgICAgICAgICAgICBjb250aW51ZQorCisgICAgICAgICAgICAgICAgcmVsYXRl
ZC5hcHBlbmQoZiJcXGZCe291dF9uYW1lfVxcZlIoOSkiKQorCisgICAgICAgICAgICBzZWxmLmRh
dGEgKz0gIixcbiIuam9pbihyZWxhdGVkKSArICJcbiIKKworICAgICAgICAjIFRPRE86IGRvZXMg
aXQgbWFrZSBzZW5zZSB0byBhZGQgb3RoZXIgc2VjdGlvbnM/IE1heWJlCisgICAgICAgICMgUkVQ
T1JUSU5HIElTU1VFUz8gTElDRU5TRT8KKworICAgIGRlZiBtc2coc2VsZiwgZm5hbWUsIG5hbWUs
IGFyZ3MpOgorICAgICAgICAiIiIKKyAgICAgICAgSGFuZGxlcyBhIHNpbmdsZSBlbnRyeSBmcm9t
IGtlcm5lbC1kb2MgcGFyc2VyLgorCisgICAgICAgIEFkZCBhIHRhaWwgYXQgdGhlIGVuZCBvZiBt
YW4gcGFnZXMgb3V0cHV0LgorICAgICAgICAiIiIKKyAgICAgICAgc3VwZXIoKS5tc2coZm5hbWUs
IG5hbWUsIGFyZ3MpCisgICAgICAgIHNlbGYub3V0X3RhaWwoZm5hbWUsIG5hbWUsIGFyZ3MpCisK
KyAgICAgICAgcmV0dXJuIHNlbGYuZGF0YQorCiAgICAgZGVmIG91dHB1dF9oaWdobGlnaHQoc2Vs
ZiwgYmxvY2spOgogICAgICAgICAiIiIKICAgICAgICAgT3V0cHV0cyBhIEMgc3ltYm9sIHRoYXQg
bWF5IHJlcXVpcmUgYmVpbmcgaGlnaGxpZ2h0ZWQgd2l0aApAQCAtNjE4LDcgKzY4NCw5IEBAIGNs
YXNzIE1hbkZvcm1hdChPdXRwdXRGb3JtYXQpOgogICAgICAgICBpZiBub3Qgc2VsZi5jaGVja19k
b2MobmFtZSwgYXJncyk6CiAgICAgICAgICAgICByZXR1cm4KIAotICAgICAgICBzZWxmLmRhdGEg
Kz0gZicuVEggIntzZWxmLm1vZHVsZW5hbWV9IiA5ICJ7c2VsZi5tb2R1bGVuYW1lfSIgIntzZWxm
Lm1hbl9kYXRlfSIgIkFQSSBNYW51YWwiIExJTlVYJyArICJcbiIKKyAgICAgICAgb3V0X25hbWUg
PSBzZWxmLmFyZ19uYW1lKGFyZ3MsIG5hbWUpCisKKyAgICAgICAgc2VsZi5kYXRhICs9IGYnLlRI
ICJ7c2VsZi5tb2R1bGVuYW1lfSIgOSAie291dF9uYW1lfSIgIntzZWxmLm1hbl9kYXRlfSIgIkFQ
SSBNYW51YWwiIExJTlVYJyArICJcbiIKIAogICAgICAgICBmb3Igc2VjdGlvbiwgdGV4dCBpbiBh
cmdzLnNlY3Rpb25zLml0ZW1zKCk6CiAgICAgICAgICAgICBzZWxmLmRhdGEgKz0gZicuU0ggIntz
ZWN0aW9ufSInICsgIlxuIgpAQCAtNjI3LDcgKzY5NSw5IEBAIGNsYXNzIE1hbkZvcm1hdChPdXRw
dXRGb3JtYXQpOgogICAgIGRlZiBvdXRfZnVuY3Rpb24oc2VsZiwgZm5hbWUsIG5hbWUsIGFyZ3Mp
OgogICAgICAgICAiIiJvdXRwdXQgZnVuY3Rpb24gaW4gbWFuIiIiCiAKLSAgICAgICAgc2VsZi5k
YXRhICs9IGYnLlRIICJ7bmFtZX0iIDkgIntuYW1lfSIgIntzZWxmLm1hbl9kYXRlfSIgIktlcm5l
bCBIYWNrZXJcJ3MgTWFudWFsIiBMSU5VWCcgKyAiXG4iCisgICAgICAgIG91dF9uYW1lID0gc2Vs
Zi5hcmdfbmFtZShhcmdzLCBuYW1lKQorCisgICAgICAgIHNlbGYuZGF0YSArPSBmJy5USCAie25h
bWV9IiA5ICJ7b3V0X25hbWV9IiAie3NlbGYubWFuX2RhdGV9IiAiS2VybmVsIEhhY2tlclwncyBN
YW51YWwiIExJTlVYJyArICJcbiIKIAogICAgICAgICBzZWxmLmRhdGEgKz0gIi5TSCBOQU1FXG4i
CiAgICAgICAgIHNlbGYuZGF0YSArPSBmIntuYW1lfSBcXC0ge2FyZ3NbJ3B1cnBvc2UnXX1cbiIK
QEAgLTY3MSw3ICs3NDEsOSBAQCBjbGFzcyBNYW5Gb3JtYXQoT3V0cHV0Rm9ybWF0KToKICAgICAg
ICAgICAgIHNlbGYub3V0cHV0X2hpZ2hsaWdodCh0ZXh0KQogCiAgICAgZGVmIG91dF9lbnVtKHNl
bGYsIGZuYW1lLCBuYW1lLCBhcmdzKToKLSAgICAgICAgc2VsZi5kYXRhICs9IGYnLlRIICJ7c2Vs
Zi5tb2R1bGVuYW1lfSIgOSAiZW51bSB7bmFtZX0iICJ7c2VsZi5tYW5fZGF0ZX0iICJBUEkgTWFu
dWFsIiBMSU5VWCcgKyAiXG4iCisgICAgICAgIG91dF9uYW1lID0gc2VsZi5hcmdfbmFtZShhcmdz
LCBuYW1lKQorCisgICAgICAgIHNlbGYuZGF0YSArPSBmJy5USCAie3NlbGYubW9kdWxlbmFtZX0i
IDkgIntvdXRfbmFtZX0iICJ7c2VsZi5tYW5fZGF0ZX0iICJBUEkgTWFudWFsIiBMSU5VWCcgKyAi
XG4iCiAKICAgICAgICAgc2VsZi5kYXRhICs9ICIuU0ggTkFNRVxuIgogICAgICAgICBzZWxmLmRh
dGEgKz0gZiJlbnVtIHtuYW1lfSBcXC0ge2FyZ3NbJ3B1cnBvc2UnXX1cbiIKQEAgLTcwMyw4ICs3
NzUsOSBAQCBjbGFzcyBNYW5Gb3JtYXQoT3V0cHV0Rm9ybWF0KToKICAgICBkZWYgb3V0X3R5cGVk
ZWYoc2VsZiwgZm5hbWUsIG5hbWUsIGFyZ3MpOgogICAgICAgICBtb2R1bGUgPSBzZWxmLm1vZHVs
ZW5hbWUKICAgICAgICAgcHVycG9zZSA9IGFyZ3MuZ2V0KCdwdXJwb3NlJykKKyAgICAgICAgb3V0
X25hbWUgPSBzZWxmLmFyZ19uYW1lKGFyZ3MsIG5hbWUpCiAKLSAgICAgICAgc2VsZi5kYXRhICs9
IGYnLlRIICJ7bW9kdWxlfSIgOSAie25hbWV9IiAie3NlbGYubWFuX2RhdGV9IiAiQVBJIE1hbnVh
bCIgTElOVVgnICsgIlxuIgorICAgICAgICBzZWxmLmRhdGEgKz0gZicuVEggInttb2R1bGV9IiA5
ICJ7b3V0X25hbWV9IiAie3NlbGYubWFuX2RhdGV9IiAiQVBJIE1hbnVhbCIgTElOVVgnICsgIlxu
IgogCiAgICAgICAgIHNlbGYuZGF0YSArPSAiLlNIIE5BTUVcbiIKICAgICAgICAgc2VsZi5kYXRh
ICs9IGYidHlwZWRlZiB7bmFtZX0gXFwtIHtwdXJwb3NlfVxuIgpAQCAtNzE3LDggKzc5MCw5IEBA
IGNsYXNzIE1hbkZvcm1hdChPdXRwdXRGb3JtYXQpOgogICAgICAgICBtb2R1bGUgPSBzZWxmLm1v
ZHVsZW5hbWUKICAgICAgICAgcHVycG9zZSA9IGFyZ3MuZ2V0KCdwdXJwb3NlJykKICAgICAgICAg
ZGVmaW5pdGlvbiA9IGFyZ3MuZ2V0KCdkZWZpbml0aW9uJykKKyAgICAgICAgb3V0X25hbWUgPSBz
ZWxmLmFyZ19uYW1lKGFyZ3MsIG5hbWUpCiAKLSAgICAgICAgc2VsZi5kYXRhICs9IGYnLlRIICJ7
bW9kdWxlfSIgOSAie2FyZ3MudHlwZX0ge25hbWV9IiAie3NlbGYubWFuX2RhdGV9IiAiQVBJIE1h
bnVhbCIgTElOVVgnICsgIlxuIgorICAgICAgICBzZWxmLmRhdGEgKz0gZicuVEggInttb2R1bGV9
IiA5ICJ7b3V0X25hbWV9IiAie3NlbGYubWFuX2RhdGV9IiAiQVBJIE1hbnVhbCIgTElOVVgnICsg
IlxuIgogCiAgICAgICAgIHNlbGYuZGF0YSArPSAiLlNIIE5BTUVcbiIKICAgICAgICAgc2VsZi5k
YXRhICs9IGYie2FyZ3MudHlwZX0ge25hbWV9IFxcLSB7cHVycG9zZX1cbiIKLS0gCjIuNTEuMAoK

^ permalink raw reply	[flat|nested] 16+ messages in thread

end of thread, other threads:[~2025-09-01 14:42 UTC | newest]

Thread overview: 16+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-09-01 14:42 [PATCH v3 00/15] Split sphinx call logic from docs Makefile Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 01/15] scripts/jobserver-exec: move the code to a class Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 02/15] scripts/jobserver-exec: move its class to the lib directory Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 03/15] scripts/jobserver-exec: add a help message Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 04/15] scripts: sphinx-pre-install: move it to tools/docs Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 05/15] tools/docs: sphinx-pre-install: move Python version handling to lib Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 06/15] tools/docs: sphinx-build-wrapper: add a wrapper for sphinx-build Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 07/15] tools/docs: sphinx-build-wrapper: add comments and blank lines Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 08/15] tools/docs: sphinx-build-wrapper: add support to run inside venv Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 09/15] docs: parallel-wrapper.sh: remove script Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 10/15] docs: Makefile: document latex/PDF PAPER= parameter Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 11/15] tools/docs: sphinx-build-wrapper: add an argument for LaTeX interactive mode Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 12/15] tools/docs,scripts: sphinx-*: prevent sphinx-build crashes Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 13/15] tools/docs: sphinx-build-wrapper: allow building PDF files in parallel Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 14/15] docs: add support to build manpages from kerneldoc output Mauro Carvalho Chehab
2025-09-01 14:42 ` [PATCH v3 15/15] tools: kernel-doc: add a see also section at man pages Mauro Carvalho Chehab

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).