linux-kernel.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree)
@ 2025-06-19  6:48 Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 01/17] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
                   ` (17 more replies)
  0 siblings, 18 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
	Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
	netdev, peterz, stern, Breno Leitao, Jakub Kicinski, Randy Dunlap,
	Simon Horman

Hi Jon,

As I sent two additional patches after the v6 that depends on
it, I'm opting to resend the series together with those extra
stuff.

-

As discussed at:
   https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/

changeset f061c9f7d058 ("Documentation: Document each netlink family")
added a logic which generates *.rst files inside $(srctree). This is bad
when O=<BUILDDIR> is used.

A recent change renamed the yaml files used by Netlink, revealing a bad
side effect: as "make cleandocs" don't clean the produced files and symbols
appear duplicated for people that don't build the kernel from scratch.

This series adds an yaml parser extension and uses an index file with glob for
*. We opted to write such extension in a way that no actual yaml conversion
code is inside it. This makes it flexible enough to handle other types of yaml
files in the future. The actual yaml conversion logic were placed at 
netlink_yml_parser.py. 

As requested by YNL maintainers, this version has netlink_yml_parser.py
inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
binaries, nor to have Python libraries spread all over the Kernel. IMO,
the best is to put all of them on a common place (scripts/lib, python/lib,
lib/python, ...) but, as this can be solved later, for now let's keep it this
way.

---

v7:
- Added a patch to cleanup conf.py and address coding style issues;
- Added a docutils version check logic to detect known issues when
  building the docs with too old or too new docutils version.  The
  actuall min/max vesion depends on Sphinx version.

v6:
- YNL doc parser is now at tools/net/ynl/pyynl/lib/doc_generator.py;
- two patches got merged;
- added instructions to test docs with Sphinx 3.4.3 (minimal supported
  version);
- minor fixes.

v5:
- some patch reorg;
- netlink_yml_parser.py is now together with ynl tools;
- minor fixes.

v4:
- Renamed the YNL parser class;
- some minor patch cleanups and merges;
- added an extra patch to fix a insert_pattern/exclude_pattern logic when
   SPHINXDIRS is used.

v3:
- Two series got merged altogether:
  - https://lore.kernel.org/linux-doc/cover.1749723671.git.mchehab+huawei@kernel.org/T/#t
  - https://lore.kernel.org/linux-doc/cover.1749735022.git.mchehab+huawei@kernel.org

- Added an extra patch to update MAINTAINERS to point to YNL library
- Added a (somewhat unrelated) patch that remove warnings check when
  running "make cleandocs".
Mauro Carvalho Chehab (17):
  docs: conf.py: properly handle include and exclude patterns
  docs: Makefile: disable check rules on make cleandocs
  docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  tools: ynl_gen_rst.py: Split library from command line tool
  docs: netlink: index.rst: add a netlink index file
  tools: ynl_gen_rst.py: cleanup coding style
  docs: sphinx: add a parser for yaml files for Netlink specs
  docs: use parser_yaml extension to handle Netlink specs
  docs: uapi: netlink: update netlink specs link
  tools: ynl_gen_rst.py: drop support for generating index files
  docs: netlink: remove obsolete .gitignore from unused directory
  MAINTAINERS: add netlink_yml_parser.py to linux-doc
  tools: netlink_yml_parser.py: add line numbers to parsed data
  docs: parser_yaml.py: add support for line numbers from the parser
  docs: sphinx: add a file with the requirements for lowest version
  docs: conf.py: several coding style fixes
  docs: conf.py: Check Sphinx and docutils version

 Documentation/Makefile                        |  19 +-
 Documentation/conf.py                         | 445 +++++++++++-------
 Documentation/doc-guide/sphinx.rst            |  15 +
 Documentation/netlink/specs/index.rst         |  13 +
 Documentation/networking/index.rst            |   2 +-
 .../networking/netlink_spec/.gitignore        |   1 -
 .../networking/netlink_spec/readme.txt        |   4 -
 Documentation/sphinx/min_requirements.txt     |   8 +
 Documentation/sphinx/parser_yaml.py           |  84 ++++
 Documentation/userspace-api/netlink/index.rst |   2 +-
 .../userspace-api/netlink/netlink-raw.rst     |   6 +-
 Documentation/userspace-api/netlink/specs.rst |   2 +-
 MAINTAINERS                                   |   1 +
 tools/net/ynl/pyynl/lib/__init__.py           |   2 +
 tools/net/ynl/pyynl/lib/doc_generator.py      | 398 ++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py            | 384 +--------------
 16 files changed, 804 insertions(+), 582 deletions(-)
 create mode 100644 Documentation/netlink/specs/index.rst
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt
 create mode 100644 Documentation/sphinx/min_requirements.txt
 create mode 100755 Documentation/sphinx/parser_yaml.py
 create mode 100644 tools/net/ynl/pyynl/lib/doc_generator.py

-- 
2.49.0



^ permalink raw reply	[flat|nested] 26+ messages in thread

* [PATCH v7 01/17] docs: conf.py: properly handle include and exclude patterns
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 02/17] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
                   ` (16 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

When one does:
	make SPHINXDIRS="foo" htmldocs

All patterns would be relative to Documentation/foo, which
causes the include/exclude patterns like:

	include_patterns = [
		...
		f'foo/*.{ext}',
	]

to break. This is not what it is expected. Address it by
adding a logic to dynamically adjust the pattern when
SPHINXDIRS is used.

That allows adding parsers for other file types.

It should be noticed that include_patterns was added on
Sphinx 5.1:
	https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns

So, a backward-compatible code is needed when we start
using it for real.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/conf.py | 67 ++++++++++++++++++++++++++++++++++++++++---
 1 file changed, 63 insertions(+), 4 deletions(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index 12de52a2b17e..4ba4ee45e599 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -17,6 +17,66 @@ import os
 import sphinx
 import shutil
 
+# Get Sphinx version
+major, minor, patch = sphinx.version_info[:3]
+
+# Include_patterns were added on Sphinx 5.1
+if (major < 5) or (major == 5 and minor < 1):
+    has_include_patterns = False
+else:
+    has_include_patterns = True
+    # Include patterns that don't contain directory names, in glob format
+    include_patterns = ['**.rst']
+
+# Location of Documentation/ directory
+doctree = os.path.abspath('.')
+
+# Exclude of patterns that don't contain directory names, in glob format.
+exclude_patterns = []
+
+# List of patterns that contain directory names in glob format.
+dyn_include_patterns = []
+dyn_exclude_patterns = ['output']
+
+# Properly handle include/exclude patterns
+# ----------------------------------------
+
+def update_patterns(app):
+
+    """
+    On Sphinx, all directories are relative to what it is passed as
+    SOURCEDIR parameter for sphinx-build. Due to that, all patterns
+    that have directory names on it need to be dynamically set, after
+    converting them to a relative patch.
+
+    As Sphinx doesn't include any patterns outside SOURCEDIR, we should
+    exclude relative patterns that start with "../".
+    """
+
+    sourcedir = app.srcdir  # full path to the source directory
+    builddir = os.environ.get("BUILDDIR")
+
+    # setup include_patterns dynamically
+    if has_include_patterns:
+        for p in dyn_include_patterns:
+            full = os.path.join(doctree, p)
+
+            rel_path = os.path.relpath(full, start = app.srcdir)
+            if rel_path.startswith("../"):
+                continue
+
+            app.config.include_patterns.append(rel_path)
+
+    # setup exclude_patterns dynamically
+    for p in dyn_exclude_patterns:
+        full = os.path.join(doctree, p)
+
+        rel_path = os.path.relpath(full, start = app.srcdir)
+        if rel_path.startswith("../"):
+            continue
+
+        app.config.exclude_patterns.append(rel_path)
+
 # helper
 # ------
 
@@ -219,10 +279,6 @@ language = 'en'
 # Else, today_fmt is used as the format for a strftime call.
 #today_fmt = '%B %d, %Y'
 
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-exclude_patterns = ['output']
-
 # The reST default role (used for this markup: `text`) to use for all
 # documents.
 #default_role = None
@@ -516,3 +572,6 @@ kerneldoc_srctree = '..'
 # the last statement in the conf.py file
 # ------------------------------------------------------------------------------
 loadConfig(globals())
+
+def setup(app):
+	app.connect('builder-inited', update_patterns)
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 02/17] docs: Makefile: disable check rules on make cleandocs
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 01/17] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
                   ` (15 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

It doesn't make sense to check for missing ABI and documents
when cleaning the tree.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/Makefile | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index d30d66ddf1ad..b98477df5ddf 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -5,6 +5,7 @@
 # for cleaning
 subdir- := devicetree/bindings
 
+ifneq ($(MAKECMDGOALS),cleandocs)
 # Check for broken documentation file references
 ifeq ($(CONFIG_WARN_MISSING_DOCUMENTS),y)
 $(shell $(srctree)/scripts/documentation-file-ref-check --warn)
@@ -14,6 +15,7 @@ endif
 ifeq ($(CONFIG_WARN_ABI_ERRORS),y)
 $(shell $(srctree)/scripts/get_abi.py --dir $(srctree)/Documentation/ABI validate)
 endif
+endif
 
 # You can set these variables from the command line.
 SPHINXBUILD   = sphinx-build
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 01/17] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 02/17] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19  8:57   ` Donald Hunter
  2025-06-19  6:48 ` [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
                   ` (14 subsequent siblings)
  17 siblings, 1 reply; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Currently, rt documents are referred with:

Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`

Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. Also that's hard to
maintain, and may break if we change the way rst files are
generated from yaml. Better to use instead a reference for
the netlink family.

So, replace them by Sphinx cross-reference tag that are
created by ynl_gen_rst.py.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
 tools/net/ynl/pyynl/ynl_gen_rst.py                  | 5 +++--
 2 files changed, 6 insertions(+), 5 deletions(-)

diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
 ------------
 
 Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
 abstraction to carry module specific information.
 
 Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
 Nested struct definitions
 -------------------------
 
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
 make use of nested struct definitions. The ``netlink-raw`` schema makes it
 possible to embed a struct within a struct definition using the ``struct``
 property. For example, the following struct definition embeds the
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
 
     # Main header
 
-    lines.append(rst_header())
-
     family = obj['name']
 
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
     title = f"Family ``{family}`` netlink specification"
     lines.append(rst_title(title))
     lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-06-19  6:48 ` [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19 12:01   ` Donald Hunter
  2025-06-19  6:48 ` [PATCH v7 05/17] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
                   ` (13 subsequent siblings)
  17 siblings, 1 reply; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/__init__.py      |   2 +
 tools/net/ynl/pyynl/lib/doc_generator.py | 409 +++++++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py       | 361 +-------------------
 3 files changed, 419 insertions(+), 353 deletions(-)
 create mode 100644 tools/net/ynl/pyynl/lib/doc_generator.py

diff --git a/tools/net/ynl/pyynl/lib/__init__.py b/tools/net/ynl/pyynl/lib/__init__.py
index 71518b9842ee..5f266ebe4526 100644
--- a/tools/net/ynl/pyynl/lib/__init__.py
+++ b/tools/net/ynl/pyynl/lib/__init__.py
@@ -4,6 +4,8 @@ from .nlspec import SpecAttr, SpecAttrSet, SpecEnumEntry, SpecEnumSet, \
     SpecFamily, SpecOperation, SpecSubMessage, SpecSubMessageFormat
 from .ynl import YnlFamily, Netlink, NlError
 
+from .doc_generator import YnlDocGenerator
+
 __all__ = ["SpecAttr", "SpecAttrSet", "SpecEnumEntry", "SpecEnumSet",
            "SpecFamily", "SpecOperation", "SpecSubMessage", "SpecSubMessageFormat",
            "YnlFamily", "Netlink", "NlError"]
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
new file mode 100644
index 000000000000..839e78b39de3
--- /dev/null
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -0,0 +1,409 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+    Class to auto generate the documentation for Netlink specifications.
+
+    :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
+    :license:    GPL Version 2, June 1991 see linux/COPYING for details.
+
+    This class performs extensive parsing to the Linux kernel's netlink YAML
+    spec files, in an effort to avoid needing to heavily mark up the original
+    YAML file.
+
+    This code is split in two classes:
+        1) RST formatters: Use to convert a string to a RST output
+        2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
+"""
+
+from typing import Any, Dict, List
+import os.path
+import sys
+import argparse
+import logging
+import yaml
+
+
+# ==============
+# RST Formatters
+# ==============
+class RstFormatters:
+    SPACE_PER_LEVEL = 4
+
+    @staticmethod
+    def headroom(level: int) -> str:
+        """Return space to format"""
+        return " " * (level * RstFormatters.SPACE_PER_LEVEL)
+
+
+    @staticmethod
+    def bold(text: str) -> str:
+        """Format bold text"""
+        return f"**{text}**"
+
+
+    @staticmethod
+    def inline(text: str) -> str:
+        """Format inline text"""
+        return f"``{text}``"
+
+
+    @staticmethod
+    def sanitize(text: str) -> str:
+        """Remove newlines and multiple spaces"""
+        # This is useful for some fields that are spread across multiple lines
+        return str(text).replace("\n", " ").strip()
+
+
+    def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+        """Return a RST formatted field"""
+        return self.headroom(level) + f":{key}: {value}"
+
+
+    def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+        """Format a single rst definition"""
+        return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
+
+
+    def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+        """Return a formatted paragraph"""
+        return self.headroom(level) + paragraph
+
+
+    def rst_bullet(self, item: str, level: int = 0) -> str:
+        """Return a formatted a bullet"""
+        return self.headroom(level) + f"- {item}"
+
+
+    @staticmethod
+    def rst_subsection(title: str) -> str:
+        """Add a sub-section to the document"""
+        return f"{title}\n" + "-" * len(title)
+
+
+    @staticmethod
+    def rst_subsubsection(title: str) -> str:
+        """Add a sub-sub-section to the document"""
+        return f"{title}\n" + "~" * len(title)
+
+
+    @staticmethod
+    def rst_section(namespace: str, prefix: str, title: str) -> str:
+        """Add a section to the document"""
+        return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+    @staticmethod
+    def rst_subtitle(title: str) -> str:
+        """Add a subtitle to the document"""
+        return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+    @staticmethod
+    def rst_title(title: str) -> str:
+        """Add a title to the document"""
+        return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+    def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+        """Format a list using inlines"""
+        return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
+
+
+    @staticmethod
+    def rst_ref(namespace: str, prefix: str, name: str) -> str:
+        """Add a hyperlink to the document"""
+        mappings = {'enum': 'definition',
+                    'fixed-header': 'definition',
+                    'nested-attributes': 'attribute-set',
+                    'struct': 'definition'}
+        if prefix in mappings:
+            prefix = mappings[prefix]
+        return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+    def rst_header(self) -> str:
+        """The headers for all the auto generated RST files"""
+        lines = []
+
+        lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+        lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_toctree(maxdepth: int = 2) -> str:
+        """Generate a toctree RST primitive"""
+        lines = []
+
+        lines.append(".. toctree::")
+        lines.append(f"   :maxdepth: {maxdepth}\n\n")
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_label(title: str) -> str:
+        """Return a formatted label"""
+        return f".. _{title}:\n\n"
+
+# =======
+# Parsers
+# =======
+class YnlDocGenerator:
+
+    fmt = RstFormatters()
+
+    def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+        """Parse 'multicast' group list and return a formatted string"""
+        lines = []
+        for group in mcast_group:
+            lines.append(self.fmt.rst_bullet(group["name"]))
+
+        return "\n".join(lines)
+
+
+    def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'do' section and return a formatted string"""
+        lines = []
+        for key in do_dict.keys():
+            lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+            if key in ['request', 'reply']:
+                lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+            else:
+                lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+        return "\n".join(lines)
+
+
+    def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'attributes' section"""
+        if "attributes" not in attrs:
+            return ""
+        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+        return "\n".join(lines)
+
+
+    def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse operations block"""
+        preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+        linkable = ["fixed-header", "attribute-set"]
+        lines = []
+
+        for operation in operations:
+            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+            for key in operation.keys():
+                if key in preprocessed:
+                    # Skip the special fields
+                    continue
+                value = operation[key]
+                if key in linkable:
+                    value = self.fmt.rst_ref(namespace, key, value)
+                lines.append(self.fmt.rst_fields(key, value, 0))
+            if 'flags' in operation:
+                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+            if "do" in operation:
+                lines.append(self.fmt.rst_paragraph(":do:", 0))
+                lines.append(self.parse_do(operation["do"], 0))
+            if "dump" in operation:
+                lines.append(self.fmt.rst_paragraph(":dump:", 0))
+                lines.append(self.parse_do(operation["dump"], 0))
+
+            # New line after fields
+            lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+        """Parse a list of entries"""
+        ignored = ["pad"]
+        lines = []
+        for entry in entries:
+            if isinstance(entry, dict):
+                # entries could be a list or a dictionary
+                field_name = entry.get("name", "")
+                if field_name in ignored:
+                    continue
+                type_ = entry.get("type")
+                if type_:
+                    field_name += f" ({self.fmt.inline(type_)})"
+                lines.append(
+                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                )
+            elif isinstance(entry, list):
+                lines.append(self.fmt.rst_list_inline(entry, level))
+            else:
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+
+        lines.append("\n")
+        return "\n".join(lines)
+
+
+    def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+        """Parse definitions section"""
+        preprocessed = ["name", "entries", "members"]
+        ignored = ["render-max"]  # This is not printed
+        lines = []
+
+        for definition in defs:
+            lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+            for k in definition.keys():
+                if k in preprocessed + ignored:
+                    continue
+                lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+            # Field list needs to finish with a new line
+            lines.append("\n")
+            if "entries" in definition:
+                lines.append(self.fmt.rst_paragraph(":entries:", 0))
+                lines.append(self.parse_entries(definition["entries"], 1))
+            if "members" in definition:
+                lines.append(self.fmt.rst_paragraph(":members:", 0))
+                lines.append(self.parse_entries(definition["members"], 1))
+
+        return "\n".join(lines)
+
+
+    def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse attribute from attribute-set"""
+        preprocessed = ["name", "type"]
+        linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+        ignored = ["checks"]
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            for attr in entry["attributes"]:
+                type_ = attr.get("type")
+                attr_line = attr["name"]
+                if type_:
+                    # Add the attribute type in the same line
+                    attr_line += f" ({self.fmt.inline(type_)})"
+
+                lines.append(self.fmt.rst_subsubsection(attr_line))
+
+                for k in attr.keys():
+                    if k in preprocessed + ignored:
+                        continue
+                    if k in linkable:
+                        value = self.fmt.rst_ref(namespace, k, attr[k])
+                    else:
+                        value = self.fmt.sanitize(attr[k])
+                    lines.append(self.fmt.rst_fields(k, value, 0))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse sub-message definitions"""
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            for fmt in entry["formats"]:
+                value = fmt["value"]
+
+                lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+                for attr in ['fixed-header', 'attribute-set']:
+                    if attr in fmt:
+                        lines.append(self.fmt.rst_fields(attr,
+                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
+                                                1))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_yaml(self, obj: Dict[str, Any]) -> str:
+        """Format the whole YAML into a RST string"""
+        lines = []
+
+        # Main header
+
+        family = obj['name']
+
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("netlink-" + family))
+
+        title = f"Family ``{family}`` netlink specification"
+        lines.append(self.fmt.rst_title(title))
+        lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
+
+        if "doc" in obj:
+            lines.append(self.fmt.rst_subtitle("Summary"))
+            lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
+
+        # Operations
+        if "operations" in obj:
+            lines.append(self.fmt.rst_subtitle("Operations"))
+            lines.append(self.parse_operations(obj["operations"]["list"], family))
+
+        # Multicast groups
+        if "mcast-groups" in obj:
+            lines.append(self.fmt.rst_subtitle("Multicast groups"))
+            lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
+
+        # Definitions
+        if "definitions" in obj:
+            lines.append(self.fmt.rst_subtitle("Definitions"))
+            lines.append(self.parse_definitions(obj["definitions"], family))
+
+        # Attributes set
+        if "attribute-sets" in obj:
+            lines.append(self.fmt.rst_subtitle("Attribute sets"))
+            lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
+
+        # Sub-messages
+        if "sub-messages" in obj:
+            lines.append(self.fmt.rst_subtitle("Sub-messages"))
+            lines.append(self.parse_sub_messages(obj["sub-messages"], family))
+
+        return "\n".join(lines)
+
+
+    # Main functions
+    # ==============
+
+
+    def parse_yaml_file(self, filename: str) -> str:
+        """Transform the YAML specified by filename into an RST-formatted string"""
+        with open(filename, "r", encoding="utf-8") as spec_file:
+            yaml_data = yaml.safe_load(spec_file)
+            content = self.parse_yaml(yaml_data)
+
+        return content
+
+
+    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
+        """Generate the `networking_spec/index` content and write to the file"""
+        lines = []
+
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("specs"))
+        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
+        lines.append(self.fmt.rst_toctree(1))
+
+        index_fname = os.path.basename(output)
+        base, ext = os.path.splitext(index_fname)
+
+        if not index_dir:
+            index_dir = os.path.dirname(output)
+
+        logging.debug(f"Looking for {ext} files in %s", index_dir)
+        for filename in sorted(os.listdir(index_dir)):
+            if not filename.endswith(ext) or filename == index_fname:
+                continue
+            base, ext = os.path.splitext(filename)
+            lines.append(f"   {base}\n")
+
+        logging.debug("Writing an index file at %s", output)
+
+        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..b5a665eeaa5a 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,354 +10,17 @@
 
     This script performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
-    YAML file.
-
-    This code is split in three big parts:
-        1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+    YAML file. It uses the library code from scripts/lib.
 """
 
-from typing import Any, Dict, List
 import os.path
+import pathlib
 import sys
 import argparse
 import logging
-import yaml
-
-
-SPACE_PER_LEVEL = 4
-
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
-
-
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
-
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
-    return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
-
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
-
-    return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
-
-        # New line after fields
-        lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
-                if k in preprocessed + ignored:
-                    continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
-
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
-
-    # Main header
-
-    family = obj['name']
-
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
-
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
-
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
-
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
-
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
-
-    return "\n".join(lines)
-
-
-# Main functions
-# ==============
 
+sys.path.append(pathlib.Path(__file__).resolve().parent.as_posix())
+from lib import YnlDocGenerator    # pylint: disable=C0413
 
 def parse_arguments() -> argparse.Namespace:
     """Parse arguments from user"""
@@ -392,15 +55,6 @@ def parse_arguments() -> argparse.Namespace:
     return args
 
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
-
-    return content
-
-
 def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
@@ -411,7 +65,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
 
 def generate_main_index_rst(output: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
 
     lines.append(rst_header())
     lines.append(rst_label("specs"))
@@ -426,7 +79,7 @@ def generate_main_index_rst(output: str) -> None:
         lines.append(f"   {filename.replace('.rst', '')}\n")
 
     logging.debug("Writing an index file at %s", output)
-    write_to_rstfile("".join(lines), output)
+    write_to_rstfile(msg, output)
 
 
 def main() -> None:
@@ -434,10 +87,12 @@ def main() -> None:
 
     args = parse_arguments()
 
+    parser = YnlDocGenerator()
+
     if args.input:
         logging.debug("Parsing %s", args.input)
         try:
-            content = parse_yaml_file(os.path.join(args.input))
+            content = parser.parse_yaml_file(os.path.join(args.input))
         except Exception as exception:
             logging.warning("Failed to parse %s.", args.input)
             logging.warning(exception)
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 05/17] docs: netlink: index.rst: add a netlink index file
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-06-19  6:48 ` [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19  6:48 ` [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
                   ` (12 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of generating the index file, use glob to automatically
include all data from yaml.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/netlink/specs/index.rst | 13 +++++++++++++
 1 file changed, 13 insertions(+)
 create mode 100644 Documentation/netlink/specs/index.rst

diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..7f7cf4a096f2
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,13 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+   :maxdepth: 1
+   :glob:
+
+   *
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-06-19  6:48 ` [PATCH v7 05/17] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
@ 2025-06-19  6:48 ` Mauro Carvalho Chehab
  2025-06-19 12:04   ` Donald Hunter
  2025-06-19  6:49 ` [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  17 siblings, 1 reply; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:48 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Cleanup some coding style issues pointed by pylint and flake8.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 75 ++++++++----------------
 1 file changed, 26 insertions(+), 49 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 839e78b39de3..f71360f0ceb7 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -18,17 +18,12 @@
 """
 
 from typing import Any, Dict, List
-import os.path
-import sys
-import argparse
-import logging
 import yaml
 
 
-# ==============
-# RST Formatters
-# ==============
 class RstFormatters:
+    """RST Formatters"""
+
     SPACE_PER_LEVEL = 4
 
     @staticmethod
@@ -36,81 +31,67 @@ class RstFormatters:
         """Return space to format"""
         return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-
     @staticmethod
     def bold(text: str) -> str:
         """Format bold text"""
         return f"**{text}**"
 
-
     @staticmethod
     def inline(text: str) -> str:
         """Format inline text"""
         return f"``{text}``"
 
-
     @staticmethod
     def sanitize(text: str) -> str:
         """Remove newlines and multiple spaces"""
         # This is useful for some fields that are spread across multiple lines
         return str(text).replace("\n", " ").strip()
 
-
     def rst_fields(self, key: str, value: str, level: int = 0) -> str:
         """Return a RST formatted field"""
         return self.headroom(level) + f":{key}: {value}"
 
-
     def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
         """Format a single rst definition"""
         return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-
     def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
         """Return a formatted paragraph"""
         return self.headroom(level) + paragraph
 
-
     def rst_bullet(self, item: str, level: int = 0) -> str:
         """Return a formatted a bullet"""
         return self.headroom(level) + f"- {item}"
 
-
     @staticmethod
     def rst_subsection(title: str) -> str:
         """Add a sub-section to the document"""
         return f"{title}\n" + "-" * len(title)
 
-
     @staticmethod
     def rst_subsubsection(title: str) -> str:
         """Add a sub-sub-section to the document"""
         return f"{title}\n" + "~" * len(title)
 
-
     @staticmethod
     def rst_section(namespace: str, prefix: str, title: str) -> str:
         """Add a section to the document"""
         return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-
     @staticmethod
     def rst_subtitle(title: str) -> str:
         """Add a subtitle to the document"""
         return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-
     @staticmethod
     def rst_title(title: str) -> str:
         """Add a title to the document"""
         return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-
     def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
         """Format a list using inlines"""
         return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-
     @staticmethod
     def rst_ref(namespace: str, prefix: str, name: str) -> str:
         """Add a hyperlink to the document"""
@@ -119,10 +100,9 @@ class RstFormatters:
                     'nested-attributes': 'attribute-set',
                     'struct': 'definition'}
         if prefix in mappings:
-            prefix = mappings[prefix]
+            prefix = mappings.get(prefix, "")
         return f":ref:`{namespace}-{prefix}-{name}`"
 
-
     def rst_header(self) -> str:
         """The headers for all the auto generated RST files"""
         lines = []
@@ -132,7 +112,6 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_toctree(maxdepth: int = 2) -> str:
         """Generate a toctree RST primitive"""
@@ -143,16 +122,13 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_label(title: str) -> str:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
-# =======
-# Parsers
-# =======
 class YnlDocGenerator:
+    """YAML Netlink specs Parser"""
 
     fmt = RstFormatters()
 
@@ -164,7 +140,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
@@ -177,16 +152,16 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
         """Parse 'attributes' section"""
         if "attributes" not in attrs:
             return ""
-        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+        lines = [self.fmt.rst_fields("attributes",
+                                     self.fmt.rst_list_inline(attrs["attributes"]),
+                                     level + 1)]
 
         return "\n".join(lines)
 
-
     def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
         """Parse operations block"""
         preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +169,8 @@ class YnlDocGenerator:
         lines = []
 
         for operation in operations:
-            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'operation',
+                                              operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
@@ -206,7 +182,8 @@ class YnlDocGenerator:
                     value = self.fmt.rst_ref(namespace, key, value)
                 lines.append(self.fmt.rst_fields(key, value, 0))
             if 'flags' in operation:
-                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+                lines.append(self.fmt.rst_fields('flags',
+                                                 self.fmt.rst_list_inline(operation['flags'])))
 
             if "do" in operation:
                 lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +197,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
         """Parse a list of entries"""
         ignored = ["pad"]
@@ -235,17 +211,19 @@ class YnlDocGenerator:
                 if type_:
                     field_name += f" ({self.fmt.inline(type_)})"
                 lines.append(
-                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                    self.fmt.rst_fields(field_name,
+                                        self.fmt.sanitize(entry.get("doc", "")),
+                                        level)
                 )
             elif isinstance(entry, list):
                 lines.append(self.fmt.rst_list_inline(entry, level))
             else:
-                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+                                                 level))
 
         lines.append("\n")
         return "\n".join(lines)
 
-
     def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
         """Parse definitions section"""
         preprocessed = ["name", "entries", "members"]
@@ -270,7 +248,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse attribute from attribute-set"""
         preprocessed = ["name", "type"]
@@ -279,7 +256,8 @@ class YnlDocGenerator:
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+                                              entry["name"]))
             for attr in entry["attributes"]:
                 type_ = attr.get("type")
                 attr_line = attr["name"]
@@ -301,13 +279,13 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse sub-message definitions"""
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'sub-message',
+                                              entry["name"]))
             for fmt in entry["formats"]:
                 value = fmt["value"]
 
@@ -315,13 +293,14 @@ class YnlDocGenerator:
                 for attr in ['fixed-header', 'attribute-set']:
                     if attr in fmt:
                         lines.append(self.fmt.rst_fields(attr,
-                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
-                                                1))
+                                                         self.fmt.rst_ref(namespace,
+                                                                          attr,
+                                                                          fmt[attr]),
+                                                         1))
                 lines.append("\n")
 
         return "\n".join(lines)
 
-
     def parse_yaml(self, obj: Dict[str, Any]) -> str:
         """Format the whole YAML into a RST string"""
         lines = []
@@ -344,7 +323,8 @@ class YnlDocGenerator:
         # Operations
         if "operations" in obj:
             lines.append(self.fmt.rst_subtitle("Operations"))
-            lines.append(self.parse_operations(obj["operations"]["list"], family))
+            lines.append(self.parse_operations(obj["operations"]["list"],
+                                               family))
 
         # Multicast groups
         if "mcast-groups" in obj:
@@ -368,11 +348,9 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     # Main functions
     # ==============
 
-
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
@@ -381,7 +359,6 @@ class YnlDocGenerator:
 
         return content
 
-
     def generate_main_index_rst(self, output: str, index_dir: str) -> None:
         """Generate the `networking_spec/index` content and write to the file"""
         lines = []
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-06-19  6:48 ` [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19 12:08   ` Donald Hunter
  2025-06-19  6:49 ` [PATCH v7 08/17] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  17 siblings, 1 reply; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

Add a simple sphinx.Parser to handle yaml files and add the
the code to handle Netlink specs. All other yaml files are
ignored.

The code was written in a way that parsing yaml for different
subsystems and even for different parts of Netlink are easy.

All it takes to have a different parser is to add an
import line similar to:

	from netlink_yml_parser import YnlDocGenerator

adding the corresponding parser somewhere at the extension:

	netlink_parser = YnlDocGenerator()

And then add a logic inside parse() to handle different
doc outputs, depending on the file location, similar to:

        if "/netlink/specs/" in fname:
            msg = self.netlink_parser.parse_yaml_file(fname)

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
 1 file changed, 76 insertions(+)
 create mode 100755 Documentation/sphinx/parser_yaml.py

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..635945e1c5ba
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,76 @@
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+import re
+import sys
+
+from pprint import pformat
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
+
+from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+    """Custom parser for YAML files."""
+
+    # Need at least two elements on this set
+    supported = ('yaml', 'yml')
+
+    netlink_parser = YnlDocGenerator()
+
+    def do_parse(self, inputstring, document, msg):
+        """Parse YAML and generate a document tree."""
+
+        self.setup_parse(inputstring, document)
+
+        result = ViewList()
+
+        try:
+            # Parse message with RSTParser
+            for i, line in enumerate(msg.split('\n')):
+                result.append(line, document.current_source, i)
+
+            rst_parser = RSTParser()
+            rst_parser.parse('\n'.join(result), document)
+
+        except Exception as e:
+            document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+        self.finish_parse()
+
+    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+    def parse(self, inputstring, document):
+        """Check if a YAML is meant to be parsed."""
+
+        fname = document.current_source
+
+        # Handle netlink yaml specs
+        if "/netlink/specs/" in fname:
+            msg = self.netlink_parser.parse_yaml_file(fname)
+            self.do_parse(inputstring, document, msg)
+
+        # All other yaml files are ignored
+
+def setup(app):
+    """Setup function for the Sphinx extension."""
+
+    # Add YAML parser
+    app.add_source_parser(YamlParser)
+    app.add_source_suffix('.yaml', 'yaml')
+
+    return {
+        'version': '1.0',
+        'parallel_read_safe': True,
+        'parallel_write_safe': True,
+    }
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 08/17] docs: use parser_yaml extension to handle Netlink specs
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 09/17] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.

We are using here a toctree with :glob: property. This way, there
is no need to touch the netlink/specs/index.rst file every time
a new Netlink spec is added/renamed/removed.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/Makefile                        | 17 ----------------
 Documentation/conf.py                         | 20 ++++++++++++++-----
 Documentation/networking/index.rst            |  2 +-
 .../networking/netlink_spec/readme.txt        |  4 ----
 Documentation/sphinx/parser_yaml.py           |  4 ++--
 5 files changed, 18 insertions(+), 29 deletions(-)
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt

diff --git a/Documentation/Makefile b/Documentation/Makefile
index b98477df5ddf..820f07e0afe6 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -104,22 +104,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
 	fi
 
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
-	$(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
-	$(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
 htmldocs:
 	@$(srctree)/scripts/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -186,7 +170,6 @@ refcheckdocs:
 	$(Q)cd $(srctree);scripts/documentation-file-ref-check
 
 cleandocs:
-	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
 	$(Q)rm -rf $(BUILDDIR)
 	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
 
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 4ba4ee45e599..6af61e26cec5 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -38,6 +38,15 @@ exclude_patterns = []
 dyn_include_patterns = []
 dyn_exclude_patterns = ['output']
 
+# Currently, only netlink/specs has a parser for yaml.
+# Prefer using include patterns if available, as it is faster
+if has_include_patterns:
+    dyn_include_patterns.append('netlink/specs/*.yaml')
+else:
+    dyn_exclude_patterns.append('netlink/*.yaml')
+    dyn_exclude_patterns.append('devicetree/bindings/**.yaml')
+    dyn_exclude_patterns.append('core-api/kho/bindings/**.yaml')
+
 # Properly handle include/exclude patterns
 # ----------------------------------------
 
@@ -105,7 +114,7 @@ needs_sphinx = '3.4.3'
 extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
               'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
               'maintainers_include', 'sphinx.ext.autosectionlabel',
-              'kernel_abi', 'kernel_feat', 'translations']
+              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
 
 # Since Sphinx version 3, the C function parser is more pedantic with regards
 # to type checking. Due to that, having macros at c:function cause problems.
@@ -203,10 +212,11 @@ else:
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['sphinx/templates']
 
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+        '.rst': 'restructuredtext',
+        '.yaml': 'yaml',
+}
 
 # The encoding of source files.
 #source_encoding = 'utf-8-sig'
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
    filter
    generic-hdlc
    generic_netlink
-   netlink_spec/index
+   ../netlink/specs/index
    gen_stats
    gtp
    ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 635945e1c5ba..2b2af239a1c2 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -15,9 +15,9 @@ from sphinx.util import logging
 from sphinx.parsers import Parser
 
 srctree = os.path.abspath(os.environ["srctree"])
-sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl/lib"))
 
-from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
+from doc_generator import YnlDocGenerator        # pylint: disable=C0413
 
 logger = logging.getLogger(__name__)
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 09/17] docs: uapi: netlink: update netlink specs link
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 08/17] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 10/17] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

With the recent parser_yaml extension, and the removal of the
auto-generated ReST source files, the location of netlink
specs changed.

Update uAPI accordingly.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/userspace-api/netlink/index.rst | 2 +-
 Documentation/userspace-api/netlink/specs.rst | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/Documentation/userspace-api/netlink/index.rst b/Documentation/userspace-api/netlink/index.rst
index c1b6765cc963..83ae25066591 100644
--- a/Documentation/userspace-api/netlink/index.rst
+++ b/Documentation/userspace-api/netlink/index.rst
@@ -18,4 +18,4 @@ Netlink documentation for users.
 
 See also:
  - :ref:`Documentation/core-api/netlink.rst <kernel_netlink>`
- - :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - :ref:`Documentation/netlink/specs/index.rst <specs>`
diff --git a/Documentation/userspace-api/netlink/specs.rst b/Documentation/userspace-api/netlink/specs.rst
index 1b50d97d8d7c..debb4bfca5c4 100644
--- a/Documentation/userspace-api/netlink/specs.rst
+++ b/Documentation/userspace-api/netlink/specs.rst
@@ -15,7 +15,7 @@ kernel headers directly.
 Internally kernel uses the YAML specs to generate:
 
  - the C uAPI header
- - documentation of the protocol as a ReST file - see :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - documentation of the protocol as a ReST file - see :ref:`Documentation/netlink/specs/index.rst <specs>`
  - policy tables for input attribute validation
  - operation tables
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 10/17] tools: ynl_gen_rst.py: drop support for generating index files
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 09/17] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 11/17] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we're now using an index file with a glob, there's no need
to generate index files anymore.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 26 ------------------------
 tools/net/ynl/pyynl/ynl_gen_rst.py       | 26 ------------------------
 2 files changed, 52 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index f71360f0ceb7..866551726723 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -358,29 +358,3 @@ class YnlDocGenerator:
             content = self.parse_yaml(yaml_data)
 
         return content
-
-    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
-        """Generate the `networking_spec/index` content and write to the file"""
-        lines = []
-
-        lines.append(self.fmt.rst_header())
-        lines.append(self.fmt.rst_label("specs"))
-        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
-        lines.append(self.fmt.rst_toctree(1))
-
-        index_fname = os.path.basename(output)
-        base, ext = os.path.splitext(index_fname)
-
-        if not index_dir:
-            index_dir = os.path.dirname(output)
-
-        logging.debug(f"Looking for {ext} files in %s", index_dir)
-        for filename in sorted(os.listdir(index_dir)):
-            if not filename.endswith(ext) or filename == index_fname:
-                continue
-            base, ext = os.path.splitext(filename)
-            lines.append(f"   {base}\n")
-
-        logging.debug("Writing an index file at %s", output)
-
-        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index b5a665eeaa5a..90ae19aac89d 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -31,9 +31,6 @@ def parse_arguments() -> argparse.Namespace:
 
     # Index and input are mutually exclusive
     group = parser.add_mutually_exclusive_group()
-    group.add_argument(
-        "-x", "--index", action="store_true", help="Generate the index page"
-    )
     group.add_argument("-i", "--input", help="YAML file name")
 
     args = parser.parse_args()
@@ -63,25 +60,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str) -> None:
-    """Generate the `networking_spec/index` content and write to the file"""
-
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
-
-    index_dir = os.path.dirname(output)
-    logging.debug("Looking for .rst files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(".rst") or filename == "index.rst":
-            continue
-        lines.append(f"   {filename.replace('.rst', '')}\n")
-
-    logging.debug("Writing an index file at %s", output)
-    write_to_rstfile(msg, output)
-
-
 def main() -> None:
     """Main function that reads the YAML files and generates the RST files"""
 
@@ -100,10 +78,6 @@ def main() -> None:
 
         write_to_rstfile(content, args.output)
 
-    if args.index:
-        # Generate the index RST file
-        generate_main_index_rst(args.output)
-
 
 if __name__ == "__main__":
     main()
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 11/17] docs: netlink: remove obsolete .gitignore from unused directory
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 10/17] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

The previous code was generating source rst files
under Documentation/networking/netlink_spec/. With the
Sphinx YAML parser, this is now gone. So, stop ignoring
*.rst files inside netlink specs directory.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/networking/netlink_spec/.gitignore | 1 -
 1 file changed, 1 deletion(-)
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore

diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 11/17] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19 12:10   ` Donald Hunter
  2025-06-19  6:49 ` [PATCH v7 13/17] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  17 siblings, 1 reply; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

The documentation build depends on the parsing code
at ynl_gen_rst.py. Ensure that changes to it will be c/c
to linux-doc ML and maintainers by adding an entry for
it. This way, if a change there would affect the build,
or the minimal version required for Python, doc developers
may know in advance.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
---
 MAINTAINERS | 1 +
 1 file changed, 1 insertion(+)

diff --git a/MAINTAINERS b/MAINTAINERS
index a92290fffa16..caa3425e5755 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -7202,6 +7202,7 @@ F:	scripts/get_abi.py
 F:	scripts/kernel-doc*
 F:	scripts/lib/abi/*
 F:	scripts/lib/kdoc/*
+F:	tools/net/ynl/pyynl/netlink_yml_parser.py
 F:	scripts/sphinx-pre-install
 X:	Documentation/ABI/
 X:	Documentation/admin-guide/media/
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 13/17] tools: netlink_yml_parser.py: add line numbers to parsed data
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 14/17] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

When something goes wrong, we want Sphinx error to point to the
right line number from the original source, not from the
processed ReST data.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 34 ++++++++++++++++++++++--
 1 file changed, 32 insertions(+), 2 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 866551726723..a9d8ab6f2639 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -20,6 +20,16 @@
 from typing import Any, Dict, List
 import yaml
 
+LINE_STR = '__lineno__'
+
+class NumberedSafeLoader(yaml.SafeLoader):
+    """Override the SafeLoader class to add line number to parsed data"""
+
+    def construct_mapping(self, node):
+        mapping = super().construct_mapping(node)
+        mapping[LINE_STR] = node.start_mark.line
+
+        return mapping
 
 class RstFormatters:
     """RST Formatters"""
@@ -127,6 +137,11 @@ class RstFormatters:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
+    @staticmethod
+    def rst_lineno(lineno: int) -> str:
+        """Return a lineno comment"""
+        return f".. LINENO {lineno}\n"
+
 class YnlDocGenerator:
     """YAML Netlink specs Parser"""
 
@@ -144,6 +159,9 @@ class YnlDocGenerator:
         """Parse 'do' section and return a formatted string"""
         lines = []
         for key in do_dict.keys():
+            if key == LINE_STR:
+                lines.append(self.fmt.rst_lineno(do_dict[key]))
+                continue
             lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
             if key in ['request', 'reply']:
                 lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
@@ -174,6 +192,10 @@ class YnlDocGenerator:
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
+                if key == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(operation[key]))
+                    continue
+
                 if key in preprocessed:
                     # Skip the special fields
                     continue
@@ -233,6 +255,9 @@ class YnlDocGenerator:
         for definition in defs:
             lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
             for k in definition.keys():
+                if k == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(definition[k]))
+                    continue
                 if k in preprocessed + ignored:
                     continue
                 lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
@@ -268,6 +293,9 @@ class YnlDocGenerator:
                 lines.append(self.fmt.rst_subsubsection(attr_line))
 
                 for k in attr.keys():
+                    if k == LINE_STR:
+                        lines.append(self.fmt.rst_lineno(attr[k]))
+                        continue
                     if k in preprocessed + ignored:
                         continue
                     if k in linkable:
@@ -306,6 +334,8 @@ class YnlDocGenerator:
         lines = []
 
         # Main header
+        lineno = obj.get('__lineno__', 0)
+        lines.append(self.fmt.rst_lineno(lineno))
 
         family = obj['name']
 
@@ -354,7 +384,7 @@ class YnlDocGenerator:
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
-            yaml_data = yaml.safe_load(spec_file)
-            content = self.parse_yaml(yaml_data)
+            numbered_yaml = yaml.load(spec_file, Loader=NumberedSafeLoader)
+            content = self.parse_yaml(numbered_yaml)
 
         return content
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 14/17] docs: parser_yaml.py: add support for line numbers from the parser
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (12 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 13/17] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 15/17] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of printing line numbers from the temp converted ReST
file, get them from the original source.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py      | 12 ++++++++++--
 tools/net/ynl/pyynl/lib/doc_generator.py | 16 ++++++++++++----
 2 files changed, 22 insertions(+), 6 deletions(-)

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 2b2af239a1c2..5360fcfd4fde 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -29,6 +29,8 @@ class YamlParser(Parser):
 
     netlink_parser = YnlDocGenerator()
 
+    re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
+
     def do_parse(self, inputstring, document, msg):
         """Parse YAML and generate a document tree."""
 
@@ -38,8 +40,14 @@ class YamlParser(Parser):
 
         try:
             # Parse message with RSTParser
-            for i, line in enumerate(msg.split('\n')):
-                result.append(line, document.current_source, i)
+            lineoffset = 0;
+            for line in msg.split('\n'):
+                match = self.re_lineno.match(line)
+                if match:
+                    lineoffset = int(match.group(1))
+                    continue
+
+                result.append(line, document.current_source, lineoffset)
 
             rst_parser = RSTParser()
             rst_parser.parse('\n'.join(result), document)
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index a9d8ab6f2639..7f4f98983cdf 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -158,9 +158,11 @@ class YnlDocGenerator:
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
+        if LINE_STR in do_dict:
+            lines.append(self.fmt.rst_lineno(do_dict[LINE_STR]))
+
         for key in do_dict.keys():
             if key == LINE_STR:
-                lines.append(self.fmt.rst_lineno(do_dict[key]))
                 continue
             lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
             if key in ['request', 'reply']:
@@ -187,13 +189,15 @@ class YnlDocGenerator:
         lines = []
 
         for operation in operations:
+            if LINE_STR in operation:
+                lines.append(self.fmt.rst_lineno(operation[LINE_STR]))
+
             lines.append(self.fmt.rst_section(namespace, 'operation',
                                               operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
                 if key == LINE_STR:
-                    lines.append(self.fmt.rst_lineno(operation[key]))
                     continue
 
                 if key in preprocessed:
@@ -253,10 +257,12 @@ class YnlDocGenerator:
         lines = []
 
         for definition in defs:
+            if LINE_STR in definition:
+                lines.append(self.fmt.rst_lineno(definition[LINE_STR]))
+
             lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
             for k in definition.keys():
                 if k == LINE_STR:
-                    lines.append(self.fmt.rst_lineno(definition[k]))
                     continue
                 if k in preprocessed + ignored:
                     continue
@@ -284,6 +290,9 @@ class YnlDocGenerator:
             lines.append(self.fmt.rst_section(namespace, 'attribute-set',
                                               entry["name"]))
             for attr in entry["attributes"]:
+                if LINE_STR in attr:
+                    lines.append(self.fmt.rst_lineno(attr[LINE_STR]))
+
                 type_ = attr.get("type")
                 attr_line = attr["name"]
                 if type_:
@@ -294,7 +303,6 @@ class YnlDocGenerator:
 
                 for k in attr.keys():
                     if k == LINE_STR:
-                        lines.append(self.fmt.rst_lineno(attr[k]))
                         continue
                     if k in preprocessed + ignored:
                         continue
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 15/17] docs: sphinx: add a file with the requirements for lowest version
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (13 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 14/17] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 16/17] docs: conf.py: several coding style fixes Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Randy Dunlap, joel, linux-kernel-mentees, linux-kernel, lkmm,
	netdev, peterz, stern

Those days, it is hard to install a virtual env that would
build docs with Sphinx 3.4.3, as even python 3.13 is not
compatible anymore with it.

	/usr/bin/python3.9 -m venv sphinx_3.4.3
	. sphinx_3.4.3/bin/activate
	pip install -r Documentation/sphinx/min_requirements.txt

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/doc-guide/sphinx.rst        | 15 +++++++++++++++
 Documentation/sphinx/min_requirements.txt |  8 ++++++++
 2 files changed, 23 insertions(+)
 create mode 100644 Documentation/sphinx/min_requirements.txt

diff --git a/Documentation/doc-guide/sphinx.rst b/Documentation/doc-guide/sphinx.rst
index 5a91df105141..13943eb532ac 100644
--- a/Documentation/doc-guide/sphinx.rst
+++ b/Documentation/doc-guide/sphinx.rst
@@ -131,6 +131,21 @@ It supports two optional parameters:
 ``--no-virtualenv``
 	Use OS packaging for Sphinx instead of Python virtual environment.
 
+Installing Sphinx Minimal Version
+---------------------------------
+
+When changing Sphinx build system, it is important to ensure that
+the minimal version will still be supported. Nowadays, it is
+becoming harder to do that on modern distributions, as it is not
+possible to install with Python 3.13 and above.
+
+The recommended way is to use the lowest supported Python version
+as defined at Documentation/process/changes.rst, creating
+a venv with it with, and install minimal requirements with::
+
+	/usr/bin/python3.9 -m venv sphinx_min
+	. sphinx_min/bin/activate
+	pip install -r Documentation/sphinx/min_requirements.txt
 
 Sphinx Build
 ============
diff --git a/Documentation/sphinx/min_requirements.txt b/Documentation/sphinx/min_requirements.txt
new file mode 100644
index 000000000000..89ea36d5798f
--- /dev/null
+++ b/Documentation/sphinx/min_requirements.txt
@@ -0,0 +1,8 @@
+Sphinx==3.4.3
+jinja2<3.1
+docutils<0.18
+sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-serializinghtml==1.1.5
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 16/17] docs: conf.py: several coding style fixes
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (14 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 15/17] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  6:49 ` [PATCH v7 17/17] docs: conf.py: Check Sphinx and docutils version Mauro Carvalho Chehab
  2025-06-19  8:29 ` [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

conf.py is missing a SPDX header and doesn't really have
a proper python coding style. It also has an obsolete
commented LaTeX syntax that doesn't work anymore.

Clean it up a little bit with some help from autolints
and manual adjustments.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/conf.py | 367 +++++++++++++++++++++---------------------
 1 file changed, 181 insertions(+), 186 deletions(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index 6af61e26cec5..5eddf5885f77 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -1,24 +1,28 @@
-# -*- coding: utf-8 -*-
-#
-# The Linux Kernel documentation build configuration file, created by
-# sphinx-quickstart on Fri Feb 12 13:51:46 2016.
-#
-# This file is execfile()d with the current directory set to its
-# containing dir.
-#
-# Note that not all possible configuration values are present in this
-# autogenerated file.
-#
-# All configuration values have a default; values that are commented out
-# serve to show the default.
+# SPDX-License-Identifier: GPL-2.0-only
+# pylint: disable=C0103,C0209
+
+"""
+The Linux Kernel documentation build configuration file.
+"""
 
-import sys
 import os
-import sphinx
 import shutil
+import sys
+
+import sphinx
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+sys.path.insert(0, os.path.abspath("sphinx"))
+
+from load_config import loadConfig               # pylint: disable=C0413,E0401
+
+# Minimal supported version
+needs_sphinx = "3.4.3"
 
 # Get Sphinx version
-major, minor, patch = sphinx.version_info[:3]
+major, minor, patch = sphinx.version_info[:3]          # pylint: disable=I1101
 
 # Include_patterns were added on Sphinx 5.1
 if (major < 5) or (major == 5 and minor < 1):
@@ -26,32 +30,32 @@ if (major < 5) or (major == 5 and minor < 1):
 else:
     has_include_patterns = True
     # Include patterns that don't contain directory names, in glob format
-    include_patterns = ['**.rst']
+    include_patterns = ["**.rst"]
 
 # Location of Documentation/ directory
-doctree = os.path.abspath('.')
+doctree = os.path.abspath(".")
 
 # Exclude of patterns that don't contain directory names, in glob format.
 exclude_patterns = []
 
 # List of patterns that contain directory names in glob format.
 dyn_include_patterns = []
-dyn_exclude_patterns = ['output']
+dyn_exclude_patterns = ["output"]
 
 # Currently, only netlink/specs has a parser for yaml.
 # Prefer using include patterns if available, as it is faster
 if has_include_patterns:
-    dyn_include_patterns.append('netlink/specs/*.yaml')
+    dyn_include_patterns.append("netlink/specs/*.yaml")
 else:
-    dyn_exclude_patterns.append('netlink/*.yaml')
-    dyn_exclude_patterns.append('devicetree/bindings/**.yaml')
-    dyn_exclude_patterns.append('core-api/kho/bindings/**.yaml')
+    dyn_exclude_patterns.append("netlink/*.yaml")
+    dyn_exclude_patterns.append("devicetree/bindings/**.yaml")
+    dyn_exclude_patterns.append("core-api/kho/bindings/**.yaml")
 
 # Properly handle include/exclude patterns
 # ----------------------------------------
 
+
 def update_patterns(app):
-
     """
     On Sphinx, all directories are relative to what it is passed as
     SOURCEDIR parameter for sphinx-build. Due to that, all patterns
@@ -62,15 +66,12 @@ def update_patterns(app):
     exclude relative patterns that start with "../".
     """
 
-    sourcedir = app.srcdir  # full path to the source directory
-    builddir = os.environ.get("BUILDDIR")
-
     # setup include_patterns dynamically
     if has_include_patterns:
         for p in dyn_include_patterns:
             full = os.path.join(doctree, p)
 
-            rel_path = os.path.relpath(full, start = app.srcdir)
+            rel_path = os.path.relpath(full, start=app.srcdir)
             if rel_path.startswith("../"):
                 continue
 
@@ -80,15 +81,17 @@ def update_patterns(app):
     for p in dyn_exclude_patterns:
         full = os.path.join(doctree, p)
 
-        rel_path = os.path.relpath(full, start = app.srcdir)
+        rel_path = os.path.relpath(full, start=app.srcdir)
         if rel_path.startswith("../"):
             continue
 
         app.config.exclude_patterns.append(rel_path)
 
+
 # helper
 # ------
 
+
 def have_command(cmd):
     """Search ``cmd`` in the ``PATH`` environment.
 
@@ -97,24 +100,24 @@ def have_command(cmd):
     """
     return shutil.which(cmd) is not None
 
-# If extensions (or modules to document with autodoc) are in another directory,
-# add these directories to sys.path here. If the directory is relative to the
-# documentation root, use os.path.abspath to make it absolute, like shown here.
-sys.path.insert(0, os.path.abspath('sphinx'))
-from load_config import loadConfig
 
 # -- General configuration ------------------------------------------------
 
-# If your documentation needs a minimal Sphinx version, state it here.
-needs_sphinx = '3.4.3'
-
-# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
-# ones.
-extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
-              'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
-              'maintainers_include', 'sphinx.ext.autosectionlabel',
-              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
+# Add any Sphinx extensions in alphabetic order
+extensions = [
+    "automarkup",
+    "kernel_abi",
+    "kerneldoc",
+    "kernel_feat",
+    "kernel_include",
+    "kfigure",
+    "maintainers_include",
+    "parser_yaml",
+    "rstFlatTable",
+    "sphinx.ext.autosectionlabel",
+    "sphinx.ext.ifconfig",
+    "translations",
+]
 
 # Since Sphinx version 3, the C function parser is more pedantic with regards
 # to type checking. Due to that, having macros at c:function cause problems.
@@ -189,45 +192,45 @@ autosectionlabel_maxdepth = 2
 # Load math renderer:
 # For html builder, load imgmath only when its dependencies are met.
 # mathjax is the default math renderer since Sphinx 1.8.
-have_latex =  have_command('latex')
-have_dvipng = have_command('dvipng')
+have_latex = have_command("latex")
+have_dvipng = have_command("dvipng")
 load_imgmath = have_latex and have_dvipng
 
 # Respect SPHINX_IMGMATH (for html docs only)
-if 'SPHINX_IMGMATH' in os.environ:
-    env_sphinx_imgmath = os.environ['SPHINX_IMGMATH']
-    if 'yes' in env_sphinx_imgmath:
+if "SPHINX_IMGMATH" in os.environ:
+    env_sphinx_imgmath = os.environ["SPHINX_IMGMATH"]
+    if "yes" in env_sphinx_imgmath:
         load_imgmath = True
-    elif 'no' in env_sphinx_imgmath:
+    elif "no" in env_sphinx_imgmath:
         load_imgmath = False
     else:
         sys.stderr.write("Unknown env SPHINX_IMGMATH=%s ignored.\n" % env_sphinx_imgmath)
 
 if load_imgmath:
     extensions.append("sphinx.ext.imgmath")
-    math_renderer = 'imgmath'
+    math_renderer = "imgmath"
 else:
-    math_renderer = 'mathjax'
+    math_renderer = "mathjax"
 
 # Add any paths that contain templates here, relative to this directory.
-templates_path = ['sphinx/templates']
+templates_path = ["sphinx/templates"]
 
 # The suffixes of source filenames that will be automatically parsed
 source_suffix = {
-        '.rst': 'restructuredtext',
-        '.yaml': 'yaml',
+    ".rst": "restructuredtext",
+    ".yaml": "yaml",
 }
 
 # The encoding of source files.
-#source_encoding = 'utf-8-sig'
+# source_encoding = 'utf-8-sig'
 
 # The master toctree document.
-master_doc = 'index'
+master_doc = "index"
 
 # General information about the project.
-project = 'The Linux Kernel'
-copyright = 'The kernel development community'
-author = 'The kernel development community'
+project = "The Linux Kernel"
+copyright = "The kernel development community"         # pylint: disable=W0622
+author = "The kernel development community"
 
 # The version info for the project you're documenting, acts as replacement for
 # |version| and |release|, also used in various other places throughout the
@@ -242,82 +245,86 @@ author = 'The kernel development community'
 try:
     makefile_version = None
     makefile_patchlevel = None
-    for line in open('../Makefile'):
-        key, val = [x.strip() for x in line.split('=', 2)]
-        if key == 'VERSION':
-            makefile_version = val
-        elif key == 'PATCHLEVEL':
-            makefile_patchlevel = val
-        if makefile_version and makefile_patchlevel:
-            break
-except:
+    with open("../Makefile", encoding="utf=8") as fp:
+        for line in fp:
+            key, val = [x.strip() for x in line.split("=", 2)]
+            if key == "VERSION":
+                makefile_version = val
+            elif key == "PATCHLEVEL":
+                makefile_patchlevel = val
+            if makefile_version and makefile_patchlevel:
+                break
+except Exception:
     pass
 finally:
     if makefile_version and makefile_patchlevel:
-        version = release = makefile_version + '.' + makefile_patchlevel
+        version = release = makefile_version + "." + makefile_patchlevel
     else:
         version = release = "unknown version"
 
-#
-# HACK: there seems to be no easy way for us to get at the version and
-# release information passed in from the makefile...so go pawing through the
-# command-line options and find it for ourselves.
-#
+
 def get_cline_version():
-    c_version = c_release = ''
+    """
+    HACK: There seems to be no easy way for us to get at the version and
+    release information passed in from the makefile...so go pawing through the
+    command-line options and find it for ourselves.
+    """
+
+    c_version = c_release = ""
     for arg in sys.argv:
-        if arg.startswith('version='):
+        if arg.startswith("version="):
             c_version = arg[8:]
-        elif arg.startswith('release='):
+        elif arg.startswith("release="):
             c_release = arg[8:]
     if c_version:
         if c_release:
-            return c_version + '-' + c_release
+            return c_version + "-" + c_release
         return c_version
-    return version # Whatever we came up with before
+    return version  # Whatever we came up with before
+
 
 # The language for content autogenerated by Sphinx. Refer to documentation
 # for a list of supported languages.
 #
 # This is also used if you do content translation via gettext catalogs.
 # Usually you set "language" from the command line for these cases.
-language = 'en'
+language = "en"
 
 # There are two options for replacing |today|: either, you set today to some
 # non-false value, then it is used:
-#today = ''
+# today = ''
 # Else, today_fmt is used as the format for a strftime call.
-#today_fmt = '%B %d, %Y'
+# today_fmt = '%B %d, %Y'
 
 # The reST default role (used for this markup: `text`) to use for all
 # documents.
-#default_role = None
+# default_role = None
 
 # If true, '()' will be appended to :func: etc. cross-reference text.
-#add_function_parentheses = True
+# add_function_parentheses = True
 
 # If true, the current module name will be prepended to all description
 # unit titles (such as .. function::).
-#add_module_names = True
+# add_module_names = True
 
 # If true, sectionauthor and moduleauthor directives will be shown in the
 # output. They are ignored by default.
-#show_authors = False
+# show_authors = False
 
 # The name of the Pygments (syntax highlighting) style to use.
-pygments_style = 'sphinx'
+pygments_style = "sphinx"
 
 # A list of ignored prefixes for module index sorting.
-#modindex_common_prefix = []
+# modindex_common_prefix = []
 
 # If true, keep warnings as "system message" paragraphs in the built documents.
-#keep_warnings = False
+# keep_warnings = False
 
 # If true, `todo` and `todoList` produce output, else they produce nothing.
 todo_include_todos = False
 
-primary_domain = 'c'
-highlight_language = 'none'
+primary_domain = "c"
+highlight_language = "none"
 
 # -- Options for HTML output ----------------------------------------------
 
@@ -325,43 +332,45 @@ highlight_language = 'none'
 # a list of builtin themes.
 
 # Default theme
-html_theme = 'alabaster'
+html_theme = "alabaster"
 html_css_files = []
 
 if "DOCS_THEME" in os.environ:
     html_theme = os.environ["DOCS_THEME"]
 
-if html_theme == 'sphinx_rtd_theme' or html_theme == 'sphinx_rtd_dark_mode':
+if html_theme in ["sphinx_rtd_theme", "sphinx_rtd_dark_mode"]:
     # Read the Docs theme
     try:
         import sphinx_rtd_theme
+
         html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
 
         # Add any paths that contain custom static files (such as style sheets) here,
         # relative to this directory. They are copied after the builtin static files,
         # so a file named "default.css" will overwrite the builtin "default.css".
         html_css_files = [
-            'theme_overrides.css',
+            "theme_overrides.css",
         ]
 
         # Read the Docs dark mode override theme
-        if html_theme == 'sphinx_rtd_dark_mode':
+        if html_theme == "sphinx_rtd_dark_mode":
             try:
-                import sphinx_rtd_dark_mode
-                extensions.append('sphinx_rtd_dark_mode')
+                import sphinx_rtd_dark_mode            # pylint: disable=W0611
+
+                extensions.append("sphinx_rtd_dark_mode")
             except ImportError:
-                html_theme == 'sphinx_rtd_theme'
+                html_theme = "sphinx_rtd_theme"
 
-        if html_theme == 'sphinx_rtd_theme':
-                # Add color-specific RTD normal mode
-                html_css_files.append('theme_rtd_colors.css')
+        if html_theme == "sphinx_rtd_theme":
+            # Add color-specific RTD normal mode
+            html_css_files.append("theme_rtd_colors.css")
 
         html_theme_options = {
-            'navigation_depth': -1,
+            "navigation_depth": -1,
         }
 
     except ImportError:
-        html_theme = 'alabaster'
+        html_theme = "alabaster"
 
 if "DOCS_CSS" in os.environ:
     css = os.environ["DOCS_CSS"].split(" ")
@@ -369,14 +378,14 @@ if "DOCS_CSS" in os.environ:
     for l in css:
         html_css_files.append(l)
 
-if  html_theme == 'alabaster':
+if html_theme == "alabaster":
     html_theme_options = {
-        'description': get_cline_version(),
-        'page_width': '65em',
-        'sidebar_width': '15em',
-        'fixed_sidebar': 'true',
-        'font_size': 'inherit',
-        'font_family': 'serif',
+        "description": get_cline_version(),
+        "page_width": "65em",
+        "sidebar_width": "15em",
+        "fixed_sidebar": "true",
+        "font_size": "inherit",
+        "font_family": "serif",
     }
 
 sys.stderr.write("Using %s theme\n" % html_theme)
@@ -384,104 +393,79 @@ sys.stderr.write("Using %s theme\n" % html_theme)
 # Add any paths that contain custom static files (such as style sheets) here,
 # relative to this directory. They are copied after the builtin static files,
 # so a file named "default.css" will overwrite the builtin "default.css".
-html_static_path = ['sphinx-static']
+html_static_path = ["sphinx-static"]
 
 # If true, Docutils "smart quotes" will be used to convert quotes and dashes
 # to typographically correct entities.  However, conversion of "--" to "—"
 # is not always what we want, so enable only quotes.
-smartquotes_action = 'q'
+smartquotes_action = "q"
 
 # Custom sidebar templates, maps document names to template names.
 # Note that the RTD theme ignores this
-html_sidebars = { '**': ['searchbox.html', 'kernel-toc.html', 'sourcelink.html']}
+html_sidebars = {"**": ["searchbox.html",
+                        "kernel-toc.html",
+                        "sourcelink.html"]}
 
 # about.html is available for alabaster theme. Add it at the front.
-if html_theme == 'alabaster':
-    html_sidebars['**'].insert(0, 'about.html')
+if html_theme == "alabaster":
+    html_sidebars["**"].insert(0, "about.html")
 
 # The name of an image file (relative to this directory) to place at the top
 # of the sidebar.
-html_logo = 'images/logo.svg'
+html_logo = "images/logo.svg"
 
 # Output file base name for HTML help builder.
-htmlhelp_basename = 'TheLinuxKerneldoc'
+htmlhelp_basename = "TheLinuxKerneldoc"
 
 # -- Options for LaTeX output ---------------------------------------------
 
 latex_elements = {
     # The paper size ('letterpaper' or 'a4paper').
-    'papersize': 'a4paper',
-
+    "papersize": "a4paper",
     # The font size ('10pt', '11pt' or '12pt').
-    'pointsize': '11pt',
-
+    "pointsize": "11pt",
     # Latex figure (float) alignment
-    #'figure_align': 'htbp',
-
+    # 'figure_align': 'htbp',
     # Don't mangle with UTF-8 chars
-    'inputenc': '',
-    'utf8extra': '',
-
+    "inputenc": "",
+    "utf8extra": "",
     # Set document margins
-    'sphinxsetup': '''
+    "sphinxsetup": """
         hmargin=0.5in, vmargin=1in,
         parsedliteralwraps=true,
         verbatimhintsturnover=false,
-    ''',
-
+    """,
     #
     # Some of our authors are fond of deep nesting; tell latex to
     # cope.
     #
-    'maxlistdepth': '10',
-
+    "maxlistdepth": "10",
     # For CJK One-half spacing, need to be in front of hyperref
-    'extrapackages': r'\usepackage{setspace}',
-
+    "extrapackages": r"\usepackage{setspace}",
     # Additional stuff for the LaTeX preamble.
-    'preamble': '''
+    "preamble": """
         % Use some font with UTF-8 support with XeLaTeX
         \\usepackage{fontspec}
         \\setsansfont{DejaVu Sans}
         \\setromanfont{DejaVu Serif}
         \\setmonofont{DejaVu Sans Mono}
-    ''',
+    """,
 }
 
 # Load kerneldoc specific LaTeX settings
-latex_elements['preamble'] += '''
+latex_elements["preamble"] += """
         % Load kerneldoc specific LaTeX settings
-	\\input{kerneldoc-preamble.sty}
-'''
-
-# With Sphinx 1.6, it is possible to change the Bg color directly
-# by using:
-#	\definecolor{sphinxnoteBgColor}{RGB}{204,255,255}
-#	\definecolor{sphinxwarningBgColor}{RGB}{255,204,204}
-#	\definecolor{sphinxattentionBgColor}{RGB}{255,255,204}
-#	\definecolor{sphinximportantBgColor}{RGB}{192,255,204}
-#
-# However, it require to use sphinx heavy box with:
-#
-#	\renewenvironment{sphinxlightbox} {%
-#		\\begin{sphinxheavybox}
-#	}
-#		\\end{sphinxheavybox}
-#	}
-#
-# Unfortunately, the implementation is buggy: if a note is inside a
-# table, it isn't displayed well. So, for now, let's use boring
-# black and white notes.
+        \\input{kerneldoc-preamble.sty}
+"""
 
 # Grouping the document tree into LaTeX files. List of tuples
 # (source start file, target name, title,
 #  author, documentclass [howto, manual, or own class]).
 # Sorted in alphabetical order
-latex_documents = [
-]
+latex_documents = []
 
 # Add all other index files from Documentation/ subdirectories
-for fn in os.listdir('.'):
+for fn in os.listdir("."):
     doc = os.path.join(fn, "index")
     if os.path.exists(doc + ".rst"):
         has = False
@@ -490,34 +474,39 @@ for fn in os.listdir('.'):
                 has = True
                 break
         if not has:
-            latex_documents.append((doc, fn + '.tex',
-                                    'Linux %s Documentation' % fn.capitalize(),
-                                    'The kernel development community',
-                                    'manual'))
+            latex_documents.append(
+                (
+                    doc,
+                    fn + ".tex",
+                    "Linux %s Documentation" % fn.capitalize(),
+                    "The kernel development community",
+                    "manual",
+                )
+            )
 
 # The name of an image file (relative to this directory) to place at the top of
 # the title page.
-#latex_logo = None
+# latex_logo = None
 
 # For "manual" documents, if this is true, then toplevel headings are parts,
 # not chapters.
-#latex_use_parts = False
+# latex_use_parts = False
 
 # If true, show page references after internal links.
-#latex_show_pagerefs = False
+# latex_show_pagerefs = False
 
 # If true, show URL addresses after external links.
-#latex_show_urls = False
+# latex_show_urls = False
 
 # Documents to append as an appendix to all manuals.
-#latex_appendices = []
+# latex_appendices = []
 
 # If false, no module index is generated.
-#latex_domain_indices = True
+# latex_domain_indices = True
 
 # Additional LaTeX stuff to be copied to build directory
 latex_additional_files = [
-    'sphinx/kerneldoc-preamble.sty',
+    "sphinx/kerneldoc-preamble.sty",
 ]
 
 
@@ -526,12 +515,11 @@ latex_additional_files = [
 # One entry per manual page. List of tuples
 # (source start file, name, description, authors, manual section).
 man_pages = [
-    (master_doc, 'thelinuxkernel', 'The Linux Kernel Documentation',
-     [author], 1)
+    (master_doc, "thelinuxkernel", "The Linux Kernel Documentation", [author], 1)
 ]
 
 # If true, show URL addresses after external links.
-#man_show_urls = False
+# man_show_urls = False
 
 
 # -- Options for Texinfo output -------------------------------------------
@@ -539,11 +527,15 @@ man_pages = [
 # Grouping the document tree into Texinfo files. List of tuples
 # (source start file, target name, title, author,
 #  dir menu entry, description, category)
-texinfo_documents = [
-    (master_doc, 'TheLinuxKernel', 'The Linux Kernel Documentation',
-     author, 'TheLinuxKernel', 'One line description of project.',
-     'Miscellaneous'),
-]
+texinfo_documents = [(
+        master_doc,
+        "TheLinuxKernel",
+        "The Linux Kernel Documentation",
+        author,
+        "TheLinuxKernel",
+        "One line description of project.",
+        "Miscellaneous",
+    ),]
 
 # -- Options for Epub output ----------------------------------------------
 
@@ -554,9 +546,9 @@ epub_publisher = author
 epub_copyright = copyright
 
 # A list of files that should not be packed into the epub file.
-epub_exclude_files = ['search.html']
+epub_exclude_files = ["search.html"]
 
-#=======
+# =======
 # rst2pdf
 #
 # Grouping the document tree into PDF files. List of tuples
@@ -568,14 +560,14 @@ epub_exclude_files = ['search.html']
 # multiple PDF files here actually tries to get the cross-referencing right
 # *between* PDF files.
 pdf_documents = [
-    ('kernel-documentation', u'Kernel', u'Kernel', u'J. Random Bozo'),
+    ("kernel-documentation", "Kernel", "Kernel", "J. Random Bozo"),
 ]
 
 # kernel-doc extension configuration for running Sphinx directly (e.g. by Read
 # the Docs). In a normal build, these are supplied from the Makefile via command
 # line arguments.
-kerneldoc_bin = '../scripts/kernel-doc.py'
-kerneldoc_srctree = '..'
+kerneldoc_bin = "../scripts/kernel-doc.py"
+kerneldoc_srctree = ".."
 
 # ------------------------------------------------------------------------------
 # Since loadConfig overwrites settings from the global namespace, it has to be
@@ -583,5 +575,8 @@ kerneldoc_srctree = '..'
 # ------------------------------------------------------------------------------
 loadConfig(globals())
 
+
 def setup(app):
-	app.connect('builder-inited', update_patterns)
+    """Patterns need to be updated at init time on older Sphinx versions"""
+
+    app.connect("builder-inited", update_patterns)
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [PATCH v7 17/17] docs: conf.py: Check Sphinx and docutils version
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (15 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 16/17] docs: conf.py: several coding style fixes Mauro Carvalho Chehab
@ 2025-06-19  6:49 ` Mauro Carvalho Chehab
  2025-06-19  8:29 ` [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  17 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:49 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

As reported by Akira, there are incompatibility issues with
Sphinx and docutils.

I manually checked that before docutils 0.17.1, yaml generation
doesn't work properly. Akira checked that 0.19 is problematic too.

After check Sphinx release notes, it seems that the
versions that are supposed to cope well together are:

	========  ============  ============
	Sphinx    Min Docutils  Max Docutils
	Version   Version       Version
	--------  ------------  ------------
	< 4.0.0	  0.17.1        0.17.1
	< 6.0.0	  0.17.1        0.18.1
	< 7.0.0   0.18.0        0.18.1
	>= 7.0.0  0.20.0        0.21.2
	========  ============  ============

Add a logic inside conf.py to check the above, emitting warnings
if the docutils version don't match what is known to be supported.

Reported-by: Akira Yokosawa <akiyks@gmail.com>
Closes: https://lore.kernel.org/linux-doc/6fcb75ee-61db-4fb3-9c5f-2029a7fea4ee@gmail.com/
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/conf.py | 33 ++++++++++++++++++++++++++++++---
 1 file changed, 30 insertions(+), 3 deletions(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index 5eddf5885f77..6047ec85add1 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -9,7 +9,11 @@ import os
 import shutil
 import sys
 
+import docutils
 import sphinx
+from sphinx.util import logging
+
+logger = logging.getLogger(__name__)
 
 # If extensions (or modules to document with autodoc) are in another directory,
 # add these directories to sys.path here. If the directory is relative to the
@@ -21,11 +25,34 @@ from load_config import loadConfig               # pylint: disable=C0413,E0401
 # Minimal supported version
 needs_sphinx = "3.4.3"
 
-# Get Sphinx version
-major, minor, patch = sphinx.version_info[:3]          # pylint: disable=I1101
+# Get Sphinx and docutils versions
+sphinx_ver = sphinx.version_info[:3]          # pylint: disable=I1101
+docutils_ver = docutils.__version_info__[:3]
+
+#
+if sphinx_ver < (4, 0, 0):
+    min_docutils = (0, 16, 0)
+    max_docutils = (0, 17, 1)
+elif sphinx_ver < (6, 0, 0):
+    min_docutils = (0, 17, 0)
+    max_docutils = (0, 18, 1)
+elif sphinx_ver < (7, 0, 0):
+    min_docutils = (0, 18, 0)
+    max_docutils = (0, 18, 1)
+else:
+    min_docutils = (0, 20, 0)
+    max_docutils = (0, 21, 2)
+
+sphinx_ver_str = ".".join([str(x) for x in sphinx_ver])
+docutils_ver_str = ".".join([str(x) for x in docutils_ver])
+
+if docutils_ver < min_docutils:
+    logger.warning(f"Docutils {docutils_ver_str} is too old for Sphinx {sphinx_ver_str}. Doc generation may fail")
+elif docutils_ver > max_docutils:
+    logger.warning(f"Docutils {docutils_ver_str} could be too new for Sphinx {sphinx_ver_str}. Doc generation may fail")
 
 # Include_patterns were added on Sphinx 5.1
-if (major < 5) or (major == 5 and minor < 1):
+if sphinx_ver < (5, 1, 0):
     has_include_patterns = False
 else:
     has_include_patterns = True
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree)
  2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (16 preceding siblings ...)
  2025-06-19  6:49 ` [PATCH v7 17/17] docs: conf.py: Check Sphinx and docutils version Mauro Carvalho Chehab
@ 2025-06-19  8:29 ` Donald Hunter
  2025-06-20 10:33   ` Mauro Carvalho Chehab
  17 siblings, 1 reply; 26+ messages in thread
From: Donald Hunter @ 2025-06-19  8:29 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao, Randy Dunlap

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
>
> v7:

ETOOFAST

https://docs.kernel.org/process/maintainer-netdev.html#resending-after-review

I didn't complete reviewing v6 yet :(

> - Added a patch to cleanup conf.py and address coding style issues;
> - Added a docutils version check logic to detect known issues when
>   building the docs with too old or too new docutils version.  The
>   actuall min/max vesion depends on Sphinx version.

It seems to me that patches 15-17 belong in a different series.

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  2025-06-19  6:48 ` [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-19  8:57   ` Donald Hunter
  0 siblings, 0 replies; 26+ messages in thread
From: Donald Hunter @ 2025-06-19  8:57 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Currently, rt documents are referred with:
>
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
>
> Having :doc: references with relative paths doesn't always work,
> as it may have troubles when O= is used. Also that's hard to
> maintain, and may break if we change the way rst files are
> generated from yaml. Better to use instead a reference for
> the netlink family.
>
> So, replace them by Sphinx cross-reference tag that are
> created by ynl_gen_rst.py.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-19  6:48 ` [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-19 12:01   ` Donald Hunter
  2025-06-20 10:58     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 26+ messages in thread
From: Donald Hunter @ 2025-06-19 12:01 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> +
> +    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
> +        """Generate the `networking_spec/index` content and write to the file"""
> +        lines = []
> +
> +        lines.append(self.fmt.rst_header())
> +        lines.append(self.fmt.rst_label("specs"))
> +        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
> +        lines.append(self.fmt.rst_toctree(1))
> +
> +        index_fname = os.path.basename(output)
> +        base, ext = os.path.splitext(index_fname)
> +
> +        if not index_dir:
> +            index_dir = os.path.dirname(output)
> +
> +        logging.debug(f"Looking for {ext} files in %s", index_dir)
> +        for filename in sorted(os.listdir(index_dir)):
> +            if not filename.endswith(ext) or filename == index_fname:
> +                continue
> +            base, ext = os.path.splitext(filename)
> +            lines.append(f"   {base}\n")
> +
> +        logging.debug("Writing an index file at %s", output)
> +
> +        return "".join(lines)

Did you miss my comment on v5 to not move this from ynl_gen_rst.py,
since it is not needed and gets removed in a later patch.

> @@ -411,7 +65,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
>  
>  def generate_main_index_rst(output: str) -> None:
>      """Generate the `networking_spec/index` content and write to the file"""
> -    lines = []
>  
>      lines.append(rst_header())
>      lines.append(rst_label("specs"))
> @@ -426,7 +79,7 @@ def generate_main_index_rst(output: str) -> None:
>          lines.append(f"   {filename.replace('.rst', '')}\n")
>  
>      logging.debug("Writing an index file at %s", output)
> -    write_to_rstfile("".join(lines), output)
> +    write_to_rstfile(msg, output)

The changes leave this function broken. Bot lines and msg never get
defined.

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style
  2025-06-19  6:48 ` [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
@ 2025-06-19 12:04   ` Donald Hunter
  0 siblings, 0 replies; 26+ messages in thread
From: Donald Hunter @ 2025-06-19 12:04 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Cleanup some coding style issues pointed by pylint and flake8.
>
> No functional changes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> Reviewed-by: Breno Leitao <leitao@debian.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-19  6:49 ` [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-19 12:08   ` Donald Hunter
  0 siblings, 0 replies; 26+ messages in thread
From: Donald Hunter @ 2025-06-19 12:08 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Add a simple sphinx.Parser to handle yaml files and add the
> the code to handle Netlink specs. All other yaml files are
> ignored.
>
> The code was written in a way that parsing yaml for different
> subsystems and even for different parts of Netlink are easy.
>
> All it takes to have a different parser is to add an
> import line similar to:
>
> 	from netlink_yml_parser import YnlDocGenerator
>
> adding the corresponding parser somewhere at the extension:
>
> 	netlink_parser = YnlDocGenerator()
>
> And then add a logic inside parse() to handle different
> doc outputs, depending on the file location, similar to:
>
>         if "/netlink/specs/" in fname:
>             msg = self.netlink_parser.parse_yaml_file(fname)
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Looks like you didn't address my comments from v5:

    > > +class YamlParser(Parser):
    > > +    """Custom parser for YAML files.""" 
    >
    > Would be good to say that this is a common YAML parser that calls
    > different subsystems, e.g. how you described it in the commit message.

    Makes sense. Will fix at the next version.

    >
    > > +
    > > +    # Need at least two elements on this set 
    >
    > I think you can drop this comment. It's not that it must be two
    > elements, it's that supported needs to be a list and the python syntax
    > to force parsing as a list would be ('item', )

    Ah, ok.

    > > +    supported = ('yaml', 'yml')
    > > +
    > > +    netlink_parser = YnlDocGenerator()
    > > +
    > > +    def do_parse(self, inputstring, document, msg): 
    >
    > Maybe a better name for this is parse_rst?

    Ok.

    >
    > > +        """Parse YAML and generate a document tree.""" 
    >
    > Also update comment.

    Ok.


^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc
  2025-06-19  6:49 ` [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-06-19 12:10   ` Donald Hunter
  0 siblings, 0 replies; 26+ messages in thread
From: Donald Hunter @ 2025-06-19 12:10 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> The documentation build depends on the parsing code
> at ynl_gen_rst.py. Ensure that changes to it will be c/c
> to linux-doc ML and maintainers by adding an entry for
> it. This way, if a change there would affect the build,
> or the minimal version required for Python, doc developers
> may know in advance.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> Reviewed-by: Breno Leitao <leitao@debian.org>
> ---
>  MAINTAINERS | 1 +
>  1 file changed, 1 insertion(+)
>
> diff --git a/MAINTAINERS b/MAINTAINERS
> index a92290fffa16..caa3425e5755 100644
> --- a/MAINTAINERS
> +++ b/MAINTAINERS
> @@ -7202,6 +7202,7 @@ F:	scripts/get_abi.py
>  F:	scripts/kernel-doc*
>  F:	scripts/lib/abi/*
>  F:	scripts/lib/kdoc/*
> +F:	tools/net/ynl/pyynl/netlink_yml_parser.py

Wrong path now, right?

>  F:	scripts/sphinx-pre-install
>  X:	Documentation/ABI/
>  X:	Documentation/admin-guide/media/

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree)
  2025-06-19  8:29 ` [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Donald Hunter
@ 2025-06-20 10:33   ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-20 10:33 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao, Randy Dunlap

Em Thu, 19 Jun 2025 09:29:19 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> >
> > v7:  
> 
> ETOOFAST

Heh, I'm now working on a v8 ;-) I guess I'll send it next
week to give Jon some time to apply the non-related YAML stuff.

> 
> https://docs.kernel.org/process/maintainer-netdev.html#resending-after-review
> 
> I didn't complete reviewing v6 yet :(
> 
> > - Added a patch to cleanup conf.py and address coding style issues;
> > - Added a docutils version check logic to detect known issues when
> >   building the docs with too old or too new docutils version.  The
> >   actuall min/max vesion depends on Sphinx version.  
> 
> It seems to me that patches 15-17 belong in a different series.

Yes. I'm splitting it on two separate series: the first one
with changes that are independent of YAML parser. The
second one with specific stuff. It will depend on the first
one, due to the changes inside Documentation/conf.py that
the first series does.

Regards,
Mauro

^ permalink raw reply	[flat|nested] 26+ messages in thread

* Re: [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-19 12:01   ` Donald Hunter
@ 2025-06-20 10:58     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 26+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-20 10:58 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Thu, 19 Jun 2025 13:01:14 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> > +
> > +    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
> > +        """Generate the `networking_spec/index` content and write to the file"""
> > +        lines = []
> > +
> > +        lines.append(self.fmt.rst_header())
> > +        lines.append(self.fmt.rst_label("specs"))
> > +        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
> > +        lines.append(self.fmt.rst_toctree(1))
> > +
> > +        index_fname = os.path.basename(output)
> > +        base, ext = os.path.splitext(index_fname)
> > +
> > +        if not index_dir:
> > +            index_dir = os.path.dirname(output)
> > +
> > +        logging.debug(f"Looking for {ext} files in %s", index_dir)
> > +        for filename in sorted(os.listdir(index_dir)):
> > +            if not filename.endswith(ext) or filename == index_fname:
> > +                continue
> > +            base, ext = os.path.splitext(filename)
> > +            lines.append(f"   {base}\n")
> > +
> > +        logging.debug("Writing an index file at %s", output)
> > +
> > +        return "".join(lines)  
> 
> Did you miss my comment on v5 to not move this from ynl_gen_rst.py,
> since it is not needed and gets removed in a later patch.
> 
> > @@ -411,7 +65,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
> >  
> >  def generate_main_index_rst(output: str) -> None:
> >      """Generate the `networking_spec/index` content and write to the file"""
> > -    lines = []
> >  
> >      lines.append(rst_header())
> >      lines.append(rst_label("specs"))
> > @@ -426,7 +79,7 @@ def generate_main_index_rst(output: str) -> None:
> >          lines.append(f"   {filename.replace('.rst', '')}\n")
> >  
> >      logging.debug("Writing an index file at %s", output)
> > -    write_to_rstfile("".join(lines), output)
> > +    write_to_rstfile(msg, output)  
> 
> The changes leave this function broken. Bot lines and msg never get
> defined.

I fixed the change at generate_main_index_rst and dropped it from
the doc_parser.py.

I did a small change at the logic for it to accept both .rst and .yaml
there, as it makes easier to test it with:

	tools/net/ynl/pyynl/ynl_gen_rst.py -x -o Documentation/netlink/specs/foo.rst

While I'll be sending it at the next version, I'm placing
the modified version below:

---

[PATCH v7.1 04/17] tools: ynl_gen_rst.py: Split library from command line tool

As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.

While here, change the code which generates an index file
to parse inputs from both .rst and .yaml extensions. With
that, the tool can easily be tested with:

	tools/net/ynl/pyynl/ynl_gen_rst.py -x -o Documentation/netlink/specs/foo.rst

Without needing to first generate a temp directory with the
rst files.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

diff --git a/tools/net/ynl/pyynl/lib/__init__.py b/tools/net/ynl/pyynl/lib/__init__.py
index 71518b9842ee..5f266ebe4526 100644
--- a/tools/net/ynl/pyynl/lib/__init__.py
+++ b/tools/net/ynl/pyynl/lib/__init__.py
@@ -4,6 +4,8 @@ from .nlspec import SpecAttr, SpecAttrSet, SpecEnumEntry, SpecEnumSet, \
     SpecFamily, SpecOperation, SpecSubMessage, SpecSubMessageFormat
 from .ynl import YnlFamily, Netlink, NlError
 
+from .doc_generator import YnlDocGenerator
+
 __all__ = ["SpecAttr", "SpecAttrSet", "SpecEnumEntry", "SpecEnumSet",
            "SpecFamily", "SpecOperation", "SpecSubMessage", "SpecSubMessageFormat",
            "YnlFamily", "Netlink", "NlError"]
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
new file mode 100644
index 000000000000..80e468086693
--- /dev/null
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -0,0 +1,382 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+    Class to auto generate the documentation for Netlink specifications.
+
+    :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
+    :license:    GPL Version 2, June 1991 see linux/COPYING for details.
+
+    This class performs extensive parsing to the Linux kernel's netlink YAML
+    spec files, in an effort to avoid needing to heavily mark up the original
+    YAML file.
+
+    This code is split in two classes:
+        1) RST formatters: Use to convert a string to a RST output
+        2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
+"""
+
+from typing import Any, Dict, List
+import os.path
+import sys
+import argparse
+import logging
+import yaml
+
+
+# ==============
+# RST Formatters
+# ==============
+class RstFormatters:
+    SPACE_PER_LEVEL = 4
+
+    @staticmethod
+    def headroom(level: int) -> str:
+        """Return space to format"""
+        return " " * (level * RstFormatters.SPACE_PER_LEVEL)
+
+
+    @staticmethod
+    def bold(text: str) -> str:
+        """Format bold text"""
+        return f"**{text}**"
+
+
+    @staticmethod
+    def inline(text: str) -> str:
+        """Format inline text"""
+        return f"``{text}``"
+
+
+    @staticmethod
+    def sanitize(text: str) -> str:
+        """Remove newlines and multiple spaces"""
+        # This is useful for some fields that are spread across multiple lines
+        return str(text).replace("\n", " ").strip()
+
+
+    def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+        """Return a RST formatted field"""
+        return self.headroom(level) + f":{key}: {value}"
+
+
+    def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+        """Format a single rst definition"""
+        return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
+
+
+    def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+        """Return a formatted paragraph"""
+        return self.headroom(level) + paragraph
+
+
+    def rst_bullet(self, item: str, level: int = 0) -> str:
+        """Return a formatted a bullet"""
+        return self.headroom(level) + f"- {item}"
+
+
+    @staticmethod
+    def rst_subsection(title: str) -> str:
+        """Add a sub-section to the document"""
+        return f"{title}\n" + "-" * len(title)
+
+
+    @staticmethod
+    def rst_subsubsection(title: str) -> str:
+        """Add a sub-sub-section to the document"""
+        return f"{title}\n" + "~" * len(title)
+
+
+    @staticmethod
+    def rst_section(namespace: str, prefix: str, title: str) -> str:
+        """Add a section to the document"""
+        return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+    @staticmethod
+    def rst_subtitle(title: str) -> str:
+        """Add a subtitle to the document"""
+        return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+    @staticmethod
+    def rst_title(title: str) -> str:
+        """Add a title to the document"""
+        return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+    def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+        """Format a list using inlines"""
+        return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
+
+
+    @staticmethod
+    def rst_ref(namespace: str, prefix: str, name: str) -> str:
+        """Add a hyperlink to the document"""
+        mappings = {'enum': 'definition',
+                    'fixed-header': 'definition',
+                    'nested-attributes': 'attribute-set',
+                    'struct': 'definition'}
+        if prefix in mappings:
+            prefix = mappings[prefix]
+        return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+    def rst_header(self) -> str:
+        """The headers for all the auto generated RST files"""
+        lines = []
+
+        lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+        lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_toctree(maxdepth: int = 2) -> str:
+        """Generate a toctree RST primitive"""
+        lines = []
+
+        lines.append(".. toctree::")
+        lines.append(f"   :maxdepth: {maxdepth}\n\n")
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_label(title: str) -> str:
+        """Return a formatted label"""
+        return f".. _{title}:\n\n"
+
+# =======
+# Parsers
+# =======
+class YnlDocGenerator:
+
+    fmt = RstFormatters()
+
+    def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+        """Parse 'multicast' group list and return a formatted string"""
+        lines = []
+        for group in mcast_group:
+            lines.append(self.fmt.rst_bullet(group["name"]))
+
+        return "\n".join(lines)
+
+
+    def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'do' section and return a formatted string"""
+        lines = []
+        for key in do_dict.keys():
+            lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+            if key in ['request', 'reply']:
+                lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+            else:
+                lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+        return "\n".join(lines)
+
+
+    def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'attributes' section"""
+        if "attributes" not in attrs:
+            return ""
+        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+        return "\n".join(lines)
+
+
+    def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse operations block"""
+        preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+        linkable = ["fixed-header", "attribute-set"]
+        lines = []
+
+        for operation in operations:
+            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+            for key in operation.keys():
+                if key in preprocessed:
+                    # Skip the special fields
+                    continue
+                value = operation[key]
+                if key in linkable:
+                    value = self.fmt.rst_ref(namespace, key, value)
+                lines.append(self.fmt.rst_fields(key, value, 0))
+            if 'flags' in operation:
+                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+            if "do" in operation:
+                lines.append(self.fmt.rst_paragraph(":do:", 0))
+                lines.append(self.parse_do(operation["do"], 0))
+            if "dump" in operation:
+                lines.append(self.fmt.rst_paragraph(":dump:", 0))
+                lines.append(self.parse_do(operation["dump"], 0))
+
+            # New line after fields
+            lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+        """Parse a list of entries"""
+        ignored = ["pad"]
+        lines = []
+        for entry in entries:
+            if isinstance(entry, dict):
+                # entries could be a list or a dictionary
+                field_name = entry.get("name", "")
+                if field_name in ignored:
+                    continue
+                type_ = entry.get("type")
+                if type_:
+                    field_name += f" ({self.fmt.inline(type_)})"
+                lines.append(
+                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                )
+            elif isinstance(entry, list):
+                lines.append(self.fmt.rst_list_inline(entry, level))
+            else:
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+
+        lines.append("\n")
+        return "\n".join(lines)
+
+
+    def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+        """Parse definitions section"""
+        preprocessed = ["name", "entries", "members"]
+        ignored = ["render-max"]  # This is not printed
+        lines = []
+
+        for definition in defs:
+            lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+            for k in definition.keys():
+                if k in preprocessed + ignored:
+                    continue
+                lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+            # Field list needs to finish with a new line
+            lines.append("\n")
+            if "entries" in definition:
+                lines.append(self.fmt.rst_paragraph(":entries:", 0))
+                lines.append(self.parse_entries(definition["entries"], 1))
+            if "members" in definition:
+                lines.append(self.fmt.rst_paragraph(":members:", 0))
+                lines.append(self.parse_entries(definition["members"], 1))
+
+        return "\n".join(lines)
+
+
+    def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse attribute from attribute-set"""
+        preprocessed = ["name", "type"]
+        linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+        ignored = ["checks"]
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            for attr in entry["attributes"]:
+                type_ = attr.get("type")
+                attr_line = attr["name"]
+                if type_:
+                    # Add the attribute type in the same line
+                    attr_line += f" ({self.fmt.inline(type_)})"
+
+                lines.append(self.fmt.rst_subsubsection(attr_line))
+
+                for k in attr.keys():
+                    if k in preprocessed + ignored:
+                        continue
+                    if k in linkable:
+                        value = self.fmt.rst_ref(namespace, k, attr[k])
+                    else:
+                        value = self.fmt.sanitize(attr[k])
+                    lines.append(self.fmt.rst_fields(k, value, 0))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse sub-message definitions"""
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            for fmt in entry["formats"]:
+                value = fmt["value"]
+
+                lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+                for attr in ['fixed-header', 'attribute-set']:
+                    if attr in fmt:
+                        lines.append(self.fmt.rst_fields(attr,
+                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
+                                                1))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_yaml(self, obj: Dict[str, Any]) -> str:
+        """Format the whole YAML into a RST string"""
+        lines = []
+
+        # Main header
+
+        family = obj['name']
+
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("netlink-" + family))
+
+        title = f"Family ``{family}`` netlink specification"
+        lines.append(self.fmt.rst_title(title))
+        lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
+
+        if "doc" in obj:
+            lines.append(self.fmt.rst_subtitle("Summary"))
+            lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
+
+        # Operations
+        if "operations" in obj:
+            lines.append(self.fmt.rst_subtitle("Operations"))
+            lines.append(self.parse_operations(obj["operations"]["list"], family))
+
+        # Multicast groups
+        if "mcast-groups" in obj:
+            lines.append(self.fmt.rst_subtitle("Multicast groups"))
+            lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
+
+        # Definitions
+        if "definitions" in obj:
+            lines.append(self.fmt.rst_subtitle("Definitions"))
+            lines.append(self.parse_definitions(obj["definitions"], family))
+
+        # Attributes set
+        if "attribute-sets" in obj:
+            lines.append(self.fmt.rst_subtitle("Attribute sets"))
+            lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
+
+        # Sub-messages
+        if "sub-messages" in obj:
+            lines.append(self.fmt.rst_subtitle("Sub-messages"))
+            lines.append(self.parse_sub_messages(obj["sub-messages"], family))
+
+        return "\n".join(lines)
+
+
+    # Main functions
+    # ==============
+
+
+    def parse_yaml_file(self, filename: str) -> str:
+        """Transform the YAML specified by filename into an RST-formatted string"""
+        with open(filename, "r", encoding="utf-8") as spec_file:
+            yaml_data = yaml.safe_load(spec_file)
+            content = self.parse_yaml(yaml_data)
+
+        return content
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..010315fad498 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,354 +10,17 @@
 
     This script performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
-    YAML file.
-
-    This code is split in three big parts:
-        1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+    YAML file. It uses the library code from scripts/lib.
 """
 
-from typing import Any, Dict, List
 import os.path
+import pathlib
 import sys
 import argparse
 import logging
-import yaml
-
-
-SPACE_PER_LEVEL = 4
-
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
-
-
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
-
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
-    return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
-
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
-
-    return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
-
-        # New line after fields
-        lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
-                if k in preprocessed + ignored:
-                    continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
-
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
-
-    # Main header
-
-    family = obj['name']
-
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
-
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
-
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
-
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
-
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
-
-    return "\n".join(lines)
-
-
-# Main functions
-# ==============
 
+sys.path.append(pathlib.Path(__file__).resolve().parent.as_posix())
+from lib import YnlDocGenerator    # pylint: disable=C0413
 
 def parse_arguments() -> argparse.Namespace:
     """Parse arguments from user"""
@@ -392,15 +55,6 @@ def parse_arguments() -> argparse.Namespace:
     return args
 
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
-
-    return content
-
-
 def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
@@ -409,21 +63,22 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str) -> None:
+def generate_main_index_rst(parser: YnlDocGenerator, output: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
     lines = []
 
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
+    lines.append(parser.fmt.rst_header())
+    lines.append(parser.fmt.rst_label("specs"))
+    lines.append(parser.fmt.rst_title("Netlink Family Specifications"))
+    lines.append(parser.fmt.rst_toctree(1))
 
     index_dir = os.path.dirname(output)
     logging.debug("Looking for .rst files in %s", index_dir)
     for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(".rst") or filename == "index.rst":
+        base, ext = os.path.splitext(filename)
+        if filename == "index.rst" or ext not in [".rst", ".yaml"]:
             continue
-        lines.append(f"   {filename.replace('.rst', '')}\n")
+        lines.append(f"   {base}\n")
 
     logging.debug("Writing an index file at %s", output)
     write_to_rstfile("".join(lines), output)
@@ -434,10 +89,12 @@ def main() -> None:
 
     args = parse_arguments()
 
+    parser = YnlDocGenerator()
+
     if args.input:
         logging.debug("Parsing %s", args.input)
         try:
-            content = parse_yaml_file(os.path.join(args.input))
+            content = parser.parse_yaml_file(os.path.join(args.input))
         except Exception as exception:
             logging.warning("Failed to parse %s.", args.input)
             logging.warning(exception)
@@ -447,7 +104,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        generate_main_index_rst(args.output)
+        generate_main_index_rst(parser, args.output)
 
 
 if __name__ == "__main__":




^ permalink raw reply related	[flat|nested] 26+ messages in thread

end of thread, other threads:[~2025-06-20 10:58 UTC | newest]

Thread overview: 26+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-06-19  6:48 [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-19  6:48 ` [PATCH v7 01/17] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
2025-06-19  6:48 ` [PATCH v7 02/17] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
2025-06-19  6:48 ` [PATCH v7 03/17] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-06-19  8:57   ` Donald Hunter
2025-06-19  6:48 ` [PATCH v7 04/17] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
2025-06-19 12:01   ` Donald Hunter
2025-06-20 10:58     ` Mauro Carvalho Chehab
2025-06-19  6:48 ` [PATCH v7 05/17] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
2025-06-19  6:48 ` [PATCH v7 06/17] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
2025-06-19 12:04   ` Donald Hunter
2025-06-19  6:49 ` [PATCH v7 07/17] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
2025-06-19 12:08   ` Donald Hunter
2025-06-19  6:49 ` [PATCH v7 08/17] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 09/17] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 10/17] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 11/17] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 12/17] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
2025-06-19 12:10   ` Donald Hunter
2025-06-19  6:49 ` [PATCH v7 13/17] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 14/17] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 15/17] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 16/17] docs: conf.py: several coding style fixes Mauro Carvalho Chehab
2025-06-19  6:49 ` [PATCH v7 17/17] docs: conf.py: Check Sphinx and docutils version Mauro Carvalho Chehab
2025-06-19  8:29 ` [PATCH v7 00/17] Don't generate netlink .rst files inside $(srctree) Donald Hunter
2025-06-20 10:33   ` Mauro Carvalho Chehab

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).