linux-doc.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree)
@ 2025-06-17  8:01 Mauro Carvalho Chehab
  2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
                   ` (15 more replies)
  0 siblings, 16 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:01 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
	Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
	netdev, peterz, stern, Breno Leitao, Jakub Kicinski, Simon Horman

As discussed at:
   https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/

changeset f061c9f7d058 ("Documentation: Document each netlink family")
added a logic which generates *.rst files inside $(srctree). This is bad
when O=<BUILDDIR> is used.

A recent change renamed the yaml files used by Netlink, revealing a bad
side effect: as "make cleandocs" don't clean the produced files and symbols
appear duplicated for people that don't build the kernel from scratch.

This series adds an yaml parser extension and uses an index file with glob for
*. We opted to write such extension in a way that no actual yaml conversion
code is inside it. This makes it flexible enough to handle other types of yaml
files in the future. The actual yaml conversion logic were placed at 
netlink_yml_parser.py. 

As requested by YNL maintainers, this version has netlink_yml_parser.py
inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
binaries, nor to have Python libraries spread all over the Kernel. IMO,
the best is to put all of them on a common place (scripts/lib, python/lib,
lib/python, ...) but, as this can be solved later, for now let's keep it this
way.

---

v5:
- some patch reorg;
- netlink_yml_parser.py is now together with ynl tools;
- minor fixes.

v4:
- Renamed the YNL parser class;
- some minor patch cleanups and merges;
- added an extra patch to fix a insert_pattern/exclude_pattern logic when
   SPHINXDIRS is used.

v3:
- Two series got merged altogether:
  - https://lore.kernel.org/linux-doc/cover.1749723671.git.mchehab+huawei@kernel.org/T/#t
  - https://lore.kernel.org/linux-doc/cover.1749735022.git.mchehab+huawei@kernel.org

- Added an extra patch to update MAINTAINERS to point to YNL library
- Added a (somewhat unrelated) patch that remove warnings check when
  running "make cleandocs".


Mauro Carvalho Chehab (15):
  docs: conf.py: properly handle include and exclude patterns
  docs: Makefile: disable check rules on make cleandocs
  tools: ynl_gen_rst.py: create a top-level reference
  docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  tools: ynl_gen_rst.py: make the index parser more generic
  tools: ynl_gen_rst.py: Split library from command line tool
  scripts: lib: netlink_yml_parser.py: use classes
  docs: netlink: index.rst: add a netlink index file
  tools: ynl_gen_rst.py: clanup coding style
  docs: sphinx: add a parser for yaml files for Netlink specs
  docs: use parser_yaml extension to handle Netlink specs
  docs: uapi: netlink: update netlink specs link
  tools: ynl_gen_rst.py: drop support for generating index files
  docs: netlink: remove obsolete .gitignore from unused directory
  MAINTAINERS: add netlink_yml_parser.py to linux-doc

 Documentation/Makefile                        |  19 +-
 Documentation/conf.py                         |  63 ++-
 Documentation/netlink/specs/index.rst         |  13 +
 Documentation/networking/index.rst            |   2 +-
 .../networking/netlink_spec/.gitignore        |   1 -
 .../networking/netlink_spec/readme.txt        |   4 -
 Documentation/sphinx/parser_yaml.py           |  76 ++++
 Documentation/userspace-api/netlink/index.rst |   2 +-
 .../userspace-api/netlink/netlink-raw.rst     |   6 +-
 Documentation/userspace-api/netlink/specs.rst |   2 +-
 MAINTAINERS                                   |   1 +
 tools/net/ynl/pyynl/netlink_yml_parser.py     | 360 ++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py            | 385 +-----------------
 13 files changed, 523 insertions(+), 411 deletions(-)
 create mode 100644 Documentation/netlink/specs/index.rst
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt
 create mode 100755 Documentation/sphinx/parser_yaml.py
 create mode 100755 tools/net/ynl/pyynl/netlink_yml_parser.py

-- 
2.49.0



^ permalink raw reply	[flat|nested] 42+ messages in thread

* [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-06-17  8:01 ` Mauro Carvalho Chehab
  2025-06-17 10:38   ` Donald Hunter
  2025-06-18  2:42   ` Akira Yokosawa
  2025-06-17  8:01 ` [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
                   ` (14 subsequent siblings)
  15 siblings, 2 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:01 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

When one does:
	make SPHINXDIRS="foo" htmldocs

All patterns would be relative to Documentation/foo, which
causes the include/exclude patterns like:

	include_patterns = [
		...
		f'foo/*.{ext}',
	]

to break. This is not what it is expected. Address it by
adding a logic to dynamically adjust the pattern when
SPHINXDIRS is used.

That allows adding parsers for other file types.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/conf.py | 52 +++++++++++++++++++++++++++++++++++++++----
 1 file changed, 48 insertions(+), 4 deletions(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index 12de52a2b17e..e887c1b786a4 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -17,6 +17,54 @@ import os
 import sphinx
 import shutil
 
+# Location of Documentation/ directory
+doctree = os.path.abspath('.')
+
+# List of patterns that don't contain directory names, in glob format.
+include_patterns = ['**.rst']
+exclude_patterns = []
+
+# List of patterns that contain directory names in glob format.
+dyn_include_patterns = []
+dyn_exclude_patterns = ['output']
+
+# Properly handle include/exclude patterns
+# ----------------------------------------
+
+def setup(app):
+    """
+    On Sphinx, all directories are relative to what it is passed as
+    SOURCEDIR parameter for sphinx-build. Due to that, all patterns
+    that have directory names on it need to be dynamically set, after
+    converting them to a relative patch.
+
+    As Sphinx doesn't include any patterns outside SOURCEDIR, we should
+    exclude relative patterns that start with "../".
+    """
+
+    sourcedir = app.srcdir  # full path to the source directory
+    builddir = os.environ.get("BUILDDIR")
+
+    # setup include_patterns dynamically
+    for p in dyn_include_patterns:
+        full = os.path.join(doctree, p)
+
+        rel_path = os.path.relpath(full, start = app.srcdir)
+        if rel_path.startswith("../"):
+            continue
+
+        app.config.include_patterns.append(rel_path)
+
+    # setup exclude_patterns dynamically
+    for p in dyn_exclude_patterns:
+        full = os.path.join(doctree, p)
+
+        rel_path = os.path.relpath(full, start = app.srcdir)
+        if rel_path.startswith("../"):
+            continue
+
+        app.config.exclude_patterns.append(rel_path)
+
 # helper
 # ------
 
@@ -219,10 +267,6 @@ language = 'en'
 # Else, today_fmt is used as the format for a strftime call.
 #today_fmt = '%B %d, %Y'
 
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-exclude_patterns = ['output']
-
 # The reST default role (used for this markup: `text`) to use for all
 # documents.
 #default_role = None
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
@ 2025-06-17  8:01 ` Mauro Carvalho Chehab
  2025-06-17  9:53   ` Breno Leitao
  2025-06-17 10:38   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
                   ` (13 subsequent siblings)
  15 siblings, 2 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:01 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

It doesn't make sense to check for missing ABI and documents
when cleaning the tree.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index d30d66ddf1ad..b98477df5ddf 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -5,6 +5,7 @@
 # for cleaning
 subdir- := devicetree/bindings
 
+ifneq ($(MAKECMDGOALS),cleandocs)
 # Check for broken documentation file references
 ifeq ($(CONFIG_WARN_MISSING_DOCUMENTS),y)
 $(shell $(srctree)/scripts/documentation-file-ref-check --warn)
@@ -14,6 +15,7 @@ endif
 ifeq ($(CONFIG_WARN_ABI_ERRORS),y)
 $(shell $(srctree)/scripts/get_abi.py --dir $(srctree)/Documentation/ABI validate)
 endif
+endif
 
 # You can set these variables from the command line.
 SPHINXBUILD   = sphinx-build
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
  2025-06-17  8:01 ` [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 10:40   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 04/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
                   ` (12 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Currently, rt documents are referred with:

Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`

that's hard to maintain, and may break if we change the way
rst files are generated from yaml. Better to use instead a
reference for the netlink family.

So, add a netlink-<foo> reference to all generated docs.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/ynl_gen_rst.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
 
     # Main header
 
-    lines.append(rst_header())
-
     family = obj['name']
 
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
     title = f"Family ``{family}`` netlink specification"
     lines.append(rst_title(title))
     lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 04/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17  8:02 ` [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  15 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. So, replace them by
Sphinx cross-reference tag that are now created by ynl_gen_rst.py.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
 ------------
 
 Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
 abstraction to carry module specific information.
 
 Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
 Nested struct definitions
 -------------------------
 
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
 make use of nested struct definitions. The ``netlink-raw`` schema makes it
 possible to embed a struct within a struct definition using the ``struct``
 property. For example, the following struct definition embeds the
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 04/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 10:55   ` Donald Hunter
  2025-06-17 11:59   ` Simon Horman
  2025-06-17  8:02 ` [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  15 siblings, 2 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

It is not a good practice to store build-generated files
inside $(srctree), as one may be using O=<BUILDDIR> and even
have the Kernel on a read-only directory.

Change the YAML generation for netlink files to allow it
to parse data based on the source or on the object tree.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
 1 file changed, 16 insertions(+), 6 deletions(-)

diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..b1e5acafb998 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
 
     parser.add_argument("-v", "--verbose", action="store_true")
     parser.add_argument("-o", "--output", help="Output file name")
+    parser.add_argument("-d", "--input_dir", help="YAML input directory")
 
     # Index and input are mutually exclusive
     group = parser.add_mutually_exclusive_group()
@@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
 
+    dir = os.path.dirname(filename)
+    os.makedirs(dir, exist_ok=True)
+
     with open(filename, "w", encoding="utf-8") as rst_file:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str) -> None:
+def generate_main_index_rst(output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
     lines = []
 
@@ -418,12 +422,18 @@ def generate_main_index_rst(output: str) -> None:
     lines.append(rst_title("Netlink Family Specifications"))
     lines.append(rst_toctree(1))
 
-    index_dir = os.path.dirname(output)
-    logging.debug("Looking for .rst files in %s", index_dir)
+    index_fname = os.path.basename(output)
+    base, ext = os.path.splitext(index_fname)
+
+    if not index_dir:
+        index_dir = os.path.dirname(output)
+
+    logging.debug(f"Looking for {ext} files in %s", index_dir)
     for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(".rst") or filename == "index.rst":
+        if not filename.endswith(ext) or filename == index_fname:
             continue
-        lines.append(f"   {filename.replace('.rst', '')}\n")
+        base, ext = os.path.splitext(filename)
+        lines.append(f"   {base}\n")
 
     logging.debug("Writing an index file at %s", output)
     write_to_rstfile("".join(lines), output)
@@ -447,7 +457,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        generate_main_index_rst(args.output)
+        generate_main_index_rst(args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 11:08   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 07/15] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/netlink_yml_parser.py | 391 ++++++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py        | 374 +--------------------
 2 files changed, 401 insertions(+), 364 deletions(-)
 create mode 100755 tools/net/ynl/pyynl/netlink_yml_parser.py

diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
new file mode 100755
index 000000000000..3c15b578f947
--- /dev/null
+++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
@@ -0,0 +1,391 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+    Script to auto generate the documentation for Netlink specifications.
+
+    :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
+    :license:    GPL Version 2, June 1991 see linux/COPYING for details.
+
+    This script performs extensive parsing to the Linux kernel's netlink YAML
+    spec files, in an effort to avoid needing to heavily mark up the original
+    YAML file.
+
+    This code is split in three big parts:
+        1) RST formatters: Use to convert a string to a RST output
+        2) Parser helpers: Functions to parse the YAML data structure
+        3) Main function and small helpers
+"""
+
+from typing import Any, Dict, List
+import os.path
+import logging
+import yaml
+
+
+SPACE_PER_LEVEL = 4
+
+
+# RST Formatters
+# ==============
+def headroom(level: int) -> str:
+    """Return space to format"""
+    return " " * (level * SPACE_PER_LEVEL)
+
+
+def bold(text: str) -> str:
+    """Format bold text"""
+    return f"**{text}**"
+
+
+def inline(text: str) -> str:
+    """Format inline text"""
+    return f"``{text}``"
+
+
+def sanitize(text: str) -> str:
+    """Remove newlines and multiple spaces"""
+    # This is useful for some fields that are spread across multiple lines
+    return str(text).replace("\n", " ").strip()
+
+
+def rst_fields(key: str, value: str, level: int = 0) -> str:
+    """Return a RST formatted field"""
+    return headroom(level) + f":{key}: {value}"
+
+
+def rst_definition(key: str, value: Any, level: int = 0) -> str:
+    """Format a single rst definition"""
+    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
+
+
+def rst_paragraph(paragraph: str, level: int = 0) -> str:
+    """Return a formatted paragraph"""
+    return headroom(level) + paragraph
+
+
+def rst_bullet(item: str, level: int = 0) -> str:
+    """Return a formatted a bullet"""
+    return headroom(level) + f"- {item}"
+
+
+def rst_subsection(title: str) -> str:
+    """Add a sub-section to the document"""
+    return f"{title}\n" + "-" * len(title)
+
+
+def rst_subsubsection(title: str) -> str:
+    """Add a sub-sub-section to the document"""
+    return f"{title}\n" + "~" * len(title)
+
+
+def rst_section(namespace: str, prefix: str, title: str) -> str:
+    """Add a section to the document"""
+    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+def rst_subtitle(title: str) -> str:
+    """Add a subtitle to the document"""
+    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+def rst_title(title: str) -> str:
+    """Add a title to the document"""
+    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+def rst_list_inline(list_: List[str], level: int = 0) -> str:
+    """Format a list using inlines"""
+    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
+
+
+def rst_ref(namespace: str, prefix: str, name: str) -> str:
+    """Add a hyperlink to the document"""
+    mappings = {'enum': 'definition',
+                'fixed-header': 'definition',
+                'nested-attributes': 'attribute-set',
+                'struct': 'definition'}
+    if prefix in mappings:
+        prefix = mappings[prefix]
+    return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+def rst_header() -> str:
+    """The headers for all the auto generated RST files"""
+    lines = []
+
+    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+    return "\n".join(lines)
+
+
+def rst_toctree(maxdepth: int = 2) -> str:
+    """Generate a toctree RST primitive"""
+    lines = []
+
+    lines.append(".. toctree::")
+    lines.append(f"   :maxdepth: {maxdepth}\n\n")
+
+    return "\n".join(lines)
+
+
+def rst_label(title: str) -> str:
+    """Return a formatted label"""
+    return f".. _{title}:\n\n"
+
+
+# Parsers
+# =======
+
+
+def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
+    """Parse 'multicast' group list and return a formatted string"""
+    lines = []
+    for group in mcast_group:
+        lines.append(rst_bullet(group["name"]))
+
+    return "\n".join(lines)
+
+
+def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
+    """Parse 'do' section and return a formatted string"""
+    lines = []
+    for key in do_dict.keys():
+        lines.append(rst_paragraph(bold(key), level + 1))
+        if key in ['request', 'reply']:
+            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
+        else:
+            lines.append(headroom(level + 2) + do_dict[key] + "\n")
+
+    return "\n".join(lines)
+
+
+def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
+    """Parse 'attributes' section"""
+    if "attributes" not in attrs:
+        return ""
+    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
+
+    return "\n".join(lines)
+
+
+def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse operations block"""
+    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+    linkable = ["fixed-header", "attribute-set"]
+    lines = []
+
+    for operation in operations:
+        lines.append(rst_section(namespace, 'operation', operation["name"]))
+        lines.append(rst_paragraph(operation["doc"]) + "\n")
+
+        for key in operation.keys():
+            if key in preprocessed:
+                # Skip the special fields
+                continue
+            value = operation[key]
+            if key in linkable:
+                value = rst_ref(namespace, key, value)
+            lines.append(rst_fields(key, value, 0))
+        if 'flags' in operation:
+            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
+
+        if "do" in operation:
+            lines.append(rst_paragraph(":do:", 0))
+            lines.append(parse_do(operation["do"], 0))
+        if "dump" in operation:
+            lines.append(rst_paragraph(":dump:", 0))
+            lines.append(parse_do(operation["dump"], 0))
+
+        # New line after fields
+        lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
+    """Parse a list of entries"""
+    ignored = ["pad"]
+    lines = []
+    for entry in entries:
+        if isinstance(entry, dict):
+            # entries could be a list or a dictionary
+            field_name = entry.get("name", "")
+            if field_name in ignored:
+                continue
+            type_ = entry.get("type")
+            if type_:
+                field_name += f" ({inline(type_)})"
+            lines.append(
+                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
+            )
+        elif isinstance(entry, list):
+            lines.append(rst_list_inline(entry, level))
+        else:
+            lines.append(rst_bullet(inline(sanitize(entry)), level))
+
+    lines.append("\n")
+    return "\n".join(lines)
+
+
+def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
+    """Parse definitions section"""
+    preprocessed = ["name", "entries", "members"]
+    ignored = ["render-max"]  # This is not printed
+    lines = []
+
+    for definition in defs:
+        lines.append(rst_section(namespace, 'definition', definition["name"]))
+        for k in definition.keys():
+            if k in preprocessed + ignored:
+                continue
+            lines.append(rst_fields(k, sanitize(definition[k]), 0))
+
+        # Field list needs to finish with a new line
+        lines.append("\n")
+        if "entries" in definition:
+            lines.append(rst_paragraph(":entries:", 0))
+            lines.append(parse_entries(definition["entries"], 1))
+        if "members" in definition:
+            lines.append(rst_paragraph(":members:", 0))
+            lines.append(parse_entries(definition["members"], 1))
+
+    return "\n".join(lines)
+
+
+def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse attribute from attribute-set"""
+    preprocessed = ["name", "type"]
+    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+    ignored = ["checks"]
+    lines = []
+
+    for entry in entries:
+        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
+        for attr in entry["attributes"]:
+            type_ = attr.get("type")
+            attr_line = attr["name"]
+            if type_:
+                # Add the attribute type in the same line
+                attr_line += f" ({inline(type_)})"
+
+            lines.append(rst_subsubsection(attr_line))
+
+            for k in attr.keys():
+                if k in preprocessed + ignored:
+                    continue
+                if k in linkable:
+                    value = rst_ref(namespace, k, attr[k])
+                else:
+                    value = sanitize(attr[k])
+                lines.append(rst_fields(k, value, 0))
+            lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse sub-message definitions"""
+    lines = []
+
+    for entry in entries:
+        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
+        for fmt in entry["formats"]:
+            value = fmt["value"]
+
+            lines.append(rst_bullet(bold(value)))
+            for attr in ['fixed-header', 'attribute-set']:
+                if attr in fmt:
+                    lines.append(rst_fields(attr,
+                                            rst_ref(namespace, attr, fmt[attr]),
+                                            1))
+            lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_yaml(obj: Dict[str, Any]) -> str:
+    """Format the whole YAML into a RST string"""
+    lines = []
+
+    # Main header
+
+    family = obj['name']
+
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
+    title = f"Family ``{family}`` netlink specification"
+    lines.append(rst_title(title))
+    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+
+    if "doc" in obj:
+        lines.append(rst_subtitle("Summary"))
+        lines.append(rst_paragraph(obj["doc"], 0))
+
+    # Operations
+    if "operations" in obj:
+        lines.append(rst_subtitle("Operations"))
+        lines.append(parse_operations(obj["operations"]["list"], family))
+
+    # Multicast groups
+    if "mcast-groups" in obj:
+        lines.append(rst_subtitle("Multicast groups"))
+        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+
+    # Definitions
+    if "definitions" in obj:
+        lines.append(rst_subtitle("Definitions"))
+        lines.append(parse_definitions(obj["definitions"], family))
+
+    # Attributes set
+    if "attribute-sets" in obj:
+        lines.append(rst_subtitle("Attribute sets"))
+        lines.append(parse_attr_sets(obj["attribute-sets"], family))
+
+    # Sub-messages
+    if "sub-messages" in obj:
+        lines.append(rst_subtitle("Sub-messages"))
+        lines.append(parse_sub_messages(obj["sub-messages"], family))
+
+    return "\n".join(lines)
+
+
+# Main functions
+# ==============
+
+
+def parse_yaml_file(filename: str) -> str:
+    """Transform the YAML specified by filename into an RST-formatted string"""
+    with open(filename, "r", encoding="utf-8") as spec_file:
+        yaml_data = yaml.safe_load(spec_file)
+        content = parse_yaml(yaml_data)
+
+    return content
+
+
+def generate_main_index_rst(output: str, index_dir: str) -> str:
+    """Generate the `networking_spec/index` content and write to the file"""
+    lines = []
+
+    lines.append(rst_header())
+    lines.append(rst_label("specs"))
+    lines.append(rst_title("Netlink Family Specifications"))
+    lines.append(rst_toctree(1))
+
+    index_fname = os.path.basename(output)
+    base, ext = os.path.splitext(index_fname)
+
+    if not index_dir:
+        index_dir = os.path.dirname(output)
+
+    logging.debug(f"Looking for {ext} files in %s", index_dir)
+    for filename in sorted(os.listdir(index_dir)):
+        if not filename.endswith(ext) or filename == index_fname:
+            continue
+        base, ext = os.path.splitext(filename)
+        lines.append(f"   {base}\n")
+
+    return "".join(lines), output
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index b1e5acafb998..38dafe3d9179 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -18,345 +18,17 @@
         3) Main function and small helpers
 """
 
-from typing import Any, Dict, List
 import os.path
 import sys
 import argparse
 import logging
-import yaml
 
+LIB_DIR = "../../../../scripts/lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
-SPACE_PER_LEVEL = 4
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
-
-
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
-
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
-    return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
-
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
-
-    return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
-
-        # New line after fields
-        lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
-                if k in preprocessed + ignored:
-                    continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
-
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
-
-    # Main header
-
-    family = obj['name']
-
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
-
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
-
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
-
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
-
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
-
-    return "\n".join(lines)
-
-
-# Main functions
-# ==============
+from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
 
 
 def parse_arguments() -> argparse.Namespace:
@@ -393,50 +65,24 @@ def parse_arguments() -> argparse.Namespace:
     return args
 
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
-
-    return content
-
-
 def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
 
-    dir = os.path.dirname(filename)
-    os.makedirs(dir, exist_ok=True)
+    directory = os.path.dirname(filename)
+    os.makedirs(directory, exist_ok=True)
 
     with open(filename, "w", encoding="utf-8") as rst_file:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
 
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
-
-    index_fname = os.path.basename(output)
-    base, ext = os.path.splitext(index_fname)
-
-    if not index_dir:
-        index_dir = os.path.dirname(output)
-
-    logging.debug(f"Looking for {ext} files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(ext) or filename == index_fname:
-            continue
-        base, ext = os.path.splitext(filename)
-        lines.append(f"   {base}\n")
+    msg = generate_main_index_rst(output, index_dir)
 
     logging.debug("Writing an index file at %s", output)
-    write_to_rstfile("".join(lines), output)
+    write_to_rstfile(msg, output)
 
 
 def main() -> None:
@@ -457,7 +103,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        generate_main_index_rst(args.output, args.input_dir)
+        write_index_rst(args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 07/15] scripts: lib: netlink_yml_parser.py: use classes
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17  8:02 ` [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  15 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be importing netlink parser into a Sphinx extension,
move all functions and global variables inside two classes:

- RstFormatters, containing ReST formatter logic, which are
  YAML independent;
- YnlDocGenerator: contains the actual parser classes. That's
  the only class that needs to be imported by the script or by
  a Sphinx extension.

With that, we won't pollute Sphinx namespace, avoiding any
potential clashes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/netlink_yml_parser.py | 594 +++++++++++-----------
 tools/net/ynl/pyynl/ynl_gen_rst.py        |  19 +-
 2 files changed, 314 insertions(+), 299 deletions(-)

diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
index 3c15b578f947..839e78b39de3 100755
--- a/tools/net/ynl/pyynl/netlink_yml_parser.py
+++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
@@ -3,389 +3,407 @@
 # -*- coding: utf-8; mode: python -*-
 
 """
-    Script to auto generate the documentation for Netlink specifications.
+    Class to auto generate the documentation for Netlink specifications.
 
     :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
     :license:    GPL Version 2, June 1991 see linux/COPYING for details.
 
-    This script performs extensive parsing to the Linux kernel's netlink YAML
+    This class performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
     YAML file.
 
-    This code is split in three big parts:
+    This code is split in two classes:
         1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+        2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
 """
 
 from typing import Any, Dict, List
 import os.path
+import sys
+import argparse
 import logging
 import yaml
 
 
-SPACE_PER_LEVEL = 4
-
-
+# ==============
 # RST Formatters
 # ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
+class RstFormatters:
+    SPACE_PER_LEVEL = 4
 
+    @staticmethod
+    def headroom(level: int) -> str:
+        """Return space to format"""
+        return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
 
+    @staticmethod
+    def bold(text: str) -> str:
+        """Format bold text"""
+        return f"**{text}**"
 
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
 
+    @staticmethod
+    def inline(text: str) -> str:
+        """Format inline text"""
+        return f"``{text}``"
 
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
 
+    @staticmethod
+    def sanitize(text: str) -> str:
+        """Remove newlines and multiple spaces"""
+        # This is useful for some fields that are spread across multiple lines
+        return str(text).replace("\n", " ").strip()
 
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
 
+    def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+        """Return a RST formatted field"""
+        return self.headroom(level) + f":{key}: {value}"
 
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
 
+    def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+        """Format a single rst definition"""
+        return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
 
+    def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+        """Return a formatted paragraph"""
+        return self.headroom(level) + paragraph
 
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
 
+    def rst_bullet(self, item: str, level: int = 0) -> str:
+        """Return a formatted a bullet"""
+        return self.headroom(level) + f"- {item}"
 
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
 
+    @staticmethod
+    def rst_subsection(title: str) -> str:
+        """Add a sub-section to the document"""
+        return f"{title}\n" + "-" * len(title)
 
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
 
+    @staticmethod
+    def rst_subsubsection(title: str) -> str:
+        """Add a sub-sub-section to the document"""
+        return f"{title}\n" + "~" * len(title)
 
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
+    @staticmethod
+    def rst_section(namespace: str, prefix: str, title: str) -> str:
+        """Add a section to the document"""
+        return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
+    @staticmethod
+    def rst_subtitle(title: str) -> str:
+        """Add a subtitle to the document"""
+        return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
+    @staticmethod
+    def rst_title(title: str) -> str:
+        """Add a title to the document"""
+        return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
 
+    def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+        """Format a list using inlines"""
+        return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
 
+    @staticmethod
+    def rst_ref(namespace: str, prefix: str, name: str) -> str:
+        """Add a hyperlink to the document"""
+        mappings = {'enum': 'definition',
+                    'fixed-header': 'definition',
+                    'nested-attributes': 'attribute-set',
+                    'struct': 'definition'}
+        if prefix in mappings:
+            prefix = mappings[prefix]
+        return f":ref:`{namespace}-{prefix}-{name}`"
 
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
 
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+    def rst_header(self) -> str:
+        """The headers for all the auto generated RST files"""
+        lines = []
 
-    return "\n".join(lines)
+        lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+        lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
 
+        return "\n".join(lines)
 
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
 
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
+    @staticmethod
+    def rst_toctree(maxdepth: int = 2) -> str:
+        """Generate a toctree RST primitive"""
+        lines = []
 
-    return "\n".join(lines)
+        lines.append(".. toctree::")
+        lines.append(f"   :maxdepth: {maxdepth}\n\n")
 
+        return "\n".join(lines)
 
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
 
+    @staticmethod
+    def rst_label(title: str) -> str:
+        """Return a formatted label"""
+        return f".. _{title}:\n\n"
 
+# =======
 # Parsers
 # =======
+class YnlDocGenerator:
+
+    fmt = RstFormatters()
+
+    def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+        """Parse 'multicast' group list and return a formatted string"""
+        lines = []
+        for group in mcast_group:
+            lines.append(self.fmt.rst_bullet(group["name"]))
+
+        return "\n".join(lines)
+
+
+    def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'do' section and return a formatted string"""
+        lines = []
+        for key in do_dict.keys():
+            lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+            if key in ['request', 'reply']:
+                lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+            else:
+                lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+        return "\n".join(lines)
+
+
+    def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'attributes' section"""
+        if "attributes" not in attrs:
+            return ""
+        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+        return "\n".join(lines)
+
+
+    def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse operations block"""
+        preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+        linkable = ["fixed-header", "attribute-set"]
+        lines = []
+
+        for operation in operations:
+            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+            for key in operation.keys():
+                if key in preprocessed:
+                    # Skip the special fields
+                    continue
+                value = operation[key]
+                if key in linkable:
+                    value = self.fmt.rst_ref(namespace, key, value)
+                lines.append(self.fmt.rst_fields(key, value, 0))
+            if 'flags' in operation:
+                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+            if "do" in operation:
+                lines.append(self.fmt.rst_paragraph(":do:", 0))
+                lines.append(self.parse_do(operation["do"], 0))
+            if "dump" in operation:
+                lines.append(self.fmt.rst_paragraph(":dump:", 0))
+                lines.append(self.parse_do(operation["dump"], 0))
+
+            # New line after fields
+            lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+        """Parse a list of entries"""
+        ignored = ["pad"]
+        lines = []
+        for entry in entries:
+            if isinstance(entry, dict):
+                # entries could be a list or a dictionary
+                field_name = entry.get("name", "")
+                if field_name in ignored:
+                    continue
+                type_ = entry.get("type")
+                if type_:
+                    field_name += f" ({self.fmt.inline(type_)})"
+                lines.append(
+                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                )
+            elif isinstance(entry, list):
+                lines.append(self.fmt.rst_list_inline(entry, level))
+            else:
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
 
+        lines.append("\n")
+        return "\n".join(lines)
 
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
 
-        # New line after fields
-        lines.append("\n")
+    def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+        """Parse definitions section"""
+        preprocessed = ["name", "entries", "members"]
+        ignored = ["render-max"]  # This is not printed
+        lines = []
 
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
+        for definition in defs:
+            lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+            for k in definition.keys():
                 if k in preprocessed + ignored:
                     continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
+                lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+            # Field list needs to finish with a new line
             lines.append("\n")
+            if "entries" in definition:
+                lines.append(self.fmt.rst_paragraph(":entries:", 0))
+                lines.append(self.parse_entries(definition["entries"], 1))
+            if "members" in definition:
+                lines.append(self.fmt.rst_paragraph(":members:", 0))
+                lines.append(self.parse_entries(definition["members"], 1))
 
-    return "\n".join(lines)
+        return "\n".join(lines)
 
 
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
+    def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse attribute from attribute-set"""
+        preprocessed = ["name", "type"]
+        linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+        ignored = ["checks"]
+        lines = []
 
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            for attr in entry["attributes"]:
+                type_ = attr.get("type")
+                attr_line = attr["name"]
+                if type_:
+                    # Add the attribute type in the same line
+                    attr_line += f" ({self.fmt.inline(type_)})"
 
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
+                lines.append(self.fmt.rst_subsubsection(attr_line))
+
+                for k in attr.keys():
+                    if k in preprocessed + ignored:
+                        continue
+                    if k in linkable:
+                        value = self.fmt.rst_ref(namespace, k, attr[k])
+                    else:
+                        value = self.fmt.sanitize(attr[k])
+                    lines.append(self.fmt.rst_fields(k, value, 0))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse sub-message definitions"""
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            for fmt in entry["formats"]:
+                value = fmt["value"]
+
+                lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+                for attr in ['fixed-header', 'attribute-set']:
+                    if attr in fmt:
+                        lines.append(self.fmt.rst_fields(attr,
+                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
+                                                1))
+                lines.append("\n")
+
+        return "\n".join(lines)
 
-    return "\n".join(lines)
 
+    def parse_yaml(self, obj: Dict[str, Any]) -> str:
+        """Format the whole YAML into a RST string"""
+        lines = []
 
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
+        # Main header
 
-    # Main header
+        family = obj['name']
 
-    family = obj['name']
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("netlink-" + family))
 
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
+        title = f"Family ``{family}`` netlink specification"
+        lines.append(self.fmt.rst_title(title))
+        lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
 
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+        if "doc" in obj:
+            lines.append(self.fmt.rst_subtitle("Summary"))
+            lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
 
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
+        # Operations
+        if "operations" in obj:
+            lines.append(self.fmt.rst_subtitle("Operations"))
+            lines.append(self.parse_operations(obj["operations"]["list"], family))
 
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
+        # Multicast groups
+        if "mcast-groups" in obj:
+            lines.append(self.fmt.rst_subtitle("Multicast groups"))
+            lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
 
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+        # Definitions
+        if "definitions" in obj:
+            lines.append(self.fmt.rst_subtitle("Definitions"))
+            lines.append(self.parse_definitions(obj["definitions"], family))
 
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
+        # Attributes set
+        if "attribute-sets" in obj:
+            lines.append(self.fmt.rst_subtitle("Attribute sets"))
+            lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
 
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
+        # Sub-messages
+        if "sub-messages" in obj:
+            lines.append(self.fmt.rst_subtitle("Sub-messages"))
+            lines.append(self.parse_sub_messages(obj["sub-messages"], family))
 
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
+        return "\n".join(lines)
 
-    return "\n".join(lines)
 
+    # Main functions
+    # ==============
 
-# Main functions
-# ==============
 
+    def parse_yaml_file(self, filename: str) -> str:
+        """Transform the YAML specified by filename into an RST-formatted string"""
+        with open(filename, "r", encoding="utf-8") as spec_file:
+            yaml_data = yaml.safe_load(spec_file)
+            content = self.parse_yaml(yaml_data)
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
+        return content
 
-    return content
 
+    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
+        """Generate the `networking_spec/index` content and write to the file"""
+        lines = []
 
-def generate_main_index_rst(output: str, index_dir: str) -> str:
-    """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("specs"))
+        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
+        lines.append(self.fmt.rst_toctree(1))
 
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
+        index_fname = os.path.basename(output)
+        base, ext = os.path.splitext(index_fname)
 
-    index_fname = os.path.basename(output)
-    base, ext = os.path.splitext(index_fname)
+        if not index_dir:
+            index_dir = os.path.dirname(output)
 
-    if not index_dir:
-        index_dir = os.path.dirname(output)
+        logging.debug(f"Looking for {ext} files in %s", index_dir)
+        for filename in sorted(os.listdir(index_dir)):
+            if not filename.endswith(ext) or filename == index_fname:
+                continue
+            base, ext = os.path.splitext(filename)
+            lines.append(f"   {base}\n")
 
-    logging.debug(f"Looking for {ext} files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(ext) or filename == index_fname:
-            continue
-        base, ext = os.path.splitext(filename)
-        lines.append(f"   {base}\n")
+        logging.debug("Writing an index file at %s", output)
 
-    return "".join(lines), output
+        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 38dafe3d9179..624b0960476e 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,12 +10,7 @@
 
     This script performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
-    YAML file.
-
-    This code is split in three big parts:
-        1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+    YAML file. It uses the library code from scripts/lib.
 """
 
 import os.path
@@ -28,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
+from netlink_yml_parser import YnlDocGenerator
 
 
 def parse_arguments() -> argparse.Namespace:
@@ -76,10 +71,10 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def write_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(parser: YnlDocGenerator, output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
 
-    msg = generate_main_index_rst(output, index_dir)
+    msg = parser.generate_main_index_rst(output, index_dir)
 
     logging.debug("Writing an index file at %s", output)
     write_to_rstfile(msg, output)
@@ -90,10 +85,12 @@ def main() -> None:
 
     args = parse_arguments()
 
+    parser = YnlDocGenerator()
+
     if args.input:
         logging.debug("Parsing %s", args.input)
         try:
-            content = parse_yaml_file(os.path.join(args.input))
+            content = parser.parse_yaml_file(os.path.join(args.input))
         except Exception as exception:
             logging.warning("Failed to parse %s.", args.input)
             logging.warning(exception)
@@ -103,7 +100,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        write_index_rst(args.output, args.input_dir)
+        write_index_rst(parser, args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 07/15] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 10:43   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of generating the index file, use glob to automatically
include all data from yaml.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/netlink/specs/index.rst | 13 +++++++++++++
 1 file changed, 13 insertions(+)
 create mode 100644 Documentation/netlink/specs/index.rst

diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..7f7cf4a096f2
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,13 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+   :maxdepth: 1
+   :glob:
+
+   *
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17  9:49   ` Breno Leitao
                     ` (2 more replies)
  2025-06-17  8:02 ` [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  15 siblings, 3 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Cleanup some coding style issues pointed by pylint and flake8.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/netlink_yml_parser.py | 75 ++++++++---------------
 tools/net/ynl/pyynl/ynl_gen_rst.py        |  2 +-
 2 files changed, 27 insertions(+), 50 deletions(-)

diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
index 839e78b39de3..f71360f0ceb7 100755
--- a/tools/net/ynl/pyynl/netlink_yml_parser.py
+++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
@@ -18,17 +18,12 @@
 """
 
 from typing import Any, Dict, List
-import os.path
-import sys
-import argparse
-import logging
 import yaml
 
 
-# ==============
-# RST Formatters
-# ==============
 class RstFormatters:
+    """RST Formatters"""
+
     SPACE_PER_LEVEL = 4
 
     @staticmethod
@@ -36,81 +31,67 @@ class RstFormatters:
         """Return space to format"""
         return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-
     @staticmethod
     def bold(text: str) -> str:
         """Format bold text"""
         return f"**{text}**"
 
-
     @staticmethod
     def inline(text: str) -> str:
         """Format inline text"""
         return f"``{text}``"
 
-
     @staticmethod
     def sanitize(text: str) -> str:
         """Remove newlines and multiple spaces"""
         # This is useful for some fields that are spread across multiple lines
         return str(text).replace("\n", " ").strip()
 
-
     def rst_fields(self, key: str, value: str, level: int = 0) -> str:
         """Return a RST formatted field"""
         return self.headroom(level) + f":{key}: {value}"
 
-
     def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
         """Format a single rst definition"""
         return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-
     def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
         """Return a formatted paragraph"""
         return self.headroom(level) + paragraph
 
-
     def rst_bullet(self, item: str, level: int = 0) -> str:
         """Return a formatted a bullet"""
         return self.headroom(level) + f"- {item}"
 
-
     @staticmethod
     def rst_subsection(title: str) -> str:
         """Add a sub-section to the document"""
         return f"{title}\n" + "-" * len(title)
 
-
     @staticmethod
     def rst_subsubsection(title: str) -> str:
         """Add a sub-sub-section to the document"""
         return f"{title}\n" + "~" * len(title)
 
-
     @staticmethod
     def rst_section(namespace: str, prefix: str, title: str) -> str:
         """Add a section to the document"""
         return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-
     @staticmethod
     def rst_subtitle(title: str) -> str:
         """Add a subtitle to the document"""
         return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-
     @staticmethod
     def rst_title(title: str) -> str:
         """Add a title to the document"""
         return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-
     def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
         """Format a list using inlines"""
         return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-
     @staticmethod
     def rst_ref(namespace: str, prefix: str, name: str) -> str:
         """Add a hyperlink to the document"""
@@ -119,10 +100,9 @@ class RstFormatters:
                     'nested-attributes': 'attribute-set',
                     'struct': 'definition'}
         if prefix in mappings:
-            prefix = mappings[prefix]
+            prefix = mappings.get(prefix, "")
         return f":ref:`{namespace}-{prefix}-{name}`"
 
-
     def rst_header(self) -> str:
         """The headers for all the auto generated RST files"""
         lines = []
@@ -132,7 +112,6 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_toctree(maxdepth: int = 2) -> str:
         """Generate a toctree RST primitive"""
@@ -143,16 +122,13 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_label(title: str) -> str:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
-# =======
-# Parsers
-# =======
 class YnlDocGenerator:
+    """YAML Netlink specs Parser"""
 
     fmt = RstFormatters()
 
@@ -164,7 +140,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
@@ -177,16 +152,16 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
         """Parse 'attributes' section"""
         if "attributes" not in attrs:
             return ""
-        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+        lines = [self.fmt.rst_fields("attributes",
+                                     self.fmt.rst_list_inline(attrs["attributes"]),
+                                     level + 1)]
 
         return "\n".join(lines)
 
-
     def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
         """Parse operations block"""
         preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +169,8 @@ class YnlDocGenerator:
         lines = []
 
         for operation in operations:
-            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'operation',
+                                              operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
@@ -206,7 +182,8 @@ class YnlDocGenerator:
                     value = self.fmt.rst_ref(namespace, key, value)
                 lines.append(self.fmt.rst_fields(key, value, 0))
             if 'flags' in operation:
-                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+                lines.append(self.fmt.rst_fields('flags',
+                                                 self.fmt.rst_list_inline(operation['flags'])))
 
             if "do" in operation:
                 lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +197,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
         """Parse a list of entries"""
         ignored = ["pad"]
@@ -235,17 +211,19 @@ class YnlDocGenerator:
                 if type_:
                     field_name += f" ({self.fmt.inline(type_)})"
                 lines.append(
-                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                    self.fmt.rst_fields(field_name,
+                                        self.fmt.sanitize(entry.get("doc", "")),
+                                        level)
                 )
             elif isinstance(entry, list):
                 lines.append(self.fmt.rst_list_inline(entry, level))
             else:
-                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+                                                 level))
 
         lines.append("\n")
         return "\n".join(lines)
 
-
     def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
         """Parse definitions section"""
         preprocessed = ["name", "entries", "members"]
@@ -270,7 +248,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse attribute from attribute-set"""
         preprocessed = ["name", "type"]
@@ -279,7 +256,8 @@ class YnlDocGenerator:
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+                                              entry["name"]))
             for attr in entry["attributes"]:
                 type_ = attr.get("type")
                 attr_line = attr["name"]
@@ -301,13 +279,13 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse sub-message definitions"""
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'sub-message',
+                                              entry["name"]))
             for fmt in entry["formats"]:
                 value = fmt["value"]
 
@@ -315,13 +293,14 @@ class YnlDocGenerator:
                 for attr in ['fixed-header', 'attribute-set']:
                     if attr in fmt:
                         lines.append(self.fmt.rst_fields(attr,
-                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
-                                                1))
+                                                         self.fmt.rst_ref(namespace,
+                                                                          attr,
+                                                                          fmt[attr]),
+                                                         1))
                 lines.append("\n")
 
         return "\n".join(lines)
 
-
     def parse_yaml(self, obj: Dict[str, Any]) -> str:
         """Format the whole YAML into a RST string"""
         lines = []
@@ -344,7 +323,8 @@ class YnlDocGenerator:
         # Operations
         if "operations" in obj:
             lines.append(self.fmt.rst_subtitle("Operations"))
-            lines.append(self.parse_operations(obj["operations"]["list"], family))
+            lines.append(self.parse_operations(obj["operations"]["list"],
+                                               family))
 
         # Multicast groups
         if "mcast-groups" in obj:
@@ -368,11 +348,9 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     # Main functions
     # ==============
 
-
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
@@ -381,7 +359,6 @@ class YnlDocGenerator:
 
         return content
 
-
     def generate_main_index_rst(self, output: str, index_dir: str) -> None:
         """Generate the `networking_spec/index` content and write to the file"""
         lines = []
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 624b0960476e..5d29ce01c60c 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -23,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from netlink_yml_parser import YnlDocGenerator
+from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
 
 
 def parse_arguments() -> argparse.Namespace:
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 12:35   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 11/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

Add a simple sphinx.Parser to handle yaml files and add the
the code to handle Netlink specs. All other yaml files are
ignored.

The code was written in a way that parsing yaml for different
subsystems and even for different parts of Netlink are easy.

All it takes to have a different parser is to add an
import line similar to:

	from netlink_yml_parser import YnlDocGenerator

adding the corresponding parser somewhere at the extension:

	netlink_parser = YnlDocGenerator()

And then add a logic inside parse() to handle different
doc outputs, depending on the file location, similar to:

        if "/netlink/specs/" in fname:
            msg = self.netlink_parser.parse_yaml_file(fname)

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
 1 file changed, 76 insertions(+)
 create mode 100755 Documentation/sphinx/parser_yaml.py

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..635945e1c5ba
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,76 @@
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+import re
+import sys
+
+from pprint import pformat
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
+
+from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+    """Custom parser for YAML files."""
+
+    # Need at least two elements on this set
+    supported = ('yaml', 'yml')
+
+    netlink_parser = YnlDocGenerator()
+
+    def do_parse(self, inputstring, document, msg):
+        """Parse YAML and generate a document tree."""
+
+        self.setup_parse(inputstring, document)
+
+        result = ViewList()
+
+        try:
+            # Parse message with RSTParser
+            for i, line in enumerate(msg.split('\n')):
+                result.append(line, document.current_source, i)
+
+            rst_parser = RSTParser()
+            rst_parser.parse('\n'.join(result), document)
+
+        except Exception as e:
+            document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+        self.finish_parse()
+
+    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+    def parse(self, inputstring, document):
+        """Check if a YAML is meant to be parsed."""
+
+        fname = document.current_source
+
+        # Handle netlink yaml specs
+        if "/netlink/specs/" in fname:
+            msg = self.netlink_parser.parse_yaml_file(fname)
+            self.do_parse(inputstring, document, msg)
+
+        # All other yaml files are ignored
+
+def setup(app):
+    """Setup function for the Sphinx extension."""
+
+    # Add YAML parser
+    app.add_source_parser(YamlParser)
+    app.add_source_suffix('.yaml', 'yaml')
+
+    return {
+        'version': '1.0',
+        'parallel_read_safe': True,
+        'parallel_write_safe': True,
+    }
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 11/15] docs: use parser_yaml extension to handle Netlink specs
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 12:45   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.

We are using here a toctree with :glob: property. This way, there
is no need to touch the netlink/specs/index.rst file every time
a new Netlink spec is added/renamed/removed.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                          | 17 -----------------
 Documentation/conf.py                           | 13 +++++++------
 Documentation/networking/index.rst              |  2 +-
 .../networking/netlink_spec/readme.txt          |  4 ----
 4 files changed, 8 insertions(+), 28 deletions(-)
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt

diff --git a/Documentation/Makefile b/Documentation/Makefile
index b98477df5ddf..820f07e0afe6 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -104,22 +104,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
 	fi
 
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
-	$(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
-	$(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
 htmldocs:
 	@$(srctree)/scripts/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -186,7 +170,6 @@ refcheckdocs:
 	$(Q)cd $(srctree);scripts/documentation-file-ref-check
 
 cleandocs:
-	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
 	$(Q)rm -rf $(BUILDDIR)
 	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
 
diff --git a/Documentation/conf.py b/Documentation/conf.py
index e887c1b786a4..a17940ac730f 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -25,7 +25,7 @@ include_patterns = ['**.rst']
 exclude_patterns = []
 
 # List of patterns that contain directory names in glob format.
-dyn_include_patterns = []
+dyn_include_patterns = ['netlink/specs/*.yaml']
 dyn_exclude_patterns = ['output']
 
 # Properly handle include/exclude patterns
@@ -93,7 +93,7 @@ needs_sphinx = '3.4.3'
 extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
               'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
               'maintainers_include', 'sphinx.ext.autosectionlabel',
-              'kernel_abi', 'kernel_feat', 'translations']
+              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
 
 # Since Sphinx version 3, the C function parser is more pedantic with regards
 # to type checking. Due to that, having macros at c:function cause problems.
@@ -191,10 +191,11 @@ else:
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['sphinx/templates']
 
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+        '.rst': 'restructuredtext',
+        '.yaml': 'yaml',
+}
 
 # The encoding of source files.
 #source_encoding = 'utf-8-sig'
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
    filter
    generic-hdlc
    generic_netlink
-   netlink_spec/index
+   ../netlink/specs/index
    gen_stats
    gtp
    ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 11/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 12:46   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 13/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

With the recent parser_yaml extension, and the removal of the
auto-generated ReST source files, the location of netlink
specs changed.

Update uAPI accordingly.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/userspace-api/netlink/index.rst | 2 +-
 Documentation/userspace-api/netlink/specs.rst | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/Documentation/userspace-api/netlink/index.rst b/Documentation/userspace-api/netlink/index.rst
index c1b6765cc963..83ae25066591 100644
--- a/Documentation/userspace-api/netlink/index.rst
+++ b/Documentation/userspace-api/netlink/index.rst
@@ -18,4 +18,4 @@ Netlink documentation for users.
 
 See also:
  - :ref:`Documentation/core-api/netlink.rst <kernel_netlink>`
- - :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - :ref:`Documentation/netlink/specs/index.rst <specs>`
diff --git a/Documentation/userspace-api/netlink/specs.rst b/Documentation/userspace-api/netlink/specs.rst
index 1b50d97d8d7c..debb4bfca5c4 100644
--- a/Documentation/userspace-api/netlink/specs.rst
+++ b/Documentation/userspace-api/netlink/specs.rst
@@ -15,7 +15,7 @@ kernel headers directly.
 Internally kernel uses the YAML specs to generate:
 
  - the C uAPI header
- - documentation of the protocol as a ReST file - see :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - documentation of the protocol as a ReST file - see :ref:`Documentation/netlink/specs/index.rst <specs>`
  - policy tables for input attribute validation
  - operation tables
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 13/15] tools: ynl_gen_rst.py: drop support for generating index files
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17  8:02 ` [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  15 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we're now using an index file with a glob, there's no need
to generate index files anymore.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/netlink_yml_parser.py | 26 -----------------------
 tools/net/ynl/pyynl/ynl_gen_rst.py        | 17 ---------------
 2 files changed, 43 deletions(-)

diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
index f71360f0ceb7..866551726723 100755
--- a/tools/net/ynl/pyynl/netlink_yml_parser.py
+++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
@@ -358,29 +358,3 @@ class YnlDocGenerator:
             content = self.parse_yaml(yaml_data)
 
         return content
-
-    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
-        """Generate the `networking_spec/index` content and write to the file"""
-        lines = []
-
-        lines.append(self.fmt.rst_header())
-        lines.append(self.fmt.rst_label("specs"))
-        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
-        lines.append(self.fmt.rst_toctree(1))
-
-        index_fname = os.path.basename(output)
-        base, ext = os.path.splitext(index_fname)
-
-        if not index_dir:
-            index_dir = os.path.dirname(output)
-
-        logging.debug(f"Looking for {ext} files in %s", index_dir)
-        for filename in sorted(os.listdir(index_dir)):
-            if not filename.endswith(ext) or filename == index_fname:
-                continue
-            base, ext = os.path.splitext(filename)
-            lines.append(f"   {base}\n")
-
-        logging.debug("Writing an index file at %s", output)
-
-        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 5d29ce01c60c..9e756d3c403d 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -32,13 +32,9 @@ def parse_arguments() -> argparse.Namespace:
 
     parser.add_argument("-v", "--verbose", action="store_true")
     parser.add_argument("-o", "--output", help="Output file name")
-    parser.add_argument("-d", "--input_dir", help="YAML input directory")
 
     # Index and input are mutually exclusive
     group = parser.add_mutually_exclusive_group()
-    group.add_argument(
-        "-x", "--index", action="store_true", help="Generate the index page"
-    )
     group.add_argument("-i", "--input", help="YAML file name")
 
     args = parser.parse_args()
@@ -71,15 +67,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def write_index_rst(parser: YnlDocGenerator, output: str, index_dir: str) -> None:
-    """Generate the `networking_spec/index` content and write to the file"""
-
-    msg = parser.generate_main_index_rst(output, index_dir)
-
-    logging.debug("Writing an index file at %s", output)
-    write_to_rstfile(msg, output)
-
-
 def main() -> None:
     """Main function that reads the YAML files and generates the RST files"""
 
@@ -98,10 +85,6 @@ def main() -> None:
 
         write_to_rstfile(content, args.output)
 
-    if args.index:
-        # Generate the index RST file
-        write_index_rst(parser, args.output, args.input_dir)
-
 
 if __name__ == "__main__":
     main()
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (12 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 13/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17 12:38   ` Donald Hunter
  2025-06-17  8:02 ` [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
  2025-06-17 12:58 ` [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

The previous code was generating source rst files
under Documentation/networking/netlink_spec/. With the
Sphinx YAML parser, this is now gone. So, stop ignoring
*.rst files inside netlink specs directory.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/networking/netlink_spec/.gitignore | 1 -
 1 file changed, 1 deletion(-)
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore

diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (13 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
@ 2025-06-17  8:02 ` Mauro Carvalho Chehab
  2025-06-17  9:50   ` Breno Leitao
  2025-06-17 12:58 ` [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  15 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17  8:02 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

The documentation build depends on the parsing code
at ynl_gen_rst.py. Ensure that changes to it will be c/c
to linux-doc ML and maintainers by adding an entry for
it. This way, if a change there would affect the build,
or the minimal version required for Python, doc developers
may know in advance.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 MAINTAINERS | 1 +
 1 file changed, 1 insertion(+)

diff --git a/MAINTAINERS b/MAINTAINERS
index a92290fffa16..caa3425e5755 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -7202,6 +7202,7 @@ F:	scripts/get_abi.py
 F:	scripts/kernel-doc*
 F:	scripts/lib/abi/*
 F:	scripts/lib/kdoc/*
+F:	tools/net/ynl/pyynl/netlink_yml_parser.py
 F:	scripts/sphinx-pre-install
 X:	Documentation/ABI/
 X:	Documentation/admin-guide/media/
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style
  2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
@ 2025-06-17  9:49   ` Breno Leitao
  2025-06-17  9:50   ` Breno Leitao
  2025-06-17 11:12   ` Donald Hunter
  2 siblings, 0 replies; 42+ messages in thread
From: Breno Leitao @ 2025-06-17  9:49 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, Jun 17, 2025 at 10:02:06AM +0200, Mauro Carvalho Chehab wrote:
> Cleanup some coding style issues pointed by pylint and flake8.
> 
> No functional changes.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Breno Leitao <leitao@debian.org>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style
  2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
  2025-06-17  9:49   ` Breno Leitao
@ 2025-06-17  9:50   ` Breno Leitao
  2025-06-17 11:12   ` Donald Hunter
  2 siblings, 0 replies; 42+ messages in thread
From: Breno Leitao @ 2025-06-17  9:50 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, Jun 17, 2025 at 10:02:06AM +0200, Mauro Carvalho Chehab wrote:
> Cleanup some coding style issues pointed by pylint and flake8.

Sorry, forgot to mention that there is a typo in the commit title.

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc
  2025-06-17  8:02 ` [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-06-17  9:50   ` Breno Leitao
  0 siblings, 0 replies; 42+ messages in thread
From: Breno Leitao @ 2025-06-17  9:50 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, Jun 17, 2025 at 10:02:12AM +0200, Mauro Carvalho Chehab wrote:
> The documentation build depends on the parsing code
> at ynl_gen_rst.py. Ensure that changes to it will be c/c
> to linux-doc ML and maintainers by adding an entry for
> it. This way, if a change there would affect the build,
> or the minimal version required for Python, doc developers
> may know in advance.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
>
Reviewed-by: Breno Leitao <leitao@debian.org>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs
  2025-06-17  8:01 ` [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
@ 2025-06-17  9:53   ` Breno Leitao
  2025-06-17 10:38   ` Donald Hunter
  1 sibling, 0 replies; 42+ messages in thread
From: Breno Leitao @ 2025-06-17  9:53 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, Jun 17, 2025 at 10:01:59AM +0200, Mauro Carvalho Chehab wrote:
> It doesn't make sense to check for missing ABI and documents
> when cleaning the tree.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Breno Leitao <leitao@debian.org>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
@ 2025-06-17 10:38   ` Donald Hunter
  2025-06-18  6:59     ` Mauro Carvalho Chehab
  2025-06-18  2:42   ` Akira Yokosawa
  1 sibling, 1 reply; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 10:38 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> When one does:
> 	make SPHINXDIRS="foo" htmldocs
>
> All patterns would be relative to Documentation/foo, which
> causes the include/exclude patterns like:
>
> 	include_patterns = [
> 		...
> 		f'foo/*.{ext}',
> 	]
>
> to break. This is not what it is expected. Address it by
> adding a logic to dynamically adjust the pattern when
> SPHINXDIRS is used.
>
> That allows adding parsers for other file types.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs
  2025-06-17  8:01 ` [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
  2025-06-17  9:53   ` Breno Leitao
@ 2025-06-17 10:38   ` Donald Hunter
  1 sibling, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 10:38 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> It doesn't make sense to check for missing ABI and documents
> when cleaning the tree.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference
  2025-06-17  8:02 ` [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
@ 2025-06-17 10:40   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 10:40 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Currently, rt documents are referred with:
>
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
> Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
>
> that's hard to maintain, and may break if we change the way
> rst files are generated from yaml. Better to use instead a
> reference for the netlink family.
>
> So, add a netlink-<foo> reference to all generated docs.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

I would still prefer to see patches 3 and 4 merged since they are part
of the same fix.

> ---
>  tools/net/ynl/pyynl/ynl_gen_rst.py | 5 +++--
>  1 file changed, 3 insertions(+), 2 deletions(-)
>
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index 0cb6348e28d3..7bfb8ceeeefc 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
>  
>      # Main header
>  
> -    lines.append(rst_header())
> -
>      family = obj['name']
>  
> +    lines.append(rst_header())
> +    lines.append(rst_label("netlink-" + family))
> +
>      title = f"Family ``{family}`` netlink specification"
>      lines.append(rst_title(title))
>      lines.append(rst_paragraph(".. contents:: :depth: 3\n"))

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file
  2025-06-17  8:02 ` [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
@ 2025-06-17 10:43   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 10:43 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Instead of generating the index file, use glob to automatically
> include all data from yaml.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-17  8:02 ` [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
@ 2025-06-17 10:55   ` Donald Hunter
  2025-06-17 11:59   ` Simon Horman
  1 sibling, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 10:55 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> It is not a good practice to store build-generated files
> inside $(srctree), as one may be using O=<BUILDDIR> and even
> have the Kernel on a read-only directory.
>
> Change the YAML generation for netlink files to allow it
> to parse data based on the source or on the object tree.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Please drop this patch. It introduces changes that affect interim
patches and the functionality all gets dropped later in the series.

> ---
>  tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
>  1 file changed, 16 insertions(+), 6 deletions(-)
>
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index 7bfb8ceeeefc..b1e5acafb998 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
>  
>      parser.add_argument("-v", "--verbose", action="store_true")
>      parser.add_argument("-o", "--output", help="Output file name")
> +    parser.add_argument("-d", "--input_dir", help="YAML input directory")
>  
>      # Index and input are mutually exclusive
>      group = parser.add_mutually_exclusive_group()
> @@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
>      """Write the generated content into an RST file"""
>      logging.debug("Saving RST file to %s", filename)
>  
> +    dir = os.path.dirname(filename)
> +    os.makedirs(dir, exist_ok=True)
> +
>      with open(filename, "w", encoding="utf-8") as rst_file:
>          rst_file.write(content)
>  
>  
> -def generate_main_index_rst(output: str) -> None:
> +def generate_main_index_rst(output: str, index_dir: str) -> None:
>      """Generate the `networking_spec/index` content and write to the file"""
>      lines = []
>  
> @@ -418,12 +422,18 @@ def generate_main_index_rst(output: str) -> None:
>      lines.append(rst_title("Netlink Family Specifications"))
>      lines.append(rst_toctree(1))
>  
> -    index_dir = os.path.dirname(output)
> -    logging.debug("Looking for .rst files in %s", index_dir)
> +    index_fname = os.path.basename(output)
> +    base, ext = os.path.splitext(index_fname)
> +
> +    if not index_dir:
> +        index_dir = os.path.dirname(output)
> +
> +    logging.debug(f"Looking for {ext} files in %s", index_dir)
>      for filename in sorted(os.listdir(index_dir)):
> -        if not filename.endswith(".rst") or filename == "index.rst":
> +        if not filename.endswith(ext) or filename == index_fname:
>              continue
> -        lines.append(f"   {filename.replace('.rst', '')}\n")
> +        base, ext = os.path.splitext(filename)
> +        lines.append(f"   {base}\n")
>  
>      logging.debug("Writing an index file at %s", output)
>      write_to_rstfile("".join(lines), output)
> @@ -447,7 +457,7 @@ def main() -> None:
>  
>      if args.index:
>          # Generate the index RST file
> -        generate_main_index_rst(args.output)
> +        generate_main_index_rst(args.output, args.input_dir)
>  
>  
>  if __name__ == "__main__":

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-17  8:02 ` [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-17 11:08   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 11:08 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> As we'll be using the Netlink specs parser inside a Sphinx
> extension, move the library part from the command line parser.
>
> No functional changes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  tools/net/ynl/pyynl/netlink_yml_parser.py | 391 ++++++++++++++++++++++
>  tools/net/ynl/pyynl/ynl_gen_rst.py        | 374 +--------------------
>  2 files changed, 401 insertions(+), 364 deletions(-)
>  create mode 100755 tools/net/ynl/pyynl/netlink_yml_parser.py

Sorry, the directory should be tools/net/ynl/pyynl/lib

Can you also name this something like doc_generator.py since the parsing
is incidental, the primary function is ReST doc generation.

[...]

> +# Main functions
> +# ==============
> +
> +
> +def parse_yaml_file(filename: str) -> str:
> +    """Transform the YAML specified by filename into an RST-formatted string"""
> +    with open(filename, "r", encoding="utf-8") as spec_file:
> +        yaml_data = yaml.safe_load(spec_file)
> +        content = parse_yaml(yaml_data)
> +
> +    return content
> +
> +
> +def generate_main_index_rst(output: str, index_dir: str) -> str:

Can you leave this function in ynl_gen_rst.py because it is not required
in the doc generator library and anyway gets removed later in the
series.

> +    """Generate the `networking_spec/index` content and write to the file"""
> +    lines = []
> +
> +    lines.append(rst_header())
> +    lines.append(rst_label("specs"))
> +    lines.append(rst_title("Netlink Family Specifications"))
> +    lines.append(rst_toctree(1))
> +
> +    index_fname = os.path.basename(output)
> +    base, ext = os.path.splitext(index_fname)
> +
> +    if not index_dir:
> +        index_dir = os.path.dirname(output)
> +
> +    logging.debug(f"Looking for {ext} files in %s", index_dir)
> +    for filename in sorted(os.listdir(index_dir)):
> +        if not filename.endswith(ext) or filename == index_fname:
> +            continue
> +        base, ext = os.path.splitext(filename)
> +        lines.append(f"   {base}\n")
> +
> +    return "".join(lines), output
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index b1e5acafb998..38dafe3d9179 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -18,345 +18,17 @@
>          3) Main function and small helpers
>  """
>  
> -from typing import Any, Dict, List
>  import os.path
>  import sys
>  import argparse
>  import logging
> -import yaml
>  
> +LIB_DIR = "../../../../scripts/lib"

Leftover from previous version?

> +SRC_DIR = os.path.dirname(os.path.realpath(__file__))
>  
> -SPACE_PER_LEVEL = 4
> +sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))

This is what we have in the other scripts in tools/net/ynl/pyynl:

sys.path.append(pathlib.Path(__file__).resolve().parent.as_posix())

>  
> -
> -# RST Formatters
> -# ==============
> -def headroom(level: int) -> str:
> -    """Return space to format"""
> -    return " " * (level * SPACE_PER_LEVEL)
> -
> -
> -def bold(text: str) -> str:
> -    """Format bold text"""
> -    return f"**{text}**"
> -
> -
> -def inline(text: str) -> str:
> -    """Format inline text"""
> -    return f"``{text}``"
> -
> -
> -def sanitize(text: str) -> str:
> -    """Remove newlines and multiple spaces"""
> -    # This is useful for some fields that are spread across multiple lines
> -    return str(text).replace("\n", " ").strip()
> -
> -
> -def rst_fields(key: str, value: str, level: int = 0) -> str:
> -    """Return a RST formatted field"""
> -    return headroom(level) + f":{key}: {value}"
> -
> -
> -def rst_definition(key: str, value: Any, level: int = 0) -> str:
> -    """Format a single rst definition"""
> -    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
> -
> -
> -def rst_paragraph(paragraph: str, level: int = 0) -> str:
> -    """Return a formatted paragraph"""
> -    return headroom(level) + paragraph
> -
> -
> -def rst_bullet(item: str, level: int = 0) -> str:
> -    """Return a formatted a bullet"""
> -    return headroom(level) + f"- {item}"
> -
> -
> -def rst_subsection(title: str) -> str:
> -    """Add a sub-section to the document"""
> -    return f"{title}\n" + "-" * len(title)
> -
> -
> -def rst_subsubsection(title: str) -> str:
> -    """Add a sub-sub-section to the document"""
> -    return f"{title}\n" + "~" * len(title)
> -
> -
> -def rst_section(namespace: str, prefix: str, title: str) -> str:
> -    """Add a section to the document"""
> -    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
> -
> -
> -def rst_subtitle(title: str) -> str:
> -    """Add a subtitle to the document"""
> -    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
> -
> -
> -def rst_title(title: str) -> str:
> -    """Add a title to the document"""
> -    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
> -
> -
> -def rst_list_inline(list_: List[str], level: int = 0) -> str:
> -    """Format a list using inlines"""
> -    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
> -
> -
> -def rst_ref(namespace: str, prefix: str, name: str) -> str:
> -    """Add a hyperlink to the document"""
> -    mappings = {'enum': 'definition',
> -                'fixed-header': 'definition',
> -                'nested-attributes': 'attribute-set',
> -                'struct': 'definition'}
> -    if prefix in mappings:
> -        prefix = mappings[prefix]
> -    return f":ref:`{namespace}-{prefix}-{name}`"
> -
> -
> -def rst_header() -> str:
> -    """The headers for all the auto generated RST files"""
> -    lines = []
> -
> -    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
> -    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
> -
> -    return "\n".join(lines)
> -
> -
> -def rst_toctree(maxdepth: int = 2) -> str:
> -    """Generate a toctree RST primitive"""
> -    lines = []
> -
> -    lines.append(".. toctree::")
> -    lines.append(f"   :maxdepth: {maxdepth}\n\n")
> -
> -    return "\n".join(lines)
> -
> -
> -def rst_label(title: str) -> str:
> -    """Return a formatted label"""
> -    return f".. _{title}:\n\n"
> -
> -
> -# Parsers
> -# =======
> -
> -
> -def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
> -    """Parse 'multicast' group list and return a formatted string"""
> -    lines = []
> -    for group in mcast_group:
> -        lines.append(rst_bullet(group["name"]))
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
> -    """Parse 'do' section and return a formatted string"""
> -    lines = []
> -    for key in do_dict.keys():
> -        lines.append(rst_paragraph(bold(key), level + 1))
> -        if key in ['request', 'reply']:
> -            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
> -        else:
> -            lines.append(headroom(level + 2) + do_dict[key] + "\n")
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
> -    """Parse 'attributes' section"""
> -    if "attributes" not in attrs:
> -        return ""
> -    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
> -    """Parse operations block"""
> -    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
> -    linkable = ["fixed-header", "attribute-set"]
> -    lines = []
> -
> -    for operation in operations:
> -        lines.append(rst_section(namespace, 'operation', operation["name"]))
> -        lines.append(rst_paragraph(operation["doc"]) + "\n")
> -
> -        for key in operation.keys():
> -            if key in preprocessed:
> -                # Skip the special fields
> -                continue
> -            value = operation[key]
> -            if key in linkable:
> -                value = rst_ref(namespace, key, value)
> -            lines.append(rst_fields(key, value, 0))
> -        if 'flags' in operation:
> -            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
> -
> -        if "do" in operation:
> -            lines.append(rst_paragraph(":do:", 0))
> -            lines.append(parse_do(operation["do"], 0))
> -        if "dump" in operation:
> -            lines.append(rst_paragraph(":dump:", 0))
> -            lines.append(parse_do(operation["dump"], 0))
> -
> -        # New line after fields
> -        lines.append("\n")
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
> -    """Parse a list of entries"""
> -    ignored = ["pad"]
> -    lines = []
> -    for entry in entries:
> -        if isinstance(entry, dict):
> -            # entries could be a list or a dictionary
> -            field_name = entry.get("name", "")
> -            if field_name in ignored:
> -                continue
> -            type_ = entry.get("type")
> -            if type_:
> -                field_name += f" ({inline(type_)})"
> -            lines.append(
> -                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
> -            )
> -        elif isinstance(entry, list):
> -            lines.append(rst_list_inline(entry, level))
> -        else:
> -            lines.append(rst_bullet(inline(sanitize(entry)), level))
> -
> -    lines.append("\n")
> -    return "\n".join(lines)
> -
> -
> -def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
> -    """Parse definitions section"""
> -    preprocessed = ["name", "entries", "members"]
> -    ignored = ["render-max"]  # This is not printed
> -    lines = []
> -
> -    for definition in defs:
> -        lines.append(rst_section(namespace, 'definition', definition["name"]))
> -        for k in definition.keys():
> -            if k in preprocessed + ignored:
> -                continue
> -            lines.append(rst_fields(k, sanitize(definition[k]), 0))
> -
> -        # Field list needs to finish with a new line
> -        lines.append("\n")
> -        if "entries" in definition:
> -            lines.append(rst_paragraph(":entries:", 0))
> -            lines.append(parse_entries(definition["entries"], 1))
> -        if "members" in definition:
> -            lines.append(rst_paragraph(":members:", 0))
> -            lines.append(parse_entries(definition["members"], 1))
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
> -    """Parse attribute from attribute-set"""
> -    preprocessed = ["name", "type"]
> -    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
> -    ignored = ["checks"]
> -    lines = []
> -
> -    for entry in entries:
> -        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
> -        for attr in entry["attributes"]:
> -            type_ = attr.get("type")
> -            attr_line = attr["name"]
> -            if type_:
> -                # Add the attribute type in the same line
> -                attr_line += f" ({inline(type_)})"
> -
> -            lines.append(rst_subsubsection(attr_line))
> -
> -            for k in attr.keys():
> -                if k in preprocessed + ignored:
> -                    continue
> -                if k in linkable:
> -                    value = rst_ref(namespace, k, attr[k])
> -                else:
> -                    value = sanitize(attr[k])
> -                lines.append(rst_fields(k, value, 0))
> -            lines.append("\n")
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
> -    """Parse sub-message definitions"""
> -    lines = []
> -
> -    for entry in entries:
> -        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
> -        for fmt in entry["formats"]:
> -            value = fmt["value"]
> -
> -            lines.append(rst_bullet(bold(value)))
> -            for attr in ['fixed-header', 'attribute-set']:
> -                if attr in fmt:
> -                    lines.append(rst_fields(attr,
> -                                            rst_ref(namespace, attr, fmt[attr]),
> -                                            1))
> -            lines.append("\n")
> -
> -    return "\n".join(lines)
> -
> -
> -def parse_yaml(obj: Dict[str, Any]) -> str:
> -    """Format the whole YAML into a RST string"""
> -    lines = []
> -
> -    # Main header
> -
> -    family = obj['name']
> -
> -    lines.append(rst_header())
> -    lines.append(rst_label("netlink-" + family))
> -
> -    title = f"Family ``{family}`` netlink specification"
> -    lines.append(rst_title(title))
> -    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
> -
> -    if "doc" in obj:
> -        lines.append(rst_subtitle("Summary"))
> -        lines.append(rst_paragraph(obj["doc"], 0))
> -
> -    # Operations
> -    if "operations" in obj:
> -        lines.append(rst_subtitle("Operations"))
> -        lines.append(parse_operations(obj["operations"]["list"], family))
> -
> -    # Multicast groups
> -    if "mcast-groups" in obj:
> -        lines.append(rst_subtitle("Multicast groups"))
> -        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
> -
> -    # Definitions
> -    if "definitions" in obj:
> -        lines.append(rst_subtitle("Definitions"))
> -        lines.append(parse_definitions(obj["definitions"], family))
> -
> -    # Attributes set
> -    if "attribute-sets" in obj:
> -        lines.append(rst_subtitle("Attribute sets"))
> -        lines.append(parse_attr_sets(obj["attribute-sets"], family))
> -
> -    # Sub-messages
> -    if "sub-messages" in obj:
> -        lines.append(rst_subtitle("Sub-messages"))
> -        lines.append(parse_sub_messages(obj["sub-messages"], family))
> -
> -    return "\n".join(lines)
> -
> -
> -# Main functions
> -# ==============
> +from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
>  
>  
>  def parse_arguments() -> argparse.Namespace:
> @@ -393,50 +65,24 @@ def parse_arguments() -> argparse.Namespace:
>      return args
>  
>  
> -def parse_yaml_file(filename: str) -> str:
> -    """Transform the YAML specified by filename into an RST-formatted string"""
> -    with open(filename, "r", encoding="utf-8") as spec_file:
> -        yaml_data = yaml.safe_load(spec_file)
> -        content = parse_yaml(yaml_data)
> -
> -    return content
> -
> -
>  def write_to_rstfile(content: str, filename: str) -> None:
>      """Write the generated content into an RST file"""
>      logging.debug("Saving RST file to %s", filename)
>  
> -    dir = os.path.dirname(filename)
> -    os.makedirs(dir, exist_ok=True)
> +    directory = os.path.dirname(filename)
> +    os.makedirs(directory, exist_ok=True)
>  
>      with open(filename, "w", encoding="utf-8") as rst_file:
>          rst_file.write(content)
>  
>  
> -def generate_main_index_rst(output: str, index_dir: str) -> None:
> +def write_index_rst(output: str, index_dir: str) -> None:

This change can also be dropped, to simplify the series.

>      """Generate the `networking_spec/index` content and write to the file"""
> -    lines = []
>  
> -    lines.append(rst_header())
> -    lines.append(rst_label("specs"))
> -    lines.append(rst_title("Netlink Family Specifications"))
> -    lines.append(rst_toctree(1))
> -
> -    index_fname = os.path.basename(output)
> -    base, ext = os.path.splitext(index_fname)
> -
> -    if not index_dir:
> -        index_dir = os.path.dirname(output)
> -
> -    logging.debug(f"Looking for {ext} files in %s", index_dir)
> -    for filename in sorted(os.listdir(index_dir)):
> -        if not filename.endswith(ext) or filename == index_fname:
> -            continue
> -        base, ext = os.path.splitext(filename)
> -        lines.append(f"   {base}\n")
> +    msg = generate_main_index_rst(output, index_dir)
>  
>      logging.debug("Writing an index file at %s", output)
> -    write_to_rstfile("".join(lines), output)
> +    write_to_rstfile(msg, output)
>  
>  
>  def main() -> None:
> @@ -457,7 +103,7 @@ def main() -> None:
>  
>      if args.index:
>          # Generate the index RST file
> -        generate_main_index_rst(args.output, args.input_dir)
> +        write_index_rst(args.output, args.input_dir)

This change can also be dropped, to simplify the series.

>  
>  
>  if __name__ == "__main__":

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style
  2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
  2025-06-17  9:49   ` Breno Leitao
  2025-06-17  9:50   ` Breno Leitao
@ 2025-06-17 11:12   ` Donald Hunter
  2 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 11:12 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Cleanup some coding style issues pointed by pylint and flake8.
>
> No functional changes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

nit: typo in the subject line: clanup -> cleanup

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-17  8:02 ` [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
  2025-06-17 10:55   ` Donald Hunter
@ 2025-06-17 11:59   ` Simon Horman
  2025-06-18  6:57     ` Mauro Carvalho Chehab
  1 sibling, 1 reply; 42+ messages in thread
From: Simon Horman @ 2025-06-17 11:59 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, Jun 17, 2025 at 10:02:02AM +0200, Mauro Carvalho Chehab wrote:
> It is not a good practice to store build-generated files
> inside $(srctree), as one may be using O=<BUILDDIR> and even
> have the Kernel on a read-only directory.
> 
> Change the YAML generation for netlink files to allow it
> to parse data based on the source or on the object tree.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
>  1 file changed, 16 insertions(+), 6 deletions(-)
> 
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index 7bfb8ceeeefc..b1e5acafb998 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
>  
>      parser.add_argument("-v", "--verbose", action="store_true")
>      parser.add_argument("-o", "--output", help="Output file name")
> +    parser.add_argument("-d", "--input_dir", help="YAML input directory")
>  
>      # Index and input are mutually exclusive
>      group = parser.add_mutually_exclusive_group()
> @@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
>      """Write the generated content into an RST file"""
>      logging.debug("Saving RST file to %s", filename)
>  
> +    dir = os.path.dirname(filename)
> +    os.makedirs(dir, exist_ok=True)
> +
>      with open(filename, "w", encoding="utf-8") as rst_file:
>          rst_file.write(content)

Hi Mauro,

With this patch applied I see the following, which did not happen before.

$ make -C tools/net/ynl
...
Traceback (most recent call last):
  File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 464, in <module>
    main()
    ~~~~^^
  File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 456, in main
    write_to_rstfile(content, args.output)
    ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^
  File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 410, in write_to_rstfile
    os.makedirs(dir, exist_ok=True)
    ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
  File "<frozen os>", line 227, in makedirs
FileNotFoundError: [Errno 2] No such file or directory: ''
make[1]: *** [Makefile:55: conntrack.rst] Error 1

-- 
pw-bot: cr

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17  8:02 ` [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-17 12:35   ` Donald Hunter
  2025-06-17 13:40     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 12:35 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Add a simple sphinx.Parser to handle yaml files and add the
> the code to handle Netlink specs. All other yaml files are
> ignored.
>
> The code was written in a way that parsing yaml for different
> subsystems and even for different parts of Netlink are easy.
>
> All it takes to have a different parser is to add an
> import line similar to:
>
> 	from netlink_yml_parser import YnlDocGenerator
>
> adding the corresponding parser somewhere at the extension:
>
> 	netlink_parser = YnlDocGenerator()
>
> And then add a logic inside parse() to handle different
> doc outputs, depending on the file location, similar to:
>
>         if "/netlink/specs/" in fname:
>             msg = self.netlink_parser.parse_yaml_file(fname)
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
>  1 file changed, 76 insertions(+)
>  create mode 100755 Documentation/sphinx/parser_yaml.py
>
> diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> new file mode 100755
> index 000000000000..635945e1c5ba
> --- /dev/null
> +++ b/Documentation/sphinx/parser_yaml.py
> @@ -0,0 +1,76 @@
> +"""
> +Sphinx extension for processing YAML files
> +"""
> +
> +import os
> +import re
> +import sys
> +
> +from pprint import pformat
> +
> +from docutils.parsers.rst import Parser as RSTParser
> +from docutils.statemachine import ViewList
> +
> +from sphinx.util import logging
> +from sphinx.parsers import Parser
> +
> +srctree = os.path.abspath(os.environ["srctree"])
> +sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
> +
> +from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
> +
> +logger = logging.getLogger(__name__)
> +
> +class YamlParser(Parser):
> +    """Custom parser for YAML files."""

Would be good to say that this is a common YAML parser that calls
different subsystems, e.g. how you described it in the commit message.

> +
> +    # Need at least two elements on this set

I think you can drop this comment. It's not that it must be two
elements, it's that supported needs to be a list and the python syntax
to force parsing as a list would be ('item', )

> +    supported = ('yaml', 'yml')
> +
> +    netlink_parser = YnlDocGenerator()
> +
> +    def do_parse(self, inputstring, document, msg):

Maybe a better name for this is parse_rst?

> +        """Parse YAML and generate a document tree."""

Also update comment.

> +
> +        self.setup_parse(inputstring, document)
> +
> +        result = ViewList()
> +
> +        try:
> +            # Parse message with RSTParser
> +            for i, line in enumerate(msg.split('\n')):
> +                result.append(line, document.current_source, i)

This has the effect of associating line numbers from the generated ReST
with the source .yaml file, right? So errors will be reported against
the wrong place in the file. Is there any way to show the cause of the
error in the intermediate ReST?

As an example if I modify tc.yaml like this:

diff --git a/Documentation/netlink/specs/tc.yaml b/Documentation/netlink/specs/tc.yaml
index 4cc1f6a45001..c36d86d2dc72 100644
--- a/Documentation/netlink/specs/tc.yaml
+++ b/Documentation/netlink/specs/tc.yaml
@@ -4044,7 +4044,9 @@ operations:
             - chain
     -
       name: getchain
-      doc: Get / dump tc chain information.
+      doc: |
+        Get / dump tc chain information.
+        .. bogus-directive:: 
       attribute-set: attrs
       fixed-header: tcmsg
       do:

This is the resuting error which will be really hard to track down:

/home/donaldh/net-next/Documentation/netlink/specs/tc.yaml:216: ERROR: Unexpected indentation. [docutils]

> +
> +            rst_parser = RSTParser()
> +            rst_parser.parse('\n'.join(result), document)
> +
> +        except Exception as e:
> +            document.reporter.error("YAML parsing error: %s" % pformat(e))
> +
> +        self.finish_parse()
> +
> +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> +    def parse(self, inputstring, document):
> +        """Check if a YAML is meant to be parsed."""
> +
> +        fname = document.current_source
> +
> +        # Handle netlink yaml specs
> +        if "/netlink/specs/" in fname:
> +            msg = self.netlink_parser.parse_yaml_file(fname)
> +            self.do_parse(inputstring, document, msg)
> +
> +        # All other yaml files are ignored
> +
> +def setup(app):
> +    """Setup function for the Sphinx extension."""
> +
> +    # Add YAML parser
> +    app.add_source_parser(YamlParser)
> +    app.add_source_suffix('.yaml', 'yaml')
> +
> +    return {
> +        'version': '1.0',
> +        'parallel_read_safe': True,
> +        'parallel_write_safe': True,
> +    }

^ permalink raw reply related	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory
  2025-06-17  8:02 ` [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
@ 2025-06-17 12:38   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 12:38 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> The previous code was generating source rst files
> under Documentation/networking/netlink_spec/. With the
> Sphinx YAML parser, this is now gone. So, stop ignoring
> *.rst files inside netlink specs directory.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 11/15] docs: use parser_yaml extension to handle Netlink specs
  2025-06-17  8:02 ` [PATCH v5 11/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-06-17 12:45   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 12:45 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
> This way, no .rst files would be written to the Kernel source
> directories.
>
> We are using here a toctree with :glob: property. This way, there
> is no need to touch the netlink/specs/index.rst file every time
> a new Netlink spec is added/renamed/removed.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link
  2025-06-17  8:02 ` [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-06-17 12:46   ` Donald Hunter
  0 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 12:46 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> With the recent parser_yaml extension, and the removal of the
> auto-generated ReST source files, the location of netlink
> specs changed.
>
> Update uAPI accordingly.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

nit: seems like this could be part of patch 11

Either way,

Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree)
  2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (14 preceding siblings ...)
  2025-06-17  8:02 ` [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-06-17 12:58 ` Donald Hunter
  15 siblings, 0 replies; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 12:58 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> As discussed at:
>    https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
>
> changeset f061c9f7d058 ("Documentation: Document each netlink family")
> added a logic which generates *.rst files inside $(srctree). This is bad
> when O=<BUILDDIR> is used.
>
> A recent change renamed the yaml files used by Netlink, revealing a bad
> side effect: as "make cleandocs" don't clean the produced files and symbols
> appear duplicated for people that don't build the kernel from scratch.
>
> This series adds an yaml parser extension and uses an index file with glob for
> *. We opted to write such extension in a way that no actual yaml conversion
> code is inside it. This makes it flexible enough to handle other types of yaml
> files in the future. The actual yaml conversion logic were placed at 
> netlink_yml_parser.py. 
>
> As requested by YNL maintainers, this version has netlink_yml_parser.py
> inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
> binaries, nor to have Python libraries spread all over the Kernel. IMO,
> the best is to put all of them on a common place (scripts/lib, python/lib,
> lib/python, ...) but, as this can be solved later, for now let's keep it this
> way.
>
> ---

Note that the series leaves the YNL build broken.

make -C tools/net/ynl/
make: Entering directory '/home/donaldh/net-next/tools/net/ynl'
make[1]: Entering directory '/home/donaldh/net-next/tools/net/ynl/lib'
make[1]: Nothing to be done for 'all'.
make[1]: Leaving directory '/home/donaldh/net-next/tools/net/ynl/lib'
make[1]: Entering directory '/home/donaldh/net-next/tools/net/ynl/generated'
	GEN_RST conntrack.rst
Traceback (most recent call last):
  File "/home/donaldh/net-next/tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 90, in <module>
    main()
    ~~~~^^
  File "/home/donaldh/net-next/tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 86, in main
    write_to_rstfile(content, args.output)
    ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^
  File "/home/donaldh/net-next/tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 64, in write_to_rstfile
    os.makedirs(directory, exist_ok=True)
    ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen os>", line 227, in makedirs
FileNotFoundError: [Errno 2] No such file or directory: ''
make[1]: *** [Makefile:56: conntrack.rst] Error 1
make[1]: Leaving directory '/home/donaldh/net-next/tools/net/ynl/generated'
make: *** [Makefile:25: generated] Error 2
make: Leaving directory '/home/donaldh/net-next/tools/net/ynl'

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17 12:35   ` Donald Hunter
@ 2025-06-17 13:40     ` Mauro Carvalho Chehab
  2025-06-17 16:00       ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17 13:40 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Tue, 17 Jun 2025 13:35:50 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > Add a simple sphinx.Parser to handle yaml files and add the
> > the code to handle Netlink specs. All other yaml files are
> > ignored.
> >
> > The code was written in a way that parsing yaml for different
> > subsystems and even for different parts of Netlink are easy.
> >
> > All it takes to have a different parser is to add an
> > import line similar to:
> >
> > 	from netlink_yml_parser import YnlDocGenerator
> >
> > adding the corresponding parser somewhere at the extension:
> >
> > 	netlink_parser = YnlDocGenerator()
> >
> > And then add a logic inside parse() to handle different
> > doc outputs, depending on the file location, similar to:
> >
> >         if "/netlink/specs/" in fname:
> >             msg = self.netlink_parser.parse_yaml_file(fname)
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
> >  1 file changed, 76 insertions(+)
> >  create mode 100755 Documentation/sphinx/parser_yaml.py
> >
> > diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> > new file mode 100755
> > index 000000000000..635945e1c5ba
> > --- /dev/null
> > +++ b/Documentation/sphinx/parser_yaml.py
> > @@ -0,0 +1,76 @@
> > +"""
> > +Sphinx extension for processing YAML files
> > +"""
> > +
> > +import os
> > +import re
> > +import sys
> > +
> > +from pprint import pformat
> > +
> > +from docutils.parsers.rst import Parser as RSTParser
> > +from docutils.statemachine import ViewList
> > +
> > +from sphinx.util import logging
> > +from sphinx.parsers import Parser
> > +
> > +srctree = os.path.abspath(os.environ["srctree"])
> > +sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
> > +
> > +from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
> > +
> > +logger = logging.getLogger(__name__)
> > +
> > +class YamlParser(Parser):
> > +    """Custom parser for YAML files."""  
> 
> Would be good to say that this is a common YAML parser that calls
> different subsystems, e.g. how you described it in the commit message.

Makes sense. Will fix at the next version.

> 
> > +
> > +    # Need at least two elements on this set  
> 
> I think you can drop this comment. It's not that it must be two
> elements, it's that supported needs to be a list and the python syntax
> to force parsing as a list would be ('item', )

Ah, ok.

> > +    supported = ('yaml', 'yml')
> > +
> > +    netlink_parser = YnlDocGenerator()
> > +
> > +    def do_parse(self, inputstring, document, msg):  
> 
> Maybe a better name for this is parse_rst?

Ok.

> 
> > +        """Parse YAML and generate a document tree."""  
> 
> Also update comment.

Ok.

> > +
> > +        self.setup_parse(inputstring, document)
> > +
> > +        result = ViewList()
> > +
> > +        try:
> > +            # Parse message with RSTParser
> > +            for i, line in enumerate(msg.split('\n')):
> > +                result.append(line, document.current_source, i)  
> 
> This has the effect of associating line numbers from the generated ReST
> with the source .yaml file, right? So errors will be reported against
> the wrong place in the file. Is there any way to show the cause of the
> error in the intermediate ReST?

Yes, but this will require modifying the parser. I prefer merging this
series without such change, and then having a separate changeset
addressing it.

There are two ways we can do that:

1. The parser can add a ReST comment with the line number. This
   is what it is done by kerneldoc.py Sphinx extension:

	lineoffset = 0
	line_regex = re.compile(r"^\.\. LINENO ([0-9]+)$")
        for line in lines:
            match = line_regex.search(line)
            if match:
                lineoffset = int(match.group(1)) - 1 # sphinx counts lines from 0
            else:
                doc = str(env.srcdir) + "/" + env.docname + ":" + str(self.lineno)
                result.append(line, doc + ": " + filename, lineoffset)
                lineoffset += 1

   I kept the same way after its conversion to Python, as right now,
   it supports both a Python class and a command lin command. I may
   eventually clean it up in the future.

2. making the parser return a tuple. At kernel_abi.py, as the parser
   returns content from multiple files, such tuple is:

		 (rst_output, filename, line_number)

   and the code for it is (cleaned up):

	for msg, f, ln in kernel_abi.doc(show_file=show_file,
                                         show_symbols=show_symbols,
                                         filter_path=abi_type):

            lines = statemachine.string2lines(msg, tab_width,
                                              convert_whitespace=True)

            for line in lines:
                content.append(line, f, ln - 1) # sphinx counts lines from 0

(2) is cleaner and faster, but (1) is easier to implement on an 
already-existing code.

> As an example if I modify tc.yaml like this:
> 
> diff --git a/Documentation/netlink/specs/tc.yaml b/Documentation/netlink/specs/tc.yaml
> index 4cc1f6a45001..c36d86d2dc72 100644
> --- a/Documentation/netlink/specs/tc.yaml
> +++ b/Documentation/netlink/specs/tc.yaml
> @@ -4044,7 +4044,9 @@ operations:
>              - chain
>      -
>        name: getchain
> -      doc: Get / dump tc chain information.
> +      doc: |
> +        Get / dump tc chain information.
> +        .. bogus-directive:: 
>        attribute-set: attrs
>        fixed-header: tcmsg
>        do:
> 
> This is the resuting error which will be really hard to track down:
> 
> /home/donaldh/net-next/Documentation/netlink/specs/tc.yaml:216: ERROR: Unexpected indentation. [docutils]
> 
> > +
> > +            rst_parser = RSTParser()
> > +            rst_parser.parse('\n'.join(result), document)
> > +
> > +        except Exception as e:
> > +            document.reporter.error("YAML parsing error: %s" % pformat(e))
> > +
> > +        self.finish_parse()
> > +
> > +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> > +    def parse(self, inputstring, document):
> > +        """Check if a YAML is meant to be parsed."""
> > +
> > +        fname = document.current_source
> > +
> > +        # Handle netlink yaml specs
> > +        if "/netlink/specs/" in fname:
> > +            msg = self.netlink_parser.parse_yaml_file(fname)
> > +            self.do_parse(inputstring, document, msg)
> > +
> > +        # All other yaml files are ignored
> > +
> > +def setup(app):
> > +    """Setup function for the Sphinx extension."""
> > +
> > +    # Add YAML parser
> > +    app.add_source_parser(YamlParser)
> > +    app.add_source_suffix('.yaml', 'yaml')
> > +
> > +    return {
> > +        'version': '1.0',
> > +        'parallel_read_safe': True,
> > +        'parallel_write_safe': True,
> > +    }  

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17 13:40     ` Mauro Carvalho Chehab
@ 2025-06-17 16:00       ` Mauro Carvalho Chehab
  2025-06-17 17:23         ` Donald Hunter
  0 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-17 16:00 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Tue, 17 Jun 2025 15:40:49 +0200
Mauro Carvalho Chehab <mchehab+huawei@kernel.org> escreveu:

> > > +            # Parse message with RSTParser
> > > +            for i, line in enumerate(msg.split('\n')):
> > > +                result.append(line, document.current_source, i)    
> > 
> > This has the effect of associating line numbers from the generated ReST
> > with the source .yaml file, right? So errors will be reported against
> > the wrong place in the file. Is there any way to show the cause of the
> > error in the intermediate ReST?  
> 
> Yes, but this will require modifying the parser. I prefer merging this
> series without such change, and then having a separate changeset
> addressing it.
> 
> There are two ways we can do that:
> 
> 1. The parser can add a ReST comment with the line number. This
>    is what it is done by kerneldoc.py Sphinx extension:
> 
> 	lineoffset = 0
> 	line_regex = re.compile(r"^\.\. LINENO ([0-9]+)$")
>         for line in lines:
>             match = line_regex.search(line)
>             if match:
>                 lineoffset = int(match.group(1)) - 1 # sphinx counts lines from 0
>             else:
>                 doc = str(env.srcdir) + "/" + env.docname + ":" + str(self.lineno)
>                 result.append(line, doc + ": " + filename, lineoffset)
>                 lineoffset += 1
> 
>    I kept the same way after its conversion to Python, as right now,
>    it supports both a Python class and a command lin command. I may
>    eventually clean it up in the future.
> 
> 2. making the parser return a tuple. At kernel_abi.py, as the parser
>    returns content from multiple files, such tuple is:
> 
> 		 (rst_output, filename, line_number)
> 
>    and the code for it is (cleaned up):
> 
> 	for msg, f, ln in kernel_abi.doc(show_file=show_file,
>                                          show_symbols=show_symbols,
>                                          filter_path=abi_type):
> 
>             lines = statemachine.string2lines(msg, tab_width,
>                                               convert_whitespace=True)
> 
>             for line in lines:
>                 content.append(line, f, ln - 1) # sphinx counts lines from 0
> 
> (2) is cleaner and faster, but (1) is easier to implement on an 
> already-existing code.

The logic below implements (1). This seems to be the easiest way for
pyyaml. I will submit as 2 separate patches at the end of the next
version.

Please notice that I didn't check yet for the "quality" of the
line numbers. Some tweaks could be needed later on.

Regards,
Mauro

---

From 750daebebadcd156b5fe9b516f4fae4bd42b9d2c Mon Sep 17 00:00:00 2001
From: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Date: Tue, 17 Jun 2025 17:54:03 +0200
Subject: [PATCH] docs: parser_yaml.py: add support for line numbers from the
 parser

Instead of printing line numbers from the temp converted ReST
file, get them from the original source.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 635945e1c5ba..15c642fc0bd5 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -29,6 +29,8 @@ class YamlParser(Parser):
 
     netlink_parser = YnlDocGenerator()
 
+    re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
+
     def do_parse(self, inputstring, document, msg):
         """Parse YAML and generate a document tree."""
 
@@ -38,8 +40,14 @@ class YamlParser(Parser):
 
         try:
             # Parse message with RSTParser
-            for i, line in enumerate(msg.split('\n')):
-                result.append(line, document.current_source, i)
+            lineoffset = 0;
+            for line in msg.split('\n'):
+                match = self.re_lineno.match(line)
+                if match:
+                    lineoffset = int(match.group(1))
+                    continue
+
+                result.append(line, document.current_source, lineoffset)
 
             rst_parser = RSTParser()
             rst_parser.parse('\n'.join(result), document)

From 15c1f9db30f3abdce110e19788d87f9fe1417781 Mon Sep 17 00:00:00 2001
From: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Date: Tue, 17 Jun 2025 17:28:04 +0200
Subject: [PATCH] tools: netlink_yml_parser.py: add line numbers to parsed data

When something goes wrong, we want Sphinx error to point to the
right line number from the original source, not from the
processed ReST data.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
index 866551726723..a9d8ab6f2639 100755
--- a/tools/net/ynl/pyynl/netlink_yml_parser.py
+++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
@@ -20,6 +20,16 @@
 from typing import Any, Dict, List
 import yaml
 
+LINE_STR = '__lineno__'
+
+class NumberedSafeLoader(yaml.SafeLoader):
+    """Override the SafeLoader class to add line number to parsed data"""
+
+    def construct_mapping(self, node):
+        mapping = super().construct_mapping(node)
+        mapping[LINE_STR] = node.start_mark.line
+
+        return mapping
 
 class RstFormatters:
     """RST Formatters"""
@@ -127,6 +137,11 @@ class RstFormatters:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
+    @staticmethod
+    def rst_lineno(lineno: int) -> str:
+        """Return a lineno comment"""
+        return f".. LINENO {lineno}\n"
+
 class YnlDocGenerator:
     """YAML Netlink specs Parser"""
 
@@ -144,6 +159,9 @@ class YnlDocGenerator:
         """Parse 'do' section and return a formatted string"""
         lines = []
         for key in do_dict.keys():
+            if key == LINE_STR:
+                lines.append(self.fmt.rst_lineno(do_dict[key]))
+                continue
             lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
             if key in ['request', 'reply']:
                 lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
@@ -174,6 +192,10 @@ class YnlDocGenerator:
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
+                if key == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(operation[key]))
+                    continue
+
                 if key in preprocessed:
                     # Skip the special fields
                     continue
@@ -233,6 +255,9 @@ class YnlDocGenerator:
         for definition in defs:
             lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
             for k in definition.keys():
+                if k == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(definition[k]))
+                    continue
                 if k in preprocessed + ignored:
                     continue
                 lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
@@ -268,6 +293,9 @@ class YnlDocGenerator:
                 lines.append(self.fmt.rst_subsubsection(attr_line))
 
                 for k in attr.keys():
+                    if k == LINE_STR:
+                        lines.append(self.fmt.rst_lineno(attr[k]))
+                        continue
                     if k in preprocessed + ignored:
                         continue
                     if k in linkable:
@@ -306,6 +334,8 @@ class YnlDocGenerator:
         lines = []
 
         # Main header
+        lineno = obj.get('__lineno__', 0)
+        lines.append(self.fmt.rst_lineno(lineno))
 
         family = obj['name']
 
@@ -354,7 +384,7 @@ class YnlDocGenerator:
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
-            yaml_data = yaml.safe_load(spec_file)
-            content = self.parse_yaml(yaml_data)
+            numbered_yaml = yaml.load(spec_file, Loader=NumberedSafeLoader)
+            content = self.parse_yaml(numbered_yaml)
 
         return content


^ permalink raw reply related	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17 16:00       ` Mauro Carvalho Chehab
@ 2025-06-17 17:23         ` Donald Hunter
  2025-06-18  8:21           ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 42+ messages in thread
From: Donald Hunter @ 2025-06-17 17:23 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Tue, 17 Jun 2025 at 17:00, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
> >
> > (2) is cleaner and faster, but (1) is easier to implement on an
> > already-existing code.
>
> The logic below implements (1). This seems to be the easiest way for
> pyyaml. I will submit as 2 separate patches at the end of the next
> version.
>
> Please notice that I didn't check yet for the "quality" of the
> line numbers. Some tweaks could be needed later on.

Thanks for working on this. I suppose we might be able to work on an
evolution from (1) to (2) in a followup piece of work?

> Regards,
> Mauro
>
> ---
>
> From 750daebebadcd156b5fe9b516f4fae4bd42b9d2c Mon Sep 17 00:00:00 2001
> From: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> Date: Tue, 17 Jun 2025 17:54:03 +0200
> Subject: [PATCH] docs: parser_yaml.py: add support for line numbers from the
>  parser
>
> Instead of printing line numbers from the temp converted ReST
> file, get them from the original source.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
>
> diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> index 635945e1c5ba..15c642fc0bd5 100755
> --- a/Documentation/sphinx/parser_yaml.py
> +++ b/Documentation/sphinx/parser_yaml.py
> @@ -29,6 +29,8 @@ class YamlParser(Parser):
>
>      netlink_parser = YnlDocGenerator()
>
> +    re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
> +
>      def do_parse(self, inputstring, document, msg):
>          """Parse YAML and generate a document tree."""
>
> @@ -38,8 +40,14 @@ class YamlParser(Parser):
>
>          try:
>              # Parse message with RSTParser
> -            for i, line in enumerate(msg.split('\n')):
> -                result.append(line, document.current_source, i)
> +            lineoffset = 0;
> +            for line in msg.split('\n'):
> +                match = self.re_lineno.match(line)
> +                if match:
> +                    lineoffset = int(match.group(1))
> +                    continue
> +
> +                result.append(line, document.current_source, lineoffset)
>
>              rst_parser = RSTParser()
>              rst_parser.parse('\n'.join(result), document)
>
> From 15c1f9db30f3abdce110e19788d87f9fe1417781 Mon Sep 17 00:00:00 2001
> From: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> Date: Tue, 17 Jun 2025 17:28:04 +0200
> Subject: [PATCH] tools: netlink_yml_parser.py: add line numbers to parsed data
>
> When something goes wrong, we want Sphinx error to point to the
> right line number from the original source, not from the
> processed ReST data.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
>
> diff --git a/tools/net/ynl/pyynl/netlink_yml_parser.py b/tools/net/ynl/pyynl/netlink_yml_parser.py
> index 866551726723..a9d8ab6f2639 100755
> --- a/tools/net/ynl/pyynl/netlink_yml_parser.py
> +++ b/tools/net/ynl/pyynl/netlink_yml_parser.py
> @@ -20,6 +20,16 @@
>  from typing import Any, Dict, List
>  import yaml
>
> +LINE_STR = '__lineno__'
> +
> +class NumberedSafeLoader(yaml.SafeLoader):
> +    """Override the SafeLoader class to add line number to parsed data"""
> +
> +    def construct_mapping(self, node):
> +        mapping = super().construct_mapping(node)
> +        mapping[LINE_STR] = node.start_mark.line
> +
> +        return mapping
>
>  class RstFormatters:
>      """RST Formatters"""
> @@ -127,6 +137,11 @@ class RstFormatters:
>          """Return a formatted label"""
>          return f".. _{title}:\n\n"
>
> +    @staticmethod
> +    def rst_lineno(lineno: int) -> str:
> +        """Return a lineno comment"""
> +        return f".. LINENO {lineno}\n"
> +
>  class YnlDocGenerator:
>      """YAML Netlink specs Parser"""
>
> @@ -144,6 +159,9 @@ class YnlDocGenerator:
>          """Parse 'do' section and return a formatted string"""
>          lines = []
>          for key in do_dict.keys():
> +            if key == LINE_STR:
> +                lines.append(self.fmt.rst_lineno(do_dict[key]))
> +                continue
>              lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
>              if key in ['request', 'reply']:
>                  lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
> @@ -174,6 +192,10 @@ class YnlDocGenerator:
>              lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
>
>              for key in operation.keys():
> +                if key == LINE_STR:
> +                    lines.append(self.fmt.rst_lineno(operation[key]))
> +                    continue
> +
>                  if key in preprocessed:
>                      # Skip the special fields
>                      continue
> @@ -233,6 +255,9 @@ class YnlDocGenerator:
>          for definition in defs:
>              lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
>              for k in definition.keys():
> +                if k == LINE_STR:
> +                    lines.append(self.fmt.rst_lineno(definition[k]))
> +                    continue
>                  if k in preprocessed + ignored:
>                      continue
>                  lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
> @@ -268,6 +293,9 @@ class YnlDocGenerator:
>                  lines.append(self.fmt.rst_subsubsection(attr_line))
>
>                  for k in attr.keys():
> +                    if k == LINE_STR:
> +                        lines.append(self.fmt.rst_lineno(attr[k]))
> +                        continue
>                      if k in preprocessed + ignored:
>                          continue
>                      if k in linkable:
> @@ -306,6 +334,8 @@ class YnlDocGenerator:
>          lines = []
>
>          # Main header
> +        lineno = obj.get('__lineno__', 0)
> +        lines.append(self.fmt.rst_lineno(lineno))
>
>          family = obj['name']
>
> @@ -354,7 +384,7 @@ class YnlDocGenerator:
>      def parse_yaml_file(self, filename: str) -> str:
>          """Transform the YAML specified by filename into an RST-formatted string"""
>          with open(filename, "r", encoding="utf-8") as spec_file:
> -            yaml_data = yaml.safe_load(spec_file)
> -            content = self.parse_yaml(yaml_data)
> +            numbered_yaml = yaml.load(spec_file, Loader=NumberedSafeLoader)
> +            content = self.parse_yaml(numbered_yaml)
>
>          return content
>

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
  2025-06-17 10:38   ` Donald Hunter
@ 2025-06-18  2:42   ` Akira Yokosawa
  2025-06-18 11:44     ` Mauro Carvalho Chehab
  1 sibling, 1 reply; 42+ messages in thread
From: Akira Yokosawa @ 2025-06-18  2:42 UTC (permalink / raw)
  To: Mauro Carvalho Chehab, Linux Doc Mailing List, Jonathan Corbet
  Cc: Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern, Akira Yokosawa

Hi Mauro,

A comment on compatibility with earlier Sphinx.

On Tue, 17 Jun 2025 10:01:58 +0200, Mauro Carvalho Chehab wrote:
> When one does:
> 	make SPHINXDIRS="foo" htmldocs
> 
> All patterns would be relative to Documentation/foo, which
> causes the include/exclude patterns like:
> 
> 	include_patterns = [
> 		...
> 		f'foo/*.{ext}',
> 	]
> 
> to break. This is not what it is expected. Address it by
> adding a logic to dynamically adjust the pattern when
> SPHINXDIRS is used.
> 
> That allows adding parsers for other file types.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  Documentation/conf.py | 52 +++++++++++++++++++++++++++++++++++++++----
>  1 file changed, 48 insertions(+), 4 deletions(-)
> 
> diff --git a/Documentation/conf.py b/Documentation/conf.py
> index 12de52a2b17e..e887c1b786a4 100644
> --- a/Documentation/conf.py
> +++ b/Documentation/conf.py
> @@ -17,6 +17,54 @@ import os
>  import sphinx
>  import shutil
>  
> +# Location of Documentation/ directory
> +doctree = os.path.abspath('.')
> +
> +# List of patterns that don't contain directory names, in glob format.
> +include_patterns = ['**.rst']
> +exclude_patterns = []
> +

Where "exclude_patterns" has been with us ever since Sphinx 1.0,
"include_patterns" was added fairly recently in Sphinx 5.1 [1].

[1]: https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns

So, this breaks earlier Sphinx versions.

Also, after applying all of v5 on top of docs-next, I see these new
warnings with Sphinx 7.2.6 (of Ubuntu 24.04):

/<srcdir>/Documentation/output/ca.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/cec.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/dmx.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/frontend.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/lirc.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/media.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/net.h.rst: WARNING: document isn't included in any toctree
/<srcdir>/Documentation/output/videodev2.h.rst: WARNING: document isn't included in any toctree

Sphinx 7.3.7 and later are free of them.  I have no idea which change in
Sphinx 7.3 got rid of them.

Now that the parallel build performance regression has be resolved in
Sphinx 7.4, I don't think there is much demand for keeping Sphinx versions
compatible.
These build errors and extra warnings would encourage people to upgrade
there Sphinx.  So I'm not going to nack this.

Of course, getting rid of above warnings with < Sphinx 7.3 would be ideal.

        Thanks, Akira

> +# List of patterns that contain directory names in glob format.
> +dyn_include_patterns = []
> +dyn_exclude_patterns = ['output']
> +
[...]

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-17 11:59   ` Simon Horman
@ 2025-06-18  6:57     ` Mauro Carvalho Chehab
  2025-06-18 12:40       ` Simon Horman
  0 siblings, 1 reply; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18  6:57 UTC (permalink / raw)
  To: Simon Horman
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Tue, 17 Jun 2025 12:59:27 +0100
Simon Horman <horms@kernel.org> escreveu:

> On Tue, Jun 17, 2025 at 10:02:02AM +0200, Mauro Carvalho Chehab wrote:
> > It is not a good practice to store build-generated files
> > inside $(srctree), as one may be using O=<BUILDDIR> and even
> > have the Kernel on a read-only directory.
> > 
> > Change the YAML generation for netlink files to allow it
> > to parse data based on the source or on the object tree.
> > 
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
> >  1 file changed, 16 insertions(+), 6 deletions(-)
> > 
> > diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> > index 7bfb8ceeeefc..b1e5acafb998 100755
> > --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> > +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> > @@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
> >  
> >      parser.add_argument("-v", "--verbose", action="store_true")
> >      parser.add_argument("-o", "--output", help="Output file name")
> > +    parser.add_argument("-d", "--input_dir", help="YAML input directory")
> >  
> >      # Index and input are mutually exclusive
> >      group = parser.add_mutually_exclusive_group()
> > @@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
> >      """Write the generated content into an RST file"""
> >      logging.debug("Saving RST file to %s", filename)
> >  
> > +    dir = os.path.dirname(filename)
> > +    os.makedirs(dir, exist_ok=True)
> > +
> >      with open(filename, "w", encoding="utf-8") as rst_file:
> >          rst_file.write(content)  
> 
> Hi Mauro,
> 
> With this patch applied I see the following, which did not happen before.

Thanks! this was an intermediate step. I'll just drop this patch and
fix conflicts at the next version.

> 
> $ make -C tools/net/ynl
> ...
> Traceback (most recent call last):
>   File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 464, in <module>
>     main()
>     ~~~~^^
>   File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 456, in main
>     write_to_rstfile(content, args.output)
>     ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^
>   File ".../tools/net/ynl/generated/../pyynl/ynl_gen_rst.py", line 410, in write_to_rstfile
>     os.makedirs(dir, exist_ok=True)
>     ~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^
>   File "<frozen os>", line 227, in makedirs
> FileNotFoundError: [Errno 2] No such file or directory: ''
> make[1]: *** [Makefile:55: conntrack.rst] Error 1
> 



Thanks,
Mauro

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-17 10:38   ` Donald Hunter
@ 2025-06-18  6:59     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18  6:59 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Tue, 17 Jun 2025 11:38:06 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > When one does:
> > 	make SPHINXDIRS="foo" htmldocs
> >
> > All patterns would be relative to Documentation/foo, which
> > causes the include/exclude patterns like:
> >
> > 	include_patterns = [
> > 		...
> > 		f'foo/*.{ext}',
> > 	]
> >
> > to break. This is not what it is expected. Address it by
> > adding a logic to dynamically adjust the pattern when
> > SPHINXDIRS is used.
> >
> > That allows adding parsers for other file types.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>  
> 
> Reviewed-by: Donald Hunter <donald.hunter@gmail.com>

Thanks for reviewing. At the next version, I'm placing some backward
compatible code for Sphinx 5.1, based on Akira's feedback.

As the basic logic is the same, I'm keeping your review there.


Thanks,
Mauro

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-17 17:23         ` Donald Hunter
@ 2025-06-18  8:21           ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18  8:21 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Tue, 17 Jun 2025 18:23:22 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> On Tue, 17 Jun 2025 at 17:00, Mauro Carvalho Chehab
> <mchehab+huawei@kernel.org> wrote:
> > >
> > > (2) is cleaner and faster, but (1) is easier to implement on an
> > > already-existing code.  
> >
> > The logic below implements (1). This seems to be the easiest way for
> > pyyaml. I will submit as 2 separate patches at the end of the next
> > version.
> >
> > Please notice that I didn't check yet for the "quality" of the
> > line numbers. Some tweaks could be needed later on.  
> 
> Thanks for working on this. I suppose we might be able to work on an
> evolution from (1) to (2) in a followup piece of work?

Yes, it shouldn't be hard for you to migrate to (2) in the future. 

Currently, the parser is stateless, but, as there's now a class,
IMO the best would be to store the lines as an array of tuples
inside the YnlDocGenerator class, like this:

	self.lines = [
		(line_number1, message_string1),
		(line_number2, message_string2),
		(line_number3, message_string3),
		(line_number4, message_string4),
		...
	]

This way, the parse_yaml_file() method would just return self.lines.

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-18  2:42   ` Akira Yokosawa
@ 2025-06-18 11:44     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 42+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:44 UTC (permalink / raw)
  To: Akira Yokosawa
  Cc: Linux Doc Mailing List, Jonathan Corbet, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Hi Akira,

Em Wed, 18 Jun 2025 11:42:14 +0900
Akira Yokosawa <akiyks@gmail.com> escreveu:

> Hi Mauro,
> 
> A comment on compatibility with earlier Sphinx.
> 
> On Tue, 17 Jun 2025 10:01:58 +0200, Mauro Carvalho Chehab wrote:
> > When one does:
> > 	make SPHINXDIRS="foo" htmldocs
> > 
> > All patterns would be relative to Documentation/foo, which
> > causes the include/exclude patterns like:
> > 
> > 	include_patterns = [
> > 		...
> > 		f'foo/*.{ext}',
> > 	]
> > 
> > to break. This is not what it is expected. Address it by
> > adding a logic to dynamically adjust the pattern when
> > SPHINXDIRS is used.
> > 
> > That allows adding parsers for other file types.
> > 
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  Documentation/conf.py | 52 +++++++++++++++++++++++++++++++++++++++----
> >  1 file changed, 48 insertions(+), 4 deletions(-)
> > 
> > diff --git a/Documentation/conf.py b/Documentation/conf.py
> > index 12de52a2b17e..e887c1b786a4 100644
> > --- a/Documentation/conf.py
> > +++ b/Documentation/conf.py
> > @@ -17,6 +17,54 @@ import os
> >  import sphinx
> >  import shutil
> >  
> > +# Location of Documentation/ directory
> > +doctree = os.path.abspath('.')
> > +
> > +# List of patterns that don't contain directory names, in glob format.
> > +include_patterns = ['**.rst']
> > +exclude_patterns = []
> > +  
> 
> Where "exclude_patterns" has been with us ever since Sphinx 1.0,
> "include_patterns" was added fairly recently in Sphinx 5.1 [1].
> 
> [1]: https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns
> 
> So, this breaks earlier Sphinx versions.

Heh, testing against old versions is harder with python 3.13 (Fedora
42 default), as one library used by older Sphinx versions were dropped.

I found a way to make it backward compatible up to 3.4.3, with a
backward-compatible logic at conf.py. I'll send the new version in a few.

> Also, after applying all of v5 on top of docs-next, I see these new
> warnings with Sphinx 7.2.6 (of Ubuntu 24.04):
> 
> /<srcdir>/Documentation/output/ca.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/cec.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/dmx.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/frontend.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/lirc.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/media.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/net.h.rst: WARNING: document isn't included in any toctree
> /<srcdir>/Documentation/output/videodev2.h.rst: WARNING: document isn't included in any toctree

We should likely use a Sphinx extension for those as well. Building those
are also made via some Makefile tricks that predates the time we start
adding our own extensions at the tree.

> Sphinx 7.3.7 and later are free of them.  I have no idea which change in
> Sphinx 7.3 got rid of them.
> 
> Now that the parallel build performance regression has be resolved in
> Sphinx 7.4, I don't think there is much demand for keeping Sphinx versions
> compatible.
> These build errors and extra warnings would encourage people to upgrade
> there Sphinx.  So I'm not going to nack this.
> 
> Of course, getting rid of above warnings with < Sphinx 7.3 would be ideal.

I'm all for using newer versions, but we need to check what LTS distros
are using those days.

On my machine, with -jauto, 3.4.3 is taking 11 minutes to build, which
is twice the time of 8.2.3. IMO, this is a very good reason for people
stop using legacy versions when possible :-)

Regards,
Mauro

^ permalink raw reply	[flat|nested] 42+ messages in thread

* Re: [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-18  6:57     ` Mauro Carvalho Chehab
@ 2025-06-18 12:40       ` Simon Horman
  0 siblings, 0 replies; 42+ messages in thread
From: Simon Horman @ 2025-06-18 12:40 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Wed, Jun 18, 2025 at 08:57:35AM +0200, Mauro Carvalho Chehab wrote:
> Em Tue, 17 Jun 2025 12:59:27 +0100
> Simon Horman <horms@kernel.org> escreveu:
> 
> > On Tue, Jun 17, 2025 at 10:02:02AM +0200, Mauro Carvalho Chehab wrote:
> > > It is not a good practice to store build-generated files
> > > inside $(srctree), as one may be using O=<BUILDDIR> and even
> > > have the Kernel on a read-only directory.
> > > 
> > > Change the YAML generation for netlink files to allow it
> > > to parse data based on the source or on the object tree.
> > > 
> > > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > > ---
> > >  tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
> > >  1 file changed, 16 insertions(+), 6 deletions(-)
> > > 
> > > diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> > > index 7bfb8ceeeefc..b1e5acafb998 100755
> > > --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> > > +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> > > @@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
> > >  
> > >      parser.add_argument("-v", "--verbose", action="store_true")
> > >      parser.add_argument("-o", "--output", help="Output file name")
> > > +    parser.add_argument("-d", "--input_dir", help="YAML input directory")
> > >  
> > >      # Index and input are mutually exclusive
> > >      group = parser.add_mutually_exclusive_group()
> > > @@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
> > >      """Write the generated content into an RST file"""
> > >      logging.debug("Saving RST file to %s", filename)
> > >  
> > > +    dir = os.path.dirname(filename)
> > > +    os.makedirs(dir, exist_ok=True)
> > > +
> > >      with open(filename, "w", encoding="utf-8") as rst_file:
> > >          rst_file.write(content)  
> > 
> > Hi Mauro,
> > 
> > With this patch applied I see the following, which did not happen before.
> 
> Thanks! this was an intermediate step. I'll just drop this patch and
> fix conflicts at the next version.

Likewise, thanks.

^ permalink raw reply	[flat|nested] 42+ messages in thread

end of thread, other threads:[~2025-06-18 12:40 UTC | newest]

Thread overview: 42+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-06-17  8:01 [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-17  8:01 ` [PATCH v5 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
2025-06-17 10:38   ` Donald Hunter
2025-06-18  6:59     ` Mauro Carvalho Chehab
2025-06-18  2:42   ` Akira Yokosawa
2025-06-18 11:44     ` Mauro Carvalho Chehab
2025-06-17  8:01 ` [PATCH v5 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
2025-06-17  9:53   ` Breno Leitao
2025-06-17 10:38   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 03/15] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
2025-06-17 10:40   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 04/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-06-17  8:02 ` [PATCH v5 05/15] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
2025-06-17 10:55   ` Donald Hunter
2025-06-17 11:59   ` Simon Horman
2025-06-18  6:57     ` Mauro Carvalho Chehab
2025-06-18 12:40       ` Simon Horman
2025-06-17  8:02 ` [PATCH v5 06/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
2025-06-17 11:08   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 07/15] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
2025-06-17  8:02 ` [PATCH v5 08/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
2025-06-17 10:43   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 09/15] tools: ynl_gen_rst.py: clanup coding style Mauro Carvalho Chehab
2025-06-17  9:49   ` Breno Leitao
2025-06-17  9:50   ` Breno Leitao
2025-06-17 11:12   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 10/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
2025-06-17 12:35   ` Donald Hunter
2025-06-17 13:40     ` Mauro Carvalho Chehab
2025-06-17 16:00       ` Mauro Carvalho Chehab
2025-06-17 17:23         ` Donald Hunter
2025-06-18  8:21           ` Mauro Carvalho Chehab
2025-06-17  8:02 ` [PATCH v5 11/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
2025-06-17 12:45   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 12/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
2025-06-17 12:46   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 13/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
2025-06-17  8:02 ` [PATCH v5 14/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
2025-06-17 12:38   ` Donald Hunter
2025-06-17  8:02 ` [PATCH v5 15/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
2025-06-17  9:50   ` Breno Leitao
2025-06-17 12:58 ` [PATCH v5 00/15] Don't generate netlink .rst files inside $(srctree) Donald Hunter

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).