linux-doc.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree)
@ 2025-06-12 10:31 Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 01/12] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
                   ` (12 more replies)
  0 siblings, 13 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
	Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
	netdev, peterz, stern, Breno Leitao, Jakub Kicinski, Simon Horman

As discussed at:
   https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/

changeset f061c9f7d058 ("Documentation: Document each netlink family")
added a logic which generates *.rst files inside $(srctree). This is bad when
O=<BUILDDIR> is used.

A recent change renamed the yaml files used by Netlink, revealing a bad
side effect: as "make cleandocs" don't clean the produced files, symbols 
appear duplicated for people that don't build the kernel from scratch.

There are some possible solutions for that. The simplest one, which is what
this series address, places the build files inside Documentation/output. 
The changes to do that are simple enough, but has one drawback,
as it requires a (simple) template file for every netlink family file from
netlink/specs. The template is simple enough:

        .. kernel-include:: $BUILDDIR/networking/netlink_spec/<family>.rst

Part of the issue is that sphinx-build only produces html files for sources
inside the source tree (Documentation/). 

To address that, add an yaml parser extension to Sphinx.

It should be noticed that this version has one drawback: it increases the
documentation build time. I suspect that the culprit is inside Sphinx
glob logic and the way it handles exclude_patterns. What happens is that
sphinx/project.py uses glob, which, on my own experiences, it is slow
(due to that, I ended implementing my own glob logic for kernel-doc).

On the plus side, the extension is flexible enough to handle other types
of yaml files, as the actual yaml conversion logic is outside the extension.

With this version, there's no need to add any template file per netlink/spec
file. Yet, the Documentation/netlink/spec.index.rst require updates as
spec files are added/renamed/removed. The already-existing script can
handle it automatically by running:

            tools/net/ynl/pyynl/ynl_gen_rst.py -x  -v -o Documentation/netlink/specs/index.rst

---

v2:
- Use a Sphinx extension to handle netlink files.

v1:
- Statically add template files to as networking/netlink_spec/<family>.rst

Mauro Carvalho Chehab (12):
  tools: ynl_gen_rst.py: create a top-level reference
  docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  docs: netlink: don't ignore generated rst files
  tools: ynl_gen_rst.py: make the index parser more generic
  tools: ynl_gen_rst.py: Split library from command line tool
  scripts: lib: netlink_yml_parser.py: use classes
  tools: ynl_gen_rst.py: do some coding style cleanups
  scripts: netlink_yml_parser.py: improve index.rst generation
  docs: sphinx: add a parser template for yaml files
  docs: sphinx: parser_yaml.py: add Netlink specs parser
  docs: use parser_yaml extension to handle Netlink specs
  docs: conf.py: don't handle yaml files outside Netlink specs

 .pylintrc                                     |   2 +-
 Documentation/Makefile                        |  17 -
 Documentation/conf.py                         |  17 +-
 Documentation/netlink/specs/index.rst         |  38 ++
 Documentation/networking/index.rst            |   2 +-
 .../networking/netlink_spec/.gitignore        |   1 -
 .../networking/netlink_spec/readme.txt        |   4 -
 Documentation/sphinx/parser_yaml.py           |  80 ++++
 .../userspace-api/netlink/netlink-raw.rst     |   6 +-
 scripts/lib/netlink_yml_parser.py             | 394 ++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py            | 378 +----------------
 11 files changed, 544 insertions(+), 395 deletions(-)
 create mode 100644 Documentation/netlink/specs/index.rst
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt
 create mode 100755 Documentation/sphinx/parser_yaml.py
 create mode 100755 scripts/lib/netlink_yml_parser.py

-- 
2.49.0



^ permalink raw reply	[flat|nested] 32+ messages in thread

* [PATCH v2 01/12] tools: ynl_gen_rst.py: create a top-level reference
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 02/12] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Currently, rt documents are referred with:

Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`

that's hard to maintain, and may break if we change the way
rst files are generated from yaml. Better to use instead a
reference for the netlink family.

So, add a netlink-<foo> reference to all generated docs.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/ynl_gen_rst.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
 
     # Main header
 
-    lines.append(rst_header())
-
     family = obj['name']
 
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
     title = f"Family ``{family}`` netlink specification"
     lines.append(rst_title(title))
     lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 02/12] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 01/12] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 03/12] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. So, replace them by
Sphinx cross-reference tag that are now created by ynl_gen_rst.py.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
 ------------
 
 Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
 abstraction to carry module specific information.
 
 Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
 Nested struct definitions
 -------------------------
 
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
 make use of nested struct definitions. The ``netlink-raw`` schema makes it
 possible to embed a struct within a struct definition using the ``struct``
 property. For example, the following struct definition embeds the
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 03/12] docs: netlink: don't ignore generated rst files
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 01/12] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 02/12] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 04/12] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Currently, the build system generates ReST files inside the
source directory. This is not a good idea, specially when
we have renames, as make clean won't get rid of them.

As the first step to address the issue, stop ignoring those
files. This way, we can see exactly what has been produced
at build time inside $(srctree):

        Documentation/networking/netlink_spec/conntrack.rst
        Documentation/networking/netlink_spec/devlink.rst
        Documentation/networking/netlink_spec/dpll.rst
        Documentation/networking/netlink_spec/ethtool.rst
        Documentation/networking/netlink_spec/fou.rst
        Documentation/networking/netlink_spec/handshake.rst
        Documentation/networking/netlink_spec/index.rst
        Documentation/networking/netlink_spec/lockd.rst
        Documentation/networking/netlink_spec/mptcp_pm.rst
        Documentation/networking/netlink_spec/net_shaper.rst
        Documentation/networking/netlink_spec/netdev.rst
        Documentation/networking/netlink_spec/nfsd.rst
        Documentation/networking/netlink_spec/nftables.rst
        Documentation/networking/netlink_spec/nl80211.rst
        Documentation/networking/netlink_spec/nlctrl.rst
        Documentation/networking/netlink_spec/ovs_datapath.rst
        Documentation/networking/netlink_spec/ovs_flow.rst
        Documentation/networking/netlink_spec/ovs_vport.rst
        Documentation/networking/netlink_spec/rt_addr.rst
        Documentation/networking/netlink_spec/rt_link.rst
        Documentation/networking/netlink_spec/rt_neigh.rst
        Documentation/networking/netlink_spec/rt_route.rst
        Documentation/networking/netlink_spec/rt_rule.rst
        Documentation/networking/netlink_spec/tc.rst
        Documentation/networking/netlink_spec/tcp_metrics.rst
        Documentation/networking/netlink_spec/team.rst

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/networking/netlink_spec/.gitignore | 1 -
 1 file changed, 1 deletion(-)
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore

diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 04/12] tools: ynl_gen_rst.py: make the index parser more generic
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-06-12 10:31 ` [PATCH v2 03/12] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-12 10:31 ` [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

It is not a good practice to store build-generated files
inside $(srctree), as one may be using O=<BUILDDIR> and even
have the Kernel on a read-only directory.

Change the YAML generation for netlink files to allow it
to parse data based on the source or on the object tree.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
 1 file changed, 16 insertions(+), 6 deletions(-)

diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..b1e5acafb998 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
 
     parser.add_argument("-v", "--verbose", action="store_true")
     parser.add_argument("-o", "--output", help="Output file name")
+    parser.add_argument("-d", "--input_dir", help="YAML input directory")
 
     # Index and input are mutually exclusive
     group = parser.add_mutually_exclusive_group()
@@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
 
+    dir = os.path.dirname(filename)
+    os.makedirs(dir, exist_ok=True)
+
     with open(filename, "w", encoding="utf-8") as rst_file:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str) -> None:
+def generate_main_index_rst(output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
     lines = []
 
@@ -418,12 +422,18 @@ def generate_main_index_rst(output: str) -> None:
     lines.append(rst_title("Netlink Family Specifications"))
     lines.append(rst_toctree(1))
 
-    index_dir = os.path.dirname(output)
-    logging.debug("Looking for .rst files in %s", index_dir)
+    index_fname = os.path.basename(output)
+    base, ext = os.path.splitext(index_fname)
+
+    if not index_dir:
+        index_dir = os.path.dirname(output)
+
+    logging.debug(f"Looking for {ext} files in %s", index_dir)
     for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(".rst") or filename == "index.rst":
+        if not filename.endswith(ext) or filename == index_fname:
             continue
-        lines.append(f"   {filename.replace('.rst', '')}\n")
+        base, ext = os.path.splitext(filename)
+        lines.append(f"   {base}\n")
 
     logging.debug("Writing an index file at %s", output)
     write_to_rstfile("".join(lines), output)
@@ -447,7 +457,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        generate_main_index_rst(args.output)
+        generate_main_index_rst(args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-06-12 10:31 ` [PATCH v2 04/12] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-13 11:13   ` Donald Hunter
  2025-06-12 10:31 ` [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/lib/netlink_yml_parser.py  | 391 +++++++++++++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------
 2 files changed, 401 insertions(+), 364 deletions(-)
 create mode 100755 scripts/lib/netlink_yml_parser.py

diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
new file mode 100755
index 000000000000..3c15b578f947
--- /dev/null
+++ b/scripts/lib/netlink_yml_parser.py
@@ -0,0 +1,391 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+    Script to auto generate the documentation for Netlink specifications.
+
+    :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
+    :license:    GPL Version 2, June 1991 see linux/COPYING for details.
+
+    This script performs extensive parsing to the Linux kernel's netlink YAML
+    spec files, in an effort to avoid needing to heavily mark up the original
+    YAML file.
+
+    This code is split in three big parts:
+        1) RST formatters: Use to convert a string to a RST output
+        2) Parser helpers: Functions to parse the YAML data structure
+        3) Main function and small helpers
+"""
+
+from typing import Any, Dict, List
+import os.path
+import logging
+import yaml
+
+
+SPACE_PER_LEVEL = 4
+
+
+# RST Formatters
+# ==============
+def headroom(level: int) -> str:
+    """Return space to format"""
+    return " " * (level * SPACE_PER_LEVEL)
+
+
+def bold(text: str) -> str:
+    """Format bold text"""
+    return f"**{text}**"
+
+
+def inline(text: str) -> str:
+    """Format inline text"""
+    return f"``{text}``"
+
+
+def sanitize(text: str) -> str:
+    """Remove newlines and multiple spaces"""
+    # This is useful for some fields that are spread across multiple lines
+    return str(text).replace("\n", " ").strip()
+
+
+def rst_fields(key: str, value: str, level: int = 0) -> str:
+    """Return a RST formatted field"""
+    return headroom(level) + f":{key}: {value}"
+
+
+def rst_definition(key: str, value: Any, level: int = 0) -> str:
+    """Format a single rst definition"""
+    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
+
+
+def rst_paragraph(paragraph: str, level: int = 0) -> str:
+    """Return a formatted paragraph"""
+    return headroom(level) + paragraph
+
+
+def rst_bullet(item: str, level: int = 0) -> str:
+    """Return a formatted a bullet"""
+    return headroom(level) + f"- {item}"
+
+
+def rst_subsection(title: str) -> str:
+    """Add a sub-section to the document"""
+    return f"{title}\n" + "-" * len(title)
+
+
+def rst_subsubsection(title: str) -> str:
+    """Add a sub-sub-section to the document"""
+    return f"{title}\n" + "~" * len(title)
+
+
+def rst_section(namespace: str, prefix: str, title: str) -> str:
+    """Add a section to the document"""
+    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+def rst_subtitle(title: str) -> str:
+    """Add a subtitle to the document"""
+    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+def rst_title(title: str) -> str:
+    """Add a title to the document"""
+    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+def rst_list_inline(list_: List[str], level: int = 0) -> str:
+    """Format a list using inlines"""
+    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
+
+
+def rst_ref(namespace: str, prefix: str, name: str) -> str:
+    """Add a hyperlink to the document"""
+    mappings = {'enum': 'definition',
+                'fixed-header': 'definition',
+                'nested-attributes': 'attribute-set',
+                'struct': 'definition'}
+    if prefix in mappings:
+        prefix = mappings[prefix]
+    return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+def rst_header() -> str:
+    """The headers for all the auto generated RST files"""
+    lines = []
+
+    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+    return "\n".join(lines)
+
+
+def rst_toctree(maxdepth: int = 2) -> str:
+    """Generate a toctree RST primitive"""
+    lines = []
+
+    lines.append(".. toctree::")
+    lines.append(f"   :maxdepth: {maxdepth}\n\n")
+
+    return "\n".join(lines)
+
+
+def rst_label(title: str) -> str:
+    """Return a formatted label"""
+    return f".. _{title}:\n\n"
+
+
+# Parsers
+# =======
+
+
+def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
+    """Parse 'multicast' group list and return a formatted string"""
+    lines = []
+    for group in mcast_group:
+        lines.append(rst_bullet(group["name"]))
+
+    return "\n".join(lines)
+
+
+def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
+    """Parse 'do' section and return a formatted string"""
+    lines = []
+    for key in do_dict.keys():
+        lines.append(rst_paragraph(bold(key), level + 1))
+        if key in ['request', 'reply']:
+            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
+        else:
+            lines.append(headroom(level + 2) + do_dict[key] + "\n")
+
+    return "\n".join(lines)
+
+
+def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
+    """Parse 'attributes' section"""
+    if "attributes" not in attrs:
+        return ""
+    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
+
+    return "\n".join(lines)
+
+
+def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse operations block"""
+    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+    linkable = ["fixed-header", "attribute-set"]
+    lines = []
+
+    for operation in operations:
+        lines.append(rst_section(namespace, 'operation', operation["name"]))
+        lines.append(rst_paragraph(operation["doc"]) + "\n")
+
+        for key in operation.keys():
+            if key in preprocessed:
+                # Skip the special fields
+                continue
+            value = operation[key]
+            if key in linkable:
+                value = rst_ref(namespace, key, value)
+            lines.append(rst_fields(key, value, 0))
+        if 'flags' in operation:
+            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
+
+        if "do" in operation:
+            lines.append(rst_paragraph(":do:", 0))
+            lines.append(parse_do(operation["do"], 0))
+        if "dump" in operation:
+            lines.append(rst_paragraph(":dump:", 0))
+            lines.append(parse_do(operation["dump"], 0))
+
+        # New line after fields
+        lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
+    """Parse a list of entries"""
+    ignored = ["pad"]
+    lines = []
+    for entry in entries:
+        if isinstance(entry, dict):
+            # entries could be a list or a dictionary
+            field_name = entry.get("name", "")
+            if field_name in ignored:
+                continue
+            type_ = entry.get("type")
+            if type_:
+                field_name += f" ({inline(type_)})"
+            lines.append(
+                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
+            )
+        elif isinstance(entry, list):
+            lines.append(rst_list_inline(entry, level))
+        else:
+            lines.append(rst_bullet(inline(sanitize(entry)), level))
+
+    lines.append("\n")
+    return "\n".join(lines)
+
+
+def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
+    """Parse definitions section"""
+    preprocessed = ["name", "entries", "members"]
+    ignored = ["render-max"]  # This is not printed
+    lines = []
+
+    for definition in defs:
+        lines.append(rst_section(namespace, 'definition', definition["name"]))
+        for k in definition.keys():
+            if k in preprocessed + ignored:
+                continue
+            lines.append(rst_fields(k, sanitize(definition[k]), 0))
+
+        # Field list needs to finish with a new line
+        lines.append("\n")
+        if "entries" in definition:
+            lines.append(rst_paragraph(":entries:", 0))
+            lines.append(parse_entries(definition["entries"], 1))
+        if "members" in definition:
+            lines.append(rst_paragraph(":members:", 0))
+            lines.append(parse_entries(definition["members"], 1))
+
+    return "\n".join(lines)
+
+
+def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse attribute from attribute-set"""
+    preprocessed = ["name", "type"]
+    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+    ignored = ["checks"]
+    lines = []
+
+    for entry in entries:
+        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
+        for attr in entry["attributes"]:
+            type_ = attr.get("type")
+            attr_line = attr["name"]
+            if type_:
+                # Add the attribute type in the same line
+                attr_line += f" ({inline(type_)})"
+
+            lines.append(rst_subsubsection(attr_line))
+
+            for k in attr.keys():
+                if k in preprocessed + ignored:
+                    continue
+                if k in linkable:
+                    value = rst_ref(namespace, k, attr[k])
+                else:
+                    value = sanitize(attr[k])
+                lines.append(rst_fields(k, value, 0))
+            lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
+    """Parse sub-message definitions"""
+    lines = []
+
+    for entry in entries:
+        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
+        for fmt in entry["formats"]:
+            value = fmt["value"]
+
+            lines.append(rst_bullet(bold(value)))
+            for attr in ['fixed-header', 'attribute-set']:
+                if attr in fmt:
+                    lines.append(rst_fields(attr,
+                                            rst_ref(namespace, attr, fmt[attr]),
+                                            1))
+            lines.append("\n")
+
+    return "\n".join(lines)
+
+
+def parse_yaml(obj: Dict[str, Any]) -> str:
+    """Format the whole YAML into a RST string"""
+    lines = []
+
+    # Main header
+
+    family = obj['name']
+
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
+    title = f"Family ``{family}`` netlink specification"
+    lines.append(rst_title(title))
+    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+
+    if "doc" in obj:
+        lines.append(rst_subtitle("Summary"))
+        lines.append(rst_paragraph(obj["doc"], 0))
+
+    # Operations
+    if "operations" in obj:
+        lines.append(rst_subtitle("Operations"))
+        lines.append(parse_operations(obj["operations"]["list"], family))
+
+    # Multicast groups
+    if "mcast-groups" in obj:
+        lines.append(rst_subtitle("Multicast groups"))
+        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+
+    # Definitions
+    if "definitions" in obj:
+        lines.append(rst_subtitle("Definitions"))
+        lines.append(parse_definitions(obj["definitions"], family))
+
+    # Attributes set
+    if "attribute-sets" in obj:
+        lines.append(rst_subtitle("Attribute sets"))
+        lines.append(parse_attr_sets(obj["attribute-sets"], family))
+
+    # Sub-messages
+    if "sub-messages" in obj:
+        lines.append(rst_subtitle("Sub-messages"))
+        lines.append(parse_sub_messages(obj["sub-messages"], family))
+
+    return "\n".join(lines)
+
+
+# Main functions
+# ==============
+
+
+def parse_yaml_file(filename: str) -> str:
+    """Transform the YAML specified by filename into an RST-formatted string"""
+    with open(filename, "r", encoding="utf-8") as spec_file:
+        yaml_data = yaml.safe_load(spec_file)
+        content = parse_yaml(yaml_data)
+
+    return content
+
+
+def generate_main_index_rst(output: str, index_dir: str) -> str:
+    """Generate the `networking_spec/index` content and write to the file"""
+    lines = []
+
+    lines.append(rst_header())
+    lines.append(rst_label("specs"))
+    lines.append(rst_title("Netlink Family Specifications"))
+    lines.append(rst_toctree(1))
+
+    index_fname = os.path.basename(output)
+    base, ext = os.path.splitext(index_fname)
+
+    if not index_dir:
+        index_dir = os.path.dirname(output)
+
+    logging.debug(f"Looking for {ext} files in %s", index_dir)
+    for filename in sorted(os.listdir(index_dir)):
+        if not filename.endswith(ext) or filename == index_fname:
+            continue
+        base, ext = os.path.splitext(filename)
+        lines.append(f"   {base}\n")
+
+    return "".join(lines), output
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index b1e5acafb998..38dafe3d9179 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -18,345 +18,17 @@
         3) Main function and small helpers
 """
 
-from typing import Any, Dict, List
 import os.path
 import sys
 import argparse
 import logging
-import yaml
 
+LIB_DIR = "../../../../scripts/lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
-SPACE_PER_LEVEL = 4
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
-
-
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
-
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
-    return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
-
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
-
-    return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
-
-        # New line after fields
-        lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
-                if k in preprocessed + ignored:
-                    continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
-
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
-
-    # Main header
-
-    family = obj['name']
-
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
-
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
-
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
-
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
-
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
-
-    return "\n".join(lines)
-
-
-# Main functions
-# ==============
+from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
 
 
 def parse_arguments() -> argparse.Namespace:
@@ -393,50 +65,24 @@ def parse_arguments() -> argparse.Namespace:
     return args
 
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
-
-    return content
-
-
 def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
 
-    dir = os.path.dirname(filename)
-    os.makedirs(dir, exist_ok=True)
+    directory = os.path.dirname(filename)
+    os.makedirs(directory, exist_ok=True)
 
     with open(filename, "w", encoding="utf-8") as rst_file:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
 
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
-
-    index_fname = os.path.basename(output)
-    base, ext = os.path.splitext(index_fname)
-
-    if not index_dir:
-        index_dir = os.path.dirname(output)
-
-    logging.debug(f"Looking for {ext} files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(ext) or filename == index_fname:
-            continue
-        base, ext = os.path.splitext(filename)
-        lines.append(f"   {base}\n")
+    msg = generate_main_index_rst(output, index_dir)
 
     logging.debug("Writing an index file at %s", output)
-    write_to_rstfile("".join(lines), output)
+    write_to_rstfile(msg, output)
 
 
 def main() -> None:
@@ -457,7 +103,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        generate_main_index_rst(args.output, args.input_dir)
+        write_index_rst(args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-06-12 10:31 ` [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-13 11:20   ` Donald Hunter
  2025-06-12 10:31 ` [PATCH v2 07/12] tools: ynl_gen_rst.py: do some coding style cleanups Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be importing netlink parser into a Sphinx extension,
move all functions and global variables inside two classes:

- RstFormatters, containing ReST formatter logic, which are
  YAML independent;
- NetlinkYamlParser: contains the actual parser classes. That's
  the only class that needs to be imported by the script or by
  a Sphinx extension.

With that, we won't pollute Sphinx namespace, avoiding any
potential clashes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/lib/netlink_yml_parser.py  | 592 +++++++++++++++--------------
 tools/net/ynl/pyynl/ynl_gen_rst.py |  19 +-
 2 files changed, 313 insertions(+), 298 deletions(-)

diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
index 3c15b578f947..8d7961a1a256 100755
--- a/scripts/lib/netlink_yml_parser.py
+++ b/scripts/lib/netlink_yml_parser.py
@@ -3,389 +3,407 @@
 # -*- coding: utf-8; mode: python -*-
 
 """
-    Script to auto generate the documentation for Netlink specifications.
+    Class to auto generate the documentation for Netlink specifications.
 
     :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
     :license:    GPL Version 2, June 1991 see linux/COPYING for details.
 
-    This script performs extensive parsing to the Linux kernel's netlink YAML
+    This class performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
     YAML file.
 
-    This code is split in three big parts:
+    This code is split in two classes:
         1) RST formatters: Use to convert a string to a RST output
         2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
 """
 
 from typing import Any, Dict, List
 import os.path
+import sys
+import argparse
 import logging
 import yaml
 
 
-SPACE_PER_LEVEL = 4
-
-
+# ==============
 # RST Formatters
 # ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
+class RstFormatters:
+    SPACE_PER_LEVEL = 4
 
+    @staticmethod
+    def headroom(level: int) -> str:
+        """Return space to format"""
+        return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
 
+    @staticmethod
+    def bold(text: str) -> str:
+        """Format bold text"""
+        return f"**{text}**"
 
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
 
+    @staticmethod
+    def inline(text: str) -> str:
+        """Format inline text"""
+        return f"``{text}``"
 
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
 
+    @staticmethod
+    def sanitize(text: str) -> str:
+        """Remove newlines and multiple spaces"""
+        # This is useful for some fields that are spread across multiple lines
+        return str(text).replace("\n", " ").strip()
 
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
 
+    def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+        """Return a RST formatted field"""
+        return self.headroom(level) + f":{key}: {value}"
 
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
 
+    def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+        """Format a single rst definition"""
+        return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
 
+    def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+        """Return a formatted paragraph"""
+        return self.headroom(level) + paragraph
 
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
 
+    def rst_bullet(self, item: str, level: int = 0) -> str:
+        """Return a formatted a bullet"""
+        return self.headroom(level) + f"- {item}"
 
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
 
+    @staticmethod
+    def rst_subsection(title: str) -> str:
+        """Add a sub-section to the document"""
+        return f"{title}\n" + "-" * len(title)
 
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
 
+    @staticmethod
+    def rst_subsubsection(title: str) -> str:
+        """Add a sub-sub-section to the document"""
+        return f"{title}\n" + "~" * len(title)
 
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
+    @staticmethod
+    def rst_section(namespace: str, prefix: str, title: str) -> str:
+        """Add a section to the document"""
+        return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
+    @staticmethod
+    def rst_subtitle(title: str) -> str:
+        """Add a subtitle to the document"""
+        return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
+    @staticmethod
+    def rst_title(title: str) -> str:
+        """Add a title to the document"""
+        return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
 
+    def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+        """Format a list using inlines"""
+        return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
 
+    @staticmethod
+    def rst_ref(namespace: str, prefix: str, name: str) -> str:
+        """Add a hyperlink to the document"""
+        mappings = {'enum': 'definition',
+                    'fixed-header': 'definition',
+                    'nested-attributes': 'attribute-set',
+                    'struct': 'definition'}
+        if prefix in mappings:
+            prefix = mappings[prefix]
+        return f":ref:`{namespace}-{prefix}-{name}`"
 
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
 
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+    def rst_header(self) -> str:
+        """The headers for all the auto generated RST files"""
+        lines = []
 
-    return "\n".join(lines)
+        lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+        lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
 
+        return "\n".join(lines)
 
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
 
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
+    @staticmethod
+    def rst_toctree(maxdepth: int = 2) -> str:
+        """Generate a toctree RST primitive"""
+        lines = []
 
-    return "\n".join(lines)
+        lines.append(".. toctree::")
+        lines.append(f"   :maxdepth: {maxdepth}\n\n")
 
+        return "\n".join(lines)
 
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
 
+    @staticmethod
+    def rst_label(title: str) -> str:
+        """Return a formatted label"""
+        return f".. _{title}:\n\n"
 
+# =======
 # Parsers
 # =======
+class NetlinkYamlParser:
+
+    fmt = RstFormatters()
+
+    def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+        """Parse 'multicast' group list and return a formatted string"""
+        lines = []
+        for group in mcast_group:
+            lines.append(self.fmt.rst_bullet(group["name"]))
+
+        return "\n".join(lines)
+
+
+    def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'do' section and return a formatted string"""
+        lines = []
+        for key in do_dict.keys():
+            lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+            if key in ['request', 'reply']:
+                lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+            else:
+                lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+        return "\n".join(lines)
+
+
+    def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'attributes' section"""
+        if "attributes" not in attrs:
+            return ""
+        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+        return "\n".join(lines)
+
+
+    def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse operations block"""
+        preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+        linkable = ["fixed-header", "attribute-set"]
+        lines = []
+
+        for operation in operations:
+            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+            for key in operation.keys():
+                if key in preprocessed:
+                    # Skip the special fields
+                    continue
+                value = operation[key]
+                if key in linkable:
+                    value = self.fmt.rst_ref(namespace, key, value)
+                lines.append(self.fmt.rst_fields(key, value, 0))
+            if 'flags' in operation:
+                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+            if "do" in operation:
+                lines.append(self.fmt.rst_paragraph(":do:", 0))
+                lines.append(self.parse_do(operation["do"], 0))
+            if "dump" in operation:
+                lines.append(self.fmt.rst_paragraph(":dump:", 0))
+                lines.append(self.parse_do(operation["dump"], 0))
+
+            # New line after fields
+            lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+        """Parse a list of entries"""
+        ignored = ["pad"]
+        lines = []
+        for entry in entries:
+            if isinstance(entry, dict):
+                # entries could be a list or a dictionary
+                field_name = entry.get("name", "")
+                if field_name in ignored:
+                    continue
+                type_ = entry.get("type")
+                if type_:
+                    field_name += f" ({self.fmt.inline(type_)})"
+                lines.append(
+                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                )
+            elif isinstance(entry, list):
+                lines.append(self.fmt.rst_list_inline(entry, level))
+            else:
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
 
+        lines.append("\n")
+        return "\n".join(lines)
 
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
 
-        # New line after fields
-        lines.append("\n")
+    def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+        """Parse definitions section"""
+        preprocessed = ["name", "entries", "members"]
+        ignored = ["render-max"]  # This is not printed
+        lines = []
 
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
+        for definition in defs:
+            lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+            for k in definition.keys():
                 if k in preprocessed + ignored:
                     continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
+                lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+            # Field list needs to finish with a new line
             lines.append("\n")
+            if "entries" in definition:
+                lines.append(self.fmt.rst_paragraph(":entries:", 0))
+                lines.append(self.parse_entries(definition["entries"], 1))
+            if "members" in definition:
+                lines.append(self.fmt.rst_paragraph(":members:", 0))
+                lines.append(self.parse_entries(definition["members"], 1))
 
-    return "\n".join(lines)
+        return "\n".join(lines)
 
 
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
+    def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse attribute from attribute-set"""
+        preprocessed = ["name", "type"]
+        linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+        ignored = ["checks"]
+        lines = []
 
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            for attr in entry["attributes"]:
+                type_ = attr.get("type")
+                attr_line = attr["name"]
+                if type_:
+                    # Add the attribute type in the same line
+                    attr_line += f" ({self.fmt.inline(type_)})"
 
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
+                lines.append(self.fmt.rst_subsubsection(attr_line))
+
+                for k in attr.keys():
+                    if k in preprocessed + ignored:
+                        continue
+                    if k in linkable:
+                        value = self.fmt.rst_ref(namespace, k, attr[k])
+                    else:
+                        value = self.fmt.sanitize(attr[k])
+                    lines.append(self.fmt.rst_fields(k, value, 0))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse sub-message definitions"""
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            for fmt in entry["formats"]:
+                value = fmt["value"]
+
+                lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+                for attr in ['fixed-header', 'attribute-set']:
+                    if attr in fmt:
+                        lines.append(self.fmt.rst_fields(attr,
+                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
+                                                1))
+                lines.append("\n")
+
+        return "\n".join(lines)
 
-    return "\n".join(lines)
 
+    def parse_yaml(self, obj: Dict[str, Any]) -> str:
+        """Format the whole YAML into a RST string"""
+        lines = []
 
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
+        # Main header
 
-    # Main header
+        family = obj['name']
 
-    family = obj['name']
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("netlink-" + family))
 
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
+        title = f"Family ``{family}`` netlink specification"
+        lines.append(self.fmt.rst_title(title))
+        lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
 
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+        if "doc" in obj:
+            lines.append(self.fmt.rst_subtitle("Summary"))
+            lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
 
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
+        # Operations
+        if "operations" in obj:
+            lines.append(self.fmt.rst_subtitle("Operations"))
+            lines.append(self.parse_operations(obj["operations"]["list"], family))
 
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
+        # Multicast groups
+        if "mcast-groups" in obj:
+            lines.append(self.fmt.rst_subtitle("Multicast groups"))
+            lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
 
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+        # Definitions
+        if "definitions" in obj:
+            lines.append(self.fmt.rst_subtitle("Definitions"))
+            lines.append(self.parse_definitions(obj["definitions"], family))
 
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
+        # Attributes set
+        if "attribute-sets" in obj:
+            lines.append(self.fmt.rst_subtitle("Attribute sets"))
+            lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
 
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
+        # Sub-messages
+        if "sub-messages" in obj:
+            lines.append(self.fmt.rst_subtitle("Sub-messages"))
+            lines.append(self.parse_sub_messages(obj["sub-messages"], family))
 
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
+        return "\n".join(lines)
 
-    return "\n".join(lines)
 
+    # Main functions
+    # ==============
 
-# Main functions
-# ==============
 
+    def parse_yaml_file(self, filename: str) -> str:
+        """Transform the YAML specified by filename into an RST-formatted string"""
+        with open(filename, "r", encoding="utf-8") as spec_file:
+            yaml_data = yaml.safe_load(spec_file)
+            content = self.parse_yaml(yaml_data)
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
+        return content
 
-    return content
 
+    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
+        """Generate the `networking_spec/index` content and write to the file"""
+        lines = []
 
-def generate_main_index_rst(output: str, index_dir: str) -> str:
-    """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("specs"))
+        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
+        lines.append(self.fmt.rst_toctree(1))
 
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
+        index_fname = os.path.basename(output)
+        base, ext = os.path.splitext(index_fname)
 
-    index_fname = os.path.basename(output)
-    base, ext = os.path.splitext(index_fname)
+        if not index_dir:
+            index_dir = os.path.dirname(output)
 
-    if not index_dir:
-        index_dir = os.path.dirname(output)
+        logging.debug(f"Looking for {ext} files in %s", index_dir)
+        for filename in sorted(os.listdir(index_dir)):
+            if not filename.endswith(ext) or filename == index_fname:
+                continue
+            base, ext = os.path.splitext(filename)
+            lines.append(f"   {base}\n")
 
-    logging.debug(f"Looking for {ext} files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(ext) or filename == index_fname:
-            continue
-        base, ext = os.path.splitext(filename)
-        lines.append(f"   {base}\n")
+        logging.debug("Writing an index file at %s", output)
 
-    return "".join(lines), output
+        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 38dafe3d9179..257288f707af 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,12 +10,7 @@
 
     This script performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
-    YAML file.
-
-    This code is split in three big parts:
-        1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+    YAML file. It uses the library code from scripts/lib.
 """
 
 import os.path
@@ -28,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
+from netlink_yml_parser import NetlinkYamlParser
 
 
 def parse_arguments() -> argparse.Namespace:
@@ -76,10 +71,10 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def write_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(parser: NetlinkYamlParser, output: str, index_dir: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
 
-    msg = generate_main_index_rst(output, index_dir)
+    msg = parser.generate_main_index_rst(output, index_dir)
 
     logging.debug("Writing an index file at %s", output)
     write_to_rstfile(msg, output)
@@ -90,10 +85,12 @@ def main() -> None:
 
     args = parse_arguments()
 
+    parser = NetlinkYamlParser()
+
     if args.input:
         logging.debug("Parsing %s", args.input)
         try:
-            content = parse_yaml_file(os.path.join(args.input))
+            content = parser.parse_yaml_file(os.path.join(args.input))
         except Exception as exception:
             logging.warning("Failed to parse %s.", args.input)
             logging.warning(exception)
@@ -103,7 +100,7 @@ def main() -> None:
 
     if args.index:
         # Generate the index RST file
-        write_index_rst(args.output, args.input_dir)
+        write_index_rst(parser, args.output, args.input_dir)
 
 
 if __name__ == "__main__":
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 07/12] tools: ynl_gen_rst.py: do some coding style cleanups
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-06-12 10:31 ` [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
@ 2025-06-12 10:31 ` Mauro Carvalho Chehab
  2025-06-12 10:32 ` [PATCH v2 08/12] scripts: netlink_yml_parser.py: improve index.rst generation Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:31 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Do some coding style cleanups to make pylint and flake8 happier.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/lib/netlink_yml_parser.py  | 74 +++++++++++-------------------
 tools/net/ynl/pyynl/ynl_gen_rst.py |  2 +-
 2 files changed, 28 insertions(+), 48 deletions(-)

diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
index 8d7961a1a256..65981b86875f 100755
--- a/scripts/lib/netlink_yml_parser.py
+++ b/scripts/lib/netlink_yml_parser.py
@@ -19,16 +19,13 @@
 
 from typing import Any, Dict, List
 import os.path
-import sys
-import argparse
 import logging
 import yaml
 
 
-# ==============
-# RST Formatters
-# ==============
 class RstFormatters:
+    """RST Formatters"""
+
     SPACE_PER_LEVEL = 4
 
     @staticmethod
@@ -36,81 +33,67 @@ class RstFormatters:
         """Return space to format"""
         return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-
     @staticmethod
     def bold(text: str) -> str:
         """Format bold text"""
         return f"**{text}**"
 
-
     @staticmethod
     def inline(text: str) -> str:
         """Format inline text"""
         return f"``{text}``"
 
-
     @staticmethod
     def sanitize(text: str) -> str:
         """Remove newlines and multiple spaces"""
         # This is useful for some fields that are spread across multiple lines
         return str(text).replace("\n", " ").strip()
 
-
     def rst_fields(self, key: str, value: str, level: int = 0) -> str:
         """Return a RST formatted field"""
         return self.headroom(level) + f":{key}: {value}"
 
-
     def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
         """Format a single rst definition"""
         return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-
     def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
         """Return a formatted paragraph"""
         return self.headroom(level) + paragraph
 
-
     def rst_bullet(self, item: str, level: int = 0) -> str:
         """Return a formatted a bullet"""
         return self.headroom(level) + f"- {item}"
 
-
     @staticmethod
     def rst_subsection(title: str) -> str:
         """Add a sub-section to the document"""
         return f"{title}\n" + "-" * len(title)
 
-
     @staticmethod
     def rst_subsubsection(title: str) -> str:
         """Add a sub-sub-section to the document"""
         return f"{title}\n" + "~" * len(title)
 
-
     @staticmethod
     def rst_section(namespace: str, prefix: str, title: str) -> str:
         """Add a section to the document"""
         return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-
     @staticmethod
     def rst_subtitle(title: str) -> str:
         """Add a subtitle to the document"""
         return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-
     @staticmethod
     def rst_title(title: str) -> str:
         """Add a title to the document"""
         return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-
     def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
         """Format a list using inlines"""
         return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-
     @staticmethod
     def rst_ref(namespace: str, prefix: str, name: str) -> str:
         """Add a hyperlink to the document"""
@@ -119,10 +102,9 @@ class RstFormatters:
                     'nested-attributes': 'attribute-set',
                     'struct': 'definition'}
         if prefix in mappings:
-            prefix = mappings[prefix]
+            prefix = mappings.get(prefix, "")
         return f":ref:`{namespace}-{prefix}-{name}`"
 
-
     def rst_header(self) -> str:
         """The headers for all the auto generated RST files"""
         lines = []
@@ -132,7 +114,6 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_toctree(maxdepth: int = 2) -> str:
         """Generate a toctree RST primitive"""
@@ -143,16 +124,14 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_label(title: str) -> str:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
-# =======
-# Parsers
-# =======
+
 class NetlinkYamlParser:
+    """YAML Netlink specs Parser"""
 
     fmt = RstFormatters()
 
@@ -164,7 +143,6 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
@@ -177,16 +155,16 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
         """Parse 'attributes' section"""
         if "attributes" not in attrs:
             return ""
-        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+        lines = [self.fmt.rst_fields("attributes",
+                                     self.fmt.rst_list_inline(attrs["attributes"]),
+                                     level + 1)]
 
         return "\n".join(lines)
 
-
     def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
         """Parse operations block"""
         preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +172,8 @@ class NetlinkYamlParser:
         lines = []
 
         for operation in operations:
-            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'operation',
+                                              operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
@@ -206,7 +185,8 @@ class NetlinkYamlParser:
                     value = self.fmt.rst_ref(namespace, key, value)
                 lines.append(self.fmt.rst_fields(key, value, 0))
             if 'flags' in operation:
-                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+                lines.append(self.fmt.rst_fields('flags',
+                                                 self.fmt.rst_list_inline(operation['flags'])))
 
             if "do" in operation:
                 lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +200,6 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
         """Parse a list of entries"""
         ignored = ["pad"]
@@ -235,17 +214,19 @@ class NetlinkYamlParser:
                 if type_:
                     field_name += f" ({self.fmt.inline(type_)})"
                 lines.append(
-                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                    self.fmt.rst_fields(field_name,
+                                        self.fmt.sanitize(entry.get("doc", "")),
+                                        level)
                 )
             elif isinstance(entry, list):
                 lines.append(self.fmt.rst_list_inline(entry, level))
             else:
-                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+                                                 level))
 
         lines.append("\n")
         return "\n".join(lines)
 
-
     def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
         """Parse definitions section"""
         preprocessed = ["name", "entries", "members"]
@@ -270,7 +251,6 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse attribute from attribute-set"""
         preprocessed = ["name", "type"]
@@ -279,7 +259,8 @@ class NetlinkYamlParser:
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+                                              entry["name"]))
             for attr in entry["attributes"]:
                 type_ = attr.get("type")
                 attr_line = attr["name"]
@@ -301,13 +282,13 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse sub-message definitions"""
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'sub-message',
+                                              entry["name"]))
             for fmt in entry["formats"]:
                 value = fmt["value"]
 
@@ -315,13 +296,14 @@ class NetlinkYamlParser:
                 for attr in ['fixed-header', 'attribute-set']:
                     if attr in fmt:
                         lines.append(self.fmt.rst_fields(attr,
-                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
-                                                1))
+                                                         self.fmt.rst_ref(namespace,
+                                                                          attr,
+                                                                          fmt[attr]),
+                                                         1))
                 lines.append("\n")
 
         return "\n".join(lines)
 
-
     def parse_yaml(self, obj: Dict[str, Any]) -> str:
         """Format the whole YAML into a RST string"""
         lines = []
@@ -344,7 +326,8 @@ class NetlinkYamlParser:
         # Operations
         if "operations" in obj:
             lines.append(self.fmt.rst_subtitle("Operations"))
-            lines.append(self.parse_operations(obj["operations"]["list"], family))
+            lines.append(self.parse_operations(obj["operations"]["list"],
+                                               family))
 
         # Multicast groups
         if "mcast-groups" in obj:
@@ -368,11 +351,9 @@ class NetlinkYamlParser:
 
         return "\n".join(lines)
 
-
     # Main functions
     # ==============
 
-
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
@@ -381,7 +362,6 @@ class NetlinkYamlParser:
 
         return content
 
-
     def generate_main_index_rst(self, output: str, index_dir: str) -> None:
         """Generate the `networking_spec/index` content and write to the file"""
         lines = []
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 257288f707af..317093623dab 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -23,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
 
 sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
 
-from netlink_yml_parser import NetlinkYamlParser
+from netlink_yml_parser import NetlinkYamlParser      # pylint: disable=C0413
 
 
 def parse_arguments() -> argparse.Namespace:
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 08/12] scripts: netlink_yml_parser.py: improve index.rst generation
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-06-12 10:31 ` [PATCH v2 07/12] tools: ynl_gen_rst.py: do some coding style cleanups Mauro Carvalho Chehab
@ 2025-06-12 10:32 ` Mauro Carvalho Chehab
  2025-06-12 10:32 ` [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  12 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:32 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, joel, linux-kernel-mentees, linux-kernel, lkmm,
	netdev, peterz, stern

Handle both .yaml and .rst extensions. This way, one can generate
an index file inside the sources with:

	tools/net/ynl/pyynl/ynl_gen_rst.py -x  -v -o Documentation/netlink/specs/index.rst

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 scripts/lib/netlink_yml_parser.py | 11 ++++++++---
 1 file changed, 8 insertions(+), 3 deletions(-)

diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
index 65981b86875f..3ba28a9a4338 100755
--- a/scripts/lib/netlink_yml_parser.py
+++ b/scripts/lib/netlink_yml_parser.py
@@ -372,15 +372,20 @@ class NetlinkYamlParser:
         lines.append(self.fmt.rst_toctree(1))
 
         index_fname = os.path.basename(output)
-        base, ext = os.path.splitext(index_fname)
 
         if not index_dir:
             index_dir = os.path.dirname(output)
 
-        logging.debug(f"Looking for {ext} files in %s", index_dir)
+        exts = [ ".yaml", ".rst" ]
+
+        logging.debug(f"Looking for files in %s", index_dir)
         for filename in sorted(os.listdir(index_dir)):
-            if not filename.endswith(ext) or filename == index_fname:
+            if filename == index_fname:
                 continue
+
+            for ext in exts:
+                if not filename.endswith(ext):
+                    continue
             base, ext = os.path.splitext(filename)
             lines.append(f"   {base}\n")
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-06-12 10:32 ` [PATCH v2 08/12] scripts: netlink_yml_parser.py: improve index.rst generation Mauro Carvalho Chehab
@ 2025-06-12 10:32 ` Mauro Carvalho Chehab
  2025-06-13 11:29   ` Donald Hunter
  2025-06-12 10:32 ` [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:32 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

Add a simple sphinx.Parser class meant to handle yaml files.

For now, it just replaces a yaml file by a simple ReST
code. I opted to do this way, as this patch can be used as
basis for new file format parsers. We may use this as an
example to parse other types of files.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py | 63 +++++++++++++++++++++++++++++
 1 file changed, 63 insertions(+)
 create mode 100755 Documentation/sphinx/parser_yaml.py

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..b3cde9cf7aac
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,63 @@
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+from pprint import pformat
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+    """Custom parser for YAML files."""
+
+    supported = ('yaml', 'yml')
+
+    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+    def parse(self, inputstring, document):
+        """Parse YAML and generate a document tree."""
+
+        self.setup_parse(inputstring, document)
+
+        result = ViewList()
+
+        try:
+            # FIXME: Test logic to generate some ReST content
+            basename = os.path.basename(document.current_source)
+            title = os.path.splitext(basename)[0].replace('_', ' ').title()
+
+            msg = f"{title}\n"
+            msg += "=" * len(title) + "\n\n"
+            msg += "Something\n"
+
+            # Parse message with RSTParser
+            for i, line in enumerate(msg.split('\n')):
+                result.append(line, document.current_source, i)
+
+            rst_parser = RSTParser()
+            rst_parser.parse('\n'.join(result), document)
+
+        except Exception as e:
+            document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+        self.finish_parse()
+
+def setup(app):
+    """Setup function for the Sphinx extension."""
+
+    # Add YAML parser
+    app.add_source_parser(YamlParser)
+    app.add_source_suffix('.yaml', 'yaml')
+    app.add_source_suffix('.yml', 'yaml')
+
+    return {
+        'version': '1.0',
+        'parallel_read_safe': True,
+        'parallel_write_safe': True,
+    }
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-06-12 10:32 ` [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files Mauro Carvalho Chehab
@ 2025-06-12 10:32 ` Mauro Carvalho Chehab
  2025-06-13 11:45   ` Donald Hunter
  2025-06-12 10:32 ` [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:32 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

Place the code at parser_yaml.py to handle Netlink specs. All
other yaml files are ignored.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 .pylintrc                           |  2 +-
 Documentation/sphinx/parser_yaml.py | 39 +++++++++++++++++++++--------
 2 files changed, 29 insertions(+), 12 deletions(-)

diff --git a/.pylintrc b/.pylintrc
index 30b8ae1659f8..f1d21379254b 100644
--- a/.pylintrc
+++ b/.pylintrc
@@ -1,2 +1,2 @@
 [MASTER]
-init-hook='import sys; sys.path += ["scripts/lib/kdoc", "scripts/lib/abi"]'
+init-hook='import sys; sys.path += ["scripts/lib", "scripts/lib/kdoc", "scripts/lib/abi"]'
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index b3cde9cf7aac..eb32e3249274 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -3,6 +3,10 @@ Sphinx extension for processing YAML files
 """
 
 import os
+import re
+import sys
+
+from pprint import pformat
 
 from docutils.parsers.rst import Parser as RSTParser
 from docutils.statemachine import ViewList
@@ -10,7 +14,10 @@ from docutils.statemachine import ViewList
 from sphinx.util import logging
 from sphinx.parsers import Parser
 
-from pprint import pformat
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "scripts/lib"))
+
+from netlink_yml_parser import NetlinkYamlParser      # pylint: disable=C0413
 
 logger = logging.getLogger(__name__)
 
@@ -19,8 +26,9 @@ class YamlParser(Parser):
 
     supported = ('yaml', 'yml')
 
-    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
-    def parse(self, inputstring, document):
+    netlink_parser = NetlinkYamlParser()
+
+    def do_parse(self, inputstring, document, msg):
         """Parse YAML and generate a document tree."""
 
         self.setup_parse(inputstring, document)
@@ -28,14 +36,6 @@ class YamlParser(Parser):
         result = ViewList()
 
         try:
-            # FIXME: Test logic to generate some ReST content
-            basename = os.path.basename(document.current_source)
-            title = os.path.splitext(basename)[0].replace('_', ' ').title()
-
-            msg = f"{title}\n"
-            msg += "=" * len(title) + "\n\n"
-            msg += "Something\n"
-
             # Parse message with RSTParser
             for i, line in enumerate(msg.split('\n')):
                 result.append(line, document.current_source, i)
@@ -48,6 +48,23 @@ class YamlParser(Parser):
 
         self.finish_parse()
 
+    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+    def parse(self, inputstring, document):
+        """Check if a YAML is meant to be parsed."""
+
+        fname = document.current_source
+
+        # Handle netlink yaml specs
+        if re.search("/netlink/specs/", fname):
+            if fname.endswith("index.yaml"):
+                msg = self.netlink_parser.generate_main_index_rst(fname, None)
+            else:
+                msg = self.netlink_parser.parse_yaml_file(fname)
+
+            self.do_parse(inputstring, document, msg)
+
+        # All other yaml files are ignored
+
 def setup(app):
     """Setup function for the Sphinx extension."""
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-06-12 10:32 ` [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser Mauro Carvalho Chehab
@ 2025-06-12 10:32 ` Mauro Carvalho Chehab
  2025-06-13 11:50   ` Donald Hunter
  2025-06-12 10:32 ` [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
  2025-06-13 11:05 ` [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:32 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/Makefile                        | 17 ---------
 Documentation/conf.py                         | 11 +++---
 Documentation/netlink/specs/index.rst         | 38 +++++++++++++++++++
 Documentation/networking/index.rst            |  2 +-
 .../networking/netlink_spec/readme.txt        |  4 --
 Documentation/sphinx/parser_yaml.py           |  2 +-
 6 files changed, 46 insertions(+), 28 deletions(-)
 create mode 100644 Documentation/netlink/specs/index.rst
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt

diff --git a/Documentation/Makefile b/Documentation/Makefile
index d30d66ddf1ad..9185680b1e86 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -102,22 +102,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
 	fi
 
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
-	$(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
-	$(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
 htmldocs:
 	@$(srctree)/scripts/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -184,7 +168,6 @@ refcheckdocs:
 	$(Q)cd $(srctree);scripts/documentation-file-ref-check
 
 cleandocs:
-	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
 	$(Q)rm -rf $(BUILDDIR)
 	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
 
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 12de52a2b17e..add6ce78dd80 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -45,7 +45,7 @@ needs_sphinx = '3.4.3'
 extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
               'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
               'maintainers_include', 'sphinx.ext.autosectionlabel',
-              'kernel_abi', 'kernel_feat', 'translations']
+              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
 
 # Since Sphinx version 3, the C function parser is more pedantic with regards
 # to type checking. Due to that, having macros at c:function cause problems.
@@ -143,10 +143,11 @@ else:
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['sphinx/templates']
 
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+        '.rst': 'restructuredtext',
+        '.yaml': 'yaml',
+}
 
 # The encoding of source files.
 #source_encoding = 'utf-8-sig'
diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..ca0bf816dc3f
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,38 @@
+.. SPDX-License-Identifier: GPL-2.0
+.. NOTE: This document was auto-generated.
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+   :maxdepth: 1
+
+   conntrack
+   devlink
+   dpll
+   ethtool
+   fou
+   handshake
+   lockd
+   mptcp_pm
+   net_shaper
+   netdev
+   nfsd
+   nftables
+   nl80211
+   nlctrl
+   ovpn
+   ovs_datapath
+   ovs_flow
+   ovs_vport
+   rt-addr
+   rt-link
+   rt-neigh
+   rt-route
+   rt-rule
+   tc
+   tcp_metrics
+   team
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
    filter
    generic-hdlc
    generic_netlink
-   netlink_spec/index
+   ../netlink/specs/index
    gen_stats
    gtp
    ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index eb32e3249274..cdcafe5b3937 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -55,7 +55,7 @@ class YamlParser(Parser):
         fname = document.current_source
 
         # Handle netlink yaml specs
-        if re.search("/netlink/specs/", fname):
+        if re.search("netlink/specs/", fname):
             if fname.endswith("index.yaml"):
                 msg = self.netlink_parser.generate_main_index_rst(fname, None)
             else:
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside Netlink specs
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-06-12 10:32 ` [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs Mauro Carvalho Chehab
@ 2025-06-12 10:32 ` Mauro Carvalho Chehab
  2025-06-13 11:52   ` Donald Hunter
  2025-06-13 11:05 ` [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Donald Hunter
  12 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-12 10:32 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

The parser_yaml extension already has a logic to prevent
handing all yaml documents. However, if we don't also exclude
the patterns at conf.py, the build time would increase a lot,
and warnings like those would be generated:

    Documentation/netlink/genetlink.yaml: WARNING: o documento não está incluído em nenhum toctree
    Documentation/netlink/genetlink-c.yaml: WARNING: o documento não está incluído em nenhum toctree
    Documentation/netlink/genetlink-legacy.yaml: WARNING: o documento não está incluído em nenhum toctree
    Documentation/netlink/index.rst: WARNING: o documento não está incluído em nenhum toctree
    Documentation/netlink/netlink-raw.yaml: WARNING: o documento não está incluído em nenhum toctree

Add some exclusion rules to prevent that.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/conf.py | 6 +++++-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index add6ce78dd80..b8668bcaf090 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -222,7 +222,11 @@ language = 'en'
 
 # List of patterns, relative to source directory, that match files and
 # directories to ignore when looking for source files.
-exclude_patterns = ['output']
+exclude_patterns = [
+	'output',
+	'devicetree/bindings/**.yaml',
+	'netlink/*.yaml',
+]
 
 # The reST default role (used for this markup: `text`) to use for all
 # documents.
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree)
  2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-06-12 10:32 ` [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
@ 2025-06-13 11:05 ` Donald Hunter
  2025-06-13 12:13   ` Mauro Carvalho Chehab
  12 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:05 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> As discussed at:
>    https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
>
> changeset f061c9f7d058 ("Documentation: Document each netlink family")
> added a logic which generates *.rst files inside $(srctree). This is bad when
> O=<BUILDDIR> is used.
>
> A recent change renamed the yaml files used by Netlink, revealing a bad
> side effect: as "make cleandocs" don't clean the produced files, symbols 
> appear duplicated for people that don't build the kernel from scratch.
>
> There are some possible solutions for that. The simplest one, which is what
> this series address, places the build files inside Documentation/output. 
> The changes to do that are simple enough, but has one drawback,
> as it requires a (simple) template file for every netlink family file from
> netlink/specs. The template is simple enough:
>
>         .. kernel-include:: $BUILDDIR/networking/netlink_spec/<family>.rst

I think we could skip describing this since it was an approach that has
now been dropped.

> Part of the issue is that sphinx-build only produces html files for sources
> inside the source tree (Documentation/). 
>
> To address that, add an yaml parser extension to Sphinx.
>
> It should be noticed that this version has one drawback: it increases the
> documentation build time. I suspect that the culprit is inside Sphinx
> glob logic and the way it handles exclude_patterns. What happens is that
> sphinx/project.py uses glob, which, on my own experiences, it is slow
> (due to that, I ended implementing my own glob logic for kernel-doc).
>
> On the plus side, the extension is flexible enough to handle other types
> of yaml files, as the actual yaml conversion logic is outside the extension.

I don't think the extension would handle anything other than the Netlink
yaml specs, and I don't think that should be a goal of this patchset.

> With this version, there's no need to add any template file per netlink/spec
> file. Yet, the Documentation/netlink/spec.index.rst require updates as
> spec files are added/renamed/removed. The already-existing script can
> handle it automatically by running:
>
>             tools/net/ynl/pyynl/ynl_gen_rst.py -x  -v -o Documentation/netlink/specs/index.rst

I think this can be avoided by using the toctree glob directive in the
index, like this:

=============================
Netlink Family Specifications
=============================

.. toctree::
   :maxdepth: 1
   :glob:

   *

This would let you have a static index file.

> ---
>
> v2:
> - Use a Sphinx extension to handle netlink files.
>
> v1:
> - Statically add template files to as networking/netlink_spec/<family>.rst
>
> Mauro Carvalho Chehab (12):
>   tools: ynl_gen_rst.py: create a top-level reference
>   docs: netlink: netlink-raw.rst: use :ref: instead of :doc:

I suggest combining the first 2 patches.

>   docs: netlink: don't ignore generated rst files

Maybe leave this patch to the end and change the description to be a
cleanup of the remants of the old approach.

Further comments on specific commits

>   tools: ynl_gen_rst.py: make the index parser more generic
>   tools: ynl_gen_rst.py: Split library from command line tool
>   scripts: lib: netlink_yml_parser.py: use classes
>   tools: ynl_gen_rst.py: do some coding style cleanups
>   scripts: netlink_yml_parser.py: improve index.rst generation
>   docs: sphinx: add a parser template for yaml files
>   docs: sphinx: parser_yaml.py: add Netlink specs parser

Please combine these 2 patches. The template patch just introduces noise
into the series and makes it harder to review.

>   docs: use parser_yaml extension to handle Netlink specs
>   docs: conf.py: don't handle yaml files outside Netlink specs
>
>  .pylintrc                                     |   2 +-
>  Documentation/Makefile                        |  17 -
>  Documentation/conf.py                         |  17 +-
>  Documentation/netlink/specs/index.rst         |  38 ++
>  Documentation/networking/index.rst            |   2 +-
>  .../networking/netlink_spec/.gitignore        |   1 -
>  .../networking/netlink_spec/readme.txt        |   4 -
>  Documentation/sphinx/parser_yaml.py           |  80 ++++
>  .../userspace-api/netlink/netlink-raw.rst     |   6 +-
>  scripts/lib/netlink_yml_parser.py             | 394 ++++++++++++++++++
>  tools/net/ynl/pyynl/ynl_gen_rst.py            | 378 +----------------
>  11 files changed, 544 insertions(+), 395 deletions(-)
>  create mode 100644 Documentation/netlink/specs/index.rst
>  delete mode 100644 Documentation/networking/netlink_spec/.gitignore
>  delete mode 100644 Documentation/networking/netlink_spec/readme.txt
>  create mode 100755 Documentation/sphinx/parser_yaml.py
>  create mode 100755 scripts/lib/netlink_yml_parser.py

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-12 10:31 ` [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-13 11:13   ` Donald Hunter
  2025-06-13 12:17     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:13 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> As we'll be using the Netlink specs parser inside a Sphinx
> extension, move the library part from the command line parser.
>
> No functional changes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  scripts/lib/netlink_yml_parser.py  | 391 +++++++++++++++++++++++++++++
>  tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------

I think the library code should be put in tools/net/ynl/pyynl/lib
because it is YNL specific code. Maybe call it rst_generator.py

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes
  2025-06-12 10:31 ` [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
@ 2025-06-13 11:20   ` Donald Hunter
  2025-06-13 12:40     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:20 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> As we'll be importing netlink parser into a Sphinx extension,
> move all functions and global variables inside two classes:
>
> - RstFormatters, containing ReST formatter logic, which are
>   YAML independent;
> - NetlinkYamlParser: contains the actual parser classes. That's
>   the only class that needs to be imported by the script or by
>   a Sphinx extension.

I suggest a third class for the doc generator that is separate from the
yaml parsing. The yaml parsing should really be refactored to reuse
tools/net/ynl/pyynl/lib/nlspec.py at some point.

> With that, we won't pollute Sphinx namespace, avoiding any
> potential clashes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files
  2025-06-12 10:32 ` [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files Mauro Carvalho Chehab
@ 2025-06-13 11:29   ` Donald Hunter
  2025-06-13 12:26     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:29 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Add a simple sphinx.Parser class meant to handle yaml files.
>
> For now, it just replaces a yaml file by a simple ReST
> code. I opted to do this way, as this patch can be used as
> basis for new file format parsers. We may use this as an
> example to parse other types of files.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  Documentation/sphinx/parser_yaml.py | 63 +++++++++++++++++++++++++++++
>  1 file changed, 63 insertions(+)
>  create mode 100755 Documentation/sphinx/parser_yaml.py

It's not a generic yaml parser so the file should be
netlink_doc_generator.py or something.

>
> diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> new file mode 100755
> index 000000000000..b3cde9cf7aac
> --- /dev/null
> +++ b/Documentation/sphinx/parser_yaml.py
> @@ -0,0 +1,63 @@
> +"""
> +Sphinx extension for processing YAML files
> +"""
> +
> +import os
> +
> +from docutils.parsers.rst import Parser as RSTParser
> +from docutils.statemachine import ViewList
> +
> +from sphinx.util import logging
> +from sphinx.parsers import Parser
> +
> +from pprint import pformat
> +
> +logger = logging.getLogger(__name__)
> +
> +class YamlParser(Parser):
> +    """Custom parser for YAML files."""

The class is only intended to be a netlink doc generator so I sugget
calling it NetlinkDocGenerator

> +
> +    supported = ('yaml', 'yml')

I don't think we need to support the .yml extension.

> +
> +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> +    def parse(self, inputstring, document):
> +        """Parse YAML and generate a document tree."""
> +
> +        self.setup_parse(inputstring, document)
> +
> +        result = ViewList()
> +
> +        try:
> +            # FIXME: Test logic to generate some ReST content
> +            basename = os.path.basename(document.current_source)
> +            title = os.path.splitext(basename)[0].replace('_', ' ').title()
> +
> +            msg = f"{title}\n"
> +            msg += "=" * len(title) + "\n\n"
> +            msg += "Something\n"
> +
> +            # Parse message with RSTParser
> +            for i, line in enumerate(msg.split('\n')):
> +                result.append(line, document.current_source, i)
> +
> +            rst_parser = RSTParser()
> +            rst_parser.parse('\n'.join(result), document)
> +
> +        except Exception as e:
> +            document.reporter.error("YAML parsing error: %s" % pformat(e))
> +
> +        self.finish_parse()
> +
> +def setup(app):
> +    """Setup function for the Sphinx extension."""
> +
> +    # Add YAML parser
> +    app.add_source_parser(YamlParser)
> +    app.add_source_suffix('.yaml', 'yaml')
> +    app.add_source_suffix('.yml', 'yaml')

No need to support the .yml extension.

> +
> +    return {
> +        'version': '1.0',
> +        'parallel_read_safe': True,
> +        'parallel_write_safe': True,
> +    }

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser
  2025-06-12 10:32 ` [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser Mauro Carvalho Chehab
@ 2025-06-13 11:45   ` Donald Hunter
  2025-06-13 12:27     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:45 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Place the code at parser_yaml.py to handle Netlink specs. All
> other yaml files are ignored.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  .pylintrc                           |  2 +-
>  Documentation/sphinx/parser_yaml.py | 39 +++++++++++++++++++++--------
>  2 files changed, 29 insertions(+), 12 deletions(-)
>
> diff --git a/.pylintrc b/.pylintrc
> index 30b8ae1659f8..f1d21379254b 100644
> --- a/.pylintrc
> +++ b/.pylintrc
> @@ -1,2 +1,2 @@
>  [MASTER]
> -init-hook='import sys; sys.path += ["scripts/lib/kdoc", "scripts/lib/abi"]'
> +init-hook='import sys; sys.path += ["scripts/lib", "scripts/lib/kdoc", "scripts/lib/abi"]'
> diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> index b3cde9cf7aac..eb32e3249274 100755
> --- a/Documentation/sphinx/parser_yaml.py
> +++ b/Documentation/sphinx/parser_yaml.py
> @@ -3,6 +3,10 @@ Sphinx extension for processing YAML files
>  """
>  
>  import os
> +import re
> +import sys
> +
> +from pprint import pformat
>  
>  from docutils.parsers.rst import Parser as RSTParser
>  from docutils.statemachine import ViewList
> @@ -10,7 +14,10 @@ from docutils.statemachine import ViewList
>  from sphinx.util import logging
>  from sphinx.parsers import Parser
>  
> -from pprint import pformat
> +srctree = os.path.abspath(os.environ["srctree"])
> +sys.path.insert(0, os.path.join(srctree, "scripts/lib"))
> +
> +from netlink_yml_parser import NetlinkYamlParser      # pylint: disable=C0413
>  
>  logger = logging.getLogger(__name__)
>  
> @@ -19,8 +26,9 @@ class YamlParser(Parser):
>  
>      supported = ('yaml', 'yml')
>  
> -    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> -    def parse(self, inputstring, document):
> +    netlink_parser = NetlinkYamlParser()
> +
> +    def do_parse(self, inputstring, document, msg):
>          """Parse YAML and generate a document tree."""
>  
>          self.setup_parse(inputstring, document)
> @@ -28,14 +36,6 @@ class YamlParser(Parser):
>          result = ViewList()
>  
>          try:
> -            # FIXME: Test logic to generate some ReST content
> -            basename = os.path.basename(document.current_source)
> -            title = os.path.splitext(basename)[0].replace('_', ' ').title()
> -
> -            msg = f"{title}\n"
> -            msg += "=" * len(title) + "\n\n"
> -            msg += "Something\n"
> -
>              # Parse message with RSTParser
>              for i, line in enumerate(msg.split('\n')):
>                  result.append(line, document.current_source, i)
> @@ -48,6 +48,23 @@ class YamlParser(Parser):
>  
>          self.finish_parse()
>  
> +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> +    def parse(self, inputstring, document):
> +        """Check if a YAML is meant to be parsed."""
> +
> +        fname = document.current_source
> +
> +        # Handle netlink yaml specs
> +        if re.search("/netlink/specs/", fname):

The re.search is overkill since it is not a regexp. You can instead say:

    if '/netlink/specs/' in fname:

> +            if fname.endswith("index.yaml"):
> +                msg = self.netlink_parser.generate_main_index_rst(fname, None)
> +            else:

I'm guessing we can drop these lines if the static index.rst approach works.

> +                msg = self.netlink_parser.parse_yaml_file(fname)
> +
> +            self.do_parse(inputstring, document, msg)
> +
> +        # All other yaml files are ignored
> +
>  def setup(app):
>      """Setup function for the Sphinx extension."""

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs
  2025-06-12 10:32 ` [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs Mauro Carvalho Chehab
@ 2025-06-13 11:50   ` Donald Hunter
  2025-06-13 12:29     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:50 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
> This way, no .rst files would be written to the Kernel source
> directories.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  Documentation/Makefile                        | 17 ---------
>  Documentation/conf.py                         | 11 +++---
>  Documentation/netlink/specs/index.rst         | 38 +++++++++++++++++++
>  Documentation/networking/index.rst            |  2 +-
>  .../networking/netlink_spec/readme.txt        |  4 --
>  Documentation/sphinx/parser_yaml.py           |  2 +-
>  6 files changed, 46 insertions(+), 28 deletions(-)
>  create mode 100644 Documentation/netlink/specs/index.rst
>  delete mode 100644 Documentation/networking/netlink_spec/readme.txt
>
> diff --git a/Documentation/Makefile b/Documentation/Makefile
> index d30d66ddf1ad..9185680b1e86 100644
> --- a/Documentation/Makefile
> +++ b/Documentation/Makefile
> @@ -102,22 +102,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
>  		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
>  	fi
>  
> -YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
> -YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
> -YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
> -YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
> -
> -YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
> -YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
> -
> -$(YNL_INDEX): $(YNL_RST_FILES)
> -	$(Q)$(YNL_TOOL) -o $@ -x
> -
> -$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
> -	$(Q)$(YNL_TOOL) -i $< -o $@
> -
> -htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
> -
>  htmldocs:
>  	@$(srctree)/scripts/sphinx-pre-install --version-check
>  	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
> @@ -184,7 +168,6 @@ refcheckdocs:
>  	$(Q)cd $(srctree);scripts/documentation-file-ref-check
>  
>  cleandocs:
> -	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
>  	$(Q)rm -rf $(BUILDDIR)
>  	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
>  
> diff --git a/Documentation/conf.py b/Documentation/conf.py
> index 12de52a2b17e..add6ce78dd80 100644
> --- a/Documentation/conf.py
> +++ b/Documentation/conf.py
> @@ -45,7 +45,7 @@ needs_sphinx = '3.4.3'
>  extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
>                'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
>                'maintainers_include', 'sphinx.ext.autosectionlabel',
> -              'kernel_abi', 'kernel_feat', 'translations']
> +              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
>  
>  # Since Sphinx version 3, the C function parser is more pedantic with regards
>  # to type checking. Due to that, having macros at c:function cause problems.
> @@ -143,10 +143,11 @@ else:
>  # Add any paths that contain templates here, relative to this directory.
>  templates_path = ['sphinx/templates']
>  
> -# The suffix(es) of source filenames.
> -# You can specify multiple suffix as a list of string:
> -# source_suffix = ['.rst', '.md']
> -source_suffix = '.rst'
> +# The suffixes of source filenames that will be automatically parsed
> +source_suffix = {
> +        '.rst': 'restructuredtext',
> +        '.yaml': 'yaml',

The handler name should probably be netlink_yaml 

> +}
>  
>  # The encoding of source files.
>  #source_encoding = 'utf-8-sig'
> diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
> new file mode 100644
> index 000000000000..ca0bf816dc3f
> --- /dev/null
> +++ b/Documentation/netlink/specs/index.rst
> @@ -0,0 +1,38 @@
> +.. SPDX-License-Identifier: GPL-2.0
> +.. NOTE: This document was auto-generated.
> +
> +.. _specs:
> +
> +=============================
> +Netlink Family Specifications
> +=============================
> +
> +.. toctree::
> +   :maxdepth: 1
> +
> +   conntrack
> +   devlink
> +   dpll
> +   ethtool
> +   fou
> +   handshake
> +   lockd
> +   mptcp_pm
> +   net_shaper
> +   netdev
> +   nfsd
> +   nftables
> +   nl80211
> +   nlctrl
> +   ovpn
> +   ovs_datapath
> +   ovs_flow
> +   ovs_vport
> +   rt-addr
> +   rt-link
> +   rt-neigh
> +   rt-route
> +   rt-rule
> +   tc
> +   tcp_metrics
> +   team
> diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
> index ac90b82f3ce9..b7a4969e9bc9 100644
> --- a/Documentation/networking/index.rst
> +++ b/Documentation/networking/index.rst
> @@ -57,7 +57,7 @@ Contents:
>     filter
>     generic-hdlc
>     generic_netlink
> -   netlink_spec/index
> +   ../netlink/specs/index
>     gen_stats
>     gtp
>     ila
> diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
> deleted file mode 100644
> index 030b44aca4e6..000000000000
> --- a/Documentation/networking/netlink_spec/readme.txt
> +++ /dev/null
> @@ -1,4 +0,0 @@
> -SPDX-License-Identifier: GPL-2.0
> -
> -This file is populated during the build of the documentation (htmldocs) by the
> -tools/net/ynl/pyynl/ynl_gen_rst.py script.
> diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> index eb32e3249274..cdcafe5b3937 100755
> --- a/Documentation/sphinx/parser_yaml.py
> +++ b/Documentation/sphinx/parser_yaml.py
> @@ -55,7 +55,7 @@ class YamlParser(Parser):
>          fname = document.current_source
>  
>          # Handle netlink yaml specs
> -        if re.search("/netlink/specs/", fname):
> +        if re.search("netlink/specs/", fname):

Please combine this change into the earlier patch so that the series
doesn't have unnecessary changes.

>              if fname.endswith("index.yaml"):
>                  msg = self.netlink_parser.generate_main_index_rst(fname, None)
>              else:

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside Netlink specs
  2025-06-12 10:32 ` [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
@ 2025-06-13 11:52   ` Donald Hunter
  2025-06-13 12:30     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 11:52 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:

> The parser_yaml extension already has a logic to prevent
> handing all yaml documents. However, if we don't also exclude
> the patterns at conf.py, the build time would increase a lot,
> and warnings like those would be generated:
>
>     Documentation/netlink/genetlink.yaml: WARNING: o documento não está incluído em nenhum toctree
>     Documentation/netlink/genetlink-c.yaml: WARNING: o documento não está incluído em nenhum toctree
>     Documentation/netlink/genetlink-legacy.yaml: WARNING: o documento não está incluído em nenhum toctree
>     Documentation/netlink/index.rst: WARNING: o documento não está incluído em nenhum toctree
>     Documentation/netlink/netlink-raw.yaml: WARNING: o documento não está incluído em nenhum toctree
>
> Add some exclusion rules to prevent that.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
>  Documentation/conf.py | 6 +++++-
>  1 file changed, 5 insertions(+), 1 deletion(-)
>
> diff --git a/Documentation/conf.py b/Documentation/conf.py
> index add6ce78dd80..b8668bcaf090 100644
> --- a/Documentation/conf.py
> +++ b/Documentation/conf.py
> @@ -222,7 +222,11 @@ language = 'en'
>  
>  # List of patterns, relative to source directory, that match files and
>  # directories to ignore when looking for source files.
> -exclude_patterns = ['output']
> +exclude_patterns = [
> +	'output',
> +	'devicetree/bindings/**.yaml',
> +	'netlink/*.yaml',
> +]

Please merge this with the earlier patch that changes these lines so
that the series doesn't contain unnecessary intermediate steps.

>  # The reST default role (used for this markup: `text`) to use for all
>  # documents.

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree)
  2025-06-13 11:05 ` [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Donald Hunter
@ 2025-06-13 12:13   ` Mauro Carvalho Chehab
  2025-06-14 13:29     ` Donald Hunter
  0 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:13 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao

Em Fri, 13 Jun 2025 12:05:56 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > As discussed at:
> >    https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
> >
> > changeset f061c9f7d058 ("Documentation: Document each netlink family")
> > added a logic which generates *.rst files inside $(srctree). This is bad when
> > O=<BUILDDIR> is used.
> >
> > A recent change renamed the yaml files used by Netlink, revealing a bad
> > side effect: as "make cleandocs" don't clean the produced files, symbols 
> > appear duplicated for people that don't build the kernel from scratch.
> >
> > There are some possible solutions for that. The simplest one, which is what
> > this series address, places the build files inside Documentation/output. 
> > The changes to do that are simple enough, but has one drawback,
> > as it requires a (simple) template file for every netlink family file from
> > netlink/specs. The template is simple enough:
> >
> >         .. kernel-include:: $BUILDDIR/networking/netlink_spec/<family>.rst  
> 
> I think we could skip describing this since it was an approach that has
> now been dropped.

Ok. Will drop on next versions.

> 
> > Part of the issue is that sphinx-build only produces html files for sources
> > inside the source tree (Documentation/). 
> >
> > To address that, add an yaml parser extension to Sphinx.
> >
> > It should be noticed that this version has one drawback: it increases the
> > documentation build time. I suspect that the culprit is inside Sphinx
> > glob logic and the way it handles exclude_patterns. What happens is that
> > sphinx/project.py uses glob, which, on my own experiences, it is slow
> > (due to that, I ended implementing my own glob logic for kernel-doc).
> >
> > On the plus side, the extension is flexible enough to handle other types
> > of yaml files, as the actual yaml conversion logic is outside the extension.  
> 
> I don't think the extension would handle anything other than the Netlink
> yaml specs, and I don't think that should be a goal of this patchset.

Not necessarily. We do have already DT yaml files (although there's
a separate process to handle those outside the tree). Nothing prevents
we end having more. See, the way Sphinx parser works is that it will cover
all files with *.yaml extension no matter where it is located within the
tree. We may end needing to use it for something else as well (*).

(*) at the last Media Summit, we did have some discussions about using
    either yaml or rst for sensor documentation.

> > With this version, there's no need to add any template file per netlink/spec
> > file. Yet, the Documentation/netlink/spec.index.rst require updates as
> > spec files are added/renamed/removed. The already-existing script can
> > handle it automatically by running:
> >
> >             tools/net/ynl/pyynl/ynl_gen_rst.py -x  -v -o Documentation/netlink/specs/index.rst  
> 
> I think this can be avoided by using the toctree glob directive in the
> index, like this:
> 
> =============================
> Netlink Family Specifications
> =============================
> 
> .. toctree::
>    :maxdepth: 1
>    :glob:
> 
>    *
> 
> This would let you have a static index file.

Didn't know about such option. If it works with the parser, it sounds good 
enough.

> 
> > ---
> >
> > v2:
> > - Use a Sphinx extension to handle netlink files.
> >
> > v1:
> > - Statically add template files to as networking/netlink_spec/<family>.rst
> >
> > Mauro Carvalho Chehab (12):
> >   tools: ynl_gen_rst.py: create a top-level reference
> >   docs: netlink: netlink-raw.rst: use :ref: instead of :doc:  
> 
> I suggest combining the first 2 patches.
> 
> >   docs: netlink: don't ignore generated rst files  
> 
> Maybe leave this patch to the end and change the description to be a
> cleanup of the remants of the old approach.

Ok for me, but I usually prefer keeping one patch per logical change.
In this case, one patch adding support at the tool; the other one
improving docs to benefit from the new feature.

> Further comments on specific commits
> 
> >   tools: ynl_gen_rst.py: make the index parser more generic
> >   tools: ynl_gen_rst.py: Split library from command line tool
> >   scripts: lib: netlink_yml_parser.py: use classes
> >   tools: ynl_gen_rst.py: do some coding style cleanups
> >   scripts: netlink_yml_parser.py: improve index.rst generation
> >   docs: sphinx: add a parser template for yaml files
> >   docs: sphinx: parser_yaml.py: add Netlink specs parser  
> 
> Please combine these 2 patches. The template patch just introduces noise
> into the series and makes it harder to review.

Ok.

> >   docs: use parser_yaml extension to handle Netlink specs
> >   docs: conf.py: don't handle yaml files outside Netlink specs
> >
> >  .pylintrc                                     |   2 +-
> >  Documentation/Makefile                        |  17 -
> >  Documentation/conf.py                         |  17 +-
> >  Documentation/netlink/specs/index.rst         |  38 ++
> >  Documentation/networking/index.rst            |   2 +-
> >  .../networking/netlink_spec/.gitignore        |   1 -
> >  .../networking/netlink_spec/readme.txt        |   4 -
> >  Documentation/sphinx/parser_yaml.py           |  80 ++++
> >  .../userspace-api/netlink/netlink-raw.rst     |   6 +-
> >  scripts/lib/netlink_yml_parser.py             | 394 ++++++++++++++++++
> >  tools/net/ynl/pyynl/ynl_gen_rst.py            | 378 +----------------
> >  11 files changed, 544 insertions(+), 395 deletions(-)
> >  create mode 100644 Documentation/netlink/specs/index.rst
> >  delete mode 100644 Documentation/networking/netlink_spec/.gitignore
> >  delete mode 100644 Documentation/networking/netlink_spec/readme.txt
> >  create mode 100755 Documentation/sphinx/parser_yaml.py
> >  create mode 100755 scripts/lib/netlink_yml_parser.py  

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-13 11:13   ` Donald Hunter
@ 2025-06-13 12:17     ` Mauro Carvalho Chehab
  2025-06-14 13:34       ` Donald Hunter
  0 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:17 UTC (permalink / raw)
  To: Donald Hunter, Jonathan Corbet
  Cc: Linux Doc Mailing List, Akira Yokosawa, Breno Leitao,
	David S. Miller, Eric Dumazet, Ignacio Encinas Rubio, Jan Stancek,
	Marco Elver, Paolo Abeni, Ruben Wauters, Shuah Khan, joel,
	linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:13:28 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > As we'll be using the Netlink specs parser inside a Sphinx
> > extension, move the library part from the command line parser.
> >
> > No functional changes.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  scripts/lib/netlink_yml_parser.py  | 391 +++++++++++++++++++++++++++++
> >  tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------  
> 
> I think the library code should be put in tools/net/ynl/pyynl/lib
> because it is YNL specific code. Maybe call it rst_generator.py

We had a similar discussion before when we switched get_abi and
kernel-doc to Python. On that time, we opted to place all shared
Python libraries under scripts/lib.

From my side, I don't mind having them on a different place,
but I prefer to see all Sphinx extensions getting libraries from
the same base directory.

Jon,

What do you think?

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files
  2025-06-13 11:29   ` Donald Hunter
@ 2025-06-13 12:26     ` Mauro Carvalho Chehab
  2025-06-13 15:42       ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:26 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:29:34 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > Add a simple sphinx.Parser class meant to handle yaml files.
> >
> > For now, it just replaces a yaml file by a simple ReST
> > code. I opted to do this way, as this patch can be used as
> > basis for new file format parsers. We may use this as an
> > example to parse other types of files.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  Documentation/sphinx/parser_yaml.py | 63 +++++++++++++++++++++++++++++
> >  1 file changed, 63 insertions(+)
> >  create mode 100755 Documentation/sphinx/parser_yaml.py  
> 
> It's not a generic yaml parser so the file should be
> netlink_doc_generator.py or something.

There's no way I'm aware of to have two yaml parsers. So, assuming that
some other subsystem would also use yaml (*), the same parser will need to
handle yaml files from different parts.

That's why I opted to have a generic name here. Besides that, there's
nothing there which is specific to Netlink, as the actual parser is
implemented inside a class on a separate file.

(*) as I said, media is considering using yaml for sensors. Nothing yet
    materialized, as we just had our summit last month. Yet, as DT also
    uses yaml, so I won't doubt that other subsystems may end using it
    as well.

> > diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> > new file mode 100755
> > index 000000000000..b3cde9cf7aac
> > --- /dev/null
> > +++ b/Documentation/sphinx/parser_yaml.py
> > @@ -0,0 +1,63 @@
> > +"""
> > +Sphinx extension for processing YAML files
> > +"""
> > +
> > +import os
> > +
> > +from docutils.parsers.rst import Parser as RSTParser
> > +from docutils.statemachine import ViewList
> > +
> > +from sphinx.util import logging
> > +from sphinx.parsers import Parser
> > +
> > +from pprint import pformat
> > +
> > +logger = logging.getLogger(__name__)
> > +
> > +class YamlParser(Parser):
> > +    """Custom parser for YAML files."""  
> 
> The class is only intended to be a netlink doc generator so I sugget
> calling it NetlinkDocGenerator

See above.

> > +
> > +    supported = ('yaml', 'yml')  
> 
> I don't think we need to support the .yml extension.

Ok, will drop "yml".

> > +
> > +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> > +    def parse(self, inputstring, document):
> > +        """Parse YAML and generate a document tree."""
> > +
> > +        self.setup_parse(inputstring, document)
> > +
> > +        result = ViewList()
> > +
> > +        try:
> > +            # FIXME: Test logic to generate some ReST content
> > +            basename = os.path.basename(document.current_source)
> > +            title = os.path.splitext(basename)[0].replace('_', ' ').title()
> > +
> > +            msg = f"{title}\n"
> > +            msg += "=" * len(title) + "\n\n"
> > +            msg += "Something\n"
> > +
> > +            # Parse message with RSTParser
> > +            for i, line in enumerate(msg.split('\n')):
> > +                result.append(line, document.current_source, i)
> > +
> > +            rst_parser = RSTParser()
> > +            rst_parser.parse('\n'.join(result), document)
> > +
> > +        except Exception as e:
> > +            document.reporter.error("YAML parsing error: %s" % pformat(e))
> > +
> > +        self.finish_parse()
> > +
> > +def setup(app):
> > +    """Setup function for the Sphinx extension."""
> > +
> > +    # Add YAML parser
> > +    app.add_source_parser(YamlParser)
> > +    app.add_source_suffix('.yaml', 'yaml')
> > +    app.add_source_suffix('.yml', 'yaml')  
> 
> No need to support the .yml extension.

Ok.

> > +
> > +    return {
> > +        'version': '1.0',
> > +        'parallel_read_safe': True,
> > +        'parallel_write_safe': True,
> > +    }  

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser
  2025-06-13 11:45   ` Donald Hunter
@ 2025-06-13 12:27     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:27 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:45:10 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > Place the code at parser_yaml.py to handle Netlink specs. All
> > other yaml files are ignored.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  .pylintrc                           |  2 +-
> >  Documentation/sphinx/parser_yaml.py | 39 +++++++++++++++++++++--------
> >  2 files changed, 29 insertions(+), 12 deletions(-)
> >
> > diff --git a/.pylintrc b/.pylintrc
> > index 30b8ae1659f8..f1d21379254b 100644
> > --- a/.pylintrc
> > +++ b/.pylintrc
> > @@ -1,2 +1,2 @@
> >  [MASTER]
> > -init-hook='import sys; sys.path += ["scripts/lib/kdoc", "scripts/lib/abi"]'
> > +init-hook='import sys; sys.path += ["scripts/lib", "scripts/lib/kdoc", "scripts/lib/abi"]'
> > diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> > index b3cde9cf7aac..eb32e3249274 100755
> > --- a/Documentation/sphinx/parser_yaml.py
> > +++ b/Documentation/sphinx/parser_yaml.py
> > @@ -3,6 +3,10 @@ Sphinx extension for processing YAML files
> >  """
> >  
> >  import os
> > +import re
> > +import sys
> > +
> > +from pprint import pformat
> >  
> >  from docutils.parsers.rst import Parser as RSTParser
> >  from docutils.statemachine import ViewList
> > @@ -10,7 +14,10 @@ from docutils.statemachine import ViewList
> >  from sphinx.util import logging
> >  from sphinx.parsers import Parser
> >  
> > -from pprint import pformat
> > +srctree = os.path.abspath(os.environ["srctree"])
> > +sys.path.insert(0, os.path.join(srctree, "scripts/lib"))
> > +
> > +from netlink_yml_parser import NetlinkYamlParser      # pylint: disable=C0413
> >  
> >  logger = logging.getLogger(__name__)
> >  
> > @@ -19,8 +26,9 @@ class YamlParser(Parser):
> >  
> >      supported = ('yaml', 'yml')
> >  
> > -    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> > -    def parse(self, inputstring, document):
> > +    netlink_parser = NetlinkYamlParser()
> > +
> > +    def do_parse(self, inputstring, document, msg):
> >          """Parse YAML and generate a document tree."""
> >  
> >          self.setup_parse(inputstring, document)
> > @@ -28,14 +36,6 @@ class YamlParser(Parser):
> >          result = ViewList()
> >  
> >          try:
> > -            # FIXME: Test logic to generate some ReST content
> > -            basename = os.path.basename(document.current_source)
> > -            title = os.path.splitext(basename)[0].replace('_', ' ').title()
> > -
> > -            msg = f"{title}\n"
> > -            msg += "=" * len(title) + "\n\n"
> > -            msg += "Something\n"
> > -
> >              # Parse message with RSTParser
> >              for i, line in enumerate(msg.split('\n')):
> >                  result.append(line, document.current_source, i)
> > @@ -48,6 +48,23 @@ class YamlParser(Parser):
> >  
> >          self.finish_parse()
> >  
> > +    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
> > +    def parse(self, inputstring, document):
> > +        """Check if a YAML is meant to be parsed."""
> > +
> > +        fname = document.current_source
> > +
> > +        # Handle netlink yaml specs
> > +        if re.search("/netlink/specs/", fname):  
> 
> The re.search is overkill since it is not a regexp. You can instead say:
> 
>     if '/netlink/specs/' in fname:

OK.

> > +            if fname.endswith("index.yaml"):
> > +                msg = self.netlink_parser.generate_main_index_rst(fname, None)
> > +            else:  
> 
> I'm guessing we can drop these lines if the static index.rst approach works.

Agreed. Will test and drop it if it works.

> 
> > +                msg = self.netlink_parser.parse_yaml_file(fname)
> > +
> > +            self.do_parse(inputstring, document, msg)
> > +
> > +        # All other yaml files are ignored
> > +
> >  def setup(app):
> >      """Setup function for the Sphinx extension."""  

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs
  2025-06-13 11:50   ` Donald Hunter
@ 2025-06-13 12:29     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:29 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:50:48 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
> > This way, no .rst files would be written to the Kernel source
> > directories.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  Documentation/Makefile                        | 17 ---------
> >  Documentation/conf.py                         | 11 +++---
> >  Documentation/netlink/specs/index.rst         | 38 +++++++++++++++++++
> >  Documentation/networking/index.rst            |  2 +-
> >  .../networking/netlink_spec/readme.txt        |  4 --
> >  Documentation/sphinx/parser_yaml.py           |  2 +-
> >  6 files changed, 46 insertions(+), 28 deletions(-)
> >  create mode 100644 Documentation/netlink/specs/index.rst
> >  delete mode 100644 Documentation/networking/netlink_spec/readme.txt
> >
> > diff --git a/Documentation/Makefile b/Documentation/Makefile
> > index d30d66ddf1ad..9185680b1e86 100644
> > --- a/Documentation/Makefile
> > +++ b/Documentation/Makefile
> > @@ -102,22 +102,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
> >  		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
> >  	fi
> >  
> > -YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
> > -YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
> > -YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
> > -YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
> > -
> > -YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
> > -YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
> > -
> > -$(YNL_INDEX): $(YNL_RST_FILES)
> > -	$(Q)$(YNL_TOOL) -o $@ -x
> > -
> > -$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
> > -	$(Q)$(YNL_TOOL) -i $< -o $@
> > -
> > -htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
> > -
> >  htmldocs:
> >  	@$(srctree)/scripts/sphinx-pre-install --version-check
> >  	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
> > @@ -184,7 +168,6 @@ refcheckdocs:
> >  	$(Q)cd $(srctree);scripts/documentation-file-ref-check
> >  
> >  cleandocs:
> > -	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
> >  	$(Q)rm -rf $(BUILDDIR)
> >  	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
> >  
> > diff --git a/Documentation/conf.py b/Documentation/conf.py
> > index 12de52a2b17e..add6ce78dd80 100644
> > --- a/Documentation/conf.py
> > +++ b/Documentation/conf.py
> > @@ -45,7 +45,7 @@ needs_sphinx = '3.4.3'
> >  extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
> >                'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
> >                'maintainers_include', 'sphinx.ext.autosectionlabel',
> > -              'kernel_abi', 'kernel_feat', 'translations']
> > +              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
> >  
> >  # Since Sphinx version 3, the C function parser is more pedantic with regards
> >  # to type checking. Due to that, having macros at c:function cause problems.
> > @@ -143,10 +143,11 @@ else:
> >  # Add any paths that contain templates here, relative to this directory.
> >  templates_path = ['sphinx/templates']
> >  
> > -# The suffix(es) of source filenames.
> > -# You can specify multiple suffix as a list of string:
> > -# source_suffix = ['.rst', '.md']
> > -source_suffix = '.rst'
> > +# The suffixes of source filenames that will be automatically parsed
> > +source_suffix = {
> > +        '.rst': 'restructuredtext',
> > +        '.yaml': 'yaml',  
> 
> The handler name should probably be netlink_yaml 

See my comments on earlier patches.

> 
> > +}
> >  
> >  # The encoding of source files.
> >  #source_encoding = 'utf-8-sig'
> > diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
> > new file mode 100644
> > index 000000000000..ca0bf816dc3f
> > --- /dev/null
> > +++ b/Documentation/netlink/specs/index.rst
> > @@ -0,0 +1,38 @@
> > +.. SPDX-License-Identifier: GPL-2.0
> > +.. NOTE: This document was auto-generated.
> > +
> > +.. _specs:
> > +
> > +=============================
> > +Netlink Family Specifications
> > +=============================
> > +
> > +.. toctree::
> > +   :maxdepth: 1
> > +
> > +   conntrack
> > +   devlink
> > +   dpll
> > +   ethtool
> > +   fou
> > +   handshake
> > +   lockd
> > +   mptcp_pm
> > +   net_shaper
> > +   netdev
> > +   nfsd
> > +   nftables
> > +   nl80211
> > +   nlctrl
> > +   ovpn
> > +   ovs_datapath
> > +   ovs_flow
> > +   ovs_vport
> > +   rt-addr
> > +   rt-link
> > +   rt-neigh
> > +   rt-route
> > +   rt-rule
> > +   tc
> > +   tcp_metrics
> > +   team
> > diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
> > index ac90b82f3ce9..b7a4969e9bc9 100644
> > --- a/Documentation/networking/index.rst
> > +++ b/Documentation/networking/index.rst
> > @@ -57,7 +57,7 @@ Contents:
> >     filter
> >     generic-hdlc
> >     generic_netlink
> > -   netlink_spec/index
> > +   ../netlink/specs/index
> >     gen_stats
> >     gtp
> >     ila
> > diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
> > deleted file mode 100644
> > index 030b44aca4e6..000000000000
> > --- a/Documentation/networking/netlink_spec/readme.txt
> > +++ /dev/null
> > @@ -1,4 +0,0 @@
> > -SPDX-License-Identifier: GPL-2.0
> > -
> > -This file is populated during the build of the documentation (htmldocs) by the
> > -tools/net/ynl/pyynl/ynl_gen_rst.py script.
> > diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
> > index eb32e3249274..cdcafe5b3937 100755
> > --- a/Documentation/sphinx/parser_yaml.py
> > +++ b/Documentation/sphinx/parser_yaml.py
> > @@ -55,7 +55,7 @@ class YamlParser(Parser):
> >          fname = document.current_source
> >  
> >          # Handle netlink yaml specs
> > -        if re.search("/netlink/specs/", fname):
> > +        if re.search("netlink/specs/", fname):  
> 
> Please combine this change into the earlier patch so that the series
> doesn't have unnecessary changes.

OK.

> 
> >              if fname.endswith("index.yaml"):
> >                  msg = self.netlink_parser.generate_main_index_rst(fname, None)
> >              else:  

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside Netlink specs
  2025-06-13 11:52   ` Donald Hunter
@ 2025-06-13 12:30     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:30 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:52:35 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > The parser_yaml extension already has a logic to prevent
> > handing all yaml documents. However, if we don't also exclude
> > the patterns at conf.py, the build time would increase a lot,
> > and warnings like those would be generated:
> >
> >     Documentation/netlink/genetlink.yaml: WARNING: o documento não está incluído em nenhum toctree
> >     Documentation/netlink/genetlink-c.yaml: WARNING: o documento não está incluído em nenhum toctree
> >     Documentation/netlink/genetlink-legacy.yaml: WARNING: o documento não está incluído em nenhum toctree
> >     Documentation/netlink/index.rst: WARNING: o documento não está incluído em nenhum toctree
> >     Documentation/netlink/netlink-raw.yaml: WARNING: o documento não está incluído em nenhum toctree
> >
> > Add some exclusion rules to prevent that.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> >  Documentation/conf.py | 6 +++++-
> >  1 file changed, 5 insertions(+), 1 deletion(-)
> >
> > diff --git a/Documentation/conf.py b/Documentation/conf.py
> > index add6ce78dd80..b8668bcaf090 100644
> > --- a/Documentation/conf.py
> > +++ b/Documentation/conf.py
> > @@ -222,7 +222,11 @@ language = 'en'
> >  
> >  # List of patterns, relative to source directory, that match files and
> >  # directories to ignore when looking for source files.
> > -exclude_patterns = ['output']
> > +exclude_patterns = [
> > +	'output',
> > +	'devicetree/bindings/**.yaml',
> > +	'netlink/*.yaml',
> > +]  
> 
> Please merge this with the earlier patch that changes these lines so
> that the series doesn't contain unnecessary intermediate steps.

OK.

> 
> >  # The reST default role (used for this markup: `text`) to use for all
> >  # documents.  



Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes
  2025-06-13 11:20   ` Donald Hunter
@ 2025-06-13 12:40     ` Mauro Carvalho Chehab
  2025-06-13 12:53       ` Donald Hunter
  0 siblings, 1 reply; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 12:40 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 12:20:33 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> 
> > As we'll be importing netlink parser into a Sphinx extension,
> > move all functions and global variables inside two classes:
> >
> > - RstFormatters, containing ReST formatter logic, which are
> >   YAML independent;
> > - NetlinkYamlParser: contains the actual parser classes. That's
> >   the only class that needs to be imported by the script or by
> >   a Sphinx extension.  
> 
> I suggest a third class for the doc generator that is separate from the
> yaml parsing.

Do you mean moving those two (or three? [*]) methods to a new class?

    def parse_yaml(self, obj: Dict[str, Any]) -> str:
    def parse_yaml_file(self, filename: str) -> str:
    def generate_main_index_rst(self, output: str, index_dir: str) -> None:

Also, how should I name it to avoid confusion with NetlinkYamlParser? 
Maybe YnlParser?

[*] generate_main_index_rst is probably deprecated. eventually
    we may drop it or keep it just at the command line stript.

> The yaml parsing should really be refactored to reuse
> tools/net/ynl/pyynl/lib/nlspec.py at some point.

Makes sense, but such change is out of the scope of this series.

> > With that, we won't pollute Sphinx namespace, avoiding any
> > potential clashes.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>  

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes
  2025-06-13 12:40     ` Mauro Carvalho Chehab
@ 2025-06-13 12:53       ` Donald Hunter
  0 siblings, 0 replies; 32+ messages in thread
From: Donald Hunter @ 2025-06-13 12:53 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Fri, 13 Jun 2025 at 13:40, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> Em Fri, 13 Jun 2025 12:20:33 +0100
> Donald Hunter <donald.hunter@gmail.com> escreveu:
>
> > Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> >
> > > As we'll be importing netlink parser into a Sphinx extension,
> > > move all functions and global variables inside two classes:
> > >
> > > - RstFormatters, containing ReST formatter logic, which are
> > >   YAML independent;
> > > - NetlinkYamlParser: contains the actual parser classes. That's
> > >   the only class that needs to be imported by the script or by
> > >   a Sphinx extension.
> >
> > I suggest a third class for the doc generator that is separate from the
> > yaml parsing.
>
> Do you mean moving those two (or three? [*]) methods to a new class?
>
>     def parse_yaml(self, obj: Dict[str, Any]) -> str:
>     def parse_yaml_file(self, filename: str) -> str:
>     def generate_main_index_rst(self, output: str, index_dir: str) -> None:
>
> Also, how should I name it to avoid confusion with NetlinkYamlParser?
> Maybe YnlParser?

On second thoughts, I see that the rst generation is actually spread
through all the parse_* methods so they are all related to doc generation.

I suggest putting all the parse_* methods into a class called
YnlDocGenerator, so just the 2 classes.

And I'm hoping that generate_main_index_rst can be removed.

> [*] generate_main_index_rst is probably deprecated. eventually
>     we may drop it or keep it just at the command line stript.
>
> > The yaml parsing should really be refactored to reuse
> > tools/net/ynl/pyynl/lib/nlspec.py at some point.
>
> Makes sense, but such change is out of the scope of this series.

Agreed

Thanks,
Donald.

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files
  2025-06-13 12:26     ` Mauro Carvalho Chehab
@ 2025-06-13 15:42       ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-13 15:42 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Fri, 13 Jun 2025 14:26:44 +0200
Mauro Carvalho Chehab <mchehab+huawei@kernel.org> escreveu:

> > > +
> > > +    supported = ('yaml', 'yml')    
> > 
> > I don't think we need to support the .yml extension.  
> 
> Ok, will drop "yml".

"supported" is not just extensions. It is a list of aliases for the
supported standard (*), like:

    supported = ('rst', 'restructuredtext', 'rest', 'restx', 'rtxt', 'rstx')
    """Aliases this parser supports."""

    (*) see: https://www.sphinx-doc.org/en/master/_modules/docutils/parsers/rst.html

Anyway, I tried with:

	supported = ('yaml')    

but it crashed with:

	sphinx.errors.SphinxError: Source parser for yaml not registered

On my tests, if "supported" set has just one element, it crashes.

On this specific case, this, for instance, works:

    supported = ('yaml', 'foobar')

Anyway, not worth spending too much time on it, as it could be
a bug or a feature at either docutils or sphinx. As we want it to
work with existing versions, I'll keep it as:

	supported = ('yaml', 'yml')    

at the next version.

> > > +def setup(app):
> > > +    """Setup function for the Sphinx extension."""
> > > +
> > > +    # Add YAML parser
> > > +    app.add_source_parser(YamlParser)
> > > +    app.add_source_suffix('.yaml', 'yaml')
> > > +    app.add_source_suffix('.yml', 'yaml')    
> > 
> > No need to support the .yml extension.  

Dropping .yml works here. So, here I'll keep just:

    # Add YAML parser
    app.add_source_parser(YamlParser)
    app.add_source_suffix('.yaml', 'yaml')

Regards,
Mauro


^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree)
  2025-06-13 12:13   ` Mauro Carvalho Chehab
@ 2025-06-14 13:29     ` Donald Hunter
  0 siblings, 0 replies; 32+ messages in thread
From: Donald Hunter @ 2025-06-14 13:29 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao

On Fri, 13 Jun 2025 at 13:14, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
> >
> > >   docs: netlink: don't ignore generated rst files
> >
> > Maybe leave this patch to the end and change the description to be a
> > cleanup of the remants of the old approach.
>
> Ok for me, but I usually prefer keeping one patch per logical change.
> In this case, one patch adding support at the tool; the other one
> improving docs to benefit from the new feature.

My point is that "don't ignore generated rst files" is not a step on
the way from the old method towards the new method. The new method
does not involve generating any .rst files so the patch is something
more like "remove obsolete .gitignore from unused directory".

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-13 12:17     ` Mauro Carvalho Chehab
@ 2025-06-14 13:34       ` Donald Hunter
  2025-06-14 15:01         ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 32+ messages in thread
From: Donald Hunter @ 2025-06-14 13:34 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Jonathan Corbet, Linux Doc Mailing List, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Fri, 13 Jun 2025 at 13:18, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> Em Fri, 13 Jun 2025 12:13:28 +0100
> Donald Hunter <donald.hunter@gmail.com> escreveu:
>
> > Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> >
> > > As we'll be using the Netlink specs parser inside a Sphinx
> > > extension, move the library part from the command line parser.
> > >
> > > No functional changes.
> > >
> > > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > > ---
> > >  scripts/lib/netlink_yml_parser.py  | 391 +++++++++++++++++++++++++++++
> > >  tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------
> >
> > I think the library code should be put in tools/net/ynl/pyynl/lib
> > because it is YNL specific code. Maybe call it rst_generator.py
>
> We had a similar discussion before when we switched get_abi and
> kernel-doc to Python. On that time, we opted to place all shared
> Python libraries under scripts/lib.
>
> From my side, I don't mind having them on a different place,
> but I prefer to see all Sphinx extensions getting libraries from
> the same base directory.

It's YNL specific code and I want to refactor it to make use of
tools/net/ynl/pyynl/lib/nlspec.py so it definitely belongs in
tools/net/ynl/pyynl/lib.

> Jon,
>
> What do you think?
>
> Thanks,
> Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

* Re: [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-14 13:34       ` Donald Hunter
@ 2025-06-14 15:01         ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 32+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 15:01 UTC (permalink / raw)
  To: Donald Hunter
  Cc: Jonathan Corbet, Linux Doc Mailing List, Akira Yokosawa,
	Breno Leitao, David S. Miller, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Sat, 14 Jun 2025 14:34:01 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:

> On Fri, 13 Jun 2025 at 13:18, Mauro Carvalho Chehab
> <mchehab+huawei@kernel.org> wrote:
> >
> > Em Fri, 13 Jun 2025 12:13:28 +0100
> > Donald Hunter <donald.hunter@gmail.com> escreveu:
> >  
> > > Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> > >  
> > > > As we'll be using the Netlink specs parser inside a Sphinx
> > > > extension, move the library part from the command line parser.
> > > >
> > > > No functional changes.
> > > >
> > > > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > > > ---
> > > >  scripts/lib/netlink_yml_parser.py  | 391 +++++++++++++++++++++++++++++
> > > >  tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------  
> > >
> > > I think the library code should be put in tools/net/ynl/pyynl/lib
> > > because it is YNL specific code. Maybe call it rst_generator.py  
> >
> > We had a similar discussion before when we switched get_abi and
> > kernel-doc to Python. On that time, we opted to place all shared
> > Python libraries under scripts/lib.
> >
> > From my side, I don't mind having them on a different place,
> > but I prefer to see all Sphinx extensions getting libraries from
> > the same base directory.  
> 
> It's YNL specific code and I want to refactor it to make use of
> tools/net/ynl/pyynl/lib/nlspec.py so it definitely belongs in
> tools/net/ynl/pyynl/lib.

To avoid duplicating comments, let's discuss this at patch 12/14's
thread.

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 32+ messages in thread

end of thread, other threads:[~2025-06-14 15:01 UTC | newest]

Thread overview: 32+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-06-12 10:31 [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 01/12] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 02/12] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 03/12] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 04/12] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 05/12] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
2025-06-13 11:13   ` Donald Hunter
2025-06-13 12:17     ` Mauro Carvalho Chehab
2025-06-14 13:34       ` Donald Hunter
2025-06-14 15:01         ` Mauro Carvalho Chehab
2025-06-12 10:31 ` [PATCH v2 06/12] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
2025-06-13 11:20   ` Donald Hunter
2025-06-13 12:40     ` Mauro Carvalho Chehab
2025-06-13 12:53       ` Donald Hunter
2025-06-12 10:31 ` [PATCH v2 07/12] tools: ynl_gen_rst.py: do some coding style cleanups Mauro Carvalho Chehab
2025-06-12 10:32 ` [PATCH v2 08/12] scripts: netlink_yml_parser.py: improve index.rst generation Mauro Carvalho Chehab
2025-06-12 10:32 ` [PATCH v2 09/12] docs: sphinx: add a parser template for yaml files Mauro Carvalho Chehab
2025-06-13 11:29   ` Donald Hunter
2025-06-13 12:26     ` Mauro Carvalho Chehab
2025-06-13 15:42       ` Mauro Carvalho Chehab
2025-06-12 10:32 ` [PATCH v2 10/12] docs: sphinx: parser_yaml.py: add Netlink specs parser Mauro Carvalho Chehab
2025-06-13 11:45   ` Donald Hunter
2025-06-13 12:27     ` Mauro Carvalho Chehab
2025-06-12 10:32 ` [PATCH v2 11/12] docs: use parser_yaml extension to handle Netlink specs Mauro Carvalho Chehab
2025-06-13 11:50   ` Donald Hunter
2025-06-13 12:29     ` Mauro Carvalho Chehab
2025-06-12 10:32 ` [PATCH v2 12/12] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
2025-06-13 11:52   ` Donald Hunter
2025-06-13 12:30     ` Mauro Carvalho Chehab
2025-06-13 11:05 ` [PATCH v2 00/12] Don't generate netlink .rst files inside $(srctree) Donald Hunter
2025-06-13 12:13   ` Mauro Carvalho Chehab
2025-06-14 13:29     ` Donald Hunter

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).