linux-kernel.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree)
@ 2025-06-18 11:46 Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
                   ` (15 more replies)
  0 siblings, 16 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel,
	Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
	Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
	Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
	netdev, peterz, stern, Breno Leitao, Jakub Kicinski, Randy Dunlap,
	Simon Horman

As discussed at:
   https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/

changeset f061c9f7d058 ("Documentation: Document each netlink family")
added a logic which generates *.rst files inside $(srctree). This is bad
when O=<BUILDDIR> is used.

A recent change renamed the yaml files used by Netlink, revealing a bad
side effect: as "make cleandocs" don't clean the produced files and symbols
appear duplicated for people that don't build the kernel from scratch.

This series adds an yaml parser extension and uses an index file with glob for
*. We opted to write such extension in a way that no actual yaml conversion
code is inside it. This makes it flexible enough to handle other types of yaml
files in the future. The actual yaml conversion logic were placed at 
netlink_yml_parser.py. 

As requested by YNL maintainers, this version has netlink_yml_parser.py
inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
binaries, nor to have Python libraries spread all over the Kernel. IMO,
the best is to put all of them on a common place (scripts/lib, python/lib,
lib/python, ...) but, as this can be solved later, for now let's keep it this
way.

---

v6:
- YNL doc parser is now at tools/net/ynl/pyynl/lib/doc_generator.py;
- two patches got merged;
- added instructions to test docs with Sphinx 3.4.3 (minimal supported
  version);
- minor fixes.

v5:
- some patch reorg;
- netlink_yml_parser.py is now together with ynl tools;
- minor fixes.

v4:
- Renamed the YNL parser class;
- some minor patch cleanups and merges;
- added an extra patch to fix a insert_pattern/exclude_pattern logic when
   SPHINXDIRS is used.

v3:
- Two series got merged altogether:
  - https://lore.kernel.org/linux-doc/cover.1749723671.git.mchehab+huawei@kernel.org/T/#t
  - https://lore.kernel.org/linux-doc/cover.1749735022.git.mchehab+huawei@kernel.org

- Added an extra patch to update MAINTAINERS to point to YNL library
- Added a (somewhat unrelated) patch that remove warnings check when
  running "make cleandocs".



Mauro Carvalho Chehab (15):
  docs: conf.py: properly handle include and exclude patterns
  docs: Makefile: disable check rules on make cleandocs
  docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  tools: ynl_gen_rst.py: Split library from command line tool
  docs: netlink: index.rst: add a netlink index file
  tools: ynl_gen_rst.py: cleanup coding style
  docs: sphinx: add a parser for yaml files for Netlink specs
  docs: use parser_yaml extension to handle Netlink specs
  docs: uapi: netlink: update netlink specs link
  tools: ynl_gen_rst.py: drop support for generating index files
  docs: netlink: remove obsolete .gitignore from unused directory
  MAINTAINERS: add netlink_yml_parser.py to linux-doc
  tools: netlink_yml_parser.py: add line numbers to parsed data
  docs: parser_yaml.py: add support for line numbers from the parser
  docs: sphinx: add a file with the requirements for lowest version

 Documentation/Makefile                        |  19 +-
 Documentation/conf.py                         |  87 +++-
 Documentation/doc-guide/sphinx.rst            |  15 +
 Documentation/netlink/specs/index.rst         |  13 +
 Documentation/networking/index.rst            |   2 +-
 .../networking/netlink_spec/.gitignore        |   1 -
 .../networking/netlink_spec/readme.txt        |   4 -
 Documentation/sphinx/min_requirements.txt     |   8 +
 Documentation/sphinx/parser_yaml.py           |  84 ++++
 Documentation/userspace-api/netlink/index.rst |   2 +-
 .../userspace-api/netlink/netlink-raw.rst     |   6 +-
 Documentation/userspace-api/netlink/specs.rst |   2 +-
 MAINTAINERS                                   |   1 +
 tools/net/ynl/pyynl/lib/__init__.py           |   2 +
 tools/net/ynl/pyynl/lib/doc_generator.py      | 398 ++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py            | 384 +----------------
 16 files changed, 614 insertions(+), 414 deletions(-)
 create mode 100644 Documentation/netlink/specs/index.rst
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt
 create mode 100644 Documentation/sphinx/min_requirements.txt
 create mode 100755 Documentation/sphinx/parser_yaml.py
 create mode 100644 tools/net/ynl/pyynl/lib/doc_generator.py

-- 
2.49.0



^ permalink raw reply	[flat|nested] 22+ messages in thread

* [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 15:42   ` Breno Leitao
  2025-06-18 11:46 ` [PATCH v6 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
                   ` (14 subsequent siblings)
  15 siblings, 1 reply; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

When one does:
	make SPHINXDIRS="foo" htmldocs

All patterns would be relative to Documentation/foo, which
causes the include/exclude patterns like:

	include_patterns = [
		...
		f'foo/*.{ext}',
	]

to break. This is not what it is expected. Address it by
adding a logic to dynamically adjust the pattern when
SPHINXDIRS is used.

That allows adding parsers for other file types.

It should be noticed that include_patterns was added on
Sphinx 5.1:
	https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns

So, a backward-compatible code is needed when we start
using it for real.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/conf.py | 67 ++++++++++++++++++++++++++++++++++++++++---
 1 file changed, 63 insertions(+), 4 deletions(-)

diff --git a/Documentation/conf.py b/Documentation/conf.py
index 12de52a2b17e..4ba4ee45e599 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -17,6 +17,66 @@ import os
 import sphinx
 import shutil
 
+# Get Sphinx version
+major, minor, patch = sphinx.version_info[:3]
+
+# Include_patterns were added on Sphinx 5.1
+if (major < 5) or (major == 5 and minor < 1):
+    has_include_patterns = False
+else:
+    has_include_patterns = True
+    # Include patterns that don't contain directory names, in glob format
+    include_patterns = ['**.rst']
+
+# Location of Documentation/ directory
+doctree = os.path.abspath('.')
+
+# Exclude of patterns that don't contain directory names, in glob format.
+exclude_patterns = []
+
+# List of patterns that contain directory names in glob format.
+dyn_include_patterns = []
+dyn_exclude_patterns = ['output']
+
+# Properly handle include/exclude patterns
+# ----------------------------------------
+
+def update_patterns(app):
+
+    """
+    On Sphinx, all directories are relative to what it is passed as
+    SOURCEDIR parameter for sphinx-build. Due to that, all patterns
+    that have directory names on it need to be dynamically set, after
+    converting them to a relative patch.
+
+    As Sphinx doesn't include any patterns outside SOURCEDIR, we should
+    exclude relative patterns that start with "../".
+    """
+
+    sourcedir = app.srcdir  # full path to the source directory
+    builddir = os.environ.get("BUILDDIR")
+
+    # setup include_patterns dynamically
+    if has_include_patterns:
+        for p in dyn_include_patterns:
+            full = os.path.join(doctree, p)
+
+            rel_path = os.path.relpath(full, start = app.srcdir)
+            if rel_path.startswith("../"):
+                continue
+
+            app.config.include_patterns.append(rel_path)
+
+    # setup exclude_patterns dynamically
+    for p in dyn_exclude_patterns:
+        full = os.path.join(doctree, p)
+
+        rel_path = os.path.relpath(full, start = app.srcdir)
+        if rel_path.startswith("../"):
+            continue
+
+        app.config.exclude_patterns.append(rel_path)
+
 # helper
 # ------
 
@@ -219,10 +279,6 @@ language = 'en'
 # Else, today_fmt is used as the format for a strftime call.
 #today_fmt = '%B %d, %Y'
 
-# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-exclude_patterns = ['output']
-
 # The reST default role (used for this markup: `text`) to use for all
 # documents.
 #default_role = None
@@ -516,3 +572,6 @@ kerneldoc_srctree = '..'
 # the last statement in the conf.py file
 # ------------------------------------------------------------------------------
 loadConfig(globals())
+
+def setup(app):
+	app.connect('builder-inited', update_patterns)
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 02/15] docs: Makefile: disable check rules on make cleandocs
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 03/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
                   ` (13 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

It doesn't make sense to check for missing ABI and documents
when cleaning the tree.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/Makefile | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/Documentation/Makefile b/Documentation/Makefile
index d30d66ddf1ad..b98477df5ddf 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -5,6 +5,7 @@
 # for cleaning
 subdir- := devicetree/bindings
 
+ifneq ($(MAKECMDGOALS),cleandocs)
 # Check for broken documentation file references
 ifeq ($(CONFIG_WARN_MISSING_DOCUMENTS),y)
 $(shell $(srctree)/scripts/documentation-file-ref-check --warn)
@@ -14,6 +15,7 @@ endif
 ifeq ($(CONFIG_WARN_ABI_ERRORS),y)
 $(shell $(srctree)/scripts/get_abi.py --dir $(srctree)/Documentation/ABI validate)
 endif
+endif
 
 # You can set these variables from the command line.
 SPHINXBUILD   = sphinx-build
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 03/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 04/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
                   ` (12 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Currently, rt documents are referred with:

Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`

Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. Also that's hard to
maintain, and may break if we change the way rst files are
generated from yaml. Better to use instead a reference for
the netlink family.

So, replace them by Sphinx cross-reference tag that are
created by ynl_gen_rst.py.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
 tools/net/ynl/pyynl/ynl_gen_rst.py                  | 5 +++--
 2 files changed, 6 insertions(+), 5 deletions(-)

diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
 ------------
 
 Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
 abstraction to carry module specific information.
 
 Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
 Nested struct definitions
 -------------------------
 
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
 make use of nested struct definitions. The ``netlink-raw`` schema makes it
 possible to embed a struct within a struct definition using the ``struct``
 property. For example, the following struct definition embeds the
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
 
     # Main header
 
-    lines.append(rst_header())
-
     family = obj['name']
 
+    lines.append(rst_header())
+    lines.append(rst_label("netlink-" + family))
+
     title = f"Family ``{family}`` netlink specification"
     lines.append(rst_title(title))
     lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 04/15] tools: ynl_gen_rst.py: Split library from command line tool
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (2 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 03/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 05/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
                   ` (11 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/__init__.py      |   2 +
 tools/net/ynl/pyynl/lib/doc_generator.py | 409 +++++++++++++++++++++++
 tools/net/ynl/pyynl/ynl_gen_rst.py       | 361 +-------------------
 3 files changed, 419 insertions(+), 353 deletions(-)
 create mode 100644 tools/net/ynl/pyynl/lib/doc_generator.py

diff --git a/tools/net/ynl/pyynl/lib/__init__.py b/tools/net/ynl/pyynl/lib/__init__.py
index 71518b9842ee..5f266ebe4526 100644
--- a/tools/net/ynl/pyynl/lib/__init__.py
+++ b/tools/net/ynl/pyynl/lib/__init__.py
@@ -4,6 +4,8 @@ from .nlspec import SpecAttr, SpecAttrSet, SpecEnumEntry, SpecEnumSet, \
     SpecFamily, SpecOperation, SpecSubMessage, SpecSubMessageFormat
 from .ynl import YnlFamily, Netlink, NlError
 
+from .doc_generator import YnlDocGenerator
+
 __all__ = ["SpecAttr", "SpecAttrSet", "SpecEnumEntry", "SpecEnumSet",
            "SpecFamily", "SpecOperation", "SpecSubMessage", "SpecSubMessageFormat",
            "YnlFamily", "Netlink", "NlError"]
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
new file mode 100644
index 000000000000..839e78b39de3
--- /dev/null
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -0,0 +1,409 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+    Class to auto generate the documentation for Netlink specifications.
+
+    :copyright:  Copyright (C) 2023  Breno Leitao <leitao@debian.org>
+    :license:    GPL Version 2, June 1991 see linux/COPYING for details.
+
+    This class performs extensive parsing to the Linux kernel's netlink YAML
+    spec files, in an effort to avoid needing to heavily mark up the original
+    YAML file.
+
+    This code is split in two classes:
+        1) RST formatters: Use to convert a string to a RST output
+        2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
+"""
+
+from typing import Any, Dict, List
+import os.path
+import sys
+import argparse
+import logging
+import yaml
+
+
+# ==============
+# RST Formatters
+# ==============
+class RstFormatters:
+    SPACE_PER_LEVEL = 4
+
+    @staticmethod
+    def headroom(level: int) -> str:
+        """Return space to format"""
+        return " " * (level * RstFormatters.SPACE_PER_LEVEL)
+
+
+    @staticmethod
+    def bold(text: str) -> str:
+        """Format bold text"""
+        return f"**{text}**"
+
+
+    @staticmethod
+    def inline(text: str) -> str:
+        """Format inline text"""
+        return f"``{text}``"
+
+
+    @staticmethod
+    def sanitize(text: str) -> str:
+        """Remove newlines and multiple spaces"""
+        # This is useful for some fields that are spread across multiple lines
+        return str(text).replace("\n", " ").strip()
+
+
+    def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+        """Return a RST formatted field"""
+        return self.headroom(level) + f":{key}: {value}"
+
+
+    def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+        """Format a single rst definition"""
+        return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
+
+
+    def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+        """Return a formatted paragraph"""
+        return self.headroom(level) + paragraph
+
+
+    def rst_bullet(self, item: str, level: int = 0) -> str:
+        """Return a formatted a bullet"""
+        return self.headroom(level) + f"- {item}"
+
+
+    @staticmethod
+    def rst_subsection(title: str) -> str:
+        """Add a sub-section to the document"""
+        return f"{title}\n" + "-" * len(title)
+
+
+    @staticmethod
+    def rst_subsubsection(title: str) -> str:
+        """Add a sub-sub-section to the document"""
+        return f"{title}\n" + "~" * len(title)
+
+
+    @staticmethod
+    def rst_section(namespace: str, prefix: str, title: str) -> str:
+        """Add a section to the document"""
+        return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+    @staticmethod
+    def rst_subtitle(title: str) -> str:
+        """Add a subtitle to the document"""
+        return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+    @staticmethod
+    def rst_title(title: str) -> str:
+        """Add a title to the document"""
+        return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+    def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+        """Format a list using inlines"""
+        return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
+
+
+    @staticmethod
+    def rst_ref(namespace: str, prefix: str, name: str) -> str:
+        """Add a hyperlink to the document"""
+        mappings = {'enum': 'definition',
+                    'fixed-header': 'definition',
+                    'nested-attributes': 'attribute-set',
+                    'struct': 'definition'}
+        if prefix in mappings:
+            prefix = mappings[prefix]
+        return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+    def rst_header(self) -> str:
+        """The headers for all the auto generated RST files"""
+        lines = []
+
+        lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+        lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_toctree(maxdepth: int = 2) -> str:
+        """Generate a toctree RST primitive"""
+        lines = []
+
+        lines.append(".. toctree::")
+        lines.append(f"   :maxdepth: {maxdepth}\n\n")
+
+        return "\n".join(lines)
+
+
+    @staticmethod
+    def rst_label(title: str) -> str:
+        """Return a formatted label"""
+        return f".. _{title}:\n\n"
+
+# =======
+# Parsers
+# =======
+class YnlDocGenerator:
+
+    fmt = RstFormatters()
+
+    def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+        """Parse 'multicast' group list and return a formatted string"""
+        lines = []
+        for group in mcast_group:
+            lines.append(self.fmt.rst_bullet(group["name"]))
+
+        return "\n".join(lines)
+
+
+    def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'do' section and return a formatted string"""
+        lines = []
+        for key in do_dict.keys():
+            lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+            if key in ['request', 'reply']:
+                lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+            else:
+                lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+        return "\n".join(lines)
+
+
+    def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+        """Parse 'attributes' section"""
+        if "attributes" not in attrs:
+            return ""
+        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+        return "\n".join(lines)
+
+
+    def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse operations block"""
+        preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+        linkable = ["fixed-header", "attribute-set"]
+        lines = []
+
+        for operation in operations:
+            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+            for key in operation.keys():
+                if key in preprocessed:
+                    # Skip the special fields
+                    continue
+                value = operation[key]
+                if key in linkable:
+                    value = self.fmt.rst_ref(namespace, key, value)
+                lines.append(self.fmt.rst_fields(key, value, 0))
+            if 'flags' in operation:
+                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+            if "do" in operation:
+                lines.append(self.fmt.rst_paragraph(":do:", 0))
+                lines.append(self.parse_do(operation["do"], 0))
+            if "dump" in operation:
+                lines.append(self.fmt.rst_paragraph(":dump:", 0))
+                lines.append(self.parse_do(operation["dump"], 0))
+
+            # New line after fields
+            lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+        """Parse a list of entries"""
+        ignored = ["pad"]
+        lines = []
+        for entry in entries:
+            if isinstance(entry, dict):
+                # entries could be a list or a dictionary
+                field_name = entry.get("name", "")
+                if field_name in ignored:
+                    continue
+                type_ = entry.get("type")
+                if type_:
+                    field_name += f" ({self.fmt.inline(type_)})"
+                lines.append(
+                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                )
+            elif isinstance(entry, list):
+                lines.append(self.fmt.rst_list_inline(entry, level))
+            else:
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+
+        lines.append("\n")
+        return "\n".join(lines)
+
+
+    def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+        """Parse definitions section"""
+        preprocessed = ["name", "entries", "members"]
+        ignored = ["render-max"]  # This is not printed
+        lines = []
+
+        for definition in defs:
+            lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+            for k in definition.keys():
+                if k in preprocessed + ignored:
+                    continue
+                lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+            # Field list needs to finish with a new line
+            lines.append("\n")
+            if "entries" in definition:
+                lines.append(self.fmt.rst_paragraph(":entries:", 0))
+                lines.append(self.parse_entries(definition["entries"], 1))
+            if "members" in definition:
+                lines.append(self.fmt.rst_paragraph(":members:", 0))
+                lines.append(self.parse_entries(definition["members"], 1))
+
+        return "\n".join(lines)
+
+
+    def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse attribute from attribute-set"""
+        preprocessed = ["name", "type"]
+        linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+        ignored = ["checks"]
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            for attr in entry["attributes"]:
+                type_ = attr.get("type")
+                attr_line = attr["name"]
+                if type_:
+                    # Add the attribute type in the same line
+                    attr_line += f" ({self.fmt.inline(type_)})"
+
+                lines.append(self.fmt.rst_subsubsection(attr_line))
+
+                for k in attr.keys():
+                    if k in preprocessed + ignored:
+                        continue
+                    if k in linkable:
+                        value = self.fmt.rst_ref(namespace, k, attr[k])
+                    else:
+                        value = self.fmt.sanitize(attr[k])
+                    lines.append(self.fmt.rst_fields(k, value, 0))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+        """Parse sub-message definitions"""
+        lines = []
+
+        for entry in entries:
+            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            for fmt in entry["formats"]:
+                value = fmt["value"]
+
+                lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+                for attr in ['fixed-header', 'attribute-set']:
+                    if attr in fmt:
+                        lines.append(self.fmt.rst_fields(attr,
+                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
+                                                1))
+                lines.append("\n")
+
+        return "\n".join(lines)
+
+
+    def parse_yaml(self, obj: Dict[str, Any]) -> str:
+        """Format the whole YAML into a RST string"""
+        lines = []
+
+        # Main header
+
+        family = obj['name']
+
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("netlink-" + family))
+
+        title = f"Family ``{family}`` netlink specification"
+        lines.append(self.fmt.rst_title(title))
+        lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
+
+        if "doc" in obj:
+            lines.append(self.fmt.rst_subtitle("Summary"))
+            lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
+
+        # Operations
+        if "operations" in obj:
+            lines.append(self.fmt.rst_subtitle("Operations"))
+            lines.append(self.parse_operations(obj["operations"]["list"], family))
+
+        # Multicast groups
+        if "mcast-groups" in obj:
+            lines.append(self.fmt.rst_subtitle("Multicast groups"))
+            lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
+
+        # Definitions
+        if "definitions" in obj:
+            lines.append(self.fmt.rst_subtitle("Definitions"))
+            lines.append(self.parse_definitions(obj["definitions"], family))
+
+        # Attributes set
+        if "attribute-sets" in obj:
+            lines.append(self.fmt.rst_subtitle("Attribute sets"))
+            lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
+
+        # Sub-messages
+        if "sub-messages" in obj:
+            lines.append(self.fmt.rst_subtitle("Sub-messages"))
+            lines.append(self.parse_sub_messages(obj["sub-messages"], family))
+
+        return "\n".join(lines)
+
+
+    # Main functions
+    # ==============
+
+
+    def parse_yaml_file(self, filename: str) -> str:
+        """Transform the YAML specified by filename into an RST-formatted string"""
+        with open(filename, "r", encoding="utf-8") as spec_file:
+            yaml_data = yaml.safe_load(spec_file)
+            content = self.parse_yaml(yaml_data)
+
+        return content
+
+
+    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
+        """Generate the `networking_spec/index` content and write to the file"""
+        lines = []
+
+        lines.append(self.fmt.rst_header())
+        lines.append(self.fmt.rst_label("specs"))
+        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
+        lines.append(self.fmt.rst_toctree(1))
+
+        index_fname = os.path.basename(output)
+        base, ext = os.path.splitext(index_fname)
+
+        if not index_dir:
+            index_dir = os.path.dirname(output)
+
+        logging.debug(f"Looking for {ext} files in %s", index_dir)
+        for filename in sorted(os.listdir(index_dir)):
+            if not filename.endswith(ext) or filename == index_fname:
+                continue
+            base, ext = os.path.splitext(filename)
+            lines.append(f"   {base}\n")
+
+        logging.debug("Writing an index file at %s", output)
+
+        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..b5a665eeaa5a 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,354 +10,17 @@
 
     This script performs extensive parsing to the Linux kernel's netlink YAML
     spec files, in an effort to avoid needing to heavily mark up the original
-    YAML file.
-
-    This code is split in three big parts:
-        1) RST formatters: Use to convert a string to a RST output
-        2) Parser helpers: Functions to parse the YAML data structure
-        3) Main function and small helpers
+    YAML file. It uses the library code from scripts/lib.
 """
 
-from typing import Any, Dict, List
 import os.path
+import pathlib
 import sys
 import argparse
 import logging
-import yaml
-
-
-SPACE_PER_LEVEL = 4
-
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
-    """Return space to format"""
-    return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
-    """Format bold text"""
-    return f"**{text}**"
-
-
-def inline(text: str) -> str:
-    """Format inline text"""
-    return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
-    """Remove newlines and multiple spaces"""
-    # This is useful for some fields that are spread across multiple lines
-    return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
-    """Return a RST formatted field"""
-    return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
-    """Format a single rst definition"""
-    return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
-    """Return a formatted paragraph"""
-    return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
-    """Return a formatted a bullet"""
-    return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
-    """Add a sub-section to the document"""
-    return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
-    """Add a sub-sub-section to the document"""
-    return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
-    """Add a section to the document"""
-    return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
-    """Add a subtitle to the document"""
-    return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
-    """Add a title to the document"""
-    return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
-    """Format a list using inlines"""
-    return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
-    """Add a hyperlink to the document"""
-    mappings = {'enum': 'definition',
-                'fixed-header': 'definition',
-                'nested-attributes': 'attribute-set',
-                'struct': 'definition'}
-    if prefix in mappings:
-        prefix = mappings[prefix]
-    return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
-    """The headers for all the auto generated RST files"""
-    lines = []
-
-    lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
-    lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
-    return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
-    """Generate a toctree RST primitive"""
-    lines = []
-
-    lines.append(".. toctree::")
-    lines.append(f"   :maxdepth: {maxdepth}\n\n")
-
-    return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
-    """Return a formatted label"""
-    return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
-    """Parse 'multicast' group list and return a formatted string"""
-    lines = []
-    for group in mcast_group:
-        lines.append(rst_bullet(group["name"]))
-
-    return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'do' section and return a formatted string"""
-    lines = []
-    for key in do_dict.keys():
-        lines.append(rst_paragraph(bold(key), level + 1))
-        if key in ['request', 'reply']:
-            lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
-        else:
-            lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
-    return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
-    """Parse 'attributes' section"""
-    if "attributes" not in attrs:
-        return ""
-    lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
-    return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse operations block"""
-    preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
-    linkable = ["fixed-header", "attribute-set"]
-    lines = []
-
-    for operation in operations:
-        lines.append(rst_section(namespace, 'operation', operation["name"]))
-        lines.append(rst_paragraph(operation["doc"]) + "\n")
-
-        for key in operation.keys():
-            if key in preprocessed:
-                # Skip the special fields
-                continue
-            value = operation[key]
-            if key in linkable:
-                value = rst_ref(namespace, key, value)
-            lines.append(rst_fields(key, value, 0))
-        if 'flags' in operation:
-            lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
-        if "do" in operation:
-            lines.append(rst_paragraph(":do:", 0))
-            lines.append(parse_do(operation["do"], 0))
-        if "dump" in operation:
-            lines.append(rst_paragraph(":dump:", 0))
-            lines.append(parse_do(operation["dump"], 0))
-
-        # New line after fields
-        lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
-    """Parse a list of entries"""
-    ignored = ["pad"]
-    lines = []
-    for entry in entries:
-        if isinstance(entry, dict):
-            # entries could be a list or a dictionary
-            field_name = entry.get("name", "")
-            if field_name in ignored:
-                continue
-            type_ = entry.get("type")
-            if type_:
-                field_name += f" ({inline(type_)})"
-            lines.append(
-                rst_fields(field_name, sanitize(entry.get("doc", "")), level)
-            )
-        elif isinstance(entry, list):
-            lines.append(rst_list_inline(entry, level))
-        else:
-            lines.append(rst_bullet(inline(sanitize(entry)), level))
-
-    lines.append("\n")
-    return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
-    """Parse definitions section"""
-    preprocessed = ["name", "entries", "members"]
-    ignored = ["render-max"]  # This is not printed
-    lines = []
-
-    for definition in defs:
-        lines.append(rst_section(namespace, 'definition', definition["name"]))
-        for k in definition.keys():
-            if k in preprocessed + ignored:
-                continue
-            lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
-        # Field list needs to finish with a new line
-        lines.append("\n")
-        if "entries" in definition:
-            lines.append(rst_paragraph(":entries:", 0))
-            lines.append(parse_entries(definition["entries"], 1))
-        if "members" in definition:
-            lines.append(rst_paragraph(":members:", 0))
-            lines.append(parse_entries(definition["members"], 1))
-
-    return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse attribute from attribute-set"""
-    preprocessed = ["name", "type"]
-    linkable = ["enum", "nested-attributes", "struct", "sub-message"]
-    ignored = ["checks"]
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
-        for attr in entry["attributes"]:
-            type_ = attr.get("type")
-            attr_line = attr["name"]
-            if type_:
-                # Add the attribute type in the same line
-                attr_line += f" ({inline(type_)})"
-
-            lines.append(rst_subsubsection(attr_line))
-
-            for k in attr.keys():
-                if k in preprocessed + ignored:
-                    continue
-                if k in linkable:
-                    value = rst_ref(namespace, k, attr[k])
-                else:
-                    value = sanitize(attr[k])
-                lines.append(rst_fields(k, value, 0))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
-    """Parse sub-message definitions"""
-    lines = []
-
-    for entry in entries:
-        lines.append(rst_section(namespace, 'sub-message', entry["name"]))
-        for fmt in entry["formats"]:
-            value = fmt["value"]
-
-            lines.append(rst_bullet(bold(value)))
-            for attr in ['fixed-header', 'attribute-set']:
-                if attr in fmt:
-                    lines.append(rst_fields(attr,
-                                            rst_ref(namespace, attr, fmt[attr]),
-                                            1))
-            lines.append("\n")
-
-    return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
-    """Format the whole YAML into a RST string"""
-    lines = []
-
-    # Main header
-
-    family = obj['name']
-
-    lines.append(rst_header())
-    lines.append(rst_label("netlink-" + family))
-
-    title = f"Family ``{family}`` netlink specification"
-    lines.append(rst_title(title))
-    lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
-    if "doc" in obj:
-        lines.append(rst_subtitle("Summary"))
-        lines.append(rst_paragraph(obj["doc"], 0))
-
-    # Operations
-    if "operations" in obj:
-        lines.append(rst_subtitle("Operations"))
-        lines.append(parse_operations(obj["operations"]["list"], family))
-
-    # Multicast groups
-    if "mcast-groups" in obj:
-        lines.append(rst_subtitle("Multicast groups"))
-        lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
-    # Definitions
-    if "definitions" in obj:
-        lines.append(rst_subtitle("Definitions"))
-        lines.append(parse_definitions(obj["definitions"], family))
-
-    # Attributes set
-    if "attribute-sets" in obj:
-        lines.append(rst_subtitle("Attribute sets"))
-        lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
-    # Sub-messages
-    if "sub-messages" in obj:
-        lines.append(rst_subtitle("Sub-messages"))
-        lines.append(parse_sub_messages(obj["sub-messages"], family))
-
-    return "\n".join(lines)
-
-
-# Main functions
-# ==============
 
+sys.path.append(pathlib.Path(__file__).resolve().parent.as_posix())
+from lib import YnlDocGenerator    # pylint: disable=C0413
 
 def parse_arguments() -> argparse.Namespace:
     """Parse arguments from user"""
@@ -392,15 +55,6 @@ def parse_arguments() -> argparse.Namespace:
     return args
 
 
-def parse_yaml_file(filename: str) -> str:
-    """Transform the YAML specified by filename into an RST-formatted string"""
-    with open(filename, "r", encoding="utf-8") as spec_file:
-        yaml_data = yaml.safe_load(spec_file)
-        content = parse_yaml(yaml_data)
-
-    return content
-
-
 def write_to_rstfile(content: str, filename: str) -> None:
     """Write the generated content into an RST file"""
     logging.debug("Saving RST file to %s", filename)
@@ -411,7 +65,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
 
 def generate_main_index_rst(output: str) -> None:
     """Generate the `networking_spec/index` content and write to the file"""
-    lines = []
 
     lines.append(rst_header())
     lines.append(rst_label("specs"))
@@ -426,7 +79,7 @@ def generate_main_index_rst(output: str) -> None:
         lines.append(f"   {filename.replace('.rst', '')}\n")
 
     logging.debug("Writing an index file at %s", output)
-    write_to_rstfile("".join(lines), output)
+    write_to_rstfile(msg, output)
 
 
 def main() -> None:
@@ -434,10 +87,12 @@ def main() -> None:
 
     args = parse_arguments()
 
+    parser = YnlDocGenerator()
+
     if args.input:
         logging.debug("Parsing %s", args.input)
         try:
-            content = parse_yaml_file(os.path.join(args.input))
+            content = parser.parse_yaml_file(os.path.join(args.input))
         except Exception as exception:
             logging.warning("Failed to parse %s.", args.input)
             logging.warning(exception)
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 05/15] docs: netlink: index.rst: add a netlink index file
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (3 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 04/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 06/15] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
                   ` (10 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of generating the index file, use glob to automatically
include all data from yaml.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/netlink/specs/index.rst | 13 +++++++++++++
 1 file changed, 13 insertions(+)
 create mode 100644 Documentation/netlink/specs/index.rst

diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..7f7cf4a096f2
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,13 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+   :maxdepth: 1
+   :glob:
+
+   *
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 06/15] tools: ynl_gen_rst.py: cleanup coding style
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (4 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 05/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 07/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
                   ` (9 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Cleanup some coding style issues pointed by pylint and flake8.

No functional changes.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 75 ++++++++----------------
 1 file changed, 26 insertions(+), 49 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 839e78b39de3..f71360f0ceb7 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -18,17 +18,12 @@
 """
 
 from typing import Any, Dict, List
-import os.path
-import sys
-import argparse
-import logging
 import yaml
 
 
-# ==============
-# RST Formatters
-# ==============
 class RstFormatters:
+    """RST Formatters"""
+
     SPACE_PER_LEVEL = 4
 
     @staticmethod
@@ -36,81 +31,67 @@ class RstFormatters:
         """Return space to format"""
         return " " * (level * RstFormatters.SPACE_PER_LEVEL)
 
-
     @staticmethod
     def bold(text: str) -> str:
         """Format bold text"""
         return f"**{text}**"
 
-
     @staticmethod
     def inline(text: str) -> str:
         """Format inline text"""
         return f"``{text}``"
 
-
     @staticmethod
     def sanitize(text: str) -> str:
         """Remove newlines and multiple spaces"""
         # This is useful for some fields that are spread across multiple lines
         return str(text).replace("\n", " ").strip()
 
-
     def rst_fields(self, key: str, value: str, level: int = 0) -> str:
         """Return a RST formatted field"""
         return self.headroom(level) + f":{key}: {value}"
 
-
     def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
         """Format a single rst definition"""
         return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
 
-
     def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
         """Return a formatted paragraph"""
         return self.headroom(level) + paragraph
 
-
     def rst_bullet(self, item: str, level: int = 0) -> str:
         """Return a formatted a bullet"""
         return self.headroom(level) + f"- {item}"
 
-
     @staticmethod
     def rst_subsection(title: str) -> str:
         """Add a sub-section to the document"""
         return f"{title}\n" + "-" * len(title)
 
-
     @staticmethod
     def rst_subsubsection(title: str) -> str:
         """Add a sub-sub-section to the document"""
         return f"{title}\n" + "~" * len(title)
 
-
     @staticmethod
     def rst_section(namespace: str, prefix: str, title: str) -> str:
         """Add a section to the document"""
         return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
 
-
     @staticmethod
     def rst_subtitle(title: str) -> str:
         """Add a subtitle to the document"""
         return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
 
-
     @staticmethod
     def rst_title(title: str) -> str:
         """Add a title to the document"""
         return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
 
-
     def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
         """Format a list using inlines"""
         return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
 
-
     @staticmethod
     def rst_ref(namespace: str, prefix: str, name: str) -> str:
         """Add a hyperlink to the document"""
@@ -119,10 +100,9 @@ class RstFormatters:
                     'nested-attributes': 'attribute-set',
                     'struct': 'definition'}
         if prefix in mappings:
-            prefix = mappings[prefix]
+            prefix = mappings.get(prefix, "")
         return f":ref:`{namespace}-{prefix}-{name}`"
 
-
     def rst_header(self) -> str:
         """The headers for all the auto generated RST files"""
         lines = []
@@ -132,7 +112,6 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_toctree(maxdepth: int = 2) -> str:
         """Generate a toctree RST primitive"""
@@ -143,16 +122,13 @@ class RstFormatters:
 
         return "\n".join(lines)
 
-
     @staticmethod
     def rst_label(title: str) -> str:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
-# =======
-# Parsers
-# =======
 class YnlDocGenerator:
+    """YAML Netlink specs Parser"""
 
     fmt = RstFormatters()
 
@@ -164,7 +140,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
@@ -177,16 +152,16 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
         """Parse 'attributes' section"""
         if "attributes" not in attrs:
             return ""
-        lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+        lines = [self.fmt.rst_fields("attributes",
+                                     self.fmt.rst_list_inline(attrs["attributes"]),
+                                     level + 1)]
 
         return "\n".join(lines)
 
-
     def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
         """Parse operations block"""
         preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +169,8 @@ class YnlDocGenerator:
         lines = []
 
         for operation in operations:
-            lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'operation',
+                                              operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
@@ -206,7 +182,8 @@ class YnlDocGenerator:
                     value = self.fmt.rst_ref(namespace, key, value)
                 lines.append(self.fmt.rst_fields(key, value, 0))
             if 'flags' in operation:
-                lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+                lines.append(self.fmt.rst_fields('flags',
+                                                 self.fmt.rst_list_inline(operation['flags'])))
 
             if "do" in operation:
                 lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +197,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
         """Parse a list of entries"""
         ignored = ["pad"]
@@ -235,17 +211,19 @@ class YnlDocGenerator:
                 if type_:
                     field_name += f" ({self.fmt.inline(type_)})"
                 lines.append(
-                    self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+                    self.fmt.rst_fields(field_name,
+                                        self.fmt.sanitize(entry.get("doc", "")),
+                                        level)
                 )
             elif isinstance(entry, list):
                 lines.append(self.fmt.rst_list_inline(entry, level))
             else:
-                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+                lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+                                                 level))
 
         lines.append("\n")
         return "\n".join(lines)
 
-
     def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
         """Parse definitions section"""
         preprocessed = ["name", "entries", "members"]
@@ -270,7 +248,6 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse attribute from attribute-set"""
         preprocessed = ["name", "type"]
@@ -279,7 +256,8 @@ class YnlDocGenerator:
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+                                              entry["name"]))
             for attr in entry["attributes"]:
                 type_ = attr.get("type")
                 attr_line = attr["name"]
@@ -301,13 +279,13 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
         """Parse sub-message definitions"""
         lines = []
 
         for entry in entries:
-            lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+            lines.append(self.fmt.rst_section(namespace, 'sub-message',
+                                              entry["name"]))
             for fmt in entry["formats"]:
                 value = fmt["value"]
 
@@ -315,13 +293,14 @@ class YnlDocGenerator:
                 for attr in ['fixed-header', 'attribute-set']:
                     if attr in fmt:
                         lines.append(self.fmt.rst_fields(attr,
-                                                self.fmt.rst_ref(namespace, attr, fmt[attr]),
-                                                1))
+                                                         self.fmt.rst_ref(namespace,
+                                                                          attr,
+                                                                          fmt[attr]),
+                                                         1))
                 lines.append("\n")
 
         return "\n".join(lines)
 
-
     def parse_yaml(self, obj: Dict[str, Any]) -> str:
         """Format the whole YAML into a RST string"""
         lines = []
@@ -344,7 +323,8 @@ class YnlDocGenerator:
         # Operations
         if "operations" in obj:
             lines.append(self.fmt.rst_subtitle("Operations"))
-            lines.append(self.parse_operations(obj["operations"]["list"], family))
+            lines.append(self.parse_operations(obj["operations"]["list"],
+                                               family))
 
         # Multicast groups
         if "mcast-groups" in obj:
@@ -368,11 +348,9 @@ class YnlDocGenerator:
 
         return "\n".join(lines)
 
-
     # Main functions
     # ==============
 
-
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
@@ -381,7 +359,6 @@ class YnlDocGenerator:
 
         return content
 
-
     def generate_main_index_rst(self, output: str, index_dir: str) -> None:
         """Generate the `networking_spec/index` content and write to the file"""
         lines = []
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 07/15] docs: sphinx: add a parser for yaml files for Netlink specs
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (5 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 06/15] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 08/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
                   ` (8 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

Add a simple sphinx.Parser to handle yaml files and add the
the code to handle Netlink specs. All other yaml files are
ignored.

The code was written in a way that parsing yaml for different
subsystems and even for different parts of Netlink are easy.

All it takes to have a different parser is to add an
import line similar to:

	from netlink_yml_parser import YnlDocGenerator

adding the corresponding parser somewhere at the extension:

	netlink_parser = YnlDocGenerator()

And then add a logic inside parse() to handle different
doc outputs, depending on the file location, similar to:

        if "/netlink/specs/" in fname:
            msg = self.netlink_parser.parse_yaml_file(fname)

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
 1 file changed, 76 insertions(+)
 create mode 100755 Documentation/sphinx/parser_yaml.py

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..635945e1c5ba
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,76 @@
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+import re
+import sys
+
+from pprint import pformat
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
+
+from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+    """Custom parser for YAML files."""
+
+    # Need at least two elements on this set
+    supported = ('yaml', 'yml')
+
+    netlink_parser = YnlDocGenerator()
+
+    def do_parse(self, inputstring, document, msg):
+        """Parse YAML and generate a document tree."""
+
+        self.setup_parse(inputstring, document)
+
+        result = ViewList()
+
+        try:
+            # Parse message with RSTParser
+            for i, line in enumerate(msg.split('\n')):
+                result.append(line, document.current_source, i)
+
+            rst_parser = RSTParser()
+            rst_parser.parse('\n'.join(result), document)
+
+        except Exception as e:
+            document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+        self.finish_parse()
+
+    # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+    def parse(self, inputstring, document):
+        """Check if a YAML is meant to be parsed."""
+
+        fname = document.current_source
+
+        # Handle netlink yaml specs
+        if "/netlink/specs/" in fname:
+            msg = self.netlink_parser.parse_yaml_file(fname)
+            self.do_parse(inputstring, document, msg)
+
+        # All other yaml files are ignored
+
+def setup(app):
+    """Setup function for the Sphinx extension."""
+
+    # Add YAML parser
+    app.add_source_parser(YamlParser)
+    app.add_source_suffix('.yaml', 'yaml')
+
+    return {
+        'version': '1.0',
+        'parallel_read_safe': True,
+        'parallel_write_safe': True,
+    }
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 08/15] docs: use parser_yaml extension to handle Netlink specs
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (6 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 07/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 09/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
                   ` (7 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.

We are using here a toctree with :glob: property. This way, there
is no need to touch the netlink/specs/index.rst file every time
a new Netlink spec is added/renamed/removed.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/Makefile                        | 17 ----------------
 Documentation/conf.py                         | 20 ++++++++++++++-----
 Documentation/networking/index.rst            |  2 +-
 .../networking/netlink_spec/readme.txt        |  4 ----
 Documentation/sphinx/parser_yaml.py           |  4 ++--
 5 files changed, 18 insertions(+), 29 deletions(-)
 delete mode 100644 Documentation/networking/netlink_spec/readme.txt

diff --git a/Documentation/Makefile b/Documentation/Makefile
index b98477df5ddf..820f07e0afe6 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -104,22 +104,6 @@ quiet_cmd_sphinx = SPHINX  $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
 		cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
 	fi
 
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
-	$(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
-	$(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
 htmldocs:
 	@$(srctree)/scripts/sphinx-pre-install --version-check
 	@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -186,7 +170,6 @@ refcheckdocs:
 	$(Q)cd $(srctree);scripts/documentation-file-ref-check
 
 cleandocs:
-	$(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
 	$(Q)rm -rf $(BUILDDIR)
 	$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
 
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 4ba4ee45e599..6af61e26cec5 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -38,6 +38,15 @@ exclude_patterns = []
 dyn_include_patterns = []
 dyn_exclude_patterns = ['output']
 
+# Currently, only netlink/specs has a parser for yaml.
+# Prefer using include patterns if available, as it is faster
+if has_include_patterns:
+    dyn_include_patterns.append('netlink/specs/*.yaml')
+else:
+    dyn_exclude_patterns.append('netlink/*.yaml')
+    dyn_exclude_patterns.append('devicetree/bindings/**.yaml')
+    dyn_exclude_patterns.append('core-api/kho/bindings/**.yaml')
+
 # Properly handle include/exclude patterns
 # ----------------------------------------
 
@@ -105,7 +114,7 @@ needs_sphinx = '3.4.3'
 extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
               'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
               'maintainers_include', 'sphinx.ext.autosectionlabel',
-              'kernel_abi', 'kernel_feat', 'translations']
+              'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
 
 # Since Sphinx version 3, the C function parser is more pedantic with regards
 # to type checking. Due to that, having macros at c:function cause problems.
@@ -203,10 +212,11 @@ else:
 # Add any paths that contain templates here, relative to this directory.
 templates_path = ['sphinx/templates']
 
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+        '.rst': 'restructuredtext',
+        '.yaml': 'yaml',
+}
 
 # The encoding of source files.
 #source_encoding = 'utf-8-sig'
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
    filter
    generic-hdlc
    generic_netlink
-   netlink_spec/index
+   ../netlink/specs/index
    gen_stats
    gtp
    ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 635945e1c5ba..2b2af239a1c2 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -15,9 +15,9 @@ from sphinx.util import logging
 from sphinx.parsers import Parser
 
 srctree = os.path.abspath(os.environ["srctree"])
-sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl"))
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl/lib"))
 
-from netlink_yml_parser import YnlDocGenerator        # pylint: disable=C0413
+from doc_generator import YnlDocGenerator        # pylint: disable=C0413
 
 logger = logging.getLogger(__name__)
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 09/15] docs: uapi: netlink: update netlink specs link
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (7 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 08/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 10/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
                   ` (6 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

With the recent parser_yaml extension, and the removal of the
auto-generated ReST source files, the location of netlink
specs changed.

Update uAPI accordingly.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/userspace-api/netlink/index.rst | 2 +-
 Documentation/userspace-api/netlink/specs.rst | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/Documentation/userspace-api/netlink/index.rst b/Documentation/userspace-api/netlink/index.rst
index c1b6765cc963..83ae25066591 100644
--- a/Documentation/userspace-api/netlink/index.rst
+++ b/Documentation/userspace-api/netlink/index.rst
@@ -18,4 +18,4 @@ Netlink documentation for users.
 
 See also:
  - :ref:`Documentation/core-api/netlink.rst <kernel_netlink>`
- - :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - :ref:`Documentation/netlink/specs/index.rst <specs>`
diff --git a/Documentation/userspace-api/netlink/specs.rst b/Documentation/userspace-api/netlink/specs.rst
index 1b50d97d8d7c..debb4bfca5c4 100644
--- a/Documentation/userspace-api/netlink/specs.rst
+++ b/Documentation/userspace-api/netlink/specs.rst
@@ -15,7 +15,7 @@ kernel headers directly.
 Internally kernel uses the YAML specs to generate:
 
  - the C uAPI header
- - documentation of the protocol as a ReST file - see :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - documentation of the protocol as a ReST file - see :ref:`Documentation/netlink/specs/index.rst <specs>`
  - policy tables for input attribute validation
  - operation tables
 
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 10/15] tools: ynl_gen_rst.py: drop support for generating index files
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (8 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 09/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 11/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
                   ` (5 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

As we're now using an index file with a glob, there's no need
to generate index files anymore.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 26 ------------------------
 tools/net/ynl/pyynl/ynl_gen_rst.py       | 26 ------------------------
 2 files changed, 52 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index f71360f0ceb7..866551726723 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -358,29 +358,3 @@ class YnlDocGenerator:
             content = self.parse_yaml(yaml_data)
 
         return content
-
-    def generate_main_index_rst(self, output: str, index_dir: str) -> None:
-        """Generate the `networking_spec/index` content and write to the file"""
-        lines = []
-
-        lines.append(self.fmt.rst_header())
-        lines.append(self.fmt.rst_label("specs"))
-        lines.append(self.fmt.rst_title("Netlink Family Specifications"))
-        lines.append(self.fmt.rst_toctree(1))
-
-        index_fname = os.path.basename(output)
-        base, ext = os.path.splitext(index_fname)
-
-        if not index_dir:
-            index_dir = os.path.dirname(output)
-
-        logging.debug(f"Looking for {ext} files in %s", index_dir)
-        for filename in sorted(os.listdir(index_dir)):
-            if not filename.endswith(ext) or filename == index_fname:
-                continue
-            base, ext = os.path.splitext(filename)
-            lines.append(f"   {base}\n")
-
-        logging.debug("Writing an index file at %s", output)
-
-        return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index b5a665eeaa5a..90ae19aac89d 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -31,9 +31,6 @@ def parse_arguments() -> argparse.Namespace:
 
     # Index and input are mutually exclusive
     group = parser.add_mutually_exclusive_group()
-    group.add_argument(
-        "-x", "--index", action="store_true", help="Generate the index page"
-    )
     group.add_argument("-i", "--input", help="YAML file name")
 
     args = parser.parse_args()
@@ -63,25 +60,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
         rst_file.write(content)
 
 
-def generate_main_index_rst(output: str) -> None:
-    """Generate the `networking_spec/index` content and write to the file"""
-
-    lines.append(rst_header())
-    lines.append(rst_label("specs"))
-    lines.append(rst_title("Netlink Family Specifications"))
-    lines.append(rst_toctree(1))
-
-    index_dir = os.path.dirname(output)
-    logging.debug("Looking for .rst files in %s", index_dir)
-    for filename in sorted(os.listdir(index_dir)):
-        if not filename.endswith(".rst") or filename == "index.rst":
-            continue
-        lines.append(f"   {filename.replace('.rst', '')}\n")
-
-    logging.debug("Writing an index file at %s", output)
-    write_to_rstfile(msg, output)
-
-
 def main() -> None:
     """Main function that reads the YAML files and generates the RST files"""
 
@@ -100,10 +78,6 @@ def main() -> None:
 
         write_to_rstfile(content, args.output)
 
-    if args.index:
-        # Generate the index RST file
-        generate_main_index_rst(args.output)
-
 
 if __name__ == "__main__":
     main()
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 11/15] docs: netlink: remove obsolete .gitignore from unused directory
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (9 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 10/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 12/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
                   ` (4 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

The previous code was generating source rst files
under Documentation/networking/netlink_spec/. With the
Sphinx YAML parser, this is now gone. So, stop ignoring
*.rst files inside netlink specs directory.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
 Documentation/networking/netlink_spec/.gitignore | 1 -
 1 file changed, 1 deletion(-)
 delete mode 100644 Documentation/networking/netlink_spec/.gitignore

diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 12/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (10 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 11/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 13/15] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
                   ` (3 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
	stern

The documentation build depends on the parsing code
at ynl_gen_rst.py. Ensure that changes to it will be c/c
to linux-doc ML and maintainers by adding an entry for
it. This way, if a change there would affect the build,
or the minimal version required for Python, doc developers
may know in advance.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
---
 MAINTAINERS | 1 +
 1 file changed, 1 insertion(+)

diff --git a/MAINTAINERS b/MAINTAINERS
index a92290fffa16..caa3425e5755 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -7202,6 +7202,7 @@ F:	scripts/get_abi.py
 F:	scripts/kernel-doc*
 F:	scripts/lib/abi/*
 F:	scripts/lib/kdoc/*
+F:	tools/net/ynl/pyynl/netlink_yml_parser.py
 F:	scripts/sphinx-pre-install
 X:	Documentation/ABI/
 X:	Documentation/admin-guide/media/
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 13/15] tools: netlink_yml_parser.py: add line numbers to parsed data
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (11 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 12/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 14/15] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
                   ` (2 subsequent siblings)
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

When something goes wrong, we want Sphinx error to point to the
right line number from the original source, not from the
processed ReST data.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 tools/net/ynl/pyynl/lib/doc_generator.py | 34 ++++++++++++++++++++++--
 1 file changed, 32 insertions(+), 2 deletions(-)

diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 866551726723..a9d8ab6f2639 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -20,6 +20,16 @@
 from typing import Any, Dict, List
 import yaml
 
+LINE_STR = '__lineno__'
+
+class NumberedSafeLoader(yaml.SafeLoader):
+    """Override the SafeLoader class to add line number to parsed data"""
+
+    def construct_mapping(self, node):
+        mapping = super().construct_mapping(node)
+        mapping[LINE_STR] = node.start_mark.line
+
+        return mapping
 
 class RstFormatters:
     """RST Formatters"""
@@ -127,6 +137,11 @@ class RstFormatters:
         """Return a formatted label"""
         return f".. _{title}:\n\n"
 
+    @staticmethod
+    def rst_lineno(lineno: int) -> str:
+        """Return a lineno comment"""
+        return f".. LINENO {lineno}\n"
+
 class YnlDocGenerator:
     """YAML Netlink specs Parser"""
 
@@ -144,6 +159,9 @@ class YnlDocGenerator:
         """Parse 'do' section and return a formatted string"""
         lines = []
         for key in do_dict.keys():
+            if key == LINE_STR:
+                lines.append(self.fmt.rst_lineno(do_dict[key]))
+                continue
             lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
             if key in ['request', 'reply']:
                 lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
@@ -174,6 +192,10 @@ class YnlDocGenerator:
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
+                if key == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(operation[key]))
+                    continue
+
                 if key in preprocessed:
                     # Skip the special fields
                     continue
@@ -233,6 +255,9 @@ class YnlDocGenerator:
         for definition in defs:
             lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
             for k in definition.keys():
+                if k == LINE_STR:
+                    lines.append(self.fmt.rst_lineno(definition[k]))
+                    continue
                 if k in preprocessed + ignored:
                     continue
                 lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
@@ -268,6 +293,9 @@ class YnlDocGenerator:
                 lines.append(self.fmt.rst_subsubsection(attr_line))
 
                 for k in attr.keys():
+                    if k == LINE_STR:
+                        lines.append(self.fmt.rst_lineno(attr[k]))
+                        continue
                     if k in preprocessed + ignored:
                         continue
                     if k in linkable:
@@ -306,6 +334,8 @@ class YnlDocGenerator:
         lines = []
 
         # Main header
+        lineno = obj.get('__lineno__', 0)
+        lines.append(self.fmt.rst_lineno(lineno))
 
         family = obj['name']
 
@@ -354,7 +384,7 @@ class YnlDocGenerator:
     def parse_yaml_file(self, filename: str) -> str:
         """Transform the YAML specified by filename into an RST-formatted string"""
         with open(filename, "r", encoding="utf-8") as spec_file:
-            yaml_data = yaml.safe_load(spec_file)
-            content = self.parse_yaml(yaml_data)
+            numbered_yaml = yaml.load(spec_file, Loader=NumberedSafeLoader)
+            content = self.parse_yaml(numbered_yaml)
 
         return content
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 14/15] docs: parser_yaml.py: add support for line numbers from the parser
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (12 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 13/15] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 11:46 ` [PATCH v6 15/15] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
  2025-06-18 15:46 ` [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Akira Yokosawa
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Instead of printing line numbers from the temp converted ReST
file, get them from the original source.

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/sphinx/parser_yaml.py      | 12 ++++++++++--
 tools/net/ynl/pyynl/lib/doc_generator.py | 16 ++++++++++++----
 2 files changed, 22 insertions(+), 6 deletions(-)

diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 2b2af239a1c2..5360fcfd4fde 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -29,6 +29,8 @@ class YamlParser(Parser):
 
     netlink_parser = YnlDocGenerator()
 
+    re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
+
     def do_parse(self, inputstring, document, msg):
         """Parse YAML and generate a document tree."""
 
@@ -38,8 +40,14 @@ class YamlParser(Parser):
 
         try:
             # Parse message with RSTParser
-            for i, line in enumerate(msg.split('\n')):
-                result.append(line, document.current_source, i)
+            lineoffset = 0;
+            for line in msg.split('\n'):
+                match = self.re_lineno.match(line)
+                if match:
+                    lineoffset = int(match.group(1))
+                    continue
+
+                result.append(line, document.current_source, lineoffset)
 
             rst_parser = RSTParser()
             rst_parser.parse('\n'.join(result), document)
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index a9d8ab6f2639..7f4f98983cdf 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -158,9 +158,11 @@ class YnlDocGenerator:
     def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
         """Parse 'do' section and return a formatted string"""
         lines = []
+        if LINE_STR in do_dict:
+            lines.append(self.fmt.rst_lineno(do_dict[LINE_STR]))
+
         for key in do_dict.keys():
             if key == LINE_STR:
-                lines.append(self.fmt.rst_lineno(do_dict[key]))
                 continue
             lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
             if key in ['request', 'reply']:
@@ -187,13 +189,15 @@ class YnlDocGenerator:
         lines = []
 
         for operation in operations:
+            if LINE_STR in operation:
+                lines.append(self.fmt.rst_lineno(operation[LINE_STR]))
+
             lines.append(self.fmt.rst_section(namespace, 'operation',
                                               operation["name"]))
             lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
 
             for key in operation.keys():
                 if key == LINE_STR:
-                    lines.append(self.fmt.rst_lineno(operation[key]))
                     continue
 
                 if key in preprocessed:
@@ -253,10 +257,12 @@ class YnlDocGenerator:
         lines = []
 
         for definition in defs:
+            if LINE_STR in definition:
+                lines.append(self.fmt.rst_lineno(definition[LINE_STR]))
+
             lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
             for k in definition.keys():
                 if k == LINE_STR:
-                    lines.append(self.fmt.rst_lineno(definition[k]))
                     continue
                 if k in preprocessed + ignored:
                     continue
@@ -284,6 +290,9 @@ class YnlDocGenerator:
             lines.append(self.fmt.rst_section(namespace, 'attribute-set',
                                               entry["name"]))
             for attr in entry["attributes"]:
+                if LINE_STR in attr:
+                    lines.append(self.fmt.rst_lineno(attr[LINE_STR]))
+
                 type_ = attr.get("type")
                 attr_line = attr["name"]
                 if type_:
@@ -294,7 +303,6 @@ class YnlDocGenerator:
 
                 for k in attr.keys():
                     if k == LINE_STR:
-                        lines.append(self.fmt.rst_lineno(attr[k]))
                         continue
                     if k in preprocessed + ignored:
                         continue
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* [PATCH v6 15/15] docs: sphinx: add a file with the requirements for lowest version
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (13 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 14/15] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
@ 2025-06-18 11:46 ` Mauro Carvalho Chehab
  2025-06-18 15:46 ` [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Akira Yokosawa
  15 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 11:46 UTC (permalink / raw)
  To: Linux Doc Mailing List, Jonathan Corbet
  Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
	Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
	Randy Dunlap, joel, linux-kernel-mentees, linux-kernel, lkmm,
	netdev, peterz, stern

Those days, it is hard to install a virtual env that would
build docs with Sphinx 3.4.3, as even python 3.13 is not
compatible anymore with it.

	/usr/bin/python3.9 -m venv sphinx_3.4.3
	. sphinx_3.4.3/bin/activate
	pip install -r Documentation/sphinx/min_requirements.txt

Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
 Documentation/doc-guide/sphinx.rst        | 15 +++++++++++++++
 Documentation/sphinx/min_requirements.txt |  8 ++++++++
 2 files changed, 23 insertions(+)
 create mode 100644 Documentation/sphinx/min_requirements.txt

diff --git a/Documentation/doc-guide/sphinx.rst b/Documentation/doc-guide/sphinx.rst
index 5a91df105141..13943eb532ac 100644
--- a/Documentation/doc-guide/sphinx.rst
+++ b/Documentation/doc-guide/sphinx.rst
@@ -131,6 +131,21 @@ It supports two optional parameters:
 ``--no-virtualenv``
 	Use OS packaging for Sphinx instead of Python virtual environment.
 
+Installing Sphinx Minimal Version
+---------------------------------
+
+When changing Sphinx build system, it is important to ensure that
+the minimal version will still be supported. Nowadays, it is
+becoming harder to do that on modern distributions, as it is not
+possible to install with Python 3.13 and above.
+
+The recommended way is to use the lowest supported Python version
+as defined at Documentation/process/changes.rst, creating
+a venv with it with, and install minimal requirements with::
+
+	/usr/bin/python3.9 -m venv sphinx_min
+	. sphinx_min/bin/activate
+	pip install -r Documentation/sphinx/min_requirements.txt
 
 Sphinx Build
 ============
diff --git a/Documentation/sphinx/min_requirements.txt b/Documentation/sphinx/min_requirements.txt
new file mode 100644
index 000000000000..89ea36d5798f
--- /dev/null
+++ b/Documentation/sphinx/min_requirements.txt
@@ -0,0 +1,8 @@
+Sphinx==3.4.3
+jinja2<3.1
+docutils<0.18
+sphinxcontrib-applehelp==1.0.4
+sphinxcontrib-devhelp==1.0.2
+sphinxcontrib-htmlhelp==2.0.1
+sphinxcontrib-qthelp==1.0.3
+sphinxcontrib-serializinghtml==1.1.5
-- 
2.49.0


^ permalink raw reply related	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-18 11:46 ` [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
@ 2025-06-18 15:42   ` Breno Leitao
  2025-06-18 19:54     ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 22+ messages in thread
From: Breno Leitao @ 2025-06-18 15:42 UTC (permalink / raw)
  To: Mauro Carvalho Chehab
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

On Wed, Jun 18, 2025 at 01:46:28PM +0200, Mauro Carvalho Chehab wrote:
> When one does:
> 	make SPHINXDIRS="foo" htmldocs
> 
> All patterns would be relative to Documentation/foo, which
> causes the include/exclude patterns like:
> 
> 	include_patterns = [
> 		...
> 		f'foo/*.{ext}',
> 	]
> 
> to break. This is not what it is expected. Address it by
> adding a logic to dynamically adjust the pattern when
> SPHINXDIRS is used.
> 
> That allows adding parsers for other file types.
> 
> It should be noticed that include_patterns was added on
> Sphinx 5.1:
> 	https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns
> 
> So, a backward-compatible code is needed when we start
> using it for real.
> 
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
> ---
>  Documentation/conf.py | 67 ++++++++++++++++++++++++++++++++++++++++---
>  1 file changed, 63 insertions(+), 4 deletions(-)
> 
> diff --git a/Documentation/conf.py b/Documentation/conf.py
> index 12de52a2b17e..4ba4ee45e599 100644
> --- a/Documentation/conf.py
> +++ b/Documentation/conf.py
> @@ -17,6 +17,66 @@ import os
>  import sphinx
>  import shutil
>  
> +# Get Sphinx version
> +major, minor, patch = sphinx.version_info[:3]
> +
> +# Include_patterns were added on Sphinx 5.1
> +if (major < 5) or (major == 5 and minor < 1):
> +    has_include_patterns = False
> +else:
> +    has_include_patterns = True
> +    # Include patterns that don't contain directory names, in glob format
> +    include_patterns = ['**.rst']
> +
> +# Location of Documentation/ directory
> +doctree = os.path.abspath('.')
> +
> +# Exclude of patterns that don't contain directory names, in glob format.
> +exclude_patterns = []
> +
> +# List of patterns that contain directory names in glob format.
> +dyn_include_patterns = []
> +dyn_exclude_patterns = ['output']
> +
> +# Properly handle include/exclude patterns
> +# ----------------------------------------
> +
> +def update_patterns(app):
> +

PEP-257 says we don't want a line before docstring:

https://peps.python.org/pep-0257/#multi-line-docstrings

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree)
  2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
                   ` (14 preceding siblings ...)
  2025-06-18 11:46 ` [PATCH v6 15/15] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
@ 2025-06-18 15:46 ` Akira Yokosawa
  2025-06-18 16:20   ` Mauro Carvalho Chehab
  15 siblings, 1 reply; 22+ messages in thread
From: Akira Yokosawa @ 2025-06-18 15:46 UTC (permalink / raw)
  To: Mauro Carvalho Chehab, Linux Doc Mailing List, Jonathan Corbet
  Cc: linux-kernel, David S. Miller, Ignacio Encinas Rubio, Marco Elver,
	Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Breno Leitao, Randy Dunlap, Akira Yokosawa

Hi Mauro,

On 2025/06/18 20:46, Mauro Carvalho Chehab wrote:
> As discussed at:
>    https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
> 
> changeset f061c9f7d058 ("Documentation: Document each netlink family")
> added a logic which generates *.rst files inside $(srctree). This is bad
> when O=<BUILDDIR> is used.
> 
> A recent change renamed the yaml files used by Netlink, revealing a bad
> side effect: as "make cleandocs" don't clean the produced files and symbols
> appear duplicated for people that don't build the kernel from scratch.
> 
> This series adds an yaml parser extension and uses an index file with glob for
> *. We opted to write such extension in a way that no actual yaml conversion
> code is inside it. This makes it flexible enough to handle other types of yaml
> files in the future. The actual yaml conversion logic were placed at 
> netlink_yml_parser.py. 
> 
> As requested by YNL maintainers, this version has netlink_yml_parser.py
> inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
> binaries, nor to have Python libraries spread all over the Kernel. IMO,
> the best is to put all of them on a common place (scripts/lib, python/lib,
> lib/python, ...) but, as this can be solved later, for now let's keep it this
> way.
> 
> ---
> 
> v6:
> - YNL doc parser is now at tools/net/ynl/pyynl/lib/doc_generator.py;
> - two patches got merged;
> - added instructions to test docs with Sphinx 3.4.3 (minimal supported
>   version);
> - minor fixes.

Quick tests against Sphinx 3.4.3 using container images based on
debian:bullseye and almalinux:9, both of which have 3.4.3 as their distro
packages, emits a *bunch* of warnings like the following:

/<srcdir>/Documentation/netlink/specs/conntrack.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
/<srcdir>/Documentation/netlink/specs/devlink.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
/<srcdir>/Documentation/netlink/specs/dpll.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
/<srcdir>/Documentation/netlink/specs/ethtool.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
/<srcdir>/Documentation/netlink/specs/fou.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
[...]

I suspect there should be a minimal required minimal version of PyYAML.

"pip freeze" based on almalinux:9 says:

    PyYAML==5.4.1

"pip freeze" based on debian:bullseye says:

    PyYAML==5.3.1

What is the minimal required version here?

And if users of those old distros need to manually upgrade PyYAML,
why don't you suggest them to upgrade Sphinx as well?

        Thanks, Akira

>
> v5:
> - some patch reorg;
> - netlink_yml_parser.py is now together with ynl tools;
> - minor fixes.
[...]


^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree)
  2025-06-18 15:46 ` [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Akira Yokosawa
@ 2025-06-18 16:20   ` Mauro Carvalho Chehab
  2025-06-19  1:34     ` Akira Yokosawa
  0 siblings, 1 reply; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 16:20 UTC (permalink / raw)
  To: Akira Yokosawa, Breno Leitao
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	David S. Miller, Ignacio Encinas Rubio, Marco Elver, Shuah Khan,
	Donald Hunter, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Randy Dunlap

Em Thu, 19 Jun 2025 00:46:15 +0900
Akira Yokosawa <akiyks@gmail.com> escreveu:

> Hi Mauro,
> 
> On 2025/06/18 20:46, Mauro Carvalho Chehab wrote:
> > As discussed at:
> >    https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
> > 
> > changeset f061c9f7d058 ("Documentation: Document each netlink family")
> > added a logic which generates *.rst files inside $(srctree). This is bad
> > when O=<BUILDDIR> is used.
> > 
> > A recent change renamed the yaml files used by Netlink, revealing a bad
> > side effect: as "make cleandocs" don't clean the produced files and symbols
> > appear duplicated for people that don't build the kernel from scratch.
> > 
> > This series adds an yaml parser extension and uses an index file with glob for
> > *. We opted to write such extension in a way that no actual yaml conversion
> > code is inside it. This makes it flexible enough to handle other types of yaml
> > files in the future. The actual yaml conversion logic were placed at 
> > netlink_yml_parser.py. 
> > 
> > As requested by YNL maintainers, this version has netlink_yml_parser.py
> > inside tools/net/ynl/pyynl/ directory. I don't like mixing libraries with
> > binaries, nor to have Python libraries spread all over the Kernel. IMO,
> > the best is to put all of them on a common place (scripts/lib, python/lib,
> > lib/python, ...) but, as this can be solved later, for now let's keep it this
> > way.
> > 
> > ---
> > 
> > v6:
> > - YNL doc parser is now at tools/net/ynl/pyynl/lib/doc_generator.py;
> > - two patches got merged;
> > - added instructions to test docs with Sphinx 3.4.3 (minimal supported
> >   version);
> > - minor fixes.  
> 
> Quick tests against Sphinx 3.4.3 using container images based on
> debian:bullseye and almalinux:9, both of which have 3.4.3 as their distro
> packages, emits a *bunch* of warnings like the following:
> 
> /<srcdir>/Documentation/netlink/specs/conntrack.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> /<srcdir>/Documentation/netlink/specs/devlink.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> /<srcdir>/Documentation/netlink/specs/dpll.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> /<srcdir>/Documentation/netlink/specs/ethtool.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> /<srcdir>/Documentation/netlink/specs/fou.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> [...]
> 
> I suspect there should be a minimal required minimal version of PyYAML.

Likely yes. From my side, I didn't change anything related to PyYAML, 
except by adding a loader at the latest patch to add line numbers.

The above warnings don't seem related. So, probably this was already
an issue.

Funny enough, I did, on my venv:

	$ pip install PyYAML==5.1
	$ tools/net/ynl/pyynl/ynl_gen_rst.py -i Documentation/netlink/specs/dpll.yaml -o Documentation/output/netlink/specs/dpll.rst -v
	...
	$ make clean; make SPHINXDIRS="netlink/specs" htmldocs
	...

but didn't get any issue (I have a later version installed outside
venv - not sure it it will do the right thing).

That's what I have at venv:

----------------------------- ---------
Package                       Version
----------------------------- ---------
alabaster                     0.7.13
babel                         2.17.0
certifi                       2025.6.15
charset-normalizer            3.4.2
docutils                      0.17.1
idna                          3.10
imagesize                     1.4.1
Jinja2                        2.8.1
MarkupSafe                    1.1.1
packaging                     25.0
pip                           25.1.1
Pygments                      2.19.1
PyYAML                        5.1
requests                      2.32.4
setuptools                    80.1.0
snowballstemmer               3.0.1
Sphinx                        3.4.3
sphinxcontrib-applehelp       1.0.4
sphinxcontrib-devhelp         1.0.2
sphinxcontrib-htmlhelp        2.0.1
sphinxcontrib-jsmath          1.0.1
sphinxcontrib-qthelp          1.0.3
sphinxcontrib-serializinghtml 1.1.5
urllib3                       2.4.0
----------------------------- ---------

> "pip freeze" based on almalinux:9 says:
> 
>     PyYAML==5.4.1
> 
> "pip freeze" based on debian:bullseye says:
> 
>     PyYAML==5.3.1
> 
> What is the minimal required version here?

Breno, what's the minimal version? Please update requirements.txt
to ensure that, and modify ./scripts/sphinx-pre-install to
check for it.

> 
> And if users of those old distros need to manually upgrade PyYAML,
> why don't you suggest them to upgrade Sphinx as well?

The criteria we used to define minimal version for python/sphinx
was having them released at the end of 2020/beginning 2021. So,
up to ~4 years old. We also double-checked latest LTS versions
from major distros.

With that, PyYAML 5.4.1 met the ~4 years old, and so 5.3.1 and
5.1.

funny enough:

	$ git grep tab_width Documentation/netlink/

doesn't return anything. Yet, tab_width is used by sphinx
extensions. The in-kernel ones do it the right way using
get:

	tab_width = self.options.get('tab-width',
				     self.state.document.settings.tab_width)

But perhaps some other extension you might have installed on your
environment has issues, or maybe Documentation/sphinx/parser_yaml.py
need to expand tabs with certain versions of docutils.

Please compare the versions that you're using on your test
environment with the ones I used here.

Regards,
Mauro

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns
  2025-06-18 15:42   ` Breno Leitao
@ 2025-06-18 19:54     ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-18 19:54 UTC (permalink / raw)
  To: Breno Leitao
  Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
	David S. Miller, Donald Hunter, Eric Dumazet,
	Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
	Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
	linux-kernel, lkmm, netdev, peterz, stern

Em Wed, 18 Jun 2025 08:42:51 -0700
Breno Leitao <leitao@debian.org> escreveu:

> On Wed, Jun 18, 2025 at 01:46:28PM +0200, Mauro Carvalho Chehab wrote:
> > When one does:
> > 	make SPHINXDIRS="foo" htmldocs
> > 
> > All patterns would be relative to Documentation/foo, which
> > causes the include/exclude patterns like:
> > 
> > 	include_patterns = [
> > 		...
> > 		f'foo/*.{ext}',
> > 	]
> > 
> > to break. This is not what it is expected. Address it by
> > adding a logic to dynamically adjust the pattern when
> > SPHINXDIRS is used.
> > 
> > That allows adding parsers for other file types.
> > 
> > It should be noticed that include_patterns was added on
> > Sphinx 5.1:
> > 	https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-include_patterns
> > 
> > So, a backward-compatible code is needed when we start
> > using it for real.
> > 
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
> > ---
> >  Documentation/conf.py | 67 ++++++++++++++++++++++++++++++++++++++++---
> >  1 file changed, 63 insertions(+), 4 deletions(-)
> > 
> > diff --git a/Documentation/conf.py b/Documentation/conf.py
> > index 12de52a2b17e..4ba4ee45e599 100644
> > --- a/Documentation/conf.py
> > +++ b/Documentation/conf.py
> > @@ -17,6 +17,66 @@ import os
> >  import sphinx
> >  import shutil
> >  
> > +# Get Sphinx version
> > +major, minor, patch = sphinx.version_info[:3]
> > +
> > +# Include_patterns were added on Sphinx 5.1
> > +if (major < 5) or (major == 5 and minor < 1):
> > +    has_include_patterns = False
> > +else:
> > +    has_include_patterns = True
> > +    # Include patterns that don't contain directory names, in glob format
> > +    include_patterns = ['**.rst']
> > +
> > +# Location of Documentation/ directory
> > +doctree = os.path.abspath('.')
> > +
> > +# Exclude of patterns that don't contain directory names, in glob format.
> > +exclude_patterns = []
> > +
> > +# List of patterns that contain directory names in glob format.
> > +dyn_include_patterns = []
> > +dyn_exclude_patterns = ['output']
> > +
> > +# Properly handle include/exclude patterns
> > +# ----------------------------------------
> > +
> > +def update_patterns(app):
> > +  
> 
> PEP-257 says we don't want a line before docstring:
> 
> https://peps.python.org/pep-0257/#multi-line-docstrings

True, but:

$ pylint Documentation/conf.py 
************* Module conf
Documentation/conf.py:357:0: W0311: Bad indentation. Found 16 spaces, expected 12 (bad-indentation)
Documentation/conf.py:587:0: W0311: Bad indentation. Found 1 spaces, expected 4 (bad-indentation)
Documentation/conf.py:567:1: W0511: FIXME: Do not add the index file here; the result will be too big. Adding (fixme)
Documentation/conf.py:1:0: C0114: Missing module docstring (missing-module-docstring)
Documentation/conf.py:229:0: W0622: Redefining built-in 'copyright' (redefined-builtin)
Documentation/conf.py:21:22: I1101: Module 'sphinx' has no 'version_info' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)
Documentation/conf.py:25:4: C0103: Constant name "has_include_patterns" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:27:4: C0103: Constant name "has_include_patterns" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:65:4: W0612: Unused variable 'sourcedir' (unused-variable)
Documentation/conf.py:66:4: W0612: Unused variable 'builddir' (unused-variable)
Documentation/conf.py:104:0: E0401: Unable to import 'load_config' (import-error)
Documentation/conf.py:104:0: C0413: Import "from load_config import loadConfig" should be placed at the top of the module (wrong-import-position)
Documentation/conf.py:109:0: C0103: Constant name "needs_sphinx" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:186:0: C0103: Constant name "autosectionlabel_prefix_document" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:187:0: C0103: Constant name "autosectionlabel_maxdepth" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:200:8: C0103: Constant name "load_imgmath" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:202:8: C0103: Constant name "load_imgmath" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:204:25: C0209: Formatting a regular string which could be an f-string (consider-using-f-string)
Documentation/conf.py:208:4: C0103: Constant name "math_renderer" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:210:4: C0103: Constant name "math_renderer" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:225:0: C0103: Constant name "master_doc" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:228:0: C0103: Constant name "project" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:229:0: C0103: Constant name "copyright" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:230:0: C0103: Constant name "author" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:253:0: W0702: No exception type(s) specified (bare-except)
Documentation/conf.py:243:4: C0103: Constant name "makefile_version" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:244:4: C0103: Constant name "makefile_patchlevel" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:245:16: R1732: Consider using 'with' for resource-allocating operations (consider-using-with)
Documentation/conf.py:245:16: W1514: Using open without explicitly specifying an encoding (unspecified-encoding)
Documentation/conf.py:259:8: C0103: Constant name "version" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:259:18: C0103: Constant name "release" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:266:0: C0116: Missing function or method docstring (missing-function-docstring)
Documentation/conf.py:284:0: C0103: Constant name "language" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:308:0: C0103: Constant name "pygments_style" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:317:0: C0103: Constant name "todo_include_todos" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:319:0: C0103: Constant name "primary_domain" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:320:0: C0103: Constant name "highlight_language" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:328:0: C0103: Constant name "html_theme" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:334:3: R1714: Consider merging these comparisons with 'in' by using 'html_theme in ('sphinx_rtd_theme', 'sphinx_rtd_dark_mode')'. Use a set instead if elements are hashable. (consider-using-in)
Documentation/conf.py:353:16: W0104: Statement seems to have no effect (pointless-statement)
Documentation/conf.py:364:8: C0103: Constant name "html_theme" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:382:17: C0209: Formatting a regular string which could be an f-string (consider-using-f-string)
Documentation/conf.py:392:0: C0103: Constant name "smartquotes_action" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:404:0: C0103: Constant name "html_logo" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:407:0: C0103: Constant name "htmlhelp_basename" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:487:8: C0103: Constant name "has" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:490:16: C0103: Constant name "has" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:494:36: C0209: Formatting a regular string which could be an f-string (consider-using-f-string)
Documentation/conf.py:551:0: C0103: Constant name "epub_title" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:552:0: C0103: Constant name "epub_author" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:553:0: C0103: Constant name "epub_publisher" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:554:0: C0103: Constant name "epub_copyright" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:571:29: W1406: The u prefix for strings is no longer necessary in Python >=3.0 (redundant-u-string-prefix)
Documentation/conf.py:571:40: W1406: The u prefix for strings is no longer necessary in Python >=3.0 (redundant-u-string-prefix)
Documentation/conf.py:571:51: W1406: The u prefix for strings is no longer necessary in Python >=3.0 (redundant-u-string-prefix)
Documentation/conf.py:577:0: C0103: Constant name "kerneldoc_bin" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:578:0: C0103: Constant name "kerneldoc_srctree" doesn't conform to UPPER_CASE naming style (invalid-name)
Documentation/conf.py:586:0: C0116: Missing function or method docstring (missing-function-docstring)
Documentation/conf.py:18:0: C0411: standard import "shutil" should be placed before third party import "sphinx" (wrong-import-order)
Documentation/conf.py:350:16: W0611: Unused import sphinx_rtd_dark_mode (unused-import)

This file needs a major cleanup, as it is currently not following
any Python code style, and still has some Phython 2.x legacy
coding styles.

Perhaps it is time to take some care of it and on other
sphinx/*.py files that aren't compliant ;-)

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree)
  2025-06-18 16:20   ` Mauro Carvalho Chehab
@ 2025-06-19  1:34     ` Akira Yokosawa
  2025-06-19  6:23       ` Mauro Carvalho Chehab
  0 siblings, 1 reply; 22+ messages in thread
From: Akira Yokosawa @ 2025-06-19  1:34 UTC (permalink / raw)
  To: Mauro Carvalho Chehab, Breno Leitao
  Cc: Linux Doc Mailing List, Jonathan Corbet, linux-kernel,
	David S. Miller, Ignacio Encinas Rubio, Marco Elver, Shuah Khan,
	Donald Hunter, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Randy Dunlap, Akira Yokosawa

On Wed, 18 Jun 2025 18:20:32 +0200, Mauro Carvalho Chehab wrote:
> Em Thu, 19 Jun 2025 00:46:15 +0900
> Akira Yokosawa <akiyks@gmail.com> escreveu:
> 
>> Quick tests against Sphinx 3.4.3 using container images based on
>> debian:bullseye and almalinux:9, both of which have 3.4.3 as their distro
>> packages, emits a *bunch* of warnings like the following:
>>
>> /<srcdir>/Documentation/netlink/specs/conntrack.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
>> /<srcdir>/Documentation/netlink/specs/devlink.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
>> /<srcdir>/Documentation/netlink/specs/dpll.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
>> /<srcdir>/Documentation/netlink/specs/ethtool.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
>> /<srcdir>/Documentation/netlink/specs/fou.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
>> [...]
>>
>> I suspect there should be a minimal required minimal version of PyYAML.
> 
> Likely yes. From my side, I didn't change anything related to PyYAML, 
> except by adding a loader at the latest patch to add line numbers.
> 
> The above warnings don't seem related. So, probably this was already
> an issue.
> 
> Funny enough, I did, on my venv:
> 
> 	$ pip install PyYAML==5.1
> 	$ tools/net/ynl/pyynl/ynl_gen_rst.py -i Documentation/netlink/specs/dpll.yaml -o Documentation/output/netlink/specs/dpll.rst -v
> 	...
> 	$ make clean; make SPHINXDIRS="netlink/specs" htmldocs
> 	...
> 
> but didn't get any issue (I have a later version installed outside
> venv - not sure it it will do the right thing).
> 
> That's what I have at venv:
> 
> ----------------------------- ---------
> Package                       Version
> ----------------------------- ---------
> alabaster                     0.7.13
> babel                         2.17.0
> certifi                       2025.6.15
> charset-normalizer            3.4.2
> docutils                      0.17.1
> idna                          3.10
> imagesize                     1.4.1
> Jinja2                        2.8.1
> MarkupSafe                    1.1.1
> packaging                     25.0
> pip                           25.1.1
> Pygments                      2.19.1
> PyYAML                        5.1
> requests                      2.32.4
> setuptools                    80.1.0
> snowballstemmer               3.0.1
> Sphinx                        3.4.3
> sphinxcontrib-applehelp       1.0.4
> sphinxcontrib-devhelp         1.0.2
> sphinxcontrib-htmlhelp        2.0.1
> sphinxcontrib-jsmath          1.0.1
> sphinxcontrib-qthelp          1.0.3
> sphinxcontrib-serializinghtml 1.1.5
> urllib3                       2.4.0
> ----------------------------- ---------
> 
[...]

> Please compare the versions that you're using on your test
> environment with the ones I used here.

It looks to me like the minimal required version of docutils is 0.17.1
for PyYAML integration.  Both almalinux:9 and debian:11 have 0.16.

Sphinx 4.3.2 of Ubuntu 22.04 comes with docutils 0.17.1, and it is
free of the warnings from PyYAML.

        Thanks, Akira


^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree)
  2025-06-19  1:34     ` Akira Yokosawa
@ 2025-06-19  6:23       ` Mauro Carvalho Chehab
  0 siblings, 0 replies; 22+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-19  6:23 UTC (permalink / raw)
  To: Akira Yokosawa
  Cc: Breno Leitao, Linux Doc Mailing List, Jonathan Corbet,
	linux-kernel, David S. Miller, Ignacio Encinas Rubio, Marco Elver,
	Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek, Paolo Abeni,
	Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
	stern, Randy Dunlap

Em Thu, 19 Jun 2025 10:34:59 +0900
Akira Yokosawa <akiyks@gmail.com> escreveu:

> On Wed, 18 Jun 2025 18:20:32 +0200, Mauro Carvalho Chehab wrote:
> > Em Thu, 19 Jun 2025 00:46:15 +0900
> > Akira Yokosawa <akiyks@gmail.com> escreveu:
> >   
> >> Quick tests against Sphinx 3.4.3 using container images based on
> >> debian:bullseye and almalinux:9, both of which have 3.4.3 as their distro
> >> packages, emits a *bunch* of warnings like the following:
> >>
> >> /<srcdir>/Documentation/netlink/specs/conntrack.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> >> /<srcdir>/Documentation/netlink/specs/devlink.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> >> /<srcdir>/Documentation/netlink/specs/dpll.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> >> /<srcdir>/Documentation/netlink/specs/ethtool.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> >> /<srcdir>/Documentation/netlink/specs/fou.yaml:: WARNING: YAML parsing error: AttributeError("'Values' object has no attribute 'tab_width'")
> >> [...]
> >>
> >> I suspect there should be a minimal required minimal version of PyYAML.  
> > 
> > Likely yes. From my side, I didn't change anything related to PyYAML, 
> > except by adding a loader at the latest patch to add line numbers.
> > 
> > The above warnings don't seem related. So, probably this was already
> > an issue.
> > 
> > Funny enough, I did, on my venv:
> > 
> > 	$ pip install PyYAML==5.1
> > 	$ tools/net/ynl/pyynl/ynl_gen_rst.py -i Documentation/netlink/specs/dpll.yaml -o Documentation/output/netlink/specs/dpll.rst -v
> > 	...
> > 	$ make clean; make SPHINXDIRS="netlink/specs" htmldocs
> > 	...
> > 
> > but didn't get any issue (I have a later version installed outside
> > venv - not sure it it will do the right thing).
> > 
> > That's what I have at venv:
> > 
> > ----------------------------- ---------
> > Package                       Version
> > ----------------------------- ---------
> > alabaster                     0.7.13
> > babel                         2.17.0
> > certifi                       2025.6.15
> > charset-normalizer            3.4.2
> > docutils                      0.17.1
> > idna                          3.10
> > imagesize                     1.4.1
> > Jinja2                        2.8.1
> > MarkupSafe                    1.1.1
> > packaging                     25.0
> > pip                           25.1.1
> > Pygments                      2.19.1
> > PyYAML                        5.1
> > requests                      2.32.4
> > setuptools                    80.1.0
> > snowballstemmer               3.0.1
> > Sphinx                        3.4.3
> > sphinxcontrib-applehelp       1.0.4
> > sphinxcontrib-devhelp         1.0.2
> > sphinxcontrib-htmlhelp        2.0.1
> > sphinxcontrib-jsmath          1.0.1
> > sphinxcontrib-qthelp          1.0.3
> > sphinxcontrib-serializinghtml 1.1.5
> > urllib3                       2.4.0
> > ----------------------------- ---------
> >   
> [...]
> 
> > Please compare the versions that you're using on your test
> > environment with the ones I used here.  
> 
> It looks to me like the minimal required version of docutils is 0.17.1
> for PyYAML integration.  Both almalinux:9 and debian:11 have 0.16.
> 
> Sphinx 4.3.2 of Ubuntu 22.04 comes with docutils 0.17.1, and it is
> free of the warnings from PyYAML.

Yes, it seems so. As I commented on my past e-mail, I think we need
a validation logic that will warn if versions are incompatible.
Using the experimental checks you and me did, and checking the minimal
version on Sphinx release notes (*), it seems to be that a good start
point is this:

            ========  ============  ============
            Sphinx    Min Docutils  Max Docutils
            Version   Version       Version
            --------  ------------  ------------
            < 4.0.0   0.17.1        0.17.1
            < 6.0.0   0.17.1        0.18.1
            < 7.0.0   0.18.0        0.18.1
            >= 7.0.0  0.20.0        0.21.2
            ========  ============  ============

Eventually, we may need to blacklist or whitelist other
combinations, but this would require a lot of time.

(*) I asked a LLM AI to check Sphinx release notes and docutils
    versions at the time Sphinx versions were released to aid
    creating such table. I also added your feedback about
    docutils 0.19 and your and my tests with docutils < 0.17.1.

Thanks,
Mauro

^ permalink raw reply	[flat|nested] 22+ messages in thread

end of thread, other threads:[~2025-06-19  6:23 UTC | newest]

Thread overview: 22+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-06-18 11:46 [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 01/15] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
2025-06-18 15:42   ` Breno Leitao
2025-06-18 19:54     ` Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 02/15] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 03/15] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 04/15] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 05/15] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 06/15] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 07/15] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 08/15] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 09/15] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 10/15] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 11/15] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 12/15] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 13/15] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 14/15] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
2025-06-18 11:46 ` [PATCH v6 15/15] docs: sphinx: add a file with the requirements for lowest version Mauro Carvalho Chehab
2025-06-18 15:46 ` [PATCH v6 00/15] Don't generate netlink .rst files inside $(srctree) Akira Yokosawa
2025-06-18 16:20   ` Mauro Carvalho Chehab
2025-06-19  1:34     ` Akira Yokosawa
2025-06-19  6:23       ` Mauro Carvalho Chehab

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).