* [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree)
@ 2025-06-14 8:55 Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 01/14] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
` (13 more replies)
0 siblings, 14 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Mauro Carvalho Chehab, linux-kernel,
Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
netdev, peterz, stern, Breno Leitao, Jakub Kicinski, Simon Horman
As discussed at:
https://lore.kernel.org/all/20250610101331.62ba466f@foz.lan/
changeset f061c9f7d058 ("Documentation: Document each netlink family")
added a logic which generates *.rst files inside $(srctree). This is bad
when O=<BUILDDIR> is used.
A recent change renamed the yaml files used by Netlink, revealing a bad
side effect: as "make cleandocs" don't clean the produced files and symbols
appear duplicated for people that don't build the kernel from scratch.
This series adds an yaml parser extension. We opted to write such extension
in a way that no actual yaml conversion code is inside it. This makes it flexible
enough to handle other types of yaml files in the future. The actual yaml
conversion logic were placed at scripts/lib/netlink_yml_parser.py. The
existing command line tool was also modified to use the library ther.
With this version, there's no need to add any template file per netlink/spec
file, and the index is automatically filled using :glob: parameter.
---
v4:
- Renamed the YNL parser class;
- some minor patch cleanups and merges;
- added an extra patch to fix a insert_pattern/exclude_pattern logic when
SPHINXDIRS is used.
v3:
- Two series got merged altogether:
- https://lore.kernel.org/linux-doc/cover.1749723671.git.mchehab+huawei@kernel.org/T/#t
- https://lore.kernel.org/linux-doc/cover.1749735022.git.mchehab+huawei@kernel.org
- Added an extra patch to update MAINTAINERS to point to YNL library
- Added a (somewhat unrelated) patch that remove warnings check when
running "make cleandocs".
---
v2:
- Use a Sphinx extension to handle netlink files.
v1:
- Statically add template files to as networking/netlink_spec/<family>.rst
Mauro Carvalho Chehab (14):
tools: ynl_gen_rst.py: create a top-level reference
docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
docs: netlink: don't ignore generated rst files
tools: ynl_gen_rst.py: make the index parser more generic
tools: ynl_gen_rst.py: Split library from command line tool
scripts: lib: netlink_yml_parser.py: use classes
tools: ynl_gen_rst.py: move index.rst generator to the script
docs: sphinx: add a parser for yaml files for Netlink specs
docs: use parser_yaml extension to handle Netlink specs
docs: conf.py: don't handle yaml files outside Netlink specs
docs: uapi: netlink: update netlink specs link
MAINTAINERS: add maintainers for netlink_yml_parser.py
docs: Makefile: disable check rules on make cleandocs
docs: conf.py: properly handle include and exclude patterns
.pylintrc | 2 +-
Documentation/Makefile | 19 +-
Documentation/conf.py | 61 ++-
Documentation/netlink/specs/index.rst | 13 +
Documentation/networking/index.rst | 2 +-
.../networking/netlink_spec/.gitignore | 1 -
.../networking/netlink_spec/readme.txt | 4 -
Documentation/sphinx/parser_yaml.py | 76 ++++
Documentation/userspace-api/netlink/index.rst | 2 +-
.../userspace-api/netlink/netlink-raw.rst | 6 +-
Documentation/userspace-api/netlink/specs.rst | 2 +-
MAINTAINERS | 2 +
scripts/lib/netlink_yml_parser.py | 360 ++++++++++++++++
tools/net/ynl/pyynl/ynl_gen_rst.py | 394 ++----------------
14 files changed, 548 insertions(+), 396 deletions(-)
create mode 100644 Documentation/netlink/specs/index.rst
delete mode 100644 Documentation/networking/netlink_spec/.gitignore
delete mode 100644 Documentation/networking/netlink_spec/readme.txt
create mode 100755 Documentation/sphinx/parser_yaml.py
create mode 100755 scripts/lib/netlink_yml_parser.py
--
2.49.0
^ permalink raw reply [flat|nested] 29+ messages in thread
* [PATCH v4 01/14] tools: ynl_gen_rst.py: create a top-level reference
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-06-14 8:55 ` Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 02/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
` (12 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Currently, rt documents are referred with:
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
that's hard to maintain, and may break if we change the way
rst files are generated from yaml. Better to use instead a
reference for the netlink family.
So, add a netlink-<foo> reference to all generated docs.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
tools/net/ynl/pyynl/ynl_gen_rst.py | 5 +++--
1 file changed, 3 insertions(+), 2 deletions(-)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
# Main header
- lines.append(rst_header())
-
family = obj['name']
+ lines.append(rst_header())
+ lines.append(rst_label("netlink-" + family))
+
title = f"Family ``{family}`` netlink specification"
lines.append(rst_title(title))
lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 02/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 01/14] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
@ 2025-06-14 8:55 ` Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 03/14] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
` (11 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. So, replace them by
Sphinx cross-reference tag that are now created by ynl_gen_rst.py.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
1 file changed, 3 insertions(+), 3 deletions(-)
diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
------------
Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
abstraction to carry module specific information.
Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
Nested struct definitions
-------------------------
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
make use of nested struct definitions. The ``netlink-raw`` schema makes it
possible to embed a struct within a struct definition using the ``struct``
property. For example, the following struct definition embeds the
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 03/14] docs: netlink: don't ignore generated rst files
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 01/14] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 02/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-06-14 8:55 ` Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
` (10 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Currently, the build system generates ReST files inside the
source directory. This is not a good idea, specially when
we have renames, as make clean won't get rid of them.
As the first step to address the issue, stop ignoring those
files. This way, we can see exactly what has been produced
at build time inside $(srctree):
Documentation/networking/netlink_spec/conntrack.rst
Documentation/networking/netlink_spec/devlink.rst
Documentation/networking/netlink_spec/dpll.rst
Documentation/networking/netlink_spec/ethtool.rst
Documentation/networking/netlink_spec/fou.rst
Documentation/networking/netlink_spec/handshake.rst
Documentation/networking/netlink_spec/index.rst
Documentation/networking/netlink_spec/lockd.rst
Documentation/networking/netlink_spec/mptcp_pm.rst
Documentation/networking/netlink_spec/net_shaper.rst
Documentation/networking/netlink_spec/netdev.rst
Documentation/networking/netlink_spec/nfsd.rst
Documentation/networking/netlink_spec/nftables.rst
Documentation/networking/netlink_spec/nl80211.rst
Documentation/networking/netlink_spec/nlctrl.rst
Documentation/networking/netlink_spec/ovs_datapath.rst
Documentation/networking/netlink_spec/ovs_flow.rst
Documentation/networking/netlink_spec/ovs_vport.rst
Documentation/networking/netlink_spec/rt_addr.rst
Documentation/networking/netlink_spec/rt_link.rst
Documentation/networking/netlink_spec/rt_neigh.rst
Documentation/networking/netlink_spec/rt_route.rst
Documentation/networking/netlink_spec/rt_rule.rst
Documentation/networking/netlink_spec/tc.rst
Documentation/networking/netlink_spec/tcp_metrics.rst
Documentation/networking/netlink_spec/team.rst
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/networking/netlink_spec/.gitignore | 1 -
1 file changed, 1 deletion(-)
delete mode 100644 Documentation/networking/netlink_spec/.gitignore
diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (2 preceding siblings ...)
2025-06-14 8:55 ` [PATCH v4 03/14] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
@ 2025-06-14 8:55 ` Mauro Carvalho Chehab
2025-06-14 13:41 ` Donald Hunter
2025-06-14 8:55 ` [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
` (9 subsequent siblings)
13 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
It is not a good practice to store build-generated files
inside $(srctree), as one may be using O=<BUILDDIR> and even
have the Kernel on a read-only directory.
Change the YAML generation for netlink files to allow it
to parse data based on the source or on the object tree.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
1 file changed, 16 insertions(+), 6 deletions(-)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..b1e5acafb998 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
parser.add_argument("-v", "--verbose", action="store_true")
parser.add_argument("-o", "--output", help="Output file name")
+ parser.add_argument("-d", "--input_dir", help="YAML input directory")
# Index and input are mutually exclusive
group = parser.add_mutually_exclusive_group()
@@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
"""Write the generated content into an RST file"""
logging.debug("Saving RST file to %s", filename)
+ dir = os.path.dirname(filename)
+ os.makedirs(dir, exist_ok=True)
+
with open(filename, "w", encoding="utf-8") as rst_file:
rst_file.write(content)
-def generate_main_index_rst(output: str) -> None:
+def generate_main_index_rst(output: str, index_dir: str) -> None:
"""Generate the `networking_spec/index` content and write to the file"""
lines = []
@@ -418,12 +422,18 @@ def generate_main_index_rst(output: str) -> None:
lines.append(rst_title("Netlink Family Specifications"))
lines.append(rst_toctree(1))
- index_dir = os.path.dirname(output)
- logging.debug("Looking for .rst files in %s", index_dir)
+ index_fname = os.path.basename(output)
+ base, ext = os.path.splitext(index_fname)
+
+ if not index_dir:
+ index_dir = os.path.dirname(output)
+
+ logging.debug(f"Looking for {ext} files in %s", index_dir)
for filename in sorted(os.listdir(index_dir)):
- if not filename.endswith(".rst") or filename == "index.rst":
+ if not filename.endswith(ext) or filename == index_fname:
continue
- lines.append(f" {filename.replace('.rst', '')}\n")
+ base, ext = os.path.splitext(filename)
+ lines.append(f" {base}\n")
logging.debug("Writing an index file at %s", output)
write_to_rstfile("".join(lines), output)
@@ -447,7 +457,7 @@ def main() -> None:
if args.index:
# Generate the index RST file
- generate_main_index_rst(args.output)
+ generate_main_index_rst(args.output, args.input_dir)
if __name__ == "__main__":
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (3 preceding siblings ...)
2025-06-14 8:55 ` [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
@ 2025-06-14 8:55 ` Mauro Carvalho Chehab
2025-06-14 14:09 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
` (8 subsequent siblings)
13 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:55 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.
No functional changes.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
scripts/lib/netlink_yml_parser.py | 391 +++++++++++++++++++++++++++++
tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------
2 files changed, 401 insertions(+), 364 deletions(-)
create mode 100755 scripts/lib/netlink_yml_parser.py
diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
new file mode 100755
index 000000000000..3c15b578f947
--- /dev/null
+++ b/scripts/lib/netlink_yml_parser.py
@@ -0,0 +1,391 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+ Script to auto generate the documentation for Netlink specifications.
+
+ :copyright: Copyright (C) 2023 Breno Leitao <leitao@debian.org>
+ :license: GPL Version 2, June 1991 see linux/COPYING for details.
+
+ This script performs extensive parsing to the Linux kernel's netlink YAML
+ spec files, in an effort to avoid needing to heavily mark up the original
+ YAML file.
+
+ This code is split in three big parts:
+ 1) RST formatters: Use to convert a string to a RST output
+ 2) Parser helpers: Functions to parse the YAML data structure
+ 3) Main function and small helpers
+"""
+
+from typing import Any, Dict, List
+import os.path
+import logging
+import yaml
+
+
+SPACE_PER_LEVEL = 4
+
+
+# RST Formatters
+# ==============
+def headroom(level: int) -> str:
+ """Return space to format"""
+ return " " * (level * SPACE_PER_LEVEL)
+
+
+def bold(text: str) -> str:
+ """Format bold text"""
+ return f"**{text}**"
+
+
+def inline(text: str) -> str:
+ """Format inline text"""
+ return f"``{text}``"
+
+
+def sanitize(text: str) -> str:
+ """Remove newlines and multiple spaces"""
+ # This is useful for some fields that are spread across multiple lines
+ return str(text).replace("\n", " ").strip()
+
+
+def rst_fields(key: str, value: str, level: int = 0) -> str:
+ """Return a RST formatted field"""
+ return headroom(level) + f":{key}: {value}"
+
+
+def rst_definition(key: str, value: Any, level: int = 0) -> str:
+ """Format a single rst definition"""
+ return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
+
+
+def rst_paragraph(paragraph: str, level: int = 0) -> str:
+ """Return a formatted paragraph"""
+ return headroom(level) + paragraph
+
+
+def rst_bullet(item: str, level: int = 0) -> str:
+ """Return a formatted a bullet"""
+ return headroom(level) + f"- {item}"
+
+
+def rst_subsection(title: str) -> str:
+ """Add a sub-section to the document"""
+ return f"{title}\n" + "-" * len(title)
+
+
+def rst_subsubsection(title: str) -> str:
+ """Add a sub-sub-section to the document"""
+ return f"{title}\n" + "~" * len(title)
+
+
+def rst_section(namespace: str, prefix: str, title: str) -> str:
+ """Add a section to the document"""
+ return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+def rst_subtitle(title: str) -> str:
+ """Add a subtitle to the document"""
+ return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+def rst_title(title: str) -> str:
+ """Add a title to the document"""
+ return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+def rst_list_inline(list_: List[str], level: int = 0) -> str:
+ """Format a list using inlines"""
+ return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
+
+
+def rst_ref(namespace: str, prefix: str, name: str) -> str:
+ """Add a hyperlink to the document"""
+ mappings = {'enum': 'definition',
+ 'fixed-header': 'definition',
+ 'nested-attributes': 'attribute-set',
+ 'struct': 'definition'}
+ if prefix in mappings:
+ prefix = mappings[prefix]
+ return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+def rst_header() -> str:
+ """The headers for all the auto generated RST files"""
+ lines = []
+
+ lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+ lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+ return "\n".join(lines)
+
+
+def rst_toctree(maxdepth: int = 2) -> str:
+ """Generate a toctree RST primitive"""
+ lines = []
+
+ lines.append(".. toctree::")
+ lines.append(f" :maxdepth: {maxdepth}\n\n")
+
+ return "\n".join(lines)
+
+
+def rst_label(title: str) -> str:
+ """Return a formatted label"""
+ return f".. _{title}:\n\n"
+
+
+# Parsers
+# =======
+
+
+def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
+ """Parse 'multicast' group list and return a formatted string"""
+ lines = []
+ for group in mcast_group:
+ lines.append(rst_bullet(group["name"]))
+
+ return "\n".join(lines)
+
+
+def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'do' section and return a formatted string"""
+ lines = []
+ for key in do_dict.keys():
+ lines.append(rst_paragraph(bold(key), level + 1))
+ if key in ['request', 'reply']:
+ lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
+ else:
+ lines.append(headroom(level + 2) + do_dict[key] + "\n")
+
+ return "\n".join(lines)
+
+
+def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'attributes' section"""
+ if "attributes" not in attrs:
+ return ""
+ lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
+
+ return "\n".join(lines)
+
+
+def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse operations block"""
+ preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+ linkable = ["fixed-header", "attribute-set"]
+ lines = []
+
+ for operation in operations:
+ lines.append(rst_section(namespace, 'operation', operation["name"]))
+ lines.append(rst_paragraph(operation["doc"]) + "\n")
+
+ for key in operation.keys():
+ if key in preprocessed:
+ # Skip the special fields
+ continue
+ value = operation[key]
+ if key in linkable:
+ value = rst_ref(namespace, key, value)
+ lines.append(rst_fields(key, value, 0))
+ if 'flags' in operation:
+ lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
+
+ if "do" in operation:
+ lines.append(rst_paragraph(":do:", 0))
+ lines.append(parse_do(operation["do"], 0))
+ if "dump" in operation:
+ lines.append(rst_paragraph(":dump:", 0))
+ lines.append(parse_do(operation["dump"], 0))
+
+ # New line after fields
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
+ """Parse a list of entries"""
+ ignored = ["pad"]
+ lines = []
+ for entry in entries:
+ if isinstance(entry, dict):
+ # entries could be a list or a dictionary
+ field_name = entry.get("name", "")
+ if field_name in ignored:
+ continue
+ type_ = entry.get("type")
+ if type_:
+ field_name += f" ({inline(type_)})"
+ lines.append(
+ rst_fields(field_name, sanitize(entry.get("doc", "")), level)
+ )
+ elif isinstance(entry, list):
+ lines.append(rst_list_inline(entry, level))
+ else:
+ lines.append(rst_bullet(inline(sanitize(entry)), level))
+
+ lines.append("\n")
+ return "\n".join(lines)
+
+
+def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
+ """Parse definitions section"""
+ preprocessed = ["name", "entries", "members"]
+ ignored = ["render-max"] # This is not printed
+ lines = []
+
+ for definition in defs:
+ lines.append(rst_section(namespace, 'definition', definition["name"]))
+ for k in definition.keys():
+ if k in preprocessed + ignored:
+ continue
+ lines.append(rst_fields(k, sanitize(definition[k]), 0))
+
+ # Field list needs to finish with a new line
+ lines.append("\n")
+ if "entries" in definition:
+ lines.append(rst_paragraph(":entries:", 0))
+ lines.append(parse_entries(definition["entries"], 1))
+ if "members" in definition:
+ lines.append(rst_paragraph(":members:", 0))
+ lines.append(parse_entries(definition["members"], 1))
+
+ return "\n".join(lines)
+
+
+def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse attribute from attribute-set"""
+ preprocessed = ["name", "type"]
+ linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+ ignored = ["checks"]
+ lines = []
+
+ for entry in entries:
+ lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
+ for attr in entry["attributes"]:
+ type_ = attr.get("type")
+ attr_line = attr["name"]
+ if type_:
+ # Add the attribute type in the same line
+ attr_line += f" ({inline(type_)})"
+
+ lines.append(rst_subsubsection(attr_line))
+
+ for k in attr.keys():
+ if k in preprocessed + ignored:
+ continue
+ if k in linkable:
+ value = rst_ref(namespace, k, attr[k])
+ else:
+ value = sanitize(attr[k])
+ lines.append(rst_fields(k, value, 0))
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse sub-message definitions"""
+ lines = []
+
+ for entry in entries:
+ lines.append(rst_section(namespace, 'sub-message', entry["name"]))
+ for fmt in entry["formats"]:
+ value = fmt["value"]
+
+ lines.append(rst_bullet(bold(value)))
+ for attr in ['fixed-header', 'attribute-set']:
+ if attr in fmt:
+ lines.append(rst_fields(attr,
+ rst_ref(namespace, attr, fmt[attr]),
+ 1))
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+def parse_yaml(obj: Dict[str, Any]) -> str:
+ """Format the whole YAML into a RST string"""
+ lines = []
+
+ # Main header
+
+ family = obj['name']
+
+ lines.append(rst_header())
+ lines.append(rst_label("netlink-" + family))
+
+ title = f"Family ``{family}`` netlink specification"
+ lines.append(rst_title(title))
+ lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+
+ if "doc" in obj:
+ lines.append(rst_subtitle("Summary"))
+ lines.append(rst_paragraph(obj["doc"], 0))
+
+ # Operations
+ if "operations" in obj:
+ lines.append(rst_subtitle("Operations"))
+ lines.append(parse_operations(obj["operations"]["list"], family))
+
+ # Multicast groups
+ if "mcast-groups" in obj:
+ lines.append(rst_subtitle("Multicast groups"))
+ lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+
+ # Definitions
+ if "definitions" in obj:
+ lines.append(rst_subtitle("Definitions"))
+ lines.append(parse_definitions(obj["definitions"], family))
+
+ # Attributes set
+ if "attribute-sets" in obj:
+ lines.append(rst_subtitle("Attribute sets"))
+ lines.append(parse_attr_sets(obj["attribute-sets"], family))
+
+ # Sub-messages
+ if "sub-messages" in obj:
+ lines.append(rst_subtitle("Sub-messages"))
+ lines.append(parse_sub_messages(obj["sub-messages"], family))
+
+ return "\n".join(lines)
+
+
+# Main functions
+# ==============
+
+
+def parse_yaml_file(filename: str) -> str:
+ """Transform the YAML specified by filename into an RST-formatted string"""
+ with open(filename, "r", encoding="utf-8") as spec_file:
+ yaml_data = yaml.safe_load(spec_file)
+ content = parse_yaml(yaml_data)
+
+ return content
+
+
+def generate_main_index_rst(output: str, index_dir: str) -> str:
+ """Generate the `networking_spec/index` content and write to the file"""
+ lines = []
+
+ lines.append(rst_header())
+ lines.append(rst_label("specs"))
+ lines.append(rst_title("Netlink Family Specifications"))
+ lines.append(rst_toctree(1))
+
+ index_fname = os.path.basename(output)
+ base, ext = os.path.splitext(index_fname)
+
+ if not index_dir:
+ index_dir = os.path.dirname(output)
+
+ logging.debug(f"Looking for {ext} files in %s", index_dir)
+ for filename in sorted(os.listdir(index_dir)):
+ if not filename.endswith(ext) or filename == index_fname:
+ continue
+ base, ext = os.path.splitext(filename)
+ lines.append(f" {base}\n")
+
+ return "".join(lines), output
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index b1e5acafb998..38dafe3d9179 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -18,345 +18,17 @@
3) Main function and small helpers
"""
-from typing import Any, Dict, List
import os.path
import sys
import argparse
import logging
-import yaml
+LIB_DIR = "../../../../scripts/lib"
+SRC_DIR = os.path.dirname(os.path.realpath(__file__))
-SPACE_PER_LEVEL = 4
+sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
- """Return space to format"""
- return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
- """Format bold text"""
- return f"**{text}**"
-
-
-def inline(text: str) -> str:
- """Format inline text"""
- return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
- """Remove newlines and multiple spaces"""
- # This is useful for some fields that are spread across multiple lines
- return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
- """Return a RST formatted field"""
- return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
- """Format a single rst definition"""
- return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
- """Return a formatted paragraph"""
- return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
- """Return a formatted a bullet"""
- return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
- """Add a sub-section to the document"""
- return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
- """Add a sub-sub-section to the document"""
- return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
- """Add a section to the document"""
- return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
- """Add a subtitle to the document"""
- return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
- """Add a title to the document"""
- return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
- """Format a list using inlines"""
- return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
- """Add a hyperlink to the document"""
- mappings = {'enum': 'definition',
- 'fixed-header': 'definition',
- 'nested-attributes': 'attribute-set',
- 'struct': 'definition'}
- if prefix in mappings:
- prefix = mappings[prefix]
- return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
- """The headers for all the auto generated RST files"""
- lines = []
-
- lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
- lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
- return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
- """Generate a toctree RST primitive"""
- lines = []
-
- lines.append(".. toctree::")
- lines.append(f" :maxdepth: {maxdepth}\n\n")
-
- return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
- """Return a formatted label"""
- return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
- """Parse 'multicast' group list and return a formatted string"""
- lines = []
- for group in mcast_group:
- lines.append(rst_bullet(group["name"]))
-
- return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
- """Parse 'do' section and return a formatted string"""
- lines = []
- for key in do_dict.keys():
- lines.append(rst_paragraph(bold(key), level + 1))
- if key in ['request', 'reply']:
- lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
- else:
- lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
- return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
- """Parse 'attributes' section"""
- if "attributes" not in attrs:
- return ""
- lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
- return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
- """Parse operations block"""
- preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
- linkable = ["fixed-header", "attribute-set"]
- lines = []
-
- for operation in operations:
- lines.append(rst_section(namespace, 'operation', operation["name"]))
- lines.append(rst_paragraph(operation["doc"]) + "\n")
-
- for key in operation.keys():
- if key in preprocessed:
- # Skip the special fields
- continue
- value = operation[key]
- if key in linkable:
- value = rst_ref(namespace, key, value)
- lines.append(rst_fields(key, value, 0))
- if 'flags' in operation:
- lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
- if "do" in operation:
- lines.append(rst_paragraph(":do:", 0))
- lines.append(parse_do(operation["do"], 0))
- if "dump" in operation:
- lines.append(rst_paragraph(":dump:", 0))
- lines.append(parse_do(operation["dump"], 0))
-
- # New line after fields
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
- """Parse a list of entries"""
- ignored = ["pad"]
- lines = []
- for entry in entries:
- if isinstance(entry, dict):
- # entries could be a list or a dictionary
- field_name = entry.get("name", "")
- if field_name in ignored:
- continue
- type_ = entry.get("type")
- if type_:
- field_name += f" ({inline(type_)})"
- lines.append(
- rst_fields(field_name, sanitize(entry.get("doc", "")), level)
- )
- elif isinstance(entry, list):
- lines.append(rst_list_inline(entry, level))
- else:
- lines.append(rst_bullet(inline(sanitize(entry)), level))
-
- lines.append("\n")
- return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
- """Parse definitions section"""
- preprocessed = ["name", "entries", "members"]
- ignored = ["render-max"] # This is not printed
- lines = []
-
- for definition in defs:
- lines.append(rst_section(namespace, 'definition', definition["name"]))
- for k in definition.keys():
- if k in preprocessed + ignored:
- continue
- lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
- # Field list needs to finish with a new line
- lines.append("\n")
- if "entries" in definition:
- lines.append(rst_paragraph(":entries:", 0))
- lines.append(parse_entries(definition["entries"], 1))
- if "members" in definition:
- lines.append(rst_paragraph(":members:", 0))
- lines.append(parse_entries(definition["members"], 1))
-
- return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse attribute from attribute-set"""
- preprocessed = ["name", "type"]
- linkable = ["enum", "nested-attributes", "struct", "sub-message"]
- ignored = ["checks"]
- lines = []
-
- for entry in entries:
- lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
- for attr in entry["attributes"]:
- type_ = attr.get("type")
- attr_line = attr["name"]
- if type_:
- # Add the attribute type in the same line
- attr_line += f" ({inline(type_)})"
-
- lines.append(rst_subsubsection(attr_line))
-
- for k in attr.keys():
- if k in preprocessed + ignored:
- continue
- if k in linkable:
- value = rst_ref(namespace, k, attr[k])
- else:
- value = sanitize(attr[k])
- lines.append(rst_fields(k, value, 0))
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse sub-message definitions"""
- lines = []
-
- for entry in entries:
- lines.append(rst_section(namespace, 'sub-message', entry["name"]))
- for fmt in entry["formats"]:
- value = fmt["value"]
-
- lines.append(rst_bullet(bold(value)))
- for attr in ['fixed-header', 'attribute-set']:
- if attr in fmt:
- lines.append(rst_fields(attr,
- rst_ref(namespace, attr, fmt[attr]),
- 1))
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
- """Format the whole YAML into a RST string"""
- lines = []
-
- # Main header
-
- family = obj['name']
-
- lines.append(rst_header())
- lines.append(rst_label("netlink-" + family))
-
- title = f"Family ``{family}`` netlink specification"
- lines.append(rst_title(title))
- lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
- if "doc" in obj:
- lines.append(rst_subtitle("Summary"))
- lines.append(rst_paragraph(obj["doc"], 0))
-
- # Operations
- if "operations" in obj:
- lines.append(rst_subtitle("Operations"))
- lines.append(parse_operations(obj["operations"]["list"], family))
-
- # Multicast groups
- if "mcast-groups" in obj:
- lines.append(rst_subtitle("Multicast groups"))
- lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
- # Definitions
- if "definitions" in obj:
- lines.append(rst_subtitle("Definitions"))
- lines.append(parse_definitions(obj["definitions"], family))
-
- # Attributes set
- if "attribute-sets" in obj:
- lines.append(rst_subtitle("Attribute sets"))
- lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
- # Sub-messages
- if "sub-messages" in obj:
- lines.append(rst_subtitle("Sub-messages"))
- lines.append(parse_sub_messages(obj["sub-messages"], family))
-
- return "\n".join(lines)
-
-
-# Main functions
-# ==============
+from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
def parse_arguments() -> argparse.Namespace:
@@ -393,50 +65,24 @@ def parse_arguments() -> argparse.Namespace:
return args
-def parse_yaml_file(filename: str) -> str:
- """Transform the YAML specified by filename into an RST-formatted string"""
- with open(filename, "r", encoding="utf-8") as spec_file:
- yaml_data = yaml.safe_load(spec_file)
- content = parse_yaml(yaml_data)
-
- return content
-
-
def write_to_rstfile(content: str, filename: str) -> None:
"""Write the generated content into an RST file"""
logging.debug("Saving RST file to %s", filename)
- dir = os.path.dirname(filename)
- os.makedirs(dir, exist_ok=True)
+ directory = os.path.dirname(filename)
+ os.makedirs(directory, exist_ok=True)
with open(filename, "w", encoding="utf-8") as rst_file:
rst_file.write(content)
-def generate_main_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(output: str, index_dir: str) -> None:
"""Generate the `networking_spec/index` content and write to the file"""
- lines = []
- lines.append(rst_header())
- lines.append(rst_label("specs"))
- lines.append(rst_title("Netlink Family Specifications"))
- lines.append(rst_toctree(1))
-
- index_fname = os.path.basename(output)
- base, ext = os.path.splitext(index_fname)
-
- if not index_dir:
- index_dir = os.path.dirname(output)
-
- logging.debug(f"Looking for {ext} files in %s", index_dir)
- for filename in sorted(os.listdir(index_dir)):
- if not filename.endswith(ext) or filename == index_fname:
- continue
- base, ext = os.path.splitext(filename)
- lines.append(f" {base}\n")
+ msg = generate_main_index_rst(output, index_dir)
logging.debug("Writing an index file at %s", output)
- write_to_rstfile("".join(lines), output)
+ write_to_rstfile(msg, output)
def main() -> None:
@@ -457,7 +103,7 @@ def main() -> None:
if args.index:
# Generate the index RST file
- generate_main_index_rst(args.output, args.input_dir)
+ write_index_rst(args.output, args.input_dir)
if __name__ == "__main__":
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (4 preceding siblings ...)
2025-06-14 8:55 ` [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 14:11 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script Mauro Carvalho Chehab
` (7 subsequent siblings)
13 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
As we'll be importing netlink parser into a Sphinx extension,
move all functions and global variables inside two classes:
- RstFormatters, containing ReST formatter logic, which are
YAML independent;
- NetlinkYamlParser: contains the actual parser classes. That's
the only class that needs to be imported by the script or by
a Sphinx extension.
With that, we won't pollute Sphinx namespace, avoiding any
potential clashes.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
scripts/lib/netlink_yml_parser.py | 594 +++++++++++++++--------------
tools/net/ynl/pyynl/ynl_gen_rst.py | 19 +-
2 files changed, 314 insertions(+), 299 deletions(-)
diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
index 3c15b578f947..839e78b39de3 100755
--- a/scripts/lib/netlink_yml_parser.py
+++ b/scripts/lib/netlink_yml_parser.py
@@ -3,389 +3,407 @@
# -*- coding: utf-8; mode: python -*-
"""
- Script to auto generate the documentation for Netlink specifications.
+ Class to auto generate the documentation for Netlink specifications.
:copyright: Copyright (C) 2023 Breno Leitao <leitao@debian.org>
:license: GPL Version 2, June 1991 see linux/COPYING for details.
- This script performs extensive parsing to the Linux kernel's netlink YAML
+ This class performs extensive parsing to the Linux kernel's netlink YAML
spec files, in an effort to avoid needing to heavily mark up the original
YAML file.
- This code is split in three big parts:
+ This code is split in two classes:
1) RST formatters: Use to convert a string to a RST output
- 2) Parser helpers: Functions to parse the YAML data structure
- 3) Main function and small helpers
+ 2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
"""
from typing import Any, Dict, List
import os.path
+import sys
+import argparse
import logging
import yaml
-SPACE_PER_LEVEL = 4
-
-
+# ==============
# RST Formatters
# ==============
-def headroom(level: int) -> str:
- """Return space to format"""
- return " " * (level * SPACE_PER_LEVEL)
+class RstFormatters:
+ SPACE_PER_LEVEL = 4
+ @staticmethod
+ def headroom(level: int) -> str:
+ """Return space to format"""
+ return " " * (level * RstFormatters.SPACE_PER_LEVEL)
-def bold(text: str) -> str:
- """Format bold text"""
- return f"**{text}**"
+ @staticmethod
+ def bold(text: str) -> str:
+ """Format bold text"""
+ return f"**{text}**"
-def inline(text: str) -> str:
- """Format inline text"""
- return f"``{text}``"
+ @staticmethod
+ def inline(text: str) -> str:
+ """Format inline text"""
+ return f"``{text}``"
-def sanitize(text: str) -> str:
- """Remove newlines and multiple spaces"""
- # This is useful for some fields that are spread across multiple lines
- return str(text).replace("\n", " ").strip()
+ @staticmethod
+ def sanitize(text: str) -> str:
+ """Remove newlines and multiple spaces"""
+ # This is useful for some fields that are spread across multiple lines
+ return str(text).replace("\n", " ").strip()
-def rst_fields(key: str, value: str, level: int = 0) -> str:
- """Return a RST formatted field"""
- return headroom(level) + f":{key}: {value}"
+ def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+ """Return a RST formatted field"""
+ return self.headroom(level) + f":{key}: {value}"
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
- """Format a single rst definition"""
- return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
+ def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+ """Format a single rst definition"""
+ return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
- """Return a formatted paragraph"""
- return headroom(level) + paragraph
+ def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+ """Return a formatted paragraph"""
+ return self.headroom(level) + paragraph
-def rst_bullet(item: str, level: int = 0) -> str:
- """Return a formatted a bullet"""
- return headroom(level) + f"- {item}"
+ def rst_bullet(self, item: str, level: int = 0) -> str:
+ """Return a formatted a bullet"""
+ return self.headroom(level) + f"- {item}"
-def rst_subsection(title: str) -> str:
- """Add a sub-section to the document"""
- return f"{title}\n" + "-" * len(title)
+ @staticmethod
+ def rst_subsection(title: str) -> str:
+ """Add a sub-section to the document"""
+ return f"{title}\n" + "-" * len(title)
-def rst_subsubsection(title: str) -> str:
- """Add a sub-sub-section to the document"""
- return f"{title}\n" + "~" * len(title)
+ @staticmethod
+ def rst_subsubsection(title: str) -> str:
+ """Add a sub-sub-section to the document"""
+ return f"{title}\n" + "~" * len(title)
-def rst_section(namespace: str, prefix: str, title: str) -> str:
- """Add a section to the document"""
- return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+ @staticmethod
+ def rst_section(namespace: str, prefix: str, title: str) -> str:
+ """Add a section to the document"""
+ return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-def rst_subtitle(title: str) -> str:
- """Add a subtitle to the document"""
- return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+ @staticmethod
+ def rst_subtitle(title: str) -> str:
+ """Add a subtitle to the document"""
+ return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-def rst_title(title: str) -> str:
- """Add a title to the document"""
- return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+ @staticmethod
+ def rst_title(title: str) -> str:
+ """Add a title to the document"""
+ return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
- """Format a list using inlines"""
- return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
+ def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+ """Format a list using inlines"""
+ return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
- """Add a hyperlink to the document"""
- mappings = {'enum': 'definition',
- 'fixed-header': 'definition',
- 'nested-attributes': 'attribute-set',
- 'struct': 'definition'}
- if prefix in mappings:
- prefix = mappings[prefix]
- return f":ref:`{namespace}-{prefix}-{name}`"
+ @staticmethod
+ def rst_ref(namespace: str, prefix: str, name: str) -> str:
+ """Add a hyperlink to the document"""
+ mappings = {'enum': 'definition',
+ 'fixed-header': 'definition',
+ 'nested-attributes': 'attribute-set',
+ 'struct': 'definition'}
+ if prefix in mappings:
+ prefix = mappings[prefix]
+ return f":ref:`{namespace}-{prefix}-{name}`"
-def rst_header() -> str:
- """The headers for all the auto generated RST files"""
- lines = []
- lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
- lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+ def rst_header(self) -> str:
+ """The headers for all the auto generated RST files"""
+ lines = []
- return "\n".join(lines)
+ lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+ lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+ return "\n".join(lines)
-def rst_toctree(maxdepth: int = 2) -> str:
- """Generate a toctree RST primitive"""
- lines = []
- lines.append(".. toctree::")
- lines.append(f" :maxdepth: {maxdepth}\n\n")
+ @staticmethod
+ def rst_toctree(maxdepth: int = 2) -> str:
+ """Generate a toctree RST primitive"""
+ lines = []
- return "\n".join(lines)
+ lines.append(".. toctree::")
+ lines.append(f" :maxdepth: {maxdepth}\n\n")
+ return "\n".join(lines)
-def rst_label(title: str) -> str:
- """Return a formatted label"""
- return f".. _{title}:\n\n"
+ @staticmethod
+ def rst_label(title: str) -> str:
+ """Return a formatted label"""
+ return f".. _{title}:\n\n"
+# =======
# Parsers
# =======
+class YnlDocGenerator:
+
+ fmt = RstFormatters()
+
+ def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+ """Parse 'multicast' group list and return a formatted string"""
+ lines = []
+ for group in mcast_group:
+ lines.append(self.fmt.rst_bullet(group["name"]))
+
+ return "\n".join(lines)
+
+
+ def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'do' section and return a formatted string"""
+ lines = []
+ for key in do_dict.keys():
+ lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+ if key in ['request', 'reply']:
+ lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+ else:
+ lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+ return "\n".join(lines)
+
+
+ def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'attributes' section"""
+ if "attributes" not in attrs:
+ return ""
+ lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+ return "\n".join(lines)
+
+
+ def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse operations block"""
+ preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+ linkable = ["fixed-header", "attribute-set"]
+ lines = []
+
+ for operation in operations:
+ lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+ lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+ for key in operation.keys():
+ if key in preprocessed:
+ # Skip the special fields
+ continue
+ value = operation[key]
+ if key in linkable:
+ value = self.fmt.rst_ref(namespace, key, value)
+ lines.append(self.fmt.rst_fields(key, value, 0))
+ if 'flags' in operation:
+ lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+ if "do" in operation:
+ lines.append(self.fmt.rst_paragraph(":do:", 0))
+ lines.append(self.parse_do(operation["do"], 0))
+ if "dump" in operation:
+ lines.append(self.fmt.rst_paragraph(":dump:", 0))
+ lines.append(self.parse_do(operation["dump"], 0))
+
+ # New line after fields
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+ def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+ """Parse a list of entries"""
+ ignored = ["pad"]
+ lines = []
+ for entry in entries:
+ if isinstance(entry, dict):
+ # entries could be a list or a dictionary
+ field_name = entry.get("name", "")
+ if field_name in ignored:
+ continue
+ type_ = entry.get("type")
+ if type_:
+ field_name += f" ({self.fmt.inline(type_)})"
+ lines.append(
+ self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+ )
+ elif isinstance(entry, list):
+ lines.append(self.fmt.rst_list_inline(entry, level))
+ else:
+ lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+ lines.append("\n")
+ return "\n".join(lines)
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
- """Parse 'multicast' group list and return a formatted string"""
- lines = []
- for group in mcast_group:
- lines.append(rst_bullet(group["name"]))
-
- return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
- """Parse 'do' section and return a formatted string"""
- lines = []
- for key in do_dict.keys():
- lines.append(rst_paragraph(bold(key), level + 1))
- if key in ['request', 'reply']:
- lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
- else:
- lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
- return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
- """Parse 'attributes' section"""
- if "attributes" not in attrs:
- return ""
- lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
- return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
- """Parse operations block"""
- preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
- linkable = ["fixed-header", "attribute-set"]
- lines = []
-
- for operation in operations:
- lines.append(rst_section(namespace, 'operation', operation["name"]))
- lines.append(rst_paragraph(operation["doc"]) + "\n")
-
- for key in operation.keys():
- if key in preprocessed:
- # Skip the special fields
- continue
- value = operation[key]
- if key in linkable:
- value = rst_ref(namespace, key, value)
- lines.append(rst_fields(key, value, 0))
- if 'flags' in operation:
- lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
- if "do" in operation:
- lines.append(rst_paragraph(":do:", 0))
- lines.append(parse_do(operation["do"], 0))
- if "dump" in operation:
- lines.append(rst_paragraph(":dump:", 0))
- lines.append(parse_do(operation["dump"], 0))
- # New line after fields
- lines.append("\n")
+ def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+ """Parse definitions section"""
+ preprocessed = ["name", "entries", "members"]
+ ignored = ["render-max"] # This is not printed
+ lines = []
- return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
- """Parse a list of entries"""
- ignored = ["pad"]
- lines = []
- for entry in entries:
- if isinstance(entry, dict):
- # entries could be a list or a dictionary
- field_name = entry.get("name", "")
- if field_name in ignored:
- continue
- type_ = entry.get("type")
- if type_:
- field_name += f" ({inline(type_)})"
- lines.append(
- rst_fields(field_name, sanitize(entry.get("doc", "")), level)
- )
- elif isinstance(entry, list):
- lines.append(rst_list_inline(entry, level))
- else:
- lines.append(rst_bullet(inline(sanitize(entry)), level))
-
- lines.append("\n")
- return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
- """Parse definitions section"""
- preprocessed = ["name", "entries", "members"]
- ignored = ["render-max"] # This is not printed
- lines = []
-
- for definition in defs:
- lines.append(rst_section(namespace, 'definition', definition["name"]))
- for k in definition.keys():
- if k in preprocessed + ignored:
- continue
- lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
- # Field list needs to finish with a new line
- lines.append("\n")
- if "entries" in definition:
- lines.append(rst_paragraph(":entries:", 0))
- lines.append(parse_entries(definition["entries"], 1))
- if "members" in definition:
- lines.append(rst_paragraph(":members:", 0))
- lines.append(parse_entries(definition["members"], 1))
-
- return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse attribute from attribute-set"""
- preprocessed = ["name", "type"]
- linkable = ["enum", "nested-attributes", "struct", "sub-message"]
- ignored = ["checks"]
- lines = []
-
- for entry in entries:
- lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
- for attr in entry["attributes"]:
- type_ = attr.get("type")
- attr_line = attr["name"]
- if type_:
- # Add the attribute type in the same line
- attr_line += f" ({inline(type_)})"
-
- lines.append(rst_subsubsection(attr_line))
-
- for k in attr.keys():
+ for definition in defs:
+ lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+ for k in definition.keys():
if k in preprocessed + ignored:
continue
- if k in linkable:
- value = rst_ref(namespace, k, attr[k])
- else:
- value = sanitize(attr[k])
- lines.append(rst_fields(k, value, 0))
+ lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+ # Field list needs to finish with a new line
lines.append("\n")
+ if "entries" in definition:
+ lines.append(self.fmt.rst_paragraph(":entries:", 0))
+ lines.append(self.parse_entries(definition["entries"], 1))
+ if "members" in definition:
+ lines.append(self.fmt.rst_paragraph(":members:", 0))
+ lines.append(self.parse_entries(definition["members"], 1))
- return "\n".join(lines)
+ return "\n".join(lines)
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse sub-message definitions"""
- lines = []
+ def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse attribute from attribute-set"""
+ preprocessed = ["name", "type"]
+ linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+ ignored = ["checks"]
+ lines = []
- for entry in entries:
- lines.append(rst_section(namespace, 'sub-message', entry["name"]))
- for fmt in entry["formats"]:
- value = fmt["value"]
+ for entry in entries:
+ lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+ for attr in entry["attributes"]:
+ type_ = attr.get("type")
+ attr_line = attr["name"]
+ if type_:
+ # Add the attribute type in the same line
+ attr_line += f" ({self.fmt.inline(type_)})"
- lines.append(rst_bullet(bold(value)))
- for attr in ['fixed-header', 'attribute-set']:
- if attr in fmt:
- lines.append(rst_fields(attr,
- rst_ref(namespace, attr, fmt[attr]),
- 1))
- lines.append("\n")
+ lines.append(self.fmt.rst_subsubsection(attr_line))
+
+ for k in attr.keys():
+ if k in preprocessed + ignored:
+ continue
+ if k in linkable:
+ value = self.fmt.rst_ref(namespace, k, attr[k])
+ else:
+ value = self.fmt.sanitize(attr[k])
+ lines.append(self.fmt.rst_fields(k, value, 0))
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+ def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse sub-message definitions"""
+ lines = []
+
+ for entry in entries:
+ lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+ for fmt in entry["formats"]:
+ value = fmt["value"]
+
+ lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+ for attr in ['fixed-header', 'attribute-set']:
+ if attr in fmt:
+ lines.append(self.fmt.rst_fields(attr,
+ self.fmt.rst_ref(namespace, attr, fmt[attr]),
+ 1))
+ lines.append("\n")
+
+ return "\n".join(lines)
- return "\n".join(lines)
+ def parse_yaml(self, obj: Dict[str, Any]) -> str:
+ """Format the whole YAML into a RST string"""
+ lines = []
-def parse_yaml(obj: Dict[str, Any]) -> str:
- """Format the whole YAML into a RST string"""
- lines = []
+ # Main header
- # Main header
+ family = obj['name']
- family = obj['name']
+ lines.append(self.fmt.rst_header())
+ lines.append(self.fmt.rst_label("netlink-" + family))
- lines.append(rst_header())
- lines.append(rst_label("netlink-" + family))
+ title = f"Family ``{family}`` netlink specification"
+ lines.append(self.fmt.rst_title(title))
+ lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
- title = f"Family ``{family}`` netlink specification"
- lines.append(rst_title(title))
- lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
+ if "doc" in obj:
+ lines.append(self.fmt.rst_subtitle("Summary"))
+ lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
- if "doc" in obj:
- lines.append(rst_subtitle("Summary"))
- lines.append(rst_paragraph(obj["doc"], 0))
+ # Operations
+ if "operations" in obj:
+ lines.append(self.fmt.rst_subtitle("Operations"))
+ lines.append(self.parse_operations(obj["operations"]["list"], family))
- # Operations
- if "operations" in obj:
- lines.append(rst_subtitle("Operations"))
- lines.append(parse_operations(obj["operations"]["list"], family))
+ # Multicast groups
+ if "mcast-groups" in obj:
+ lines.append(self.fmt.rst_subtitle("Multicast groups"))
+ lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
- # Multicast groups
- if "mcast-groups" in obj:
- lines.append(rst_subtitle("Multicast groups"))
- lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
+ # Definitions
+ if "definitions" in obj:
+ lines.append(self.fmt.rst_subtitle("Definitions"))
+ lines.append(self.parse_definitions(obj["definitions"], family))
- # Definitions
- if "definitions" in obj:
- lines.append(rst_subtitle("Definitions"))
- lines.append(parse_definitions(obj["definitions"], family))
+ # Attributes set
+ if "attribute-sets" in obj:
+ lines.append(self.fmt.rst_subtitle("Attribute sets"))
+ lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
- # Attributes set
- if "attribute-sets" in obj:
- lines.append(rst_subtitle("Attribute sets"))
- lines.append(parse_attr_sets(obj["attribute-sets"], family))
+ # Sub-messages
+ if "sub-messages" in obj:
+ lines.append(self.fmt.rst_subtitle("Sub-messages"))
+ lines.append(self.parse_sub_messages(obj["sub-messages"], family))
- # Sub-messages
- if "sub-messages" in obj:
- lines.append(rst_subtitle("Sub-messages"))
- lines.append(parse_sub_messages(obj["sub-messages"], family))
+ return "\n".join(lines)
- return "\n".join(lines)
+ # Main functions
+ # ==============
-# Main functions
-# ==============
+ def parse_yaml_file(self, filename: str) -> str:
+ """Transform the YAML specified by filename into an RST-formatted string"""
+ with open(filename, "r", encoding="utf-8") as spec_file:
+ yaml_data = yaml.safe_load(spec_file)
+ content = self.parse_yaml(yaml_data)
-def parse_yaml_file(filename: str) -> str:
- """Transform the YAML specified by filename into an RST-formatted string"""
- with open(filename, "r", encoding="utf-8") as spec_file:
- yaml_data = yaml.safe_load(spec_file)
- content = parse_yaml(yaml_data)
+ return content
- return content
+ def generate_main_index_rst(self, output: str, index_dir: str) -> None:
+ """Generate the `networking_spec/index` content and write to the file"""
+ lines = []
-def generate_main_index_rst(output: str, index_dir: str) -> str:
- """Generate the `networking_spec/index` content and write to the file"""
- lines = []
+ lines.append(self.fmt.rst_header())
+ lines.append(self.fmt.rst_label("specs"))
+ lines.append(self.fmt.rst_title("Netlink Family Specifications"))
+ lines.append(self.fmt.rst_toctree(1))
- lines.append(rst_header())
- lines.append(rst_label("specs"))
- lines.append(rst_title("Netlink Family Specifications"))
- lines.append(rst_toctree(1))
+ index_fname = os.path.basename(output)
+ base, ext = os.path.splitext(index_fname)
- index_fname = os.path.basename(output)
- base, ext = os.path.splitext(index_fname)
+ if not index_dir:
+ index_dir = os.path.dirname(output)
- if not index_dir:
- index_dir = os.path.dirname(output)
+ logging.debug(f"Looking for {ext} files in %s", index_dir)
+ for filename in sorted(os.listdir(index_dir)):
+ if not filename.endswith(ext) or filename == index_fname:
+ continue
+ base, ext = os.path.splitext(filename)
+ lines.append(f" {base}\n")
- logging.debug(f"Looking for {ext} files in %s", index_dir)
- for filename in sorted(os.listdir(index_dir)):
- if not filename.endswith(ext) or filename == index_fname:
- continue
- base, ext = os.path.splitext(filename)
- lines.append(f" {base}\n")
+ logging.debug("Writing an index file at %s", output)
- return "".join(lines), output
+ return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 38dafe3d9179..624b0960476e 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,12 +10,7 @@
This script performs extensive parsing to the Linux kernel's netlink YAML
spec files, in an effort to avoid needing to heavily mark up the original
- YAML file.
-
- This code is split in three big parts:
- 1) RST formatters: Use to convert a string to a RST output
- 2) Parser helpers: Functions to parse the YAML data structure
- 3) Main function and small helpers
+ YAML file. It uses the library code from scripts/lib.
"""
import os.path
@@ -28,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
-from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
+from netlink_yml_parser import YnlDocGenerator
def parse_arguments() -> argparse.Namespace:
@@ -76,10 +71,10 @@ def write_to_rstfile(content: str, filename: str) -> None:
rst_file.write(content)
-def write_index_rst(output: str, index_dir: str) -> None:
+def write_index_rst(parser: YnlDocGenerator, output: str, index_dir: str) -> None:
"""Generate the `networking_spec/index` content and write to the file"""
- msg = generate_main_index_rst(output, index_dir)
+ msg = parser.generate_main_index_rst(output, index_dir)
logging.debug("Writing an index file at %s", output)
write_to_rstfile(msg, output)
@@ -90,10 +85,12 @@ def main() -> None:
args = parse_arguments()
+ parser = YnlDocGenerator()
+
if args.input:
logging.debug("Parsing %s", args.input)
try:
- content = parse_yaml_file(os.path.join(args.input))
+ content = parser.parse_yaml_file(os.path.join(args.input))
except Exception as exception:
logging.warning("Failed to parse %s.", args.input)
logging.warning(exception)
@@ -103,7 +100,7 @@ def main() -> None:
if args.index:
# Generate the index RST file
- write_index_rst(args.output, args.input_dir)
+ write_index_rst(parser, args.output, args.input_dir)
if __name__ == "__main__":
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (5 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 14:15 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 08/14] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
` (6 subsequent siblings)
13 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
The index.rst generator doesn't really belong to the parsing
function. Move it to the command line tool, as it won't be
used elsewhere.
While here, make it more generic, allowing it to handle either
.yaml or .rst as input files.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
scripts/lib/netlink_yml_parser.py | 101 ++++++++---------------------
tools/net/ynl/pyynl/ynl_gen_rst.py | 28 +++++++-
2 files changed, 52 insertions(+), 77 deletions(-)
diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
index 839e78b39de3..866551726723 100755
--- a/scripts/lib/netlink_yml_parser.py
+++ b/scripts/lib/netlink_yml_parser.py
@@ -18,17 +18,12 @@
"""
from typing import Any, Dict, List
-import os.path
-import sys
-import argparse
-import logging
import yaml
-# ==============
-# RST Formatters
-# ==============
class RstFormatters:
+ """RST Formatters"""
+
SPACE_PER_LEVEL = 4
@staticmethod
@@ -36,81 +31,67 @@ class RstFormatters:
"""Return space to format"""
return " " * (level * RstFormatters.SPACE_PER_LEVEL)
-
@staticmethod
def bold(text: str) -> str:
"""Format bold text"""
return f"**{text}**"
-
@staticmethod
def inline(text: str) -> str:
"""Format inline text"""
return f"``{text}``"
-
@staticmethod
def sanitize(text: str) -> str:
"""Remove newlines and multiple spaces"""
# This is useful for some fields that are spread across multiple lines
return str(text).replace("\n", " ").strip()
-
def rst_fields(self, key: str, value: str, level: int = 0) -> str:
"""Return a RST formatted field"""
return self.headroom(level) + f":{key}: {value}"
-
def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
"""Format a single rst definition"""
return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
-
def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
"""Return a formatted paragraph"""
return self.headroom(level) + paragraph
-
def rst_bullet(self, item: str, level: int = 0) -> str:
"""Return a formatted a bullet"""
return self.headroom(level) + f"- {item}"
-
@staticmethod
def rst_subsection(title: str) -> str:
"""Add a sub-section to the document"""
return f"{title}\n" + "-" * len(title)
-
@staticmethod
def rst_subsubsection(title: str) -> str:
"""Add a sub-sub-section to the document"""
return f"{title}\n" + "~" * len(title)
-
@staticmethod
def rst_section(namespace: str, prefix: str, title: str) -> str:
"""Add a section to the document"""
return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
@staticmethod
def rst_subtitle(title: str) -> str:
"""Add a subtitle to the document"""
return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
@staticmethod
def rst_title(title: str) -> str:
"""Add a title to the document"""
return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
"""Format a list using inlines"""
return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
-
@staticmethod
def rst_ref(namespace: str, prefix: str, name: str) -> str:
"""Add a hyperlink to the document"""
@@ -119,10 +100,9 @@ class RstFormatters:
'nested-attributes': 'attribute-set',
'struct': 'definition'}
if prefix in mappings:
- prefix = mappings[prefix]
+ prefix = mappings.get(prefix, "")
return f":ref:`{namespace}-{prefix}-{name}`"
-
def rst_header(self) -> str:
"""The headers for all the auto generated RST files"""
lines = []
@@ -132,7 +112,6 @@ class RstFormatters:
return "\n".join(lines)
-
@staticmethod
def rst_toctree(maxdepth: int = 2) -> str:
"""Generate a toctree RST primitive"""
@@ -143,16 +122,13 @@ class RstFormatters:
return "\n".join(lines)
-
@staticmethod
def rst_label(title: str) -> str:
"""Return a formatted label"""
return f".. _{title}:\n\n"
-# =======
-# Parsers
-# =======
class YnlDocGenerator:
+ """YAML Netlink specs Parser"""
fmt = RstFormatters()
@@ -164,7 +140,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
"""Parse 'do' section and return a formatted string"""
lines = []
@@ -177,16 +152,16 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
"""Parse 'attributes' section"""
if "attributes" not in attrs:
return ""
- lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+ lines = [self.fmt.rst_fields("attributes",
+ self.fmt.rst_list_inline(attrs["attributes"]),
+ level + 1)]
return "\n".join(lines)
-
def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
"""Parse operations block"""
preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +169,8 @@ class YnlDocGenerator:
lines = []
for operation in operations:
- lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'operation',
+ operation["name"]))
lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
for key in operation.keys():
@@ -206,7 +182,8 @@ class YnlDocGenerator:
value = self.fmt.rst_ref(namespace, key, value)
lines.append(self.fmt.rst_fields(key, value, 0))
if 'flags' in operation:
- lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+ lines.append(self.fmt.rst_fields('flags',
+ self.fmt.rst_list_inline(operation['flags'])))
if "do" in operation:
lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +197,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
"""Parse a list of entries"""
ignored = ["pad"]
@@ -235,17 +211,19 @@ class YnlDocGenerator:
if type_:
field_name += f" ({self.fmt.inline(type_)})"
lines.append(
- self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+ self.fmt.rst_fields(field_name,
+ self.fmt.sanitize(entry.get("doc", "")),
+ level)
)
elif isinstance(entry, list):
lines.append(self.fmt.rst_list_inline(entry, level))
else:
- lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+ lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+ level))
lines.append("\n")
return "\n".join(lines)
-
def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
"""Parse definitions section"""
preprocessed = ["name", "entries", "members"]
@@ -270,7 +248,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
"""Parse attribute from attribute-set"""
preprocessed = ["name", "type"]
@@ -279,7 +256,8 @@ class YnlDocGenerator:
lines = []
for entry in entries:
- lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+ entry["name"]))
for attr in entry["attributes"]:
type_ = attr.get("type")
attr_line = attr["name"]
@@ -301,13 +279,13 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
"""Parse sub-message definitions"""
lines = []
for entry in entries:
- lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'sub-message',
+ entry["name"]))
for fmt in entry["formats"]:
value = fmt["value"]
@@ -315,13 +293,14 @@ class YnlDocGenerator:
for attr in ['fixed-header', 'attribute-set']:
if attr in fmt:
lines.append(self.fmt.rst_fields(attr,
- self.fmt.rst_ref(namespace, attr, fmt[attr]),
- 1))
+ self.fmt.rst_ref(namespace,
+ attr,
+ fmt[attr]),
+ 1))
lines.append("\n")
return "\n".join(lines)
-
def parse_yaml(self, obj: Dict[str, Any]) -> str:
"""Format the whole YAML into a RST string"""
lines = []
@@ -344,7 +323,8 @@ class YnlDocGenerator:
# Operations
if "operations" in obj:
lines.append(self.fmt.rst_subtitle("Operations"))
- lines.append(self.parse_operations(obj["operations"]["list"], family))
+ lines.append(self.parse_operations(obj["operations"]["list"],
+ family))
# Multicast groups
if "mcast-groups" in obj:
@@ -368,11 +348,9 @@ class YnlDocGenerator:
return "\n".join(lines)
-
# Main functions
# ==============
-
def parse_yaml_file(self, filename: str) -> str:
"""Transform the YAML specified by filename into an RST-formatted string"""
with open(filename, "r", encoding="utf-8") as spec_file:
@@ -380,30 +358,3 @@ class YnlDocGenerator:
content = self.parse_yaml(yaml_data)
return content
-
-
- def generate_main_index_rst(self, output: str, index_dir: str) -> None:
- """Generate the `networking_spec/index` content and write to the file"""
- lines = []
-
- lines.append(self.fmt.rst_header())
- lines.append(self.fmt.rst_label("specs"))
- lines.append(self.fmt.rst_title("Netlink Family Specifications"))
- lines.append(self.fmt.rst_toctree(1))
-
- index_fname = os.path.basename(output)
- base, ext = os.path.splitext(index_fname)
-
- if not index_dir:
- index_dir = os.path.dirname(output)
-
- logging.debug(f"Looking for {ext} files in %s", index_dir)
- for filename in sorted(os.listdir(index_dir)):
- if not filename.endswith(ext) or filename == index_fname:
- continue
- base, ext = os.path.splitext(filename)
- lines.append(f" {base}\n")
-
- logging.debug("Writing an index file at %s", output)
-
- return "".join(lines)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 624b0960476e..85e9e2520393 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -23,7 +23,7 @@ SRC_DIR = os.path.dirname(os.path.realpath(__file__))
sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
-from netlink_yml_parser import YnlDocGenerator
+from netlink_yml_parser import YnlDocGenerator # pylint: disable=C0413
def parse_arguments() -> argparse.Namespace:
@@ -74,7 +74,31 @@ def write_to_rstfile(content: str, filename: str) -> None:
def write_index_rst(parser: YnlDocGenerator, output: str, index_dir: str) -> None:
"""Generate the `networking_spec/index` content and write to the file"""
- msg = parser.generate_main_index_rst(output, index_dir)
+ lines = []
+
+ lines.append(parser.fmt.rst_header())
+ lines.append(parser.fmt.rst_label("specs"))
+ lines.append(parser.fmt.rst_title("Netlink Family Specifications"))
+ lines.append(parser.fmt.rst_toctree(1))
+
+ index_fname = os.path.basename(output)
+ if not index_dir:
+ index_dir = os.path.dirname(output)
+
+ exts = [ ".yaml", ".rst" ]
+
+ logging.debug(f"Looking for files in %s", index_dir)
+ for filename in sorted(os.listdir(index_dir)):
+ if filename == index_fname:
+ continue
+
+ for ext in exts:
+ if not filename.endswith(ext):
+ continue
+ base, ext = os.path.splitext(filename)
+ lines.append(f" {base}\n")
+
+ msg = "".join(lines)
logging.debug("Writing an index file at %s", output)
write_to_rstfile(msg, output)
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 08/14] docs: sphinx: add a parser for yaml files for Netlink specs
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (6 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 09/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
` (5 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
Add a simple sphinx.Parser to handle yaml files and add the
the code to handle Netlink specs. All other yaml files are
ignored.
The code was written in a way that parsing yaml for different
subsystems and even for different parts of Netlink are easy.
All it takes to have a different parser is to add an
import line similar to:
from netlink_yml_parser import YnlDocGenerator
adding the corresponding parser somewhere at the extension:
netlink_parser = YnlDocGenerator()
And then add a logic inside parse() to handle different
doc outputs, depending on the file location, similar to:
if "/netlink/specs/" in fname:
msg = self.netlink_parser.parse_yaml_file(fname)
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
.pylintrc | 2 +-
Documentation/sphinx/parser_yaml.py | 76 +++++++++++++++++++++++++++++
2 files changed, 77 insertions(+), 1 deletion(-)
create mode 100755 Documentation/sphinx/parser_yaml.py
diff --git a/.pylintrc b/.pylintrc
index 30b8ae1659f8..f1d21379254b 100644
--- a/.pylintrc
+++ b/.pylintrc
@@ -1,2 +1,2 @@
[MASTER]
-init-hook='import sys; sys.path += ["scripts/lib/kdoc", "scripts/lib/abi"]'
+init-hook='import sys; sys.path += ["scripts/lib", "scripts/lib/kdoc", "scripts/lib/abi"]'
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..9083e102c9f3
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,76 @@
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+import re
+import sys
+
+from pprint import pformat
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "scripts/lib"))
+
+from netlink_yml_parser import YnlDocGenerator # pylint: disable=C0413
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+ """Custom parser for YAML files."""
+
+ # Need at least two elements on this set
+ supported = ('yaml', 'yml')
+
+ netlink_parser = YnlDocGenerator()
+
+ def do_parse(self, inputstring, document, msg):
+ """Parse YAML and generate a document tree."""
+
+ self.setup_parse(inputstring, document)
+
+ result = ViewList()
+
+ try:
+ # Parse message with RSTParser
+ for i, line in enumerate(msg.split('\n')):
+ result.append(line, document.current_source, i)
+
+ rst_parser = RSTParser()
+ rst_parser.parse('\n'.join(result), document)
+
+ except Exception as e:
+ document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+ self.finish_parse()
+
+ # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+ def parse(self, inputstring, document):
+ """Check if a YAML is meant to be parsed."""
+
+ fname = document.current_source
+
+ # Handle netlink yaml specs
+ if "/netlink/specs/" in fname:
+ msg = self.netlink_parser.parse_yaml_file(fname)
+ self.do_parse(inputstring, document, msg)
+
+ # All other yaml files are ignored
+
+def setup(app):
+ """Setup function for the Sphinx extension."""
+
+ # Add YAML parser
+ app.add_source_parser(YamlParser)
+ app.add_source_suffix('.yaml', 'yaml')
+
+ return {
+ 'version': '1.0',
+ 'parallel_read_safe': True,
+ 'parallel_write_safe': True,
+ }
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 09/14] docs: use parser_yaml extension to handle Netlink specs
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (7 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 08/14] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 10/14] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
` (4 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.
We are using here a toctree with :glob: property. This way, there
is no need to touch the netlink/specs/index.rst file every time
a new Netlink spec is added/renamed/removed.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/Makefile | 17 -----------------
Documentation/conf.py | 11 ++++++-----
Documentation/netlink/specs/index.rst | 13 +++++++++++++
Documentation/networking/index.rst | 2 +-
.../networking/netlink_spec/readme.txt | 4 ----
5 files changed, 20 insertions(+), 27 deletions(-)
create mode 100644 Documentation/netlink/specs/index.rst
delete mode 100644 Documentation/networking/netlink_spec/readme.txt
diff --git a/Documentation/Makefile b/Documentation/Makefile
index d30d66ddf1ad..9185680b1e86 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -102,22 +102,6 @@ quiet_cmd_sphinx = SPHINX $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
fi
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
- $(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
- $(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
htmldocs:
@$(srctree)/scripts/sphinx-pre-install --version-check
@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -184,7 +168,6 @@ refcheckdocs:
$(Q)cd $(srctree);scripts/documentation-file-ref-check
cleandocs:
- $(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
$(Q)rm -rf $(BUILDDIR)
$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 12de52a2b17e..add6ce78dd80 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -45,7 +45,7 @@ needs_sphinx = '3.4.3'
extensions = ['kerneldoc', 'rstFlatTable', 'kernel_include',
'kfigure', 'sphinx.ext.ifconfig', 'automarkup',
'maintainers_include', 'sphinx.ext.autosectionlabel',
- 'kernel_abi', 'kernel_feat', 'translations']
+ 'kernel_abi', 'kernel_feat', 'translations', 'parser_yaml']
# Since Sphinx version 3, the C function parser is more pedantic with regards
# to type checking. Due to that, having macros at c:function cause problems.
@@ -143,10 +143,11 @@ else:
# Add any paths that contain templates here, relative to this directory.
templates_path = ['sphinx/templates']
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+ '.rst': 'restructuredtext',
+ '.yaml': 'yaml',
+}
# The encoding of source files.
#source_encoding = 'utf-8-sig'
diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..7f7cf4a096f2
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,13 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+ :maxdepth: 1
+ :glob:
+
+ *
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
filter
generic-hdlc
generic_netlink
- netlink_spec/index
+ ../netlink/specs/index
gen_stats
gtp
ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 10/14] docs: conf.py: don't handle yaml files outside Netlink specs
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (8 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 09/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 11/14] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
` (3 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
The parser_yaml extension already has a logic to prevent
handing all yaml documents. However, if we don't also exclude
the patterns at conf.py, the build time would increase a lot,
and warnings like those would be generated:
Documentation/netlink/genetlink.yaml: WARNING: o documento não está incluído em nenhum toctree
Documentation/netlink/genetlink-c.yaml: WARNING: o documento não está incluído em nenhum toctree
Documentation/netlink/genetlink-legacy.yaml: WARNING: o documento não está incluído em nenhum toctree
Documentation/netlink/index.rst: WARNING: o documento não está incluído em nenhum toctree
Documentation/netlink/netlink-raw.yaml: WARNING: o documento não está incluído em nenhum toctree
Add some exclusion rules to prevent that.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/conf.py | 12 ++++++++++--
1 file changed, 10 insertions(+), 2 deletions(-)
diff --git a/Documentation/conf.py b/Documentation/conf.py
index add6ce78dd80..62a51ac64b95 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -221,8 +221,16 @@ language = 'en'
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
-# directories to ignore when looking for source files.
-exclude_patterns = ['output']
+# directories.
+include_patterns = [
+ '**.rst',
+ 'netlink/specs/*.yaml',
+]
+
+# patterns to ignore when looking for source files.
+exclude_patterns = [
+ 'output',
+]
# The reST default role (used for this markup: `text`) to use for all
# documents.
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 11/14] docs: uapi: netlink: update netlink specs link
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (9 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 10/14] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py Mauro Carvalho Chehab
` (2 subsequent siblings)
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
Jakub Kicinski, Simon Horman, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
With the recent parser_yaml extension, and the removal of the
auto-generated ReST source files, the location of netlink
specs changed.
Update uAPI accordingly.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/userspace-api/netlink/index.rst | 2 +-
Documentation/userspace-api/netlink/specs.rst | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/Documentation/userspace-api/netlink/index.rst b/Documentation/userspace-api/netlink/index.rst
index c1b6765cc963..83ae25066591 100644
--- a/Documentation/userspace-api/netlink/index.rst
+++ b/Documentation/userspace-api/netlink/index.rst
@@ -18,4 +18,4 @@ Netlink documentation for users.
See also:
- :ref:`Documentation/core-api/netlink.rst <kernel_netlink>`
- - :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - :ref:`Documentation/netlink/specs/index.rst <specs>`
diff --git a/Documentation/userspace-api/netlink/specs.rst b/Documentation/userspace-api/netlink/specs.rst
index 1b50d97d8d7c..debb4bfca5c4 100644
--- a/Documentation/userspace-api/netlink/specs.rst
+++ b/Documentation/userspace-api/netlink/specs.rst
@@ -15,7 +15,7 @@ kernel headers directly.
Internally kernel uses the YAML specs to generate:
- the C uAPI header
- - documentation of the protocol as a ReST file - see :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - documentation of the protocol as a ReST file - see :ref:`Documentation/netlink/specs/index.rst <specs>`
- policy tables for input attribute validation
- operation tables
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (10 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 11/14] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 14:22 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 13/14] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 14/14] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
13 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
The parsing code from tools/net/ynl/pyynl/ynl_gen_rst.py was moved
to scripts/lib/netlink_yml_parser.py. Its maintainership
is done by Netlink maintainers. Yet, as it is used by Sphinx
build system, add it also to linux-doc maintainers, as changes
there might affect documentation builds. So, linux-docs ML
should ideally be C/C on changes to it.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
MAINTAINERS | 2 ++
1 file changed, 2 insertions(+)
diff --git a/MAINTAINERS b/MAINTAINERS
index a92290fffa16..2c0b13e5d8fc 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -7202,6 +7202,7 @@ F: scripts/get_abi.py
F: scripts/kernel-doc*
F: scripts/lib/abi/*
F: scripts/lib/kdoc/*
+F: scripts/lib/netlink_yml_parser.py
F: scripts/sphinx-pre-install
X: Documentation/ABI/
X: Documentation/admin-guide/media/
@@ -27314,6 +27315,7 @@ M: Jakub Kicinski <kuba@kernel.org>
F: Documentation/netlink/
F: Documentation/userspace-api/netlink/intro-specs.rst
F: Documentation/userspace-api/netlink/specs.rst
+F: scripts/lib/netlink_yml_parser.py
F: tools/net/ynl/
YEALINK PHONE DRIVER
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 13/14] docs: Makefile: disable check rules on make cleandocs
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (11 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 14/14] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
It doesn't make sense to check for missing ABI and documents
when cleaning the tree.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/Makefile | 2 ++
1 file changed, 2 insertions(+)
diff --git a/Documentation/Makefile b/Documentation/Makefile
index 9185680b1e86..820f07e0afe6 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -5,6 +5,7 @@
# for cleaning
subdir- := devicetree/bindings
+ifneq ($(MAKECMDGOALS),cleandocs)
# Check for broken documentation file references
ifeq ($(CONFIG_WARN_MISSING_DOCUMENTS),y)
$(shell $(srctree)/scripts/documentation-file-ref-check --warn)
@@ -14,6 +15,7 @@ endif
ifeq ($(CONFIG_WARN_ABI_ERRORS),y)
$(shell $(srctree)/scripts/get_abi.py --dir $(srctree)/Documentation/ABI validate)
endif
+endif
# You can set these variables from the command line.
SPHINXBUILD = sphinx-build
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* [PATCH v4 14/14] docs: conf.py: properly handle include and exclude patterns
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (12 preceding siblings ...)
2025-06-14 8:56 ` [PATCH v4 13/14] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
@ 2025-06-14 8:56 ` Mauro Carvalho Chehab
13 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 8:56 UTC (permalink / raw)
To: Linux Doc Mailing List, Jonathan Corbet
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver,
Mauro Carvalho Chehab, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
When one does:
make SPHINXDIRS="netlink/specs" htmldocs
the build would break because a statically-defined pattern
like:
include_patterns = [
...
'netlink/specs/*.yaml',
]
would be pointing to Documentation/netlink/specs/netlink/specs,
as the path there is relative. Also, when SPHINXDIRS is used,
the exclude_pattern = [ "output" ] is also wrong.
Fix conf.py to generate relative include/exclude patterns,
relative to the SOURCEDIR sphinx-dir parameter.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/conf.py | 58 ++++++++++++++++++++++++++++++++++---------
1 file changed, 46 insertions(+), 12 deletions(-)
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 62a51ac64b95..be678bdf95d4 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -17,6 +17,52 @@ import os
import sphinx
import shutil
+# Location of Documentation/ directory
+doctree = os.path.abspath('.')
+
+# List of patterns that don't contain directory names, in glob format.
+include_patterns = ['**.rst']
+exclude_patterns = []
+
+# List of patterns that contain directory names in glob format.
+dyn_include_patterns = ['netlink/specs/**.yaml']
+dyn_exclude_patterns = ['output']
+
+def setup(app):
+ """
+ On Sphinx, all directories are relative to what it is passed as
+ SOURCEDIR parameter for sphinx-build. Due to that, all patterns
+ that have directory names on it need to be dynamically set, after
+ converting them to a relative patch.
+
+ As Sphinx doesn't include any patterns outside SOURCEDIR, we should
+ exclude relative patterns that start with "../".
+ """
+
+ sourcedir = app.srcdir # full path to the source directory
+ builddir = os.environ.get("BUILDDIR")
+
+ # setup include_patterns dynamically
+ for p in dyn_include_patterns:
+ full = os.path.join(doctree, p)
+
+ rel_path = os.path.relpath(full, start = app.srcdir)
+ if rel_path.startswith("../"):
+ continue
+
+ app.config.include_patterns.append(rel_path)
+
+ # setup exclude_patterns dynamically
+ for p in dyn_exclude_patterns:
+ full = os.path.join(doctree, p)
+
+ rel_path = os.path.relpath(full, start = app.srcdir)
+ if rel_path.startswith("../"):
+ continue
+
+ app.config.exclude_patterns.append(rel_path)
+
+
# helper
# ------
@@ -220,18 +266,6 @@ language = 'en'
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
-# List of patterns, relative to source directory, that match files and
-# directories.
-include_patterns = [
- '**.rst',
- 'netlink/specs/*.yaml',
-]
-
-# patterns to ignore when looking for source files.
-exclude_patterns = [
- 'output',
-]
-
# The reST default role (used for this markup: `text`) to use for all
# documents.
#default_role = None
--
2.49.0
^ permalink raw reply related [flat|nested] 29+ messages in thread
* Re: [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic
2025-06-14 8:55 ` [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
@ 2025-06-14 13:41 ` Donald Hunter
2025-06-14 14:58 ` Mauro Carvalho Chehab
0 siblings, 1 reply; 29+ messages in thread
From: Donald Hunter @ 2025-06-14 13:41 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> It is not a good practice to store build-generated files
> inside $(srctree), as one may be using O=<BUILDDIR> and even
> have the Kernel on a read-only directory.
>
> Change the YAML generation for netlink files to allow it
> to parse data based on the source or on the object tree.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
> tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
> 1 file changed, 16 insertions(+), 6 deletions(-)
It looks like this patch is no longer required since this script
doesn't get run by `make htmldocs` any more.
Instead, I think there is cleanup work to remove unused code like
`generate_main_index_rst`
This whole script may be unnecessary now, unless we want a simple way
to run YnlDocGenerator separately from the main doc build.
>
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index 7bfb8ceeeefc..b1e5acafb998 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -365,6 +365,7 @@ def parse_arguments() -> argparse.Namespace:
>
> parser.add_argument("-v", "--verbose", action="store_true")
> parser.add_argument("-o", "--output", help="Output file name")
> + parser.add_argument("-d", "--input_dir", help="YAML input directory")
>
> # Index and input are mutually exclusive
> group = parser.add_mutually_exclusive_group()
> @@ -405,11 +406,14 @@ def write_to_rstfile(content: str, filename: str) -> None:
> """Write the generated content into an RST file"""
> logging.debug("Saving RST file to %s", filename)
>
> + dir = os.path.dirname(filename)
> + os.makedirs(dir, exist_ok=True)
> +
> with open(filename, "w", encoding="utf-8") as rst_file:
> rst_file.write(content)
>
>
> -def generate_main_index_rst(output: str) -> None:
> +def generate_main_index_rst(output: str, index_dir: str) -> None:
> """Generate the `networking_spec/index` content and write to the file"""
> lines = []
>
> @@ -418,12 +422,18 @@ def generate_main_index_rst(output: str) -> None:
> lines.append(rst_title("Netlink Family Specifications"))
> lines.append(rst_toctree(1))
>
> - index_dir = os.path.dirname(output)
> - logging.debug("Looking for .rst files in %s", index_dir)
> + index_fname = os.path.basename(output)
> + base, ext = os.path.splitext(index_fname)
> +
> + if not index_dir:
> + index_dir = os.path.dirname(output)
> +
> + logging.debug(f"Looking for {ext} files in %s", index_dir)
> for filename in sorted(os.listdir(index_dir)):
> - if not filename.endswith(".rst") or filename == "index.rst":
> + if not filename.endswith(ext) or filename == index_fname:
> continue
> - lines.append(f" {filename.replace('.rst', '')}\n")
> + base, ext = os.path.splitext(filename)
> + lines.append(f" {base}\n")
>
> logging.debug("Writing an index file at %s", output)
> write_to_rstfile("".join(lines), output)
> @@ -447,7 +457,7 @@ def main() -> None:
>
> if args.index:
> # Generate the index RST file
> - generate_main_index_rst(args.output)
> + generate_main_index_rst(args.output, args.input_dir)
>
>
> if __name__ == "__main__":
> --
> 2.49.0
>
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool
2025-06-14 8:55 ` [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-06-14 14:09 ` Donald Hunter
0 siblings, 0 replies; 29+ messages in thread
From: Donald Hunter @ 2025-06-14 14:09 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> As we'll be using the Netlink specs parser inside a Sphinx
> extension, move the library part from the command line parser.
>
> No functional changes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
> scripts/lib/netlink_yml_parser.py | 391 +++++++++++++++++++++++++++++
As I mentioned earlier, please move this to tools/net/ynl/pyynl/lib
> tools/net/ynl/pyynl/ynl_gen_rst.py | 374 +--------------------------
> 2 files changed, 401 insertions(+), 364 deletions(-)
> create mode 100755 scripts/lib/netlink_yml_parser.py
>
> diff --git a/scripts/lib/netlink_yml_parser.py b/scripts/lib/netlink_yml_parser.py
> new file mode 100755
> index 000000000000..3c15b578f947
> --- /dev/null
> +++ b/scripts/lib/netlink_yml_parser.py
> @@ -0,0 +1,391 @@
> +#!/usr/bin/env python3
> +# SPDX-License-Identifier: GPL-2.0
> +# -*- coding: utf-8; mode: python -*-
> +
> +"""
> + Script to auto generate the documentation for Netlink specifications.
> +
> + :copyright: Copyright (C) 2023 Breno Leitao <leitao@debian.org>
> + :license: GPL Version 2, June 1991 see linux/COPYING for details.
> +
> + This script performs extensive parsing to the Linux kernel's netlink YAML
> + spec files, in an effort to avoid needing to heavily mark up the original
> + YAML file.
> +
> + This code is split in three big parts:
> + 1) RST formatters: Use to convert a string to a RST output
> + 2) Parser helpers: Functions to parse the YAML data structure
> + 3) Main function and small helpers
> +"""
> +
> +from typing import Any, Dict, List
> +import os.path
> +import logging
> +import yaml
> +
> +
> +SPACE_PER_LEVEL = 4
> +
> +
> +# RST Formatters
> +# ==============
> +def headroom(level: int) -> str:
> + """Return space to format"""
> + return " " * (level * SPACE_PER_LEVEL)
> +
> +
> +def bold(text: str) -> str:
> + """Format bold text"""
> + return f"**{text}**"
> +
> +
> +def inline(text: str) -> str:
> + """Format inline text"""
> + return f"``{text}``"
> +
> +
> +def sanitize(text: str) -> str:
> + """Remove newlines and multiple spaces"""
> + # This is useful for some fields that are spread across multiple lines
> + return str(text).replace("\n", " ").strip()
> +
> +
> +def rst_fields(key: str, value: str, level: int = 0) -> str:
> + """Return a RST formatted field"""
> + return headroom(level) + f":{key}: {value}"
> +
> +
> +def rst_definition(key: str, value: Any, level: int = 0) -> str:
> + """Format a single rst definition"""
> + return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
> +
> +
> +def rst_paragraph(paragraph: str, level: int = 0) -> str:
> + """Return a formatted paragraph"""
> + return headroom(level) + paragraph
> +
> +
> +def rst_bullet(item: str, level: int = 0) -> str:
> + """Return a formatted a bullet"""
> + return headroom(level) + f"- {item}"
> +
> +
> +def rst_subsection(title: str) -> str:
> + """Add a sub-section to the document"""
> + return f"{title}\n" + "-" * len(title)
> +
> +
> +def rst_subsubsection(title: str) -> str:
> + """Add a sub-sub-section to the document"""
> + return f"{title}\n" + "~" * len(title)
> +
> +
> +def rst_section(namespace: str, prefix: str, title: str) -> str:
> + """Add a section to the document"""
> + return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
> +
> +
> +def rst_subtitle(title: str) -> str:
> + """Add a subtitle to the document"""
> + return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
> +
> +
> +def rst_title(title: str) -> str:
> + """Add a title to the document"""
> + return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
> +
> +
> +def rst_list_inline(list_: List[str], level: int = 0) -> str:
> + """Format a list using inlines"""
> + return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
> +
> +
> +def rst_ref(namespace: str, prefix: str, name: str) -> str:
> + """Add a hyperlink to the document"""
> + mappings = {'enum': 'definition',
> + 'fixed-header': 'definition',
> + 'nested-attributes': 'attribute-set',
> + 'struct': 'definition'}
> + if prefix in mappings:
> + prefix = mappings[prefix]
> + return f":ref:`{namespace}-{prefix}-{name}`"
> +
> +
> +def rst_header() -> str:
> + """The headers for all the auto generated RST files"""
> + lines = []
> +
> + lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
> + lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
> +
> + return "\n".join(lines)
> +
> +
> +def rst_toctree(maxdepth: int = 2) -> str:
> + """Generate a toctree RST primitive"""
> + lines = []
> +
> + lines.append(".. toctree::")
> + lines.append(f" :maxdepth: {maxdepth}\n\n")
> +
> + return "\n".join(lines)
> +
> +
> +def rst_label(title: str) -> str:
> + """Return a formatted label"""
> + return f".. _{title}:\n\n"
> +
> +
> +# Parsers
> +# =======
> +
> +
> +def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
> + """Parse 'multicast' group list and return a formatted string"""
> + lines = []
> + for group in mcast_group:
> + lines.append(rst_bullet(group["name"]))
> +
> + return "\n".join(lines)
> +
> +
> +def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
> + """Parse 'do' section and return a formatted string"""
> + lines = []
> + for key in do_dict.keys():
> + lines.append(rst_paragraph(bold(key), level + 1))
> + if key in ['request', 'reply']:
> + lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
> + else:
> + lines.append(headroom(level + 2) + do_dict[key] + "\n")
> +
> + return "\n".join(lines)
> +
> +
> +def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
> + """Parse 'attributes' section"""
> + if "attributes" not in attrs:
> + return ""
> + lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
> +
> + return "\n".join(lines)
> +
> +
> +def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
> + """Parse operations block"""
> + preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
> + linkable = ["fixed-header", "attribute-set"]
> + lines = []
> +
> + for operation in operations:
> + lines.append(rst_section(namespace, 'operation', operation["name"]))
> + lines.append(rst_paragraph(operation["doc"]) + "\n")
> +
> + for key in operation.keys():
> + if key in preprocessed:
> + # Skip the special fields
> + continue
> + value = operation[key]
> + if key in linkable:
> + value = rst_ref(namespace, key, value)
> + lines.append(rst_fields(key, value, 0))
> + if 'flags' in operation:
> + lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
> +
> + if "do" in operation:
> + lines.append(rst_paragraph(":do:", 0))
> + lines.append(parse_do(operation["do"], 0))
> + if "dump" in operation:
> + lines.append(rst_paragraph(":dump:", 0))
> + lines.append(parse_do(operation["dump"], 0))
> +
> + # New line after fields
> + lines.append("\n")
> +
> + return "\n".join(lines)
> +
> +
> +def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
> + """Parse a list of entries"""
> + ignored = ["pad"]
> + lines = []
> + for entry in entries:
> + if isinstance(entry, dict):
> + # entries could be a list or a dictionary
> + field_name = entry.get("name", "")
> + if field_name in ignored:
> + continue
> + type_ = entry.get("type")
> + if type_:
> + field_name += f" ({inline(type_)})"
> + lines.append(
> + rst_fields(field_name, sanitize(entry.get("doc", "")), level)
> + )
> + elif isinstance(entry, list):
> + lines.append(rst_list_inline(entry, level))
> + else:
> + lines.append(rst_bullet(inline(sanitize(entry)), level))
> +
> + lines.append("\n")
> + return "\n".join(lines)
> +
> +
> +def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
> + """Parse definitions section"""
> + preprocessed = ["name", "entries", "members"]
> + ignored = ["render-max"] # This is not printed
> + lines = []
> +
> + for definition in defs:
> + lines.append(rst_section(namespace, 'definition', definition["name"]))
> + for k in definition.keys():
> + if k in preprocessed + ignored:
> + continue
> + lines.append(rst_fields(k, sanitize(definition[k]), 0))
> +
> + # Field list needs to finish with a new line
> + lines.append("\n")
> + if "entries" in definition:
> + lines.append(rst_paragraph(":entries:", 0))
> + lines.append(parse_entries(definition["entries"], 1))
> + if "members" in definition:
> + lines.append(rst_paragraph(":members:", 0))
> + lines.append(parse_entries(definition["members"], 1))
> +
> + return "\n".join(lines)
> +
> +
> +def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
> + """Parse attribute from attribute-set"""
> + preprocessed = ["name", "type"]
> + linkable = ["enum", "nested-attributes", "struct", "sub-message"]
> + ignored = ["checks"]
> + lines = []
> +
> + for entry in entries:
> + lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
> + for attr in entry["attributes"]:
> + type_ = attr.get("type")
> + attr_line = attr["name"]
> + if type_:
> + # Add the attribute type in the same line
> + attr_line += f" ({inline(type_)})"
> +
> + lines.append(rst_subsubsection(attr_line))
> +
> + for k in attr.keys():
> + if k in preprocessed + ignored:
> + continue
> + if k in linkable:
> + value = rst_ref(namespace, k, attr[k])
> + else:
> + value = sanitize(attr[k])
> + lines.append(rst_fields(k, value, 0))
> + lines.append("\n")
> +
> + return "\n".join(lines)
> +
> +
> +def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
> + """Parse sub-message definitions"""
> + lines = []
> +
> + for entry in entries:
> + lines.append(rst_section(namespace, 'sub-message', entry["name"]))
> + for fmt in entry["formats"]:
> + value = fmt["value"]
> +
> + lines.append(rst_bullet(bold(value)))
> + for attr in ['fixed-header', 'attribute-set']:
> + if attr in fmt:
> + lines.append(rst_fields(attr,
> + rst_ref(namespace, attr, fmt[attr]),
> + 1))
> + lines.append("\n")
> +
> + return "\n".join(lines)
> +
> +
> +def parse_yaml(obj: Dict[str, Any]) -> str:
> + """Format the whole YAML into a RST string"""
> + lines = []
> +
> + # Main header
> +
> + family = obj['name']
> +
> + lines.append(rst_header())
> + lines.append(rst_label("netlink-" + family))
> +
> + title = f"Family ``{family}`` netlink specification"
> + lines.append(rst_title(title))
> + lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
> +
> + if "doc" in obj:
> + lines.append(rst_subtitle("Summary"))
> + lines.append(rst_paragraph(obj["doc"], 0))
> +
> + # Operations
> + if "operations" in obj:
> + lines.append(rst_subtitle("Operations"))
> + lines.append(parse_operations(obj["operations"]["list"], family))
> +
> + # Multicast groups
> + if "mcast-groups" in obj:
> + lines.append(rst_subtitle("Multicast groups"))
> + lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
> +
> + # Definitions
> + if "definitions" in obj:
> + lines.append(rst_subtitle("Definitions"))
> + lines.append(parse_definitions(obj["definitions"], family))
> +
> + # Attributes set
> + if "attribute-sets" in obj:
> + lines.append(rst_subtitle("Attribute sets"))
> + lines.append(parse_attr_sets(obj["attribute-sets"], family))
> +
> + # Sub-messages
> + if "sub-messages" in obj:
> + lines.append(rst_subtitle("Sub-messages"))
> + lines.append(parse_sub_messages(obj["sub-messages"], family))
> +
> + return "\n".join(lines)
> +
> +
> +# Main functions
> +# ==============
> +
> +
> +def parse_yaml_file(filename: str) -> str:
> + """Transform the YAML specified by filename into an RST-formatted string"""
> + with open(filename, "r", encoding="utf-8") as spec_file:
> + yaml_data = yaml.safe_load(spec_file)
> + content = parse_yaml(yaml_data)
> +
> + return content
> +
> +
> +def generate_main_index_rst(output: str, index_dir: str) -> str:
> + """Generate the `networking_spec/index` content and write to the file"""
> + lines = []
> +
> + lines.append(rst_header())
> + lines.append(rst_label("specs"))
> + lines.append(rst_title("Netlink Family Specifications"))
> + lines.append(rst_toctree(1))
> +
> + index_fname = os.path.basename(output)
> + base, ext = os.path.splitext(index_fname)
> +
> + if not index_dir:
> + index_dir = os.path.dirname(output)
> +
> + logging.debug(f"Looking for {ext} files in %s", index_dir)
> + for filename in sorted(os.listdir(index_dir)):
> + if not filename.endswith(ext) or filename == index_fname:
> + continue
> + base, ext = os.path.splitext(filename)
> + lines.append(f" {base}\n")
> +
> + return "".join(lines), output
> diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
> index b1e5acafb998..38dafe3d9179 100755
> --- a/tools/net/ynl/pyynl/ynl_gen_rst.py
> +++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
> @@ -18,345 +18,17 @@
> 3) Main function and small helpers
> """
>
> -from typing import Any, Dict, List
> import os.path
> import sys
> import argparse
> import logging
> -import yaml
>
> +LIB_DIR = "../../../../scripts/lib"
> +SRC_DIR = os.path.dirname(os.path.realpath(__file__))
>
> -SPACE_PER_LEVEL = 4
> +sys.path.insert(0, os.path.join(SRC_DIR, LIB_DIR))
>
> -
> -# RST Formatters
> -# ==============
> -def headroom(level: int) -> str:
> - """Return space to format"""
> - return " " * (level * SPACE_PER_LEVEL)
> -
> -
> -def bold(text: str) -> str:
> - """Format bold text"""
> - return f"**{text}**"
> -
> -
> -def inline(text: str) -> str:
> - """Format inline text"""
> - return f"``{text}``"
> -
> -
> -def sanitize(text: str) -> str:
> - """Remove newlines and multiple spaces"""
> - # This is useful for some fields that are spread across multiple lines
> - return str(text).replace("\n", " ").strip()
> -
> -
> -def rst_fields(key: str, value: str, level: int = 0) -> str:
> - """Return a RST formatted field"""
> - return headroom(level) + f":{key}: {value}"
> -
> -
> -def rst_definition(key: str, value: Any, level: int = 0) -> str:
> - """Format a single rst definition"""
> - return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
> -
> -
> -def rst_paragraph(paragraph: str, level: int = 0) -> str:
> - """Return a formatted paragraph"""
> - return headroom(level) + paragraph
> -
> -
> -def rst_bullet(item: str, level: int = 0) -> str:
> - """Return a formatted a bullet"""
> - return headroom(level) + f"- {item}"
> -
> -
> -def rst_subsection(title: str) -> str:
> - """Add a sub-section to the document"""
> - return f"{title}\n" + "-" * len(title)
> -
> -
> -def rst_subsubsection(title: str) -> str:
> - """Add a sub-sub-section to the document"""
> - return f"{title}\n" + "~" * len(title)
> -
> -
> -def rst_section(namespace: str, prefix: str, title: str) -> str:
> - """Add a section to the document"""
> - return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
> -
> -
> -def rst_subtitle(title: str) -> str:
> - """Add a subtitle to the document"""
> - return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
> -
> -
> -def rst_title(title: str) -> str:
> - """Add a title to the document"""
> - return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
> -
> -
> -def rst_list_inline(list_: List[str], level: int = 0) -> str:
> - """Format a list using inlines"""
> - return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
> -
> -
> -def rst_ref(namespace: str, prefix: str, name: str) -> str:
> - """Add a hyperlink to the document"""
> - mappings = {'enum': 'definition',
> - 'fixed-header': 'definition',
> - 'nested-attributes': 'attribute-set',
> - 'struct': 'definition'}
> - if prefix in mappings:
> - prefix = mappings[prefix]
> - return f":ref:`{namespace}-{prefix}-{name}`"
> -
> -
> -def rst_header() -> str:
> - """The headers for all the auto generated RST files"""
> - lines = []
> -
> - lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
> - lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
> -
> - return "\n".join(lines)
> -
> -
> -def rst_toctree(maxdepth: int = 2) -> str:
> - """Generate a toctree RST primitive"""
> - lines = []
> -
> - lines.append(".. toctree::")
> - lines.append(f" :maxdepth: {maxdepth}\n\n")
> -
> - return "\n".join(lines)
> -
> -
> -def rst_label(title: str) -> str:
> - """Return a formatted label"""
> - return f".. _{title}:\n\n"
> -
> -
> -# Parsers
> -# =======
> -
> -
> -def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
> - """Parse 'multicast' group list and return a formatted string"""
> - lines = []
> - for group in mcast_group:
> - lines.append(rst_bullet(group["name"]))
> -
> - return "\n".join(lines)
> -
> -
> -def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
> - """Parse 'do' section and return a formatted string"""
> - lines = []
> - for key in do_dict.keys():
> - lines.append(rst_paragraph(bold(key), level + 1))
> - if key in ['request', 'reply']:
> - lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
> - else:
> - lines.append(headroom(level + 2) + do_dict[key] + "\n")
> -
> - return "\n".join(lines)
> -
> -
> -def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
> - """Parse 'attributes' section"""
> - if "attributes" not in attrs:
> - return ""
> - lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
> -
> - return "\n".join(lines)
> -
> -
> -def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
> - """Parse operations block"""
> - preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
> - linkable = ["fixed-header", "attribute-set"]
> - lines = []
> -
> - for operation in operations:
> - lines.append(rst_section(namespace, 'operation', operation["name"]))
> - lines.append(rst_paragraph(operation["doc"]) + "\n")
> -
> - for key in operation.keys():
> - if key in preprocessed:
> - # Skip the special fields
> - continue
> - value = operation[key]
> - if key in linkable:
> - value = rst_ref(namespace, key, value)
> - lines.append(rst_fields(key, value, 0))
> - if 'flags' in operation:
> - lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
> -
> - if "do" in operation:
> - lines.append(rst_paragraph(":do:", 0))
> - lines.append(parse_do(operation["do"], 0))
> - if "dump" in operation:
> - lines.append(rst_paragraph(":dump:", 0))
> - lines.append(parse_do(operation["dump"], 0))
> -
> - # New line after fields
> - lines.append("\n")
> -
> - return "\n".join(lines)
> -
> -
> -def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
> - """Parse a list of entries"""
> - ignored = ["pad"]
> - lines = []
> - for entry in entries:
> - if isinstance(entry, dict):
> - # entries could be a list or a dictionary
> - field_name = entry.get("name", "")
> - if field_name in ignored:
> - continue
> - type_ = entry.get("type")
> - if type_:
> - field_name += f" ({inline(type_)})"
> - lines.append(
> - rst_fields(field_name, sanitize(entry.get("doc", "")), level)
> - )
> - elif isinstance(entry, list):
> - lines.append(rst_list_inline(entry, level))
> - else:
> - lines.append(rst_bullet(inline(sanitize(entry)), level))
> -
> - lines.append("\n")
> - return "\n".join(lines)
> -
> -
> -def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
> - """Parse definitions section"""
> - preprocessed = ["name", "entries", "members"]
> - ignored = ["render-max"] # This is not printed
> - lines = []
> -
> - for definition in defs:
> - lines.append(rst_section(namespace, 'definition', definition["name"]))
> - for k in definition.keys():
> - if k in preprocessed + ignored:
> - continue
> - lines.append(rst_fields(k, sanitize(definition[k]), 0))
> -
> - # Field list needs to finish with a new line
> - lines.append("\n")
> - if "entries" in definition:
> - lines.append(rst_paragraph(":entries:", 0))
> - lines.append(parse_entries(definition["entries"], 1))
> - if "members" in definition:
> - lines.append(rst_paragraph(":members:", 0))
> - lines.append(parse_entries(definition["members"], 1))
> -
> - return "\n".join(lines)
> -
> -
> -def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
> - """Parse attribute from attribute-set"""
> - preprocessed = ["name", "type"]
> - linkable = ["enum", "nested-attributes", "struct", "sub-message"]
> - ignored = ["checks"]
> - lines = []
> -
> - for entry in entries:
> - lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
> - for attr in entry["attributes"]:
> - type_ = attr.get("type")
> - attr_line = attr["name"]
> - if type_:
> - # Add the attribute type in the same line
> - attr_line += f" ({inline(type_)})"
> -
> - lines.append(rst_subsubsection(attr_line))
> -
> - for k in attr.keys():
> - if k in preprocessed + ignored:
> - continue
> - if k in linkable:
> - value = rst_ref(namespace, k, attr[k])
> - else:
> - value = sanitize(attr[k])
> - lines.append(rst_fields(k, value, 0))
> - lines.append("\n")
> -
> - return "\n".join(lines)
> -
> -
> -def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
> - """Parse sub-message definitions"""
> - lines = []
> -
> - for entry in entries:
> - lines.append(rst_section(namespace, 'sub-message', entry["name"]))
> - for fmt in entry["formats"]:
> - value = fmt["value"]
> -
> - lines.append(rst_bullet(bold(value)))
> - for attr in ['fixed-header', 'attribute-set']:
> - if attr in fmt:
> - lines.append(rst_fields(attr,
> - rst_ref(namespace, attr, fmt[attr]),
> - 1))
> - lines.append("\n")
> -
> - return "\n".join(lines)
> -
> -
> -def parse_yaml(obj: Dict[str, Any]) -> str:
> - """Format the whole YAML into a RST string"""
> - lines = []
> -
> - # Main header
> -
> - family = obj['name']
> -
> - lines.append(rst_header())
> - lines.append(rst_label("netlink-" + family))
> -
> - title = f"Family ``{family}`` netlink specification"
> - lines.append(rst_title(title))
> - lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
> -
> - if "doc" in obj:
> - lines.append(rst_subtitle("Summary"))
> - lines.append(rst_paragraph(obj["doc"], 0))
> -
> - # Operations
> - if "operations" in obj:
> - lines.append(rst_subtitle("Operations"))
> - lines.append(parse_operations(obj["operations"]["list"], family))
> -
> - # Multicast groups
> - if "mcast-groups" in obj:
> - lines.append(rst_subtitle("Multicast groups"))
> - lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
> -
> - # Definitions
> - if "definitions" in obj:
> - lines.append(rst_subtitle("Definitions"))
> - lines.append(parse_definitions(obj["definitions"], family))
> -
> - # Attributes set
> - if "attribute-sets" in obj:
> - lines.append(rst_subtitle("Attribute sets"))
> - lines.append(parse_attr_sets(obj["attribute-sets"], family))
> -
> - # Sub-messages
> - if "sub-messages" in obj:
> - lines.append(rst_subtitle("Sub-messages"))
> - lines.append(parse_sub_messages(obj["sub-messages"], family))
> -
> - return "\n".join(lines)
> -
> -
> -# Main functions
> -# ==============
> +from netlink_yml_parser import parse_yaml_file, generate_main_index_rst
>
>
> def parse_arguments() -> argparse.Namespace:
> @@ -393,50 +65,24 @@ def parse_arguments() -> argparse.Namespace:
> return args
>
>
> -def parse_yaml_file(filename: str) -> str:
> - """Transform the YAML specified by filename into an RST-formatted string"""
> - with open(filename, "r", encoding="utf-8") as spec_file:
> - yaml_data = yaml.safe_load(spec_file)
> - content = parse_yaml(yaml_data)
> -
> - return content
> -
> -
> def write_to_rstfile(content: str, filename: str) -> None:
> """Write the generated content into an RST file"""
> logging.debug("Saving RST file to %s", filename)
>
> - dir = os.path.dirname(filename)
> - os.makedirs(dir, exist_ok=True)
> + directory = os.path.dirname(filename)
> + os.makedirs(directory, exist_ok=True)
This would be easier to review if this patch was only the code move,
without unrelated code changes.
> with open(filename, "w", encoding="utf-8") as rst_file:
> rst_file.write(content)
>
>
> -def generate_main_index_rst(output: str, index_dir: str) -> None:
> +def write_index_rst(output: str, index_dir: str) -> None:
It would simplify the patch series if we can avoid refactoring
generate_main_index_rst when it needs to be removed. Can you remove
the code earlier in the series?
> """Generate the `networking_spec/index` content and write to the file"""
> - lines = []
>
> - lines.append(rst_header())
> - lines.append(rst_label("specs"))
> - lines.append(rst_title("Netlink Family Specifications"))
> - lines.append(rst_toctree(1))
> -
> - index_fname = os.path.basename(output)
> - base, ext = os.path.splitext(index_fname)
> -
> - if not index_dir:
> - index_dir = os.path.dirname(output)
> -
> - logging.debug(f"Looking for {ext} files in %s", index_dir)
> - for filename in sorted(os.listdir(index_dir)):
> - if not filename.endswith(ext) or filename == index_fname:
> - continue
> - base, ext = os.path.splitext(filename)
> - lines.append(f" {base}\n")
> + msg = generate_main_index_rst(output, index_dir)
>
> logging.debug("Writing an index file at %s", output)
> - write_to_rstfile("".join(lines), output)
> + write_to_rstfile(msg, output)
>
>
> def main() -> None:
> @@ -457,7 +103,7 @@ def main() -> None:
>
> if args.index:
> # Generate the index RST file
> - generate_main_index_rst(args.output, args.input_dir)
> + write_index_rst(args.output, args.input_dir)
>
>
> if __name__ == "__main__":
> --
> 2.49.0
>
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes
2025-06-14 8:56 ` [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
@ 2025-06-14 14:11 ` Donald Hunter
0 siblings, 0 replies; 29+ messages in thread
From: Donald Hunter @ 2025-06-14 14:11 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> As we'll be importing netlink parser into a Sphinx extension,
> move all functions and global variables inside two classes:
>
> - RstFormatters, containing ReST formatter logic, which are
> YAML independent;
> - NetlinkYamlParser: contains the actual parser classes. That's
Please update the commit description to match the code.
> the only class that needs to be imported by the script or by
> a Sphinx extension.
>
> With that, we won't pollute Sphinx namespace, avoiding any
> potential clashes.
>
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script
2025-06-14 8:56 ` [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script Mauro Carvalho Chehab
@ 2025-06-14 14:15 ` Donald Hunter
2025-06-14 15:35 ` Mauro Carvalho Chehab
0 siblings, 1 reply; 29+ messages in thread
From: Donald Hunter @ 2025-06-14 14:15 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> The index.rst generator doesn't really belong to the parsing
> function. Move it to the command line tool, as it won't be
> used elsewhere.
>
> While here, make it more generic, allowing it to handle either
> .yaml or .rst as input files.
I think this patch can be dropped from the series, if instead you
remove the index generation code before refactoring into a library.
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
> scripts/lib/netlink_yml_parser.py | 101 ++++++++---------------------
> tools/net/ynl/pyynl/ynl_gen_rst.py | 28 +++++++-
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 8:56 ` [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py Mauro Carvalho Chehab
@ 2025-06-14 14:22 ` Donald Hunter
2025-06-14 15:32 ` Mauro Carvalho Chehab
0 siblings, 1 reply; 29+ messages in thread
From: Donald Hunter @ 2025-06-14 14:22 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
<mchehab+huawei@kernel.org> wrote:
>
> The parsing code from tools/net/ynl/pyynl/ynl_gen_rst.py was moved
> to scripts/lib/netlink_yml_parser.py. Its maintainership
> is done by Netlink maintainers. Yet, as it is used by Sphinx
> build system, add it also to linux-doc maintainers, as changes
> there might affect documentation builds. So, linux-docs ML
> should ideally be C/C on changes to it.
This patch can be dropped from the series when you move the library
code to tools/net/ynl/pyynl/lib.
> Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> ---
> MAINTAINERS | 2 ++
> 1 file changed, 2 insertions(+)
>
> diff --git a/MAINTAINERS b/MAINTAINERS
> index a92290fffa16..2c0b13e5d8fc 100644
> --- a/MAINTAINERS
> +++ b/MAINTAINERS
> @@ -7202,6 +7202,7 @@ F: scripts/get_abi.py
> F: scripts/kernel-doc*
> F: scripts/lib/abi/*
> F: scripts/lib/kdoc/*
> +F: scripts/lib/netlink_yml_parser.py
> F: scripts/sphinx-pre-install
> X: Documentation/ABI/
> X: Documentation/admin-guide/media/
> @@ -27314,6 +27315,7 @@ M: Jakub Kicinski <kuba@kernel.org>
> F: Documentation/netlink/
> F: Documentation/userspace-api/netlink/intro-specs.rst
> F: Documentation/userspace-api/netlink/specs.rst
> +F: scripts/lib/netlink_yml_parser.py
> F: tools/net/ynl/
>
> YEALINK PHONE DRIVER
> --
> 2.49.0
>
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic
2025-06-14 13:41 ` Donald Hunter
@ 2025-06-14 14:58 ` Mauro Carvalho Chehab
0 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 14:58 UTC (permalink / raw)
To: Donald Hunter
Cc: Linux Doc Mailing List, Jonathan Corbet, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Em Sat, 14 Jun 2025 14:41:29 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:
> On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
> <mchehab+huawei@kernel.org> wrote:
> >
> > It is not a good practice to store build-generated files
> > inside $(srctree), as one may be using O=<BUILDDIR> and even
> > have the Kernel on a read-only directory.
> >
> > Change the YAML generation for netlink files to allow it
> > to parse data based on the source or on the object tree.
> >
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> > tools/net/ynl/pyynl/ynl_gen_rst.py | 22 ++++++++++++++++------
> > 1 file changed, 16 insertions(+), 6 deletions(-)
>
> It looks like this patch is no longer required since this script
> doesn't get run by `make htmldocs` any more.
>
> Instead, I think there is cleanup work to remove unused code like
> `generate_main_index_rst`
It is too early to drop it on this series, as only this patch:
[PATCH v4 09/14] docs: use parser_yaml extension to handle Netlink specs
stops using it.
> This whole script may be unnecessary now, unless we want a simple way
> to run YnlDocGenerator separately from the main doc build.
It is up to you to keep or drop after patch 9. Yet, on my experiences with
kernel_doc.py and get_abi.py, it is a lot easier to test the parser via
a simple command line script, without having Sphinx parallel build, complex
doc build logic and Sphinx exception handling in place.
My suggestion is to keep ynl_gen_rst.py, removing generate_main_index_rst
as a cleanup patch after patch 9.
Regards,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 14:22 ` Donald Hunter
@ 2025-06-14 15:32 ` Mauro Carvalho Chehab
2025-06-14 17:37 ` Jakub Kicinski
0 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 15:32 UTC (permalink / raw)
To: Donald Hunter, Jonathan Corbet
Cc: Linux Doc Mailing List, Akira Yokosawa, Breno Leitao,
David S. Miller, Eric Dumazet, Ignacio Encinas Rubio, Jan Stancek,
Marco Elver, Paolo Abeni, Ruben Wauters, Shuah Khan, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Hi Donald, Jon,
Em Sat, 14 Jun 2025 15:22:16 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:
> On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
> <mchehab+huawei@kernel.org> wrote:
> >
> > The parsing code from tools/net/ynl/pyynl/ynl_gen_rst.py was moved
> > to scripts/lib/netlink_yml_parser.py. Its maintainership
> > is done by Netlink maintainers. Yet, as it is used by Sphinx
> > build system, add it also to linux-doc maintainers, as changes
> > there might affect documentation builds. So, linux-docs ML
> > should ideally be C/C on changes to it.
>
> This patch can be dropped from the series when you move the library
> code to tools/net/ynl/pyynl/lib.
>
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> > MAINTAINERS | 2 ++
> > 1 file changed, 2 insertions(+)
> >
> > diff --git a/MAINTAINERS b/MAINTAINERS
> > index a92290fffa16..2c0b13e5d8fc 100644
> > --- a/MAINTAINERS
> > +++ b/MAINTAINERS
> > @@ -7202,6 +7202,7 @@ F: scripts/get_abi.py
> > F: scripts/kernel-doc*
> > F: scripts/lib/abi/*
> > F: scripts/lib/kdoc/*
> > +F: scripts/lib/netlink_yml_parser.py
> > F: scripts/sphinx-pre-install
> > X: Documentation/ABI/
> > X: Documentation/admin-guide/media/
Adding an entry that would c/c to linux-doc for the parser is
important, as problems there will affect documentation build,
no matter where it is located. Perhaps one option would be to
create a separate MAINTAINERS entry for it, like:
YAML NETLINK (YNL) DOC GENERATOR
M: Donald Hunter <donald.hunter@gmail.com>
M: Jakub Kicinski <kuba@kernel.org>
F: <python_lib_location>/netlink_yml_parser.py
L: linux-doc@vger.kernel.org
to ensure that changes to it would be C/C to linux-doc.
> > @@ -27314,6 +27315,7 @@ M: Jakub Kicinski <kuba@kernel.org>
> > F: Documentation/netlink/
> > F: Documentation/userspace-api/netlink/intro-specs.rst
> > F: Documentation/userspace-api/netlink/specs.rst
> > +F: scripts/lib/netlink_yml_parser.py
> > F: tools/net/ynl/
With regards to the location itself, as I said earlier, it is up to
Jon and you to decide.
My preference is to have all Python libraries at the entire Kernel
inside scripts/lib (or at some other common location), no matter where
the caller Python command or in-kernel Sphinx extensions are located.
There is also slight advantage on placing them at the same location:
if we end adding parsers for other subsystems at parse_html.py, having
all of them at the same directory means we don't need to do something
like:
lib_paths = [
"tools/net/ynl/pyynl/lib",
"foo",
"bar",
...
]
for d in lib_paths:
sys.path.insert(0, os.path.join(srctree, "scripts/lib"))
from netlink_yml_parser import YnlDocGenerator # pylint: disable=C0413
from foo_yml_parser import FooYamlDocGenerator # pylint: disable=C0413
from bar_yml_parser import BarYamlDocGenerator # pylint: disable=C0413
...
Thanks,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script
2025-06-14 14:15 ` Donald Hunter
@ 2025-06-14 15:35 ` Mauro Carvalho Chehab
0 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 15:35 UTC (permalink / raw)
To: Donald Hunter, Jonathan Corbet
Cc: Linux Doc Mailing List, Akira Yokosawa, Breno Leitao,
David S. Miller, Eric Dumazet, Ignacio Encinas Rubio, Jan Stancek,
Marco Elver, Paolo Abeni, Ruben Wauters, Shuah Khan, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Em Sat, 14 Jun 2025 15:15:25 +0100
Donald Hunter <donald.hunter@gmail.com> escreveu:
> On Sat, 14 Jun 2025 at 09:56, Mauro Carvalho Chehab
> <mchehab+huawei@kernel.org> wrote:
> >
> > The index.rst generator doesn't really belong to the parsing
> > function. Move it to the command line tool, as it won't be
> > used elsewhere.
> >
> > While here, make it more generic, allowing it to handle either
> > .yaml or .rst as input files.
>
> I think this patch can be dropped from the series, if instead you
> remove the index generation code before refactoring into a library.
Works for me. I'll wait for you and Jon's comments with regards to the
location before sending a new version.
Regards,
Mauro
>
> > Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
> > ---
> > scripts/lib/netlink_yml_parser.py | 101 ++++++++---------------------
> > tools/net/ynl/pyynl/ynl_gen_rst.py | 28 +++++++-
Thanks,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 15:32 ` Mauro Carvalho Chehab
@ 2025-06-14 17:37 ` Jakub Kicinski
2025-06-14 18:56 ` Mauro Carvalho Chehab
0 siblings, 1 reply; 29+ messages in thread
From: Jakub Kicinski @ 2025-06-14 17:37 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Donald Hunter, Jonathan Corbet, Linux Doc Mailing List,
Akira Yokosawa, Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 17:32:35 +0200 Mauro Carvalho Chehab wrote:
> > > @@ -27314,6 +27315,7 @@ M: Jakub Kicinski <kuba@kernel.org>
> > > F: Documentation/netlink/
> > > F: Documentation/userspace-api/netlink/intro-specs.rst
> > > F: Documentation/userspace-api/netlink/specs.rst
> > > +F: scripts/lib/netlink_yml_parser.py
> > > F: tools/net/ynl/
>
> With regards to the location itself, as I said earlier, it is up to
> Jon and you to decide.
>
> My preference is to have all Python libraries at the entire Kernel
> inside scripts/lib (or at some other common location), no matter where
> the caller Python command or in-kernel Sphinx extensions are located.
I understand that from the PoV of ease of maintenance of the docs.
Is it fair to say there is a trade off here between ease of maintenance
for docs maintainers and encouraging people to integrate with kernel
docs in novel ways?
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 17:37 ` Jakub Kicinski
@ 2025-06-14 18:56 ` Mauro Carvalho Chehab
2025-06-14 19:46 ` Jakub Kicinski
0 siblings, 1 reply; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-14 18:56 UTC (permalink / raw)
To: Jakub Kicinski
Cc: Donald Hunter, Jonathan Corbet, Linux Doc Mailing List,
Akira Yokosawa, Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Em Sat, 14 Jun 2025 10:37:00 -0700
Jakub Kicinski <kuba@kernel.org> escreveu:
> On Sat, 14 Jun 2025 17:32:35 +0200 Mauro Carvalho Chehab wrote:
> > > > @@ -27314,6 +27315,7 @@ M: Jakub Kicinski <kuba@kernel.org>
> > > > F: Documentation/netlink/
> > > > F: Documentation/userspace-api/netlink/intro-specs.rst
> > > > F: Documentation/userspace-api/netlink/specs.rst
> > > > +F: scripts/lib/netlink_yml_parser.py
> > > > F: tools/net/ynl/
> >
> > With regards to the location itself, as I said earlier, it is up to
> > Jon and you to decide.
> >
> > My preference is to have all Python libraries at the entire Kernel
> > inside scripts/lib (or at some other common location), no matter where
> > the caller Python command or in-kernel Sphinx extensions are located.
>
> I understand that from the PoV of ease of maintenance of the docs.
> Is it fair to say there is a trade off here between ease of maintenance
> for docs maintainers and encouraging people to integrate with kernel
> docs in novel ways?
Placing elsewhere won't make much difference from doc maintainers and
developers.
I'm more interested on having a single place where python libraries
could be placed. Eventually, some classes might be re-used in the future
by multiple scripts and subsystems, when it makes sense, just like we do
already with Kernel's kAPIs. This also helps when checking what is the
Python's minimal version that are required by the Kernel when updating
it at:
Documentation/process/changes.rst
And writing patches documenting it like:
d2b239099cf0 ("docs: changes: update Sphinx minimal version to 3.4.3")
5e25b972a22b ("docs: changes: update Python minimal version")
Properly setting the minimal Python version is important specially to
check if the minimal version set at changes.rst is compatible with
the Makefile build targets:
$ pip install --user vermin
...
$ vermin -v scripts/lib/
Detecting python files..
Analyzing 9 files using 24 processes..
!2, 3.6 /new_devel/v4l/docs/scripts/lib/abi/abi_parser.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/abi/abi_regex.py
~2, ~3 /new_devel/v4l/docs/scripts/lib/abi/helpers.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/abi/system_symbols.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/kdoc/kdoc_files.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/kdoc/kdoc_output.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/kdoc/kdoc_parser.py
2.3, 3.0 /new_devel/v4l/docs/scripts/lib/kdoc/kdoc_re.py
!2, 3.6 /new_devel/v4l/docs/scripts/lib/netlink_yml_parser.py
Tips:
- You're using potentially backported modules: argparse, typing
If so, try using the following for better results: --backport argparse --backport typing
- Since '# novm' or '# novermin' weren't used, a speedup can be achieved using: --no-parse-comments
(disable using: --no-tips)
Minimum required versions: 3.6
Incompatible versions: 2
Thanks,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 18:56 ` Mauro Carvalho Chehab
@ 2025-06-14 19:46 ` Jakub Kicinski
2025-06-16 10:51 ` Mauro Carvalho Chehab
2025-06-19 20:06 ` Jonathan Corbet
0 siblings, 2 replies; 29+ messages in thread
From: Jakub Kicinski @ 2025-06-14 19:46 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Donald Hunter, Jonathan Corbet, Linux Doc Mailing List,
Akira Yokosawa, Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
On Sat, 14 Jun 2025 20:56:09 +0200 Mauro Carvalho Chehab wrote:
> > I understand that from the PoV of ease of maintenance of the docs.
> > Is it fair to say there is a trade off here between ease of maintenance
> > for docs maintainers and encouraging people to integrate with kernel
> > docs in novel ways?
>
> Placing elsewhere won't make much difference from doc maintainers and
> developers.
I must be missing your point. Clearly it makes a difference to Donald,
who is a maintainer of the docs in question.
> I'm more interested on having a single place where python libraries
> could be placed.
Me too, especially for selftests. But it's not clear to me that
scripts/ is the right location. I thought purely user space code
should live in tools/ and bulk of YNL is for user space.
> Eventually, some classes might be re-used in the future
> by multiple scripts and subsystems, when it makes sense, just like we do
> already with Kernel's kAPIs. This also helps when checking what is the
> Python's minimal version that are required by the Kernel when updating
> it at:
I think this is exactly the same point Donald is making, but from YNL
perspective. The hope is to share more code between the ReST generator,
the existing C generator and Python library. The later two are already
based on a shared spec model.
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 19:46 ` Jakub Kicinski
@ 2025-06-16 10:51 ` Mauro Carvalho Chehab
2025-06-19 20:06 ` Jonathan Corbet
1 sibling, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-16 10:51 UTC (permalink / raw)
To: Jakub Kicinski
Cc: Donald Hunter, Jonathan Corbet, Linux Doc Mailing List,
Akira Yokosawa, Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Em Sat, 14 Jun 2025 12:46:49 -0700
Jakub Kicinski <kuba@kernel.org> escreveu:
> On Sat, 14 Jun 2025 20:56:09 +0200 Mauro Carvalho Chehab wrote:
> > > I understand that from the PoV of ease of maintenance of the docs.
> > > Is it fair to say there is a trade off here between ease of maintenance
> > > for docs maintainers and encouraging people to integrate with kernel
> > > docs in novel ways?
> >
> > Placing elsewhere won't make much difference from doc maintainers and
> > developers.
>
> I must be missing your point. Clearly it makes a difference to Donald,
> who is a maintainer of the docs in question.
Heh, I was just saying that I missed your point ;-)
See, you said that "there is a trade off here between ease of maintenance
for docs maintainers and encouraging people to integrate with kernel
docs in novel ways".
I can't see how being easy/hard to maintain or even "integrate with
kernel docs in novel ways" would be affected by the script location.
Whatever it is located, there should be MAINTAINERS entries that would
point to YAML and network maintainers maintainers:
$ ./scripts/get_maintainer.pl tools/net/ynl/pyynl/ynl_gen_rst.py --nogit --nogit-blame --nogit-fallback
Donald Hunter <donald.hunter@gmail.com> (maintainer:YAML NETLINK (YNL))
Jakub Kicinski <kuba@kernel.org> (maintainer:YAML NETLINK (YNL))
"David S. Miller" <davem@davemloft.net> (maintainer:NETWORKING [GENERAL])
Eric Dumazet <edumazet@google.com> (maintainer:NETWORKING [GENERAL])
Paolo Abeni <pabeni@redhat.com> (maintainer:NETWORKING [GENERAL])
Simon Horman <horms@kernel.org> (reviewer:NETWORKING [GENERAL])
netdev@vger.kernel.org (open list:NETWORKING [GENERAL])
linux-kernel@vger.kernel.org (open list)
YAML NETLINK (YNL) status: Unknown
(do they all apply to YNL doc parser?)
Plus having doc ML/Maintainer on it:
Jonathan Corbet <corbet@lwn.net> (maintainer:DOCUMENTATION)
linux-doc@vger.kernel.org (open list:DOCUMENTATION)
So, at least the file called by the Sphinx class should be at the
linux-doc entry at the maintainers' file.
The rationale is that linux-doc and Jon should be c/c, just in case some
change there might end causing build issues using a version of the toolchain
that is officially supported, as documented at
Documentation/process/changes.rst, e.g. currently whatever it there is
expected to be compatible with:
====================== =============== ========================================
Program Minimal version Command to check the version
====================== =============== ========================================
...
Sphinx\ [#f1]_ 3.4.3 sphinx-build --version
...
Python (optional) 3.9.x python3 --version
...
This is independent if the YNL classes are either at scripts/lib
or at tools/net/ynl/pyynl/lib.
>
> > I'm more interested on having a single place where python libraries
> > could be placed.
>
> Me too, especially for selftests. But it's not clear to me that
> scripts/ is the right location. I thought purely user space code
> should live in tools/ and bulk of YNL is for user space.
Several scripts under scripts/ are meant to run outside build
time. One clear example is:
$ ./scripts/get_abi.py undefined
That basically checks if the userspace sysfs API is properly
documented, by reading the macine's sysfs node and comparing
with the uAPI documentation. Such tool can also used to check if
the ABI documentation Python classes are working as expected.
So, it is a mix of kernel build time and userspace.
There are also pure userspace tools like those two:
./scripts/get_dvb_firmware
./scripts/extract_xc3028.pl
Both extract firmware files from some other OS and write as a
Linux firmware file to be stored under /lib/firmware. They are
userspace-only tools.
-
From my side, I don't care where Python classes would be placed,
but I prefer having them on a single common place. It could be:
/scripts/lib
/tools/lib
/python/lib
eventually with their own sub-directories on it, like what we have
today:
${some_prefix}/kdoc
${some_prefix}/abi
In the case of netlink, it could be:
${some_prefix}/netlink
Yet, IMO, we should not have a different location for userspace
and non-userspace, as it is very hard to draw the borders on several
cases, like the ABI toolset.
> > Eventually, some classes might be re-used in the future
> > by multiple scripts and subsystems, when it makes sense, just like we do
> > already with Kernel's kAPIs. This also helps when checking what is the
> > Python's minimal version that are required by the Kernel when updating
> > it at:
>
> I think this is exactly the same point Donald is making, but from YNL
> perspective. The hope is to share more code between the ReST generator,
> the existing C generator and Python library. The later two are already
> based on a shared spec model.
That makes perfect sense to me. Yet, this doesn't preventing having
a:
${some_prefix}/ynl
directory where you would place Netlink YNL parsing, where the prefix
would be either:
- /scripts/lib
- /tools/lib
- /python/lib
- something else
It may even use some common classes under:
${some_prefix}/${some_common_prefix}
---
Now, seeing your comments, maybe the main point is wheather it is OK to
add userspace libraries to scripts/lib or not. IMO, using "/scripts/lib"
is OK, no matter if the script is kernel-build related or "pure userspace",
but if there are no consensus, we could migrate what we have to
"python/lib" or to some other place.
Thanks,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-14 19:46 ` Jakub Kicinski
2025-06-16 10:51 ` Mauro Carvalho Chehab
@ 2025-06-19 20:06 ` Jonathan Corbet
2025-06-20 15:31 ` Mauro Carvalho Chehab
1 sibling, 1 reply; 29+ messages in thread
From: Jonathan Corbet @ 2025-06-19 20:06 UTC (permalink / raw)
To: Jakub Kicinski, Mauro Carvalho Chehab
Cc: Donald Hunter, Linux Doc Mailing List, Akira Yokosawa,
Breno Leitao, David S. Miller, Eric Dumazet,
Ignacio Encinas Rubio, Jan Stancek, Marco Elver, Paolo Abeni,
Ruben Wauters, Shuah Khan, joel, linux-kernel-mentees,
linux-kernel, lkmm, netdev, peterz, stern
Jakub Kicinski <kuba@kernel.org> writes:
> On Sat, 14 Jun 2025 20:56:09 +0200 Mauro Carvalho Chehab wrote:
>> I'm more interested on having a single place where python libraries
>> could be placed.
>
> Me too, especially for selftests. But it's not clear to me that
> scripts/ is the right location. I thought purely user space code
> should live in tools/ and bulk of YNL is for user space.
I've been out wandering the woods and canyons with no connectivity for a
bit, so missed this whole discussion, sorry.
Mauro and I had talked about the proper home for Python libraries when
he reworked kernel-doc; we ended up with them under scripts/, which I
didn't find entirely pleasing. If you were to ask me today, I'd say
they should be under lib/python, but tomorrow I might say something
else...
In truth, I don't think it matters much, but I *do* think we should have
a single location from which to import kernel-specific Python code.
Spreading it throughout the tree just isn't going to lead to joy.
Thanks,
jon
^ permalink raw reply [flat|nested] 29+ messages in thread
* Re: [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py
2025-06-19 20:06 ` Jonathan Corbet
@ 2025-06-20 15:31 ` Mauro Carvalho Chehab
0 siblings, 0 replies; 29+ messages in thread
From: Mauro Carvalho Chehab @ 2025-06-20 15:31 UTC (permalink / raw)
To: Jonathan Corbet
Cc: Jakub Kicinski, Donald Hunter, Linux Doc Mailing List,
Akira Yokosawa, Breno Leitao, David S. Miller, Eric Dumazet,
Jan Stancek, Marco Elver, Paolo Abeni, Ruben Wauters, Shuah Khan,
joel, linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz,
stern
Em Thu, 19 Jun 2025 14:06:58 -0600
Jonathan Corbet <corbet@lwn.net> escreveu:
> Jakub Kicinski <kuba@kernel.org> writes:
>
> > On Sat, 14 Jun 2025 20:56:09 +0200 Mauro Carvalho Chehab wrote:
>
> >> I'm more interested on having a single place where python libraries
> >> could be placed.
> >
> > Me too, especially for selftests. But it's not clear to me that
> > scripts/ is the right location. I thought purely user space code
> > should live in tools/ and bulk of YNL is for user space.
>
> I've been out wandering the woods and canyons with no connectivity for a
> bit, so missed this whole discussion, sorry.
Sounds fun!
> Mauro and I had talked about the proper home for Python libraries when
> he reworked kernel-doc; we ended up with them under scripts/, which I
> didn't find entirely pleasing. If you were to ask me today, I'd say
> they should be under lib/python, but tomorrow I might say something
> else...
Yeah, I guess you proposed lib/python before... I could be wrong though.
Anyway, at least for me lib/python sounds a better alternative than
scripts. I won't mind tools/lib/python or some other place.
> In truth, I don't think it matters much, but I *do* think we should have
> a single location from which to import kernel-specific Python code.
> Spreading it throughout the tree just isn't going to lead to joy.
We're aligned with that regards: IMO, we need a single store within
the Kernel for classes that might be shared.
As I commented on one of PRs, maybe the series could be merged
with Donald proposed (tools/net/ynl/pyynl/lib/doc_generator.py),
while we're still discussing. So, let's focus on get it reviewed
and merged without needing to wait for a broader discussion
about its permanent location.
We can later shift the code once we reach an agreement.
-
To start the discussions about a permanent location, in the specific
case of YNL, we currently have there:
$ tree -d tools/net/ynl/ -I __pycache__
tools/net/ynl/
├── generated
├── lib
├── pyynl
│ └── lib
└── samples
where pyynl have executables and pyynl the python libraries.
what I would suggest is to move what it is under "pyynl/lib"
to "{prefix}/ynl", where "{prefix}" can be "lib/python",
"tools/lib/python", "scripts/lib" or whatever other location
we reach an agreement.
For now, I placed the latest version of my doc patch series
under:
https://github.com/mchehab/linux/tree/netlink_v8
to have a central place to have them on one of my scratch
trees.
I sent today for review to linux-doc ML an initial patch series
with some non-YAML related patches. I have another set of
patches after it, which I'm planning to send on Monday. At the
end, there are the YAML parser submission.
Regards,
Mauro
^ permalink raw reply [flat|nested] 29+ messages in thread
end of thread, other threads:[~2025-06-20 15:31 UTC | newest]
Thread overview: 29+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2025-06-14 8:55 [PATCH v4 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 01/14] tools: ynl_gen_rst.py: create a top-level reference Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 02/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 03/14] docs: netlink: don't ignore generated rst files Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 04/14] tools: ynl_gen_rst.py: make the index parser more generic Mauro Carvalho Chehab
2025-06-14 13:41 ` Donald Hunter
2025-06-14 14:58 ` Mauro Carvalho Chehab
2025-06-14 8:55 ` [PATCH v4 05/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
2025-06-14 14:09 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 06/14] scripts: lib: netlink_yml_parser.py: use classes Mauro Carvalho Chehab
2025-06-14 14:11 ` Donald Hunter
2025-06-14 8:56 ` [PATCH v4 07/14] tools: ynl_gen_rst.py: move index.rst generator to the script Mauro Carvalho Chehab
2025-06-14 14:15 ` Donald Hunter
2025-06-14 15:35 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 08/14] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 09/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 10/14] docs: conf.py: don't handle yaml files outside " Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 11/14] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 12/14] MAINTAINERS: add maintainers for netlink_yml_parser.py Mauro Carvalho Chehab
2025-06-14 14:22 ` Donald Hunter
2025-06-14 15:32 ` Mauro Carvalho Chehab
2025-06-14 17:37 ` Jakub Kicinski
2025-06-14 18:56 ` Mauro Carvalho Chehab
2025-06-14 19:46 ` Jakub Kicinski
2025-06-16 10:51 ` Mauro Carvalho Chehab
2025-06-19 20:06 ` Jonathan Corbet
2025-06-20 15:31 ` Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 13/14] docs: Makefile: disable check rules on make cleandocs Mauro Carvalho Chehab
2025-06-14 8:56 ` [PATCH v4 14/14] docs: conf.py: properly handle include and exclude patterns Mauro Carvalho Chehab
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).