* [PATCH v10 01/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc:
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 02/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
` (13 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Currently, rt documents are referred with:
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`rt-link<../../networking/netlink_spec/rt-link>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Documentation/userspace-api/netlink/netlink-raw.rst: :doc:`tc<../../networking/netlink_spec/tc>`
Having :doc: references with relative paths doesn't always work,
as it may have troubles when O= is used. Also that's hard to
maintain, and may break if we change the way rst files are
generated from yaml. Better to use instead a reference for
the netlink family.
So, replace them by Sphinx cross-reference tag that are
created by ynl_gen_rst.py.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/userspace-api/netlink/netlink-raw.rst | 6 +++---
tools/net/ynl/pyynl/ynl_gen_rst.py | 5 +++--
2 files changed, 6 insertions(+), 5 deletions(-)
diff --git a/Documentation/userspace-api/netlink/netlink-raw.rst b/Documentation/userspace-api/netlink/netlink-raw.rst
index 31fc91020eb3..aae296c170c5 100644
--- a/Documentation/userspace-api/netlink/netlink-raw.rst
+++ b/Documentation/userspace-api/netlink/netlink-raw.rst
@@ -62,8 +62,8 @@ Sub-messages
------------
Several raw netlink families such as
-:doc:`rt-link<../../networking/netlink_spec/rt-link>` and
-:doc:`tc<../../networking/netlink_spec/tc>` use attribute nesting as an
+:ref:`rt-link<netlink-rt-link>` and
+:ref:`tc<netlink-tc>` use attribute nesting as an
abstraction to carry module specific information.
Conceptually it looks as follows::
@@ -162,7 +162,7 @@ then this is an error.
Nested struct definitions
-------------------------
-Many raw netlink families such as :doc:`tc<../../networking/netlink_spec/tc>`
+Many raw netlink families such as :ref:`tc<netlink-tc>`
make use of nested struct definitions. The ``netlink-raw`` schema makes it
possible to embed a struct within a struct definition using the ``struct``
property. For example, the following struct definition embeds the
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 0cb6348e28d3..7bfb8ceeeefc 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -314,10 +314,11 @@ def parse_yaml(obj: Dict[str, Any]) -> str:
# Main header
- lines.append(rst_header())
-
family = obj['name']
+ lines.append(rst_header())
+ lines.append(rst_label("netlink-" + family))
+
title = f"Family ``{family}`` netlink specification"
lines.append(rst_title(title))
lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 02/14] tools: ynl_gen_rst.py: Split library from command line tool
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 01/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 03/14] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
` (12 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
As we'll be using the Netlink specs parser inside a Sphinx
extension, move the library part from the command line parser.
While here, change the code which generates an index file
to parse inputs from both .rst and .yaml extensions. With
that, the tool can easily be tested with:
tools/net/ynl/pyynl/ynl_gen_rst.py -x -o Documentation/netlink/specs/foo.rst
Without needing to first generate a temp directory with the
rst files.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
tools/net/ynl/pyynl/lib/__init__.py | 2 +
tools/net/ynl/pyynl/lib/doc_generator.py | 382 +++++++++++++++++++++++
tools/net/ynl/pyynl/ynl_gen_rst.py | 375 +---------------------
3 files changed, 400 insertions(+), 359 deletions(-)
create mode 100644 tools/net/ynl/pyynl/lib/doc_generator.py
diff --git a/tools/net/ynl/pyynl/lib/__init__.py b/tools/net/ynl/pyynl/lib/__init__.py
index 71518b9842ee..5f266ebe4526 100644
--- a/tools/net/ynl/pyynl/lib/__init__.py
+++ b/tools/net/ynl/pyynl/lib/__init__.py
@@ -4,6 +4,8 @@ from .nlspec import SpecAttr, SpecAttrSet, SpecEnumEntry, SpecEnumSet, \
SpecFamily, SpecOperation, SpecSubMessage, SpecSubMessageFormat
from .ynl import YnlFamily, Netlink, NlError
+from .doc_generator import YnlDocGenerator
+
__all__ = ["SpecAttr", "SpecAttrSet", "SpecEnumEntry", "SpecEnumSet",
"SpecFamily", "SpecOperation", "SpecSubMessage", "SpecSubMessageFormat",
"YnlFamily", "Netlink", "NlError"]
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
new file mode 100644
index 000000000000..80e468086693
--- /dev/null
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -0,0 +1,382 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: GPL-2.0
+# -*- coding: utf-8; mode: python -*-
+
+"""
+ Class to auto generate the documentation for Netlink specifications.
+
+ :copyright: Copyright (C) 2023 Breno Leitao <leitao@debian.org>
+ :license: GPL Version 2, June 1991 see linux/COPYING for details.
+
+ This class performs extensive parsing to the Linux kernel's netlink YAML
+ spec files, in an effort to avoid needing to heavily mark up the original
+ YAML file.
+
+ This code is split in two classes:
+ 1) RST formatters: Use to convert a string to a RST output
+ 2) YAML Netlink (YNL) doc generator: Generate docs from YAML data
+"""
+
+from typing import Any, Dict, List
+import os.path
+import sys
+import argparse
+import logging
+import yaml
+
+
+# ==============
+# RST Formatters
+# ==============
+class RstFormatters:
+ SPACE_PER_LEVEL = 4
+
+ @staticmethod
+ def headroom(level: int) -> str:
+ """Return space to format"""
+ return " " * (level * RstFormatters.SPACE_PER_LEVEL)
+
+
+ @staticmethod
+ def bold(text: str) -> str:
+ """Format bold text"""
+ return f"**{text}**"
+
+
+ @staticmethod
+ def inline(text: str) -> str:
+ """Format inline text"""
+ return f"``{text}``"
+
+
+ @staticmethod
+ def sanitize(text: str) -> str:
+ """Remove newlines and multiple spaces"""
+ # This is useful for some fields that are spread across multiple lines
+ return str(text).replace("\n", " ").strip()
+
+
+ def rst_fields(self, key: str, value: str, level: int = 0) -> str:
+ """Return a RST formatted field"""
+ return self.headroom(level) + f":{key}: {value}"
+
+
+ def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
+ """Format a single rst definition"""
+ return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
+
+
+ def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
+ """Return a formatted paragraph"""
+ return self.headroom(level) + paragraph
+
+
+ def rst_bullet(self, item: str, level: int = 0) -> str:
+ """Return a formatted a bullet"""
+ return self.headroom(level) + f"- {item}"
+
+
+ @staticmethod
+ def rst_subsection(title: str) -> str:
+ """Add a sub-section to the document"""
+ return f"{title}\n" + "-" * len(title)
+
+
+ @staticmethod
+ def rst_subsubsection(title: str) -> str:
+ """Add a sub-sub-section to the document"""
+ return f"{title}\n" + "~" * len(title)
+
+
+ @staticmethod
+ def rst_section(namespace: str, prefix: str, title: str) -> str:
+ """Add a section to the document"""
+ return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
+
+
+ @staticmethod
+ def rst_subtitle(title: str) -> str:
+ """Add a subtitle to the document"""
+ return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
+
+
+ @staticmethod
+ def rst_title(title: str) -> str:
+ """Add a title to the document"""
+ return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
+
+
+ def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
+ """Format a list using inlines"""
+ return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
+
+
+ @staticmethod
+ def rst_ref(namespace: str, prefix: str, name: str) -> str:
+ """Add a hyperlink to the document"""
+ mappings = {'enum': 'definition',
+ 'fixed-header': 'definition',
+ 'nested-attributes': 'attribute-set',
+ 'struct': 'definition'}
+ if prefix in mappings:
+ prefix = mappings[prefix]
+ return f":ref:`{namespace}-{prefix}-{name}`"
+
+
+ def rst_header(self) -> str:
+ """The headers for all the auto generated RST files"""
+ lines = []
+
+ lines.append(self.rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
+ lines.append(self.rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
+
+ return "\n".join(lines)
+
+
+ @staticmethod
+ def rst_toctree(maxdepth: int = 2) -> str:
+ """Generate a toctree RST primitive"""
+ lines = []
+
+ lines.append(".. toctree::")
+ lines.append(f" :maxdepth: {maxdepth}\n\n")
+
+ return "\n".join(lines)
+
+
+ @staticmethod
+ def rst_label(title: str) -> str:
+ """Return a formatted label"""
+ return f".. _{title}:\n\n"
+
+# =======
+# Parsers
+# =======
+class YnlDocGenerator:
+
+ fmt = RstFormatters()
+
+ def parse_mcast_group(self, mcast_group: List[Dict[str, Any]]) -> str:
+ """Parse 'multicast' group list and return a formatted string"""
+ lines = []
+ for group in mcast_group:
+ lines.append(self.fmt.rst_bullet(group["name"]))
+
+ return "\n".join(lines)
+
+
+ def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'do' section and return a formatted string"""
+ lines = []
+ for key in do_dict.keys():
+ lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
+ if key in ['request', 'reply']:
+ lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
+ else:
+ lines.append(self.fmt.headroom(level + 2) + do_dict[key] + "\n")
+
+ return "\n".join(lines)
+
+
+ def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
+ """Parse 'attributes' section"""
+ if "attributes" not in attrs:
+ return ""
+ lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+
+ return "\n".join(lines)
+
+
+ def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse operations block"""
+ preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
+ linkable = ["fixed-header", "attribute-set"]
+ lines = []
+
+ for operation in operations:
+ lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+ lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
+
+ for key in operation.keys():
+ if key in preprocessed:
+ # Skip the special fields
+ continue
+ value = operation[key]
+ if key in linkable:
+ value = self.fmt.rst_ref(namespace, key, value)
+ lines.append(self.fmt.rst_fields(key, value, 0))
+ if 'flags' in operation:
+ lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+
+ if "do" in operation:
+ lines.append(self.fmt.rst_paragraph(":do:", 0))
+ lines.append(self.parse_do(operation["do"], 0))
+ if "dump" in operation:
+ lines.append(self.fmt.rst_paragraph(":dump:", 0))
+ lines.append(self.parse_do(operation["dump"], 0))
+
+ # New line after fields
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+ def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
+ """Parse a list of entries"""
+ ignored = ["pad"]
+ lines = []
+ for entry in entries:
+ if isinstance(entry, dict):
+ # entries could be a list or a dictionary
+ field_name = entry.get("name", "")
+ if field_name in ignored:
+ continue
+ type_ = entry.get("type")
+ if type_:
+ field_name += f" ({self.fmt.inline(type_)})"
+ lines.append(
+ self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+ )
+ elif isinstance(entry, list):
+ lines.append(self.fmt.rst_list_inline(entry, level))
+ else:
+ lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+
+ lines.append("\n")
+ return "\n".join(lines)
+
+
+ def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
+ """Parse definitions section"""
+ preprocessed = ["name", "entries", "members"]
+ ignored = ["render-max"] # This is not printed
+ lines = []
+
+ for definition in defs:
+ lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
+ for k in definition.keys():
+ if k in preprocessed + ignored:
+ continue
+ lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
+
+ # Field list needs to finish with a new line
+ lines.append("\n")
+ if "entries" in definition:
+ lines.append(self.fmt.rst_paragraph(":entries:", 0))
+ lines.append(self.parse_entries(definition["entries"], 1))
+ if "members" in definition:
+ lines.append(self.fmt.rst_paragraph(":members:", 0))
+ lines.append(self.parse_entries(definition["members"], 1))
+
+ return "\n".join(lines)
+
+
+ def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse attribute from attribute-set"""
+ preprocessed = ["name", "type"]
+ linkable = ["enum", "nested-attributes", "struct", "sub-message"]
+ ignored = ["checks"]
+ lines = []
+
+ for entry in entries:
+ lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+ for attr in entry["attributes"]:
+ type_ = attr.get("type")
+ attr_line = attr["name"]
+ if type_:
+ # Add the attribute type in the same line
+ attr_line += f" ({self.fmt.inline(type_)})"
+
+ lines.append(self.fmt.rst_subsubsection(attr_line))
+
+ for k in attr.keys():
+ if k in preprocessed + ignored:
+ continue
+ if k in linkable:
+ value = self.fmt.rst_ref(namespace, k, attr[k])
+ else:
+ value = self.fmt.sanitize(attr[k])
+ lines.append(self.fmt.rst_fields(k, value, 0))
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+ def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
+ """Parse sub-message definitions"""
+ lines = []
+
+ for entry in entries:
+ lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+ for fmt in entry["formats"]:
+ value = fmt["value"]
+
+ lines.append(self.fmt.rst_bullet(self.fmt.bold(value)))
+ for attr in ['fixed-header', 'attribute-set']:
+ if attr in fmt:
+ lines.append(self.fmt.rst_fields(attr,
+ self.fmt.rst_ref(namespace, attr, fmt[attr]),
+ 1))
+ lines.append("\n")
+
+ return "\n".join(lines)
+
+
+ def parse_yaml(self, obj: Dict[str, Any]) -> str:
+ """Format the whole YAML into a RST string"""
+ lines = []
+
+ # Main header
+
+ family = obj['name']
+
+ lines.append(self.fmt.rst_header())
+ lines.append(self.fmt.rst_label("netlink-" + family))
+
+ title = f"Family ``{family}`` netlink specification"
+ lines.append(self.fmt.rst_title(title))
+ lines.append(self.fmt.rst_paragraph(".. contents:: :depth: 3\n"))
+
+ if "doc" in obj:
+ lines.append(self.fmt.rst_subtitle("Summary"))
+ lines.append(self.fmt.rst_paragraph(obj["doc"], 0))
+
+ # Operations
+ if "operations" in obj:
+ lines.append(self.fmt.rst_subtitle("Operations"))
+ lines.append(self.parse_operations(obj["operations"]["list"], family))
+
+ # Multicast groups
+ if "mcast-groups" in obj:
+ lines.append(self.fmt.rst_subtitle("Multicast groups"))
+ lines.append(self.parse_mcast_group(obj["mcast-groups"]["list"]))
+
+ # Definitions
+ if "definitions" in obj:
+ lines.append(self.fmt.rst_subtitle("Definitions"))
+ lines.append(self.parse_definitions(obj["definitions"], family))
+
+ # Attributes set
+ if "attribute-sets" in obj:
+ lines.append(self.fmt.rst_subtitle("Attribute sets"))
+ lines.append(self.parse_attr_sets(obj["attribute-sets"], family))
+
+ # Sub-messages
+ if "sub-messages" in obj:
+ lines.append(self.fmt.rst_subtitle("Sub-messages"))
+ lines.append(self.parse_sub_messages(obj["sub-messages"], family))
+
+ return "\n".join(lines)
+
+
+ # Main functions
+ # ==============
+
+
+ def parse_yaml_file(self, filename: str) -> str:
+ """Transform the YAML specified by filename into an RST-formatted string"""
+ with open(filename, "r", encoding="utf-8") as spec_file:
+ yaml_data = yaml.safe_load(spec_file)
+ content = self.parse_yaml(yaml_data)
+
+ return content
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 7bfb8ceeeefc..010315fad498 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -10,354 +10,17 @@
This script performs extensive parsing to the Linux kernel's netlink YAML
spec files, in an effort to avoid needing to heavily mark up the original
- YAML file.
-
- This code is split in three big parts:
- 1) RST formatters: Use to convert a string to a RST output
- 2) Parser helpers: Functions to parse the YAML data structure
- 3) Main function and small helpers
+ YAML file. It uses the library code from scripts/lib.
"""
-from typing import Any, Dict, List
import os.path
+import pathlib
import sys
import argparse
import logging
-import yaml
-
-
-SPACE_PER_LEVEL = 4
-
-
-# RST Formatters
-# ==============
-def headroom(level: int) -> str:
- """Return space to format"""
- return " " * (level * SPACE_PER_LEVEL)
-
-
-def bold(text: str) -> str:
- """Format bold text"""
- return f"**{text}**"
-
-
-def inline(text: str) -> str:
- """Format inline text"""
- return f"``{text}``"
-
-
-def sanitize(text: str) -> str:
- """Remove newlines and multiple spaces"""
- # This is useful for some fields that are spread across multiple lines
- return str(text).replace("\n", " ").strip()
-
-
-def rst_fields(key: str, value: str, level: int = 0) -> str:
- """Return a RST formatted field"""
- return headroom(level) + f":{key}: {value}"
-
-
-def rst_definition(key: str, value: Any, level: int = 0) -> str:
- """Format a single rst definition"""
- return headroom(level) + key + "\n" + headroom(level + 1) + str(value)
-
-
-def rst_paragraph(paragraph: str, level: int = 0) -> str:
- """Return a formatted paragraph"""
- return headroom(level) + paragraph
-
-
-def rst_bullet(item: str, level: int = 0) -> str:
- """Return a formatted a bullet"""
- return headroom(level) + f"- {item}"
-
-
-def rst_subsection(title: str) -> str:
- """Add a sub-section to the document"""
- return f"{title}\n" + "-" * len(title)
-
-
-def rst_subsubsection(title: str) -> str:
- """Add a sub-sub-section to the document"""
- return f"{title}\n" + "~" * len(title)
-
-
-def rst_section(namespace: str, prefix: str, title: str) -> str:
- """Add a section to the document"""
- return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
-
-def rst_subtitle(title: str) -> str:
- """Add a subtitle to the document"""
- return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
-
-def rst_title(title: str) -> str:
- """Add a title to the document"""
- return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
-
-def rst_list_inline(list_: List[str], level: int = 0) -> str:
- """Format a list using inlines"""
- return headroom(level) + "[" + ", ".join(inline(i) for i in list_) + "]"
-
-
-def rst_ref(namespace: str, prefix: str, name: str) -> str:
- """Add a hyperlink to the document"""
- mappings = {'enum': 'definition',
- 'fixed-header': 'definition',
- 'nested-attributes': 'attribute-set',
- 'struct': 'definition'}
- if prefix in mappings:
- prefix = mappings[prefix]
- return f":ref:`{namespace}-{prefix}-{name}`"
-
-
-def rst_header() -> str:
- """The headers for all the auto generated RST files"""
- lines = []
-
- lines.append(rst_paragraph(".. SPDX-License-Identifier: GPL-2.0"))
- lines.append(rst_paragraph(".. NOTE: This document was auto-generated.\n\n"))
-
- return "\n".join(lines)
-
-
-def rst_toctree(maxdepth: int = 2) -> str:
- """Generate a toctree RST primitive"""
- lines = []
-
- lines.append(".. toctree::")
- lines.append(f" :maxdepth: {maxdepth}\n\n")
-
- return "\n".join(lines)
-
-
-def rst_label(title: str) -> str:
- """Return a formatted label"""
- return f".. _{title}:\n\n"
-
-
-# Parsers
-# =======
-
-
-def parse_mcast_group(mcast_group: List[Dict[str, Any]]) -> str:
- """Parse 'multicast' group list and return a formatted string"""
- lines = []
- for group in mcast_group:
- lines.append(rst_bullet(group["name"]))
-
- return "\n".join(lines)
-
-
-def parse_do(do_dict: Dict[str, Any], level: int = 0) -> str:
- """Parse 'do' section and return a formatted string"""
- lines = []
- for key in do_dict.keys():
- lines.append(rst_paragraph(bold(key), level + 1))
- if key in ['request', 'reply']:
- lines.append(parse_do_attributes(do_dict[key], level + 1) + "\n")
- else:
- lines.append(headroom(level + 2) + do_dict[key] + "\n")
-
- return "\n".join(lines)
-
-
-def parse_do_attributes(attrs: Dict[str, Any], level: int = 0) -> str:
- """Parse 'attributes' section"""
- if "attributes" not in attrs:
- return ""
- lines = [rst_fields("attributes", rst_list_inline(attrs["attributes"]), level + 1)]
-
- return "\n".join(lines)
-
-
-def parse_operations(operations: List[Dict[str, Any]], namespace: str) -> str:
- """Parse operations block"""
- preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
- linkable = ["fixed-header", "attribute-set"]
- lines = []
-
- for operation in operations:
- lines.append(rst_section(namespace, 'operation', operation["name"]))
- lines.append(rst_paragraph(operation["doc"]) + "\n")
-
- for key in operation.keys():
- if key in preprocessed:
- # Skip the special fields
- continue
- value = operation[key]
- if key in linkable:
- value = rst_ref(namespace, key, value)
- lines.append(rst_fields(key, value, 0))
- if 'flags' in operation:
- lines.append(rst_fields('flags', rst_list_inline(operation['flags'])))
-
- if "do" in operation:
- lines.append(rst_paragraph(":do:", 0))
- lines.append(parse_do(operation["do"], 0))
- if "dump" in operation:
- lines.append(rst_paragraph(":dump:", 0))
- lines.append(parse_do(operation["dump"], 0))
-
- # New line after fields
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_entries(entries: List[Dict[str, Any]], level: int) -> str:
- """Parse a list of entries"""
- ignored = ["pad"]
- lines = []
- for entry in entries:
- if isinstance(entry, dict):
- # entries could be a list or a dictionary
- field_name = entry.get("name", "")
- if field_name in ignored:
- continue
- type_ = entry.get("type")
- if type_:
- field_name += f" ({inline(type_)})"
- lines.append(
- rst_fields(field_name, sanitize(entry.get("doc", "")), level)
- )
- elif isinstance(entry, list):
- lines.append(rst_list_inline(entry, level))
- else:
- lines.append(rst_bullet(inline(sanitize(entry)), level))
-
- lines.append("\n")
- return "\n".join(lines)
-
-
-def parse_definitions(defs: Dict[str, Any], namespace: str) -> str:
- """Parse definitions section"""
- preprocessed = ["name", "entries", "members"]
- ignored = ["render-max"] # This is not printed
- lines = []
-
- for definition in defs:
- lines.append(rst_section(namespace, 'definition', definition["name"]))
- for k in definition.keys():
- if k in preprocessed + ignored:
- continue
- lines.append(rst_fields(k, sanitize(definition[k]), 0))
-
- # Field list needs to finish with a new line
- lines.append("\n")
- if "entries" in definition:
- lines.append(rst_paragraph(":entries:", 0))
- lines.append(parse_entries(definition["entries"], 1))
- if "members" in definition:
- lines.append(rst_paragraph(":members:", 0))
- lines.append(parse_entries(definition["members"], 1))
-
- return "\n".join(lines)
-
-
-def parse_attr_sets(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse attribute from attribute-set"""
- preprocessed = ["name", "type"]
- linkable = ["enum", "nested-attributes", "struct", "sub-message"]
- ignored = ["checks"]
- lines = []
-
- for entry in entries:
- lines.append(rst_section(namespace, 'attribute-set', entry["name"]))
- for attr in entry["attributes"]:
- type_ = attr.get("type")
- attr_line = attr["name"]
- if type_:
- # Add the attribute type in the same line
- attr_line += f" ({inline(type_)})"
-
- lines.append(rst_subsubsection(attr_line))
-
- for k in attr.keys():
- if k in preprocessed + ignored:
- continue
- if k in linkable:
- value = rst_ref(namespace, k, attr[k])
- else:
- value = sanitize(attr[k])
- lines.append(rst_fields(k, value, 0))
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_sub_messages(entries: List[Dict[str, Any]], namespace: str) -> str:
- """Parse sub-message definitions"""
- lines = []
-
- for entry in entries:
- lines.append(rst_section(namespace, 'sub-message', entry["name"]))
- for fmt in entry["formats"]:
- value = fmt["value"]
-
- lines.append(rst_bullet(bold(value)))
- for attr in ['fixed-header', 'attribute-set']:
- if attr in fmt:
- lines.append(rst_fields(attr,
- rst_ref(namespace, attr, fmt[attr]),
- 1))
- lines.append("\n")
-
- return "\n".join(lines)
-
-
-def parse_yaml(obj: Dict[str, Any]) -> str:
- """Format the whole YAML into a RST string"""
- lines = []
-
- # Main header
-
- family = obj['name']
-
- lines.append(rst_header())
- lines.append(rst_label("netlink-" + family))
-
- title = f"Family ``{family}`` netlink specification"
- lines.append(rst_title(title))
- lines.append(rst_paragraph(".. contents:: :depth: 3\n"))
-
- if "doc" in obj:
- lines.append(rst_subtitle("Summary"))
- lines.append(rst_paragraph(obj["doc"], 0))
-
- # Operations
- if "operations" in obj:
- lines.append(rst_subtitle("Operations"))
- lines.append(parse_operations(obj["operations"]["list"], family))
-
- # Multicast groups
- if "mcast-groups" in obj:
- lines.append(rst_subtitle("Multicast groups"))
- lines.append(parse_mcast_group(obj["mcast-groups"]["list"]))
-
- # Definitions
- if "definitions" in obj:
- lines.append(rst_subtitle("Definitions"))
- lines.append(parse_definitions(obj["definitions"], family))
-
- # Attributes set
- if "attribute-sets" in obj:
- lines.append(rst_subtitle("Attribute sets"))
- lines.append(parse_attr_sets(obj["attribute-sets"], family))
-
- # Sub-messages
- if "sub-messages" in obj:
- lines.append(rst_subtitle("Sub-messages"))
- lines.append(parse_sub_messages(obj["sub-messages"], family))
-
- return "\n".join(lines)
-
-
-# Main functions
-# ==============
+sys.path.append(pathlib.Path(__file__).resolve().parent.as_posix())
+from lib import YnlDocGenerator # pylint: disable=C0413
def parse_arguments() -> argparse.Namespace:
"""Parse arguments from user"""
@@ -392,15 +55,6 @@ def parse_arguments() -> argparse.Namespace:
return args
-def parse_yaml_file(filename: str) -> str:
- """Transform the YAML specified by filename into an RST-formatted string"""
- with open(filename, "r", encoding="utf-8") as spec_file:
- yaml_data = yaml.safe_load(spec_file)
- content = parse_yaml(yaml_data)
-
- return content
-
-
def write_to_rstfile(content: str, filename: str) -> None:
"""Write the generated content into an RST file"""
logging.debug("Saving RST file to %s", filename)
@@ -409,21 +63,22 @@ def write_to_rstfile(content: str, filename: str) -> None:
rst_file.write(content)
-def generate_main_index_rst(output: str) -> None:
+def generate_main_index_rst(parser: YnlDocGenerator, output: str) -> None:
"""Generate the `networking_spec/index` content and write to the file"""
lines = []
- lines.append(rst_header())
- lines.append(rst_label("specs"))
- lines.append(rst_title("Netlink Family Specifications"))
- lines.append(rst_toctree(1))
+ lines.append(parser.fmt.rst_header())
+ lines.append(parser.fmt.rst_label("specs"))
+ lines.append(parser.fmt.rst_title("Netlink Family Specifications"))
+ lines.append(parser.fmt.rst_toctree(1))
index_dir = os.path.dirname(output)
logging.debug("Looking for .rst files in %s", index_dir)
for filename in sorted(os.listdir(index_dir)):
- if not filename.endswith(".rst") or filename == "index.rst":
+ base, ext = os.path.splitext(filename)
+ if filename == "index.rst" or ext not in [".rst", ".yaml"]:
continue
- lines.append(f" {filename.replace('.rst', '')}\n")
+ lines.append(f" {base}\n")
logging.debug("Writing an index file at %s", output)
write_to_rstfile("".join(lines), output)
@@ -434,10 +89,12 @@ def main() -> None:
args = parse_arguments()
+ parser = YnlDocGenerator()
+
if args.input:
logging.debug("Parsing %s", args.input)
try:
- content = parse_yaml_file(os.path.join(args.input))
+ content = parser.parse_yaml_file(os.path.join(args.input))
except Exception as exception:
logging.warning("Failed to parse %s.", args.input)
logging.warning(exception)
@@ -447,7 +104,7 @@ def main() -> None:
if args.index:
# Generate the index RST file
- generate_main_index_rst(args.output)
+ generate_main_index_rst(parser, args.output)
if __name__ == "__main__":
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 03/14] docs: netlink: index.rst: add a netlink index file
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 01/14] docs: netlink: netlink-raw.rst: use :ref: instead of :doc: Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 02/14] tools: ynl_gen_rst.py: Split library from command line tool Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 04/14] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
` (11 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Instead of generating the index file, use glob to automatically
include all data from yaml.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/netlink/specs/index.rst | 13 +++++++++++++
1 file changed, 13 insertions(+)
create mode 100644 Documentation/netlink/specs/index.rst
diff --git a/Documentation/netlink/specs/index.rst b/Documentation/netlink/specs/index.rst
new file mode 100644
index 000000000000..7f7cf4a096f2
--- /dev/null
+++ b/Documentation/netlink/specs/index.rst
@@ -0,0 +1,13 @@
+.. SPDX-License-Identifier: GPL-2.0
+
+.. _specs:
+
+=============================
+Netlink Family Specifications
+=============================
+
+.. toctree::
+ :maxdepth: 1
+ :glob:
+
+ *
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 04/14] tools: ynl_gen_rst.py: cleanup coding style
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (2 preceding siblings ...)
2025-07-28 16:01 ` [PATCH v10 03/14] docs: netlink: index.rst: add a netlink index file Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 05/14] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
` (10 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Cleanup some coding style issues pointed by pylint and flake8.
No functional changes.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
tools/net/ynl/pyynl/lib/doc_generator.py | 72 ++++++++----------------
1 file changed, 25 insertions(+), 47 deletions(-)
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 80e468086693..474b2b78c7bc 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -18,17 +18,12 @@
"""
from typing import Any, Dict, List
-import os.path
-import sys
-import argparse
-import logging
import yaml
-# ==============
-# RST Formatters
-# ==============
class RstFormatters:
+ """RST Formatters"""
+
SPACE_PER_LEVEL = 4
@staticmethod
@@ -36,81 +31,67 @@ class RstFormatters:
"""Return space to format"""
return " " * (level * RstFormatters.SPACE_PER_LEVEL)
-
@staticmethod
def bold(text: str) -> str:
"""Format bold text"""
return f"**{text}**"
-
@staticmethod
def inline(text: str) -> str:
"""Format inline text"""
return f"``{text}``"
-
@staticmethod
def sanitize(text: str) -> str:
"""Remove newlines and multiple spaces"""
# This is useful for some fields that are spread across multiple lines
return str(text).replace("\n", " ").strip()
-
def rst_fields(self, key: str, value: str, level: int = 0) -> str:
"""Return a RST formatted field"""
return self.headroom(level) + f":{key}: {value}"
-
def rst_definition(self, key: str, value: Any, level: int = 0) -> str:
"""Format a single rst definition"""
return self.headroom(level) + key + "\n" + self.headroom(level + 1) + str(value)
-
def rst_paragraph(self, paragraph: str, level: int = 0) -> str:
"""Return a formatted paragraph"""
return self.headroom(level) + paragraph
-
def rst_bullet(self, item: str, level: int = 0) -> str:
"""Return a formatted a bullet"""
return self.headroom(level) + f"- {item}"
-
@staticmethod
def rst_subsection(title: str) -> str:
"""Add a sub-section to the document"""
return f"{title}\n" + "-" * len(title)
-
@staticmethod
def rst_subsubsection(title: str) -> str:
"""Add a sub-sub-section to the document"""
return f"{title}\n" + "~" * len(title)
-
@staticmethod
def rst_section(namespace: str, prefix: str, title: str) -> str:
"""Add a section to the document"""
return f".. _{namespace}-{prefix}-{title}:\n\n{title}\n" + "=" * len(title)
-
@staticmethod
def rst_subtitle(title: str) -> str:
"""Add a subtitle to the document"""
return "\n" + "-" * len(title) + f"\n{title}\n" + "-" * len(title) + "\n\n"
-
@staticmethod
def rst_title(title: str) -> str:
"""Add a title to the document"""
return "=" * len(title) + f"\n{title}\n" + "=" * len(title) + "\n\n"
-
def rst_list_inline(self, list_: List[str], level: int = 0) -> str:
"""Format a list using inlines"""
return self.headroom(level) + "[" + ", ".join(self.inline(i) for i in list_) + "]"
-
@staticmethod
def rst_ref(namespace: str, prefix: str, name: str) -> str:
"""Add a hyperlink to the document"""
@@ -122,7 +103,6 @@ class RstFormatters:
prefix = mappings[prefix]
return f":ref:`{namespace}-{prefix}-{name}`"
-
def rst_header(self) -> str:
"""The headers for all the auto generated RST files"""
lines = []
@@ -132,7 +112,6 @@ class RstFormatters:
return "\n".join(lines)
-
@staticmethod
def rst_toctree(maxdepth: int = 2) -> str:
"""Generate a toctree RST primitive"""
@@ -143,16 +122,13 @@ class RstFormatters:
return "\n".join(lines)
-
@staticmethod
def rst_label(title: str) -> str:
"""Return a formatted label"""
return f".. _{title}:\n\n"
-# =======
-# Parsers
-# =======
class YnlDocGenerator:
+ """YAML Netlink specs Parser"""
fmt = RstFormatters()
@@ -164,7 +140,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
"""Parse 'do' section and return a formatted string"""
lines = []
@@ -177,16 +152,16 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_do_attributes(self, attrs: Dict[str, Any], level: int = 0) -> str:
"""Parse 'attributes' section"""
if "attributes" not in attrs:
return ""
- lines = [self.fmt.rst_fields("attributes", self.fmt.rst_list_inline(attrs["attributes"]), level + 1)]
+ lines = [self.fmt.rst_fields("attributes",
+ self.fmt.rst_list_inline(attrs["attributes"]),
+ level + 1)]
return "\n".join(lines)
-
def parse_operations(self, operations: List[Dict[str, Any]], namespace: str) -> str:
"""Parse operations block"""
preprocessed = ["name", "doc", "title", "do", "dump", "flags"]
@@ -194,7 +169,8 @@ class YnlDocGenerator:
lines = []
for operation in operations:
- lines.append(self.fmt.rst_section(namespace, 'operation', operation["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'operation',
+ operation["name"]))
lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
for key in operation.keys():
@@ -206,7 +182,8 @@ class YnlDocGenerator:
value = self.fmt.rst_ref(namespace, key, value)
lines.append(self.fmt.rst_fields(key, value, 0))
if 'flags' in operation:
- lines.append(self.fmt.rst_fields('flags', self.fmt.rst_list_inline(operation['flags'])))
+ lines.append(self.fmt.rst_fields('flags',
+ self.fmt.rst_list_inline(operation['flags'])))
if "do" in operation:
lines.append(self.fmt.rst_paragraph(":do:", 0))
@@ -220,7 +197,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_entries(self, entries: List[Dict[str, Any]], level: int) -> str:
"""Parse a list of entries"""
ignored = ["pad"]
@@ -235,17 +211,19 @@ class YnlDocGenerator:
if type_:
field_name += f" ({self.fmt.inline(type_)})"
lines.append(
- self.fmt.rst_fields(field_name, self.fmt.sanitize(entry.get("doc", "")), level)
+ self.fmt.rst_fields(field_name,
+ self.fmt.sanitize(entry.get("doc", "")),
+ level)
)
elif isinstance(entry, list):
lines.append(self.fmt.rst_list_inline(entry, level))
else:
- lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)), level))
+ lines.append(self.fmt.rst_bullet(self.fmt.inline(self.fmt.sanitize(entry)),
+ level))
lines.append("\n")
return "\n".join(lines)
-
def parse_definitions(self, defs: Dict[str, Any], namespace: str) -> str:
"""Parse definitions section"""
preprocessed = ["name", "entries", "members"]
@@ -270,7 +248,6 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_attr_sets(self, entries: List[Dict[str, Any]], namespace: str) -> str:
"""Parse attribute from attribute-set"""
preprocessed = ["name", "type"]
@@ -279,7 +256,8 @@ class YnlDocGenerator:
lines = []
for entry in entries:
- lines.append(self.fmt.rst_section(namespace, 'attribute-set', entry["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'attribute-set',
+ entry["name"]))
for attr in entry["attributes"]:
type_ = attr.get("type")
attr_line = attr["name"]
@@ -301,13 +279,13 @@ class YnlDocGenerator:
return "\n".join(lines)
-
def parse_sub_messages(self, entries: List[Dict[str, Any]], namespace: str) -> str:
"""Parse sub-message definitions"""
lines = []
for entry in entries:
- lines.append(self.fmt.rst_section(namespace, 'sub-message', entry["name"]))
+ lines.append(self.fmt.rst_section(namespace, 'sub-message',
+ entry["name"]))
for fmt in entry["formats"]:
value = fmt["value"]
@@ -315,13 +293,14 @@ class YnlDocGenerator:
for attr in ['fixed-header', 'attribute-set']:
if attr in fmt:
lines.append(self.fmt.rst_fields(attr,
- self.fmt.rst_ref(namespace, attr, fmt[attr]),
- 1))
+ self.fmt.rst_ref(namespace,
+ attr,
+ fmt[attr]),
+ 1))
lines.append("\n")
return "\n".join(lines)
-
def parse_yaml(self, obj: Dict[str, Any]) -> str:
"""Format the whole YAML into a RST string"""
lines = []
@@ -344,7 +323,8 @@ class YnlDocGenerator:
# Operations
if "operations" in obj:
lines.append(self.fmt.rst_subtitle("Operations"))
- lines.append(self.parse_operations(obj["operations"]["list"], family))
+ lines.append(self.parse_operations(obj["operations"]["list"],
+ family))
# Multicast groups
if "mcast-groups" in obj:
@@ -368,11 +348,9 @@ class YnlDocGenerator:
return "\n".join(lines)
-
# Main functions
# ==============
-
def parse_yaml_file(self, filename: str) -> str:
"""Transform the YAML specified by filename into an RST-formatted string"""
with open(filename, "r", encoding="utf-8") as spec_file:
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 05/14] docs: sphinx: add a parser for yaml files for Netlink specs
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (3 preceding siblings ...)
2025-07-28 16:01 ` [PATCH v10 04/14] tools: ynl_gen_rst.py: cleanup coding style Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-07-28 16:01 ` [PATCH v10 06/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
` (9 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Add a simple sphinx.Parser to handle yaml files and add the
the code to handle Netlink specs. All other yaml files are
ignored.
The code was written in a way that parsing yaml for different
subsystems and even for different parts of Netlink are easy.
All it takes to have a different parser is to add an
import line similar to:
from doc_generator import YnlDocGenerator
adding the corresponding parser somewhere at the extension:
netlink_parser = YnlDocGenerator()
And then add a logic inside parse() to handle different
doc outputs, depending on the file location, similar to:
if "/netlink/specs/" in fname:
msg = self.netlink_parser.parse_yaml_file(fname)
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/sphinx/parser_yaml.py | 104 ++++++++++++++++++++++++++++
1 file changed, 104 insertions(+)
create mode 100755 Documentation/sphinx/parser_yaml.py
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
new file mode 100755
index 000000000000..fa2e6da17617
--- /dev/null
+++ b/Documentation/sphinx/parser_yaml.py
@@ -0,0 +1,104 @@
+# SPDX-License-Identifier: GPL-2.0
+# Copyright 2025 Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
+
+"""
+Sphinx extension for processing YAML files
+"""
+
+import os
+import re
+import sys
+
+from pprint import pformat
+
+from docutils.parsers.rst import Parser as RSTParser
+from docutils.statemachine import ViewList
+
+from sphinx.util import logging
+from sphinx.parsers import Parser
+
+srctree = os.path.abspath(os.environ["srctree"])
+sys.path.insert(0, os.path.join(srctree, "tools/net/ynl/pyynl/lib"))
+
+from doc_generator import YnlDocGenerator # pylint: disable=C0413
+
+logger = logging.getLogger(__name__)
+
+class YamlParser(Parser):
+ """
+ Kernel parser for YAML files.
+
+ This is a simple sphinx.Parser to handle yaml files inside the
+ Kernel tree that will be part of the built documentation.
+
+ The actual parser function is not contained here: the code was
+ written in a way that parsing yaml for different subsystems
+ can be done from a single dispatcher.
+
+ All it takes to have parse YAML patches is to have an import line:
+
+ from some_parser_code import NewYamlGenerator
+
+ To this module. Then add an instance of the parser with:
+
+ new_parser = NewYamlGenerator()
+
+ and add a logic inside parse() to handle it based on the path,
+ like this:
+
+ if "/foo" in fname:
+ msg = self.new_parser.parse_yaml_file(fname)
+ """
+
+ supported = ('yaml', )
+
+ netlink_parser = YnlDocGenerator()
+
+ def rst_parse(self, inputstring, document, msg):
+ """
+ Receives a ReST content that was previously converted by the
+ YAML parser, adding it to the document tree.
+ """
+
+ self.setup_parse(inputstring, document)
+
+ result = ViewList()
+
+ try:
+ # Parse message with RSTParser
+ for i, line in enumerate(msg.split('\n')):
+ result.append(line, document.current_source, i)
+
+ rst_parser = RSTParser()
+ rst_parser.parse('\n'.join(result), document)
+
+ except Exception as e:
+ document.reporter.error("YAML parsing error: %s" % pformat(e))
+
+ self.finish_parse()
+
+ # Overrides docutils.parsers.Parser. See sphinx.parsers.RSTParser
+ def parse(self, inputstring, document):
+ """Check if a YAML is meant to be parsed."""
+
+ fname = document.current_source
+
+ # Handle netlink yaml specs
+ if "/netlink/specs/" in fname:
+ msg = self.netlink_parser.parse_yaml_file(fname)
+ self.rst_parse(inputstring, document, msg)
+
+ # All other yaml files are ignored
+
+def setup(app):
+ """Setup function for the Sphinx extension."""
+
+ # Add YAML parser
+ app.add_source_parser(YamlParser)
+ app.add_source_suffix('.yaml', 'yaml')
+
+ return {
+ 'version': '1.0',
+ 'parallel_read_safe': True,
+ 'parallel_write_safe': True,
+ }
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 06/14] docs: use parser_yaml extension to handle Netlink specs
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (4 preceding siblings ...)
2025-07-28 16:01 ` [PATCH v10 05/14] docs: sphinx: add a parser for yaml files for Netlink specs Mauro Carvalho Chehab
@ 2025-07-28 16:01 ` Mauro Carvalho Chehab
2025-08-28 16:50 ` Kory Maincent
2025-07-28 16:02 ` [PATCH v10 07/14] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
` (8 subsequent siblings)
14 siblings, 1 reply; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:01 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
This way, no .rst files would be written to the Kernel source
directories.
We are using here a toctree with :glob: property. This way, there
is no need to touch the netlink/specs/index.rst file every time
a new Netlink spec is added/renamed/removed.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/Makefile | 17 ----------------
Documentation/conf.py | 20 ++++++++++++++-----
Documentation/networking/index.rst | 2 +-
.../networking/netlink_spec/readme.txt | 4 ----
4 files changed, 16 insertions(+), 27 deletions(-)
delete mode 100644 Documentation/networking/netlink_spec/readme.txt
diff --git a/Documentation/Makefile b/Documentation/Makefile
index b98477df5ddf..820f07e0afe6 100644
--- a/Documentation/Makefile
+++ b/Documentation/Makefile
@@ -104,22 +104,6 @@ quiet_cmd_sphinx = SPHINX $@ --> file://$(abspath $(BUILDDIR)/$3/$4)
cp $(if $(patsubst /%,,$(DOCS_CSS)),$(abspath $(srctree)/$(DOCS_CSS)),$(DOCS_CSS)) $(BUILDDIR)/$3/_static/; \
fi
-YNL_INDEX:=$(srctree)/Documentation/networking/netlink_spec/index.rst
-YNL_RST_DIR:=$(srctree)/Documentation/networking/netlink_spec
-YNL_YAML_DIR:=$(srctree)/Documentation/netlink/specs
-YNL_TOOL:=$(srctree)/tools/net/ynl/pyynl/ynl_gen_rst.py
-
-YNL_RST_FILES_TMP := $(patsubst %.yaml,%.rst,$(wildcard $(YNL_YAML_DIR)/*.yaml))
-YNL_RST_FILES := $(patsubst $(YNL_YAML_DIR)%,$(YNL_RST_DIR)%, $(YNL_RST_FILES_TMP))
-
-$(YNL_INDEX): $(YNL_RST_FILES)
- $(Q)$(YNL_TOOL) -o $@ -x
-
-$(YNL_RST_DIR)/%.rst: $(YNL_YAML_DIR)/%.yaml $(YNL_TOOL)
- $(Q)$(YNL_TOOL) -i $< -o $@
-
-htmldocs texinfodocs latexdocs epubdocs xmldocs: $(YNL_INDEX)
-
htmldocs:
@$(srctree)/scripts/sphinx-pre-install --version-check
@+$(foreach var,$(SPHINXDIRS),$(call loop_cmd,sphinx,html,$(var),,$(var)))
@@ -186,7 +170,6 @@ refcheckdocs:
$(Q)cd $(srctree);scripts/documentation-file-ref-check
cleandocs:
- $(Q)rm -f $(YNL_INDEX) $(YNL_RST_FILES)
$(Q)rm -rf $(BUILDDIR)
$(Q)$(MAKE) BUILDDIR=$(abspath $(BUILDDIR)) $(build)=Documentation/userspace-api/media clean
diff --git a/Documentation/conf.py b/Documentation/conf.py
index 700516238d3f..f9828f3862f9 100644
--- a/Documentation/conf.py
+++ b/Documentation/conf.py
@@ -42,6 +42,15 @@ exclude_patterns = []
dyn_include_patterns = []
dyn_exclude_patterns = ["output"]
+# Currently, only netlink/specs has a parser for yaml.
+# Prefer using include patterns if available, as it is faster
+if has_include_patterns:
+ dyn_include_patterns.append("netlink/specs/*.yaml")
+else:
+ dyn_exclude_patterns.append("netlink/*.yaml")
+ dyn_exclude_patterns.append("devicetree/bindings/**.yaml")
+ dyn_exclude_patterns.append("core-api/kho/bindings/**.yaml")
+
# Properly handle include/exclude patterns
# ----------------------------------------
@@ -102,12 +111,12 @@ extensions = [
"kernel_include",
"kfigure",
"maintainers_include",
+ "parser_yaml",
"rstFlatTable",
"sphinx.ext.autosectionlabel",
"sphinx.ext.ifconfig",
"translations",
]
-
# Since Sphinx version 3, the C function parser is more pedantic with regards
# to type checking. Due to that, having macros at c:function cause problems.
# Those needed to be escaped by using c_id_attributes[] array
@@ -204,10 +213,11 @@ else:
# Add any paths that contain templates here, relative to this directory.
templates_path = ["sphinx/templates"]
-# The suffix(es) of source filenames.
-# You can specify multiple suffix as a list of string:
-# source_suffix = ['.rst', '.md']
-source_suffix = '.rst'
+# The suffixes of source filenames that will be automatically parsed
+source_suffix = {
+ ".rst": "restructuredtext",
+ ".yaml": "yaml",
+}
# The encoding of source files.
# source_encoding = 'utf-8-sig'
diff --git a/Documentation/networking/index.rst b/Documentation/networking/index.rst
index ac90b82f3ce9..b7a4969e9bc9 100644
--- a/Documentation/networking/index.rst
+++ b/Documentation/networking/index.rst
@@ -57,7 +57,7 @@ Contents:
filter
generic-hdlc
generic_netlink
- netlink_spec/index
+ ../netlink/specs/index
gen_stats
gtp
ila
diff --git a/Documentation/networking/netlink_spec/readme.txt b/Documentation/networking/netlink_spec/readme.txt
deleted file mode 100644
index 030b44aca4e6..000000000000
--- a/Documentation/networking/netlink_spec/readme.txt
+++ /dev/null
@@ -1,4 +0,0 @@
-SPDX-License-Identifier: GPL-2.0
-
-This file is populated during the build of the documentation (htmldocs) by the
-tools/net/ynl/pyynl/ynl_gen_rst.py script.
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* Re: [PATCH v10 06/14] docs: use parser_yaml extension to handle Netlink specs
2025-07-28 16:01 ` [PATCH v10 06/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-08-28 16:50 ` Kory Maincent
2025-08-28 17:02 ` Kory Maincent
0 siblings, 1 reply; 20+ messages in thread
From: Kory Maincent @ 2025-08-28 16:50 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Message-ID :, Linux Doc Mailing List, Akira Yokosawa,
Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Le Mon, 28 Jul 2025 18:01:59 +0200,
Mauro Carvalho Chehab <mchehab+huawei@kernel.org> a écrit :
> Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
> This way, no .rst files would be written to the Kernel source
> directories.
>
> We are using here a toctree with :glob: property. This way, there
> is no need to touch the netlink/specs/index.rst file every time
> a new Netlink spec is added/renamed/removed.
...
> diff --git a/Documentation/networking/index.rst
> b/Documentation/networking/index.rst index ac90b82f3ce9..b7a4969e9bc9 100644
> --- a/Documentation/networking/index.rst
> +++ b/Documentation/networking/index.rst
> @@ -57,7 +57,7 @@ Contents:
> filter
> generic-hdlc
> generic_netlink
> - netlink_spec/index
> + ../netlink/specs/index
Faced a doc build warning that say netlink_spec/index.rst is not used.
$ git grep netlink_spec
Documentation/networking/mptcp.rst:netlink_spec/mptcp_pm.rst.
Documentation/translations/zh_CN/networking/index.rst:* netlink_spec/index
I think we can remove the whole directory and change these, right?
Also got a doc build warning that says netlink/specs/index is not existing even
if it exists. Maybe a sphinx parsing issue ?!
Regards,
--
Köry Maincent, Bootlin
Embedded Linux and kernel engineering
https://bootlin.com
^ permalink raw reply [flat|nested] 20+ messages in thread
* Re: [PATCH v10 06/14] docs: use parser_yaml extension to handle Netlink specs
2025-08-28 16:50 ` Kory Maincent
@ 2025-08-28 17:02 ` Kory Maincent
0 siblings, 0 replies; 20+ messages in thread
From: Kory Maincent @ 2025-08-28 17:02 UTC (permalink / raw)
To: Mauro Carvalho Chehab
Cc: Message-ID :, Linux Doc Mailing List, Akira Yokosawa,
Breno Leitao, David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Le Thu, 28 Aug 2025 18:50:37 +0200,
Kory Maincent <kory.maincent@bootlin.com> a écrit :
> Le Mon, 28 Jul 2025 18:01:59 +0200,
> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> a écrit :
>
> > Instead of manually calling ynl_gen_rst.py, use a Sphinx extension.
> > This way, no .rst files would be written to the Kernel source
> > directories.
> >
> > We are using here a toctree with :glob: property. This way, there
> > is no need to touch the netlink/specs/index.rst file every time
> > a new Netlink spec is added/renamed/removed.
>
> ...
>
> > diff --git a/Documentation/networking/index.rst
> > b/Documentation/networking/index.rst index ac90b82f3ce9..b7a4969e9bc9 100644
> > --- a/Documentation/networking/index.rst
> > +++ b/Documentation/networking/index.rst
> > @@ -57,7 +57,7 @@ Contents:
> > filter
> > generic-hdlc
> > generic_netlink
> > - netlink_spec/index
> > + ../netlink/specs/index
>
> Faced a doc build warning that say netlink_spec/index.rst is not used.
>
> $ git grep netlink_spec
> Documentation/networking/mptcp.rst:netlink_spec/mptcp_pm.rst.
> Documentation/translations/zh_CN/networking/index.rst:* netlink_spec/index
>
> I think we can remove the whole directory and change these, right?
Oops just saw that is was some local leftover. This first warning get removed
with a clean tree.
> Also got a doc build warning that says netlink/specs/index is not existing
> even if it exists. Maybe a sphinx parsing issue ?!
>
> Regards,
--
Köry Maincent, Bootlin
Embedded Linux and kernel engineering
https://bootlin.com
^ permalink raw reply [flat|nested] 20+ messages in thread
* [PATCH v10 07/14] docs: uapi: netlink: update netlink specs link
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (5 preceding siblings ...)
2025-07-28 16:01 ` [PATCH v10 06/14] docs: use parser_yaml extension to handle " Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 08/14] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
` (7 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
With the recent parser_yaml extension, and the removal of the
auto-generated ReST source files, the location of netlink
specs changed.
Update uAPI accordingly.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/userspace-api/netlink/index.rst | 2 +-
Documentation/userspace-api/netlink/specs.rst | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/Documentation/userspace-api/netlink/index.rst b/Documentation/userspace-api/netlink/index.rst
index c1b6765cc963..83ae25066591 100644
--- a/Documentation/userspace-api/netlink/index.rst
+++ b/Documentation/userspace-api/netlink/index.rst
@@ -18,4 +18,4 @@ Netlink documentation for users.
See also:
- :ref:`Documentation/core-api/netlink.rst <kernel_netlink>`
- - :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - :ref:`Documentation/netlink/specs/index.rst <specs>`
diff --git a/Documentation/userspace-api/netlink/specs.rst b/Documentation/userspace-api/netlink/specs.rst
index 1b50d97d8d7c..debb4bfca5c4 100644
--- a/Documentation/userspace-api/netlink/specs.rst
+++ b/Documentation/userspace-api/netlink/specs.rst
@@ -15,7 +15,7 @@ kernel headers directly.
Internally kernel uses the YAML specs to generate:
- the C uAPI header
- - documentation of the protocol as a ReST file - see :ref:`Documentation/networking/netlink_spec/index.rst <specs>`
+ - documentation of the protocol as a ReST file - see :ref:`Documentation/netlink/specs/index.rst <specs>`
- policy tables for input attribute validation
- operation tables
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 08/14] tools: ynl_gen_rst.py: drop support for generating index files
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (6 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 07/14] docs: uapi: netlink: update netlink specs link Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 09/14] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
` (6 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
As we're now using an index file with a glob, there's no need
to generate index files anymore.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
tools/net/ynl/pyynl/ynl_gen_rst.py | 28 ----------------------------
1 file changed, 28 deletions(-)
diff --git a/tools/net/ynl/pyynl/ynl_gen_rst.py b/tools/net/ynl/pyynl/ynl_gen_rst.py
index 010315fad498..90ae19aac89d 100755
--- a/tools/net/ynl/pyynl/ynl_gen_rst.py
+++ b/tools/net/ynl/pyynl/ynl_gen_rst.py
@@ -31,9 +31,6 @@ def parse_arguments() -> argparse.Namespace:
# Index and input are mutually exclusive
group = parser.add_mutually_exclusive_group()
- group.add_argument(
- "-x", "--index", action="store_true", help="Generate the index page"
- )
group.add_argument("-i", "--input", help="YAML file name")
args = parser.parse_args()
@@ -63,27 +60,6 @@ def write_to_rstfile(content: str, filename: str) -> None:
rst_file.write(content)
-def generate_main_index_rst(parser: YnlDocGenerator, output: str) -> None:
- """Generate the `networking_spec/index` content and write to the file"""
- lines = []
-
- lines.append(parser.fmt.rst_header())
- lines.append(parser.fmt.rst_label("specs"))
- lines.append(parser.fmt.rst_title("Netlink Family Specifications"))
- lines.append(parser.fmt.rst_toctree(1))
-
- index_dir = os.path.dirname(output)
- logging.debug("Looking for .rst files in %s", index_dir)
- for filename in sorted(os.listdir(index_dir)):
- base, ext = os.path.splitext(filename)
- if filename == "index.rst" or ext not in [".rst", ".yaml"]:
- continue
- lines.append(f" {base}\n")
-
- logging.debug("Writing an index file at %s", output)
- write_to_rstfile("".join(lines), output)
-
-
def main() -> None:
"""Main function that reads the YAML files and generates the RST files"""
@@ -102,10 +78,6 @@ def main() -> None:
write_to_rstfile(content, args.output)
- if args.index:
- # Generate the index RST file
- generate_main_index_rst(parser, args.output)
-
if __name__ == "__main__":
main()
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 09/14] docs: netlink: remove obsolete .gitignore from unused directory
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (7 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 08/14] tools: ynl_gen_rst.py: drop support for generating index files Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 10/14] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
` (5 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
The previous code was generating source rst files
under Documentation/networking/netlink_spec/. With the
Sphinx YAML parser, this is now gone. So, stop ignoring
*.rst files inside netlink specs directory.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
Documentation/networking/netlink_spec/.gitignore | 1 -
1 file changed, 1 deletion(-)
delete mode 100644 Documentation/networking/netlink_spec/.gitignore
diff --git a/Documentation/networking/netlink_spec/.gitignore b/Documentation/networking/netlink_spec/.gitignore
deleted file mode 100644
index 30d85567b592..000000000000
--- a/Documentation/networking/netlink_spec/.gitignore
+++ /dev/null
@@ -1 +0,0 @@
-*.rst
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 10/14] MAINTAINERS: add netlink_yml_parser.py to linux-doc
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (8 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 09/14] docs: netlink: remove obsolete .gitignore from unused directory Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 11/14] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
` (4 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
The documentation build depends on the parsing code
at ynl_gen_rst.py. Ensure that changes to it will be c/c
to linux-doc ML and maintainers by adding an entry for
it. This way, if a change there would affect the build,
or the minimal version required for Python, doc developers
may know in advance.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
Reviewed-by: Breno Leitao <leitao@debian.org>
Reviewed-by: Donald Hunter <donald.hunter@gmail.com>
---
MAINTAINERS | 1 +
1 file changed, 1 insertion(+)
diff --git a/MAINTAINERS b/MAINTAINERS
index b02127967322..f509a16054ec 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -7202,6 +7202,7 @@ F: scripts/get_abi.py
F: scripts/kernel-doc*
F: scripts/lib/abi/*
F: scripts/lib/kdoc/*
+F: tools/net/ynl/pyynl/lib/doc_generator.py
F: scripts/sphinx-pre-install
X: Documentation/ABI/
X: Documentation/admin-guide/media/
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 11/14] tools: netlink_yml_parser.py: add line numbers to parsed data
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (9 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 10/14] MAINTAINERS: add netlink_yml_parser.py to linux-doc Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 12/14] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
` (3 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
When something goes wrong, we want Sphinx error to point to the
right line number from the original source, not from the
processed ReST data.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
tools/net/ynl/pyynl/lib/doc_generator.py | 34 ++++++++++++++++++++++--
1 file changed, 32 insertions(+), 2 deletions(-)
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 474b2b78c7bc..658759a527a6 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -20,6 +20,16 @@
from typing import Any, Dict, List
import yaml
+LINE_STR = '__lineno__'
+
+class NumberedSafeLoader(yaml.SafeLoader): # pylint: disable=R0901
+ """Override the SafeLoader class to add line number to parsed data"""
+
+ def construct_mapping(self, node, *args, **kwargs):
+ mapping = super().construct_mapping(node, *args, **kwargs)
+ mapping[LINE_STR] = node.start_mark.line
+
+ return mapping
class RstFormatters:
"""RST Formatters"""
@@ -127,6 +137,11 @@ class RstFormatters:
"""Return a formatted label"""
return f".. _{title}:\n\n"
+ @staticmethod
+ def rst_lineno(lineno: int) -> str:
+ """Return a lineno comment"""
+ return f".. LINENO {lineno}\n"
+
class YnlDocGenerator:
"""YAML Netlink specs Parser"""
@@ -144,6 +159,9 @@ class YnlDocGenerator:
"""Parse 'do' section and return a formatted string"""
lines = []
for key in do_dict.keys():
+ if key == LINE_STR:
+ lines.append(self.fmt.rst_lineno(do_dict[key]))
+ continue
lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
if key in ['request', 'reply']:
lines.append(self.parse_do_attributes(do_dict[key], level + 1) + "\n")
@@ -174,6 +192,10 @@ class YnlDocGenerator:
lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
for key in operation.keys():
+ if key == LINE_STR:
+ lines.append(self.fmt.rst_lineno(operation[key]))
+ continue
+
if key in preprocessed:
# Skip the special fields
continue
@@ -233,6 +255,9 @@ class YnlDocGenerator:
for definition in defs:
lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
for k in definition.keys():
+ if k == LINE_STR:
+ lines.append(self.fmt.rst_lineno(definition[k]))
+ continue
if k in preprocessed + ignored:
continue
lines.append(self.fmt.rst_fields(k, self.fmt.sanitize(definition[k]), 0))
@@ -268,6 +293,9 @@ class YnlDocGenerator:
lines.append(self.fmt.rst_subsubsection(attr_line))
for k in attr.keys():
+ if k == LINE_STR:
+ lines.append(self.fmt.rst_lineno(attr[k]))
+ continue
if k in preprocessed + ignored:
continue
if k in linkable:
@@ -306,6 +334,8 @@ class YnlDocGenerator:
lines = []
# Main header
+ lineno = obj.get('__lineno__', 0)
+ lines.append(self.fmt.rst_lineno(lineno))
family = obj['name']
@@ -354,7 +384,7 @@ class YnlDocGenerator:
def parse_yaml_file(self, filename: str) -> str:
"""Transform the YAML specified by filename into an RST-formatted string"""
with open(filename, "r", encoding="utf-8") as spec_file:
- yaml_data = yaml.safe_load(spec_file)
- content = self.parse_yaml(yaml_data)
+ numbered_yaml = yaml.load(spec_file, Loader=NumberedSafeLoader)
+ content = self.parse_yaml(numbered_yaml)
return content
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 12/14] docs: parser_yaml.py: add support for line numbers from the parser
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (10 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 11/14] tools: netlink_yml_parser.py: add line numbers to parsed data Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 13/14] docs: parser_yaml.py: fix backward compatibility with old docutils Mauro Carvalho Chehab
` (2 subsequent siblings)
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
Instead of printing line numbers from the temp converted ReST
file, get them from the original source.
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/sphinx/parser_yaml.py | 12 ++++++++++--
tools/net/ynl/pyynl/lib/doc_generator.py | 16 ++++++++++++----
2 files changed, 22 insertions(+), 6 deletions(-)
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index fa2e6da17617..8288e2ff7c7c 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -54,6 +54,8 @@ class YamlParser(Parser):
netlink_parser = YnlDocGenerator()
+ re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
+
def rst_parse(self, inputstring, document, msg):
"""
Receives a ReST content that was previously converted by the
@@ -66,8 +68,14 @@ class YamlParser(Parser):
try:
# Parse message with RSTParser
- for i, line in enumerate(msg.split('\n')):
- result.append(line, document.current_source, i)
+ lineoffset = 0;
+ for line in msg.split('\n'):
+ match = self.re_lineno.match(line)
+ if match:
+ lineoffset = int(match.group(1))
+ continue
+
+ result.append(line, document.current_source, lineoffset)
rst_parser = RSTParser()
rst_parser.parse('\n'.join(result), document)
diff --git a/tools/net/ynl/pyynl/lib/doc_generator.py b/tools/net/ynl/pyynl/lib/doc_generator.py
index 658759a527a6..403abf1a2eda 100644
--- a/tools/net/ynl/pyynl/lib/doc_generator.py
+++ b/tools/net/ynl/pyynl/lib/doc_generator.py
@@ -158,9 +158,11 @@ class YnlDocGenerator:
def parse_do(self, do_dict: Dict[str, Any], level: int = 0) -> str:
"""Parse 'do' section and return a formatted string"""
lines = []
+ if LINE_STR in do_dict:
+ lines.append(self.fmt.rst_lineno(do_dict[LINE_STR]))
+
for key in do_dict.keys():
if key == LINE_STR:
- lines.append(self.fmt.rst_lineno(do_dict[key]))
continue
lines.append(self.fmt.rst_paragraph(self.fmt.bold(key), level + 1))
if key in ['request', 'reply']:
@@ -187,13 +189,15 @@ class YnlDocGenerator:
lines = []
for operation in operations:
+ if LINE_STR in operation:
+ lines.append(self.fmt.rst_lineno(operation[LINE_STR]))
+
lines.append(self.fmt.rst_section(namespace, 'operation',
operation["name"]))
lines.append(self.fmt.rst_paragraph(operation["doc"]) + "\n")
for key in operation.keys():
if key == LINE_STR:
- lines.append(self.fmt.rst_lineno(operation[key]))
continue
if key in preprocessed:
@@ -253,10 +257,12 @@ class YnlDocGenerator:
lines = []
for definition in defs:
+ if LINE_STR in definition:
+ lines.append(self.fmt.rst_lineno(definition[LINE_STR]))
+
lines.append(self.fmt.rst_section(namespace, 'definition', definition["name"]))
for k in definition.keys():
if k == LINE_STR:
- lines.append(self.fmt.rst_lineno(definition[k]))
continue
if k in preprocessed + ignored:
continue
@@ -284,6 +290,9 @@ class YnlDocGenerator:
lines.append(self.fmt.rst_section(namespace, 'attribute-set',
entry["name"]))
for attr in entry["attributes"]:
+ if LINE_STR in attr:
+ lines.append(self.fmt.rst_lineno(attr[LINE_STR]))
+
type_ = attr.get("type")
attr_line = attr["name"]
if type_:
@@ -294,7 +303,6 @@ class YnlDocGenerator:
for k in attr.keys():
if k == LINE_STR:
- lines.append(self.fmt.rst_lineno(attr[k]))
continue
if k in preprocessed + ignored:
continue
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 13/14] docs: parser_yaml.py: fix backward compatibility with old docutils
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (11 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 12/14] docs: parser_yaml.py: add support for line numbers from the parser Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-07-28 16:02 ` [PATCH v10 14/14] sphinx: parser_yaml.py: fix line numbers information Mauro Carvalho Chehab
2025-08-11 17:28 ` [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Jonathan Corbet
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
As reported by Akira, older docutils versions are not compatible
with the way some Sphinx versions send tab_width. Add a code to
address it.
Reported-by: Akira Yokosawa <akiyks@gmail.com>
Closes: https://lore.kernel.org/linux-doc/598b2cb7-2fd7-4388-96ba-2ddf0ab55d2a@gmail.com/
Tested-by: Akira Yokosawa <akiyks@gmail.com>
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/sphinx/parser_yaml.py | 4 ++++
1 file changed, 4 insertions(+)
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 8288e2ff7c7c..1602b31f448e 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -77,6 +77,10 @@ class YamlParser(Parser):
result.append(line, document.current_source, lineoffset)
+ # Fix backward compatibility with docutils < 0.17.1
+ if "tab_width" not in vars(document.settings):
+ document.settings.tab_width = 8
+
rst_parser = RSTParser()
rst_parser.parse('\n'.join(result), document)
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* [PATCH v10 14/14] sphinx: parser_yaml.py: fix line numbers information
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (12 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 13/14] docs: parser_yaml.py: fix backward compatibility with old docutils Mauro Carvalho Chehab
@ 2025-07-28 16:02 ` Mauro Carvalho Chehab
2025-08-11 17:28 ` [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Jonathan Corbet
14 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-07-28 16:02 UTC (permalink / raw)
To: Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, Akira Yokosawa, Breno Leitao,
David S. Miller, Donald Hunter, Eric Dumazet,
Ignacio Encinas Rubio, Jakub Kicinski, Jan Stancek,
Jonathan Corbet, Marco Elver, Paolo Abeni, Randy Dunlap,
Ruben Wauters, Shuah Khan, Simon Horman, joel,
linux-kernel-mentees, linux-kernel, lkmm, netdev, peterz, stern
As reported by Donald, this code:
rst_parser = RSTParser()
rst_parser.parse('\n'.join(result), document)
breaks line parsing. As an alternative, I tested a variant of it:
rst_parser.parse(result, document)
but still line number was not preserved. As Donald noted,
standard Parser classes don't have a direct mechanism to preserve
line numbers from ViewList().
So, instead, let's use a mechanism similar to what we do already at
kerneldoc.py: call the statemachine mechanism directly there.
I double-checked when states and statemachine were introduced:
both were back in 2002. I also tested doc build with docutils 0.16
and 0.21.2. It worked with both, so it seems to be stable enough
for our needs.
Reported-by: Donald Hunter <donald.hunter@gmail.com>
Closes: https://lore.kernel.org/linux-doc/m24ivk78ng.fsf@gmail.com/T/#u
Signed-off-by: Mauro Carvalho Chehab <mchehab+huawei@kernel.org>
---
Documentation/sphinx/parser_yaml.py | 21 ++++++++++++++-------
1 file changed, 14 insertions(+), 7 deletions(-)
diff --git a/Documentation/sphinx/parser_yaml.py b/Documentation/sphinx/parser_yaml.py
index 1602b31f448e..634d84a202fc 100755
--- a/Documentation/sphinx/parser_yaml.py
+++ b/Documentation/sphinx/parser_yaml.py
@@ -11,7 +11,9 @@ import sys
from pprint import pformat
+from docutils import statemachine
from docutils.parsers.rst import Parser as RSTParser
+from docutils.parsers.rst import states
from docutils.statemachine import ViewList
from sphinx.util import logging
@@ -56,6 +58,8 @@ class YamlParser(Parser):
re_lineno = re.compile(r"\.\. LINENO ([0-9]+)$")
+ tab_width = 8
+
def rst_parse(self, inputstring, document, msg):
"""
Receives a ReST content that was previously converted by the
@@ -66,10 +70,18 @@ class YamlParser(Parser):
result = ViewList()
+ self.statemachine = states.RSTStateMachine(state_classes=states.state_classes,
+ initial_state='Body',
+ debug=document.reporter.debug_flag)
+
try:
# Parse message with RSTParser
lineoffset = 0;
- for line in msg.split('\n'):
+
+ lines = statemachine.string2lines(msg, self.tab_width,
+ convert_whitespace=True)
+
+ for line in lines:
match = self.re_lineno.match(line)
if match:
lineoffset = int(match.group(1))
@@ -77,12 +89,7 @@ class YamlParser(Parser):
result.append(line, document.current_source, lineoffset)
- # Fix backward compatibility with docutils < 0.17.1
- if "tab_width" not in vars(document.settings):
- document.settings.tab_width = 8
-
- rst_parser = RSTParser()
- rst_parser.parse('\n'.join(result), document)
+ self.statemachine.run(result, document)
except Exception as e:
document.reporter.error("YAML parsing error: %s" % pformat(e))
--
2.49.0
^ permalink raw reply related [flat|nested] 20+ messages in thread
* Re: [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree)
2025-07-28 16:01 [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Mauro Carvalho Chehab
` (13 preceding siblings ...)
2025-07-28 16:02 ` [PATCH v10 14/14] sphinx: parser_yaml.py: fix line numbers information Mauro Carvalho Chehab
@ 2025-08-11 17:28 ` Jonathan Corbet
2025-08-12 0:56 ` Jakub Kicinski
14 siblings, 1 reply; 20+ messages in thread
From: Jonathan Corbet @ 2025-08-11 17:28 UTC (permalink / raw)
To: Mauro Carvalho Chehab, Message-ID :, Linux Doc Mailing List
Cc: Mauro Carvalho Chehab, linux-kernel, Akira Yokosawa,
David S. Miller, Ignacio Encinas Rubio, Marco Elver, Shuah Khan,
Donald Hunter, Eric Dumazet, Jan Stancek, Paolo Abeni,
Ruben Wauters, joel, linux-kernel-mentees, lkmm, netdev, peterz,
stern, Breno Leitao, Randy Dunlap, Jakub Kicinski, Simon Horman
Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> Hi Jon,
>
> That's the v10 version of the parser-yaml series, addressing a couple of
> issues raised by Donald.
>
> It should apply cleanly on the top of docs-next, as I just rebased on
> the top of docs/docs-next.
>
> Please merge it via your tree, as I have another patch series that will
> depend on this one.
I intend to do that shortly unless I hear objections; are the netdev
folks OK with this work going through docs-next?
Thanks,
jon
^ permalink raw reply [flat|nested] 20+ messages in thread
* Re: [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree)
2025-08-11 17:28 ` [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree) Jonathan Corbet
@ 2025-08-12 0:56 ` Jakub Kicinski
2025-08-12 9:29 ` Mauro Carvalho Chehab
0 siblings, 1 reply; 20+ messages in thread
From: Jakub Kicinski @ 2025-08-12 0:56 UTC (permalink / raw)
To: Jonathan Corbet
Cc: Mauro Carvalho Chehab, Linux Doc Mailing List, linux-kernel,
Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
netdev, peterz, stern, Breno Leitao, Randy Dunlap, Simon Horman
On Mon, 11 Aug 2025 11:28:45 -0600 Jonathan Corbet wrote:
> Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> > That's the v10 version of the parser-yaml series, addressing a couple of
> > issues raised by Donald.
> >
> > It should apply cleanly on the top of docs-next, as I just rebased on
> > the top of docs/docs-next.
> >
> > Please merge it via your tree, as I have another patch series that will
> > depend on this one.
>
> I intend to do that shortly unless I hear objections; are the netdev
> folks OK with this work going through docs-next?
No objections.
Would you be willing to apply these on top of -rc1, and create a merge
commit? YNL is fairly active, if there's a conflict we may be testing
our luck if Linus has to resolve Python conflicts.
Happy to do that on our end (we have a script:)), or perhaps Mauro could
apply and send us both a PR?
^ permalink raw reply [flat|nested] 20+ messages in thread
* Re: [PATCH v10 00/14] Don't generate netlink .rst files inside $(srctree)
2025-08-12 0:56 ` Jakub Kicinski
@ 2025-08-12 9:29 ` Mauro Carvalho Chehab
0 siblings, 0 replies; 20+ messages in thread
From: Mauro Carvalho Chehab @ 2025-08-12 9:29 UTC (permalink / raw)
To: Jakub Kicinski
Cc: Jonathan Corbet, Linux Doc Mailing List, linux-kernel,
Akira Yokosawa, David S. Miller, Ignacio Encinas Rubio,
Marco Elver, Shuah Khan, Donald Hunter, Eric Dumazet, Jan Stancek,
Paolo Abeni, Ruben Wauters, joel, linux-kernel-mentees, lkmm,
netdev, peterz, stern, Breno Leitao, Randy Dunlap, Simon Horman
On Mon, 11 Aug 2025 17:56:55 -0700
Jakub Kicinski <kuba@kernel.org> wrote:
> On Mon, 11 Aug 2025 11:28:45 -0600 Jonathan Corbet wrote:
> > Mauro Carvalho Chehab <mchehab+huawei@kernel.org> writes:
> > > That's the v10 version of the parser-yaml series, addressing a couple of
> > > issues raised by Donald.
> > >
> > > It should apply cleanly on the top of docs-next, as I just rebased on
> > > the top of docs/docs-next.
> > >
> > > Please merge it via your tree, as I have another patch series that will
> > > depend on this one.
> >
> > I intend to do that shortly unless I hear objections; are the netdev
> > folks OK with this work going through docs-next?
>
> No objections.
>
> Would you be willing to apply these on top of -rc1, and create a merge
> commit? YNL is fairly active, if there's a conflict we may be testing
> our luck if Linus has to resolve Python conflicts.
>
> Happy to do that on our end (we have a script:)), or perhaps Mauro could
> apply and send us both a PR?
Whatever works best for you. In case you decide for the PR, I'm sending
one right now for you both. It is on a stable signed tag:
https://git.kernel.org/pub/scm/linux/kernel/git/mchehab/linux-docs.git/tag/?h=docs/v6.17-1
So, if both of you pick from it, we should be able to avoid conflicts on
the next merge window, as I don't have any patches changing the Sphinx
yaml parser plugin nor netlink tools.
It should be noticed that I have a patch series that does a major
cleanup at Documentation/Makefile. It is based after this series.
So, better to have the series merged at linux-docs to avoid merge
conflicts at linux-next and during the next merge window.
Thanks,
Mauro
^ permalink raw reply [flat|nested] 20+ messages in thread