public inbox for openembedded-core@lists.openembedded.org
 help / color / mirror / Atom feed
* [PATCH 1/3] classes/vex: remove
@ 2026-03-31 13:24 Ross Burton
  2026-03-31 13:24 ` [PATCH 2/3] classes/sbom-cve-check: remove references to vex.bbclass Ross Burton
                   ` (2 more replies)
  0 siblings, 3 replies; 12+ messages in thread
From: Ross Burton @ 2026-03-31 13:24 UTC (permalink / raw)
  To: openembedded-core

This class existed as a provider of information for external CVE tooling,
and uses a non-standard format that is OpenEmbedded-specific[1].

However, the SPDX 3 output can contain all of this needed information,
in a format that is standardised.

I'm unaware of any active users of this class beyond sbom-cve-check,
which can also read the data from the SPDX if SPDX_INCLUDE_VEX has been
set.

So that we don't have to maintain this class for the lifetime of the
Wrynose LTS, delete it.

[1] oe-core 6352ad93a72 ("vex.bbclass: add a new class")

Signed-off-by: Ross Burton <ross.burton@arm.com>
---
 meta/classes/vex.bbclass | 303 ---------------------------------------
 1 file changed, 303 deletions(-)
 delete mode 100644 meta/classes/vex.bbclass

diff --git a/meta/classes/vex.bbclass b/meta/classes/vex.bbclass
deleted file mode 100644
index c57b8209c23..00000000000
--- a/meta/classes/vex.bbclass
+++ /dev/null
@@ -1,303 +0,0 @@
-#
-# Copyright OpenEmbedded Contributors
-#
-# SPDX-License-Identifier: MIT
-#
-
-# This class is used to generate metadata needed by external
-# tools to check for vulnerabilities, for example CVEs.
-#
-# In order to use this class just inherit the class in the
-# local.conf file and it will add the generate_vex task for
-# every recipe. If an image is build it will generate a report
-# in DEPLOY_DIR_IMAGE for all the packages used, it will also
-# generate a file for all recipes used in the build.
-#
-# Variables use CVE_CHECK prefix to keep compatibility with
-# the cve-check class
-#
-# Example:
-#   bitbake -c generate_vex openssl
-#   bitbake core-image-sato
-#   bitbake -k -c generate_vex universe
-#
-# The product name that the CVE database uses defaults to BPN, but may need to
-# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff).
-CVE_PRODUCT ??= "${BPN}"
-CVE_VERSION ??= "${PV}"
-
-CVE_CHECK_SUMMARY_DIR ?= "${LOG_DIR}/cve"
-
-CVE_CHECK_SUMMARY_FILE_NAME_JSON = "cve-summary.json"
-CVE_CHECK_SUMMARY_INDEX_PATH = "${CVE_CHECK_SUMMARY_DIR}/cve-summary-index.txt"
-
-CVE_CHECK_DIR ??= "${DEPLOY_DIR}/cve"
-CVE_CHECK_RECIPE_FILE_JSON ?= "${CVE_CHECK_DIR}/${PN}_cve.json"
-CVE_CHECK_MANIFEST_JSON ?= "${IMGDEPLOYDIR}/${IMAGE_NAME}.vex.json"
-
-# Skip CVE Check for packages (PN)
-CVE_CHECK_SKIP_RECIPE ?= ""
-
-# Replace NVD DB check status for a given CVE. Each of CVE has to be mentioned
-# separately with optional detail and description for this status.
-#
-# CVE_STATUS[CVE-1234-0001] = "not-applicable-platform: Issue only applies on Windows"
-# CVE_STATUS[CVE-1234-0002] = "fixed-version: Fixed externally"
-#
-# Settings the same status and reason for multiple CVEs is possible
-# via CVE_STATUS_GROUPS variable.
-#
-# CVE_STATUS_GROUPS = "CVE_STATUS_WIN CVE_STATUS_PATCHED"
-#
-# CVE_STATUS_WIN = "CVE-1234-0001 CVE-1234-0003"
-# CVE_STATUS_WIN[status] = "not-applicable-platform: Issue only applies on Windows"
-# CVE_STATUS_PATCHED = "CVE-1234-0002 CVE-1234-0004"
-# CVE_STATUS_PATCHED[status] = "fixed-version: Fixed externally"
-#
-# All possible CVE statuses could be found in cve-check-map.conf
-# CVE_CHECK_STATUSMAP[not-applicable-platform] = "Ignored"
-# CVE_CHECK_STATUSMAP[fixed-version] = "Patched"
-#
-# CVE_CHECK_IGNORE is deprecated and CVE_STATUS has to be used instead.
-# Keep CVE_CHECK_IGNORE until other layers migrate to new variables
-CVE_CHECK_IGNORE ?= ""
-
-# Layers to be excluded
-CVE_CHECK_LAYER_EXCLUDELIST ??= ""
-
-# Layers to be included
-CVE_CHECK_LAYER_INCLUDELIST ??= ""
-
-
-# set to "alphabetical" for version using single alphabetical character as increment release
-CVE_VERSION_SUFFIX ??= ""
-
-python () {
-    if bb.data.inherits_class("cve-check", d):
-        raise bb.parse.SkipRecipe("Skipping recipe: found incompatible combination of cve-check and vex enabled at the same time.")
-
-    from oe.cve_check import extend_cve_status
-    extend_cve_status(d)
-}
-
-def generate_json_report(d, out_path, link_path):
-    if os.path.exists(d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")):
-        import json
-        from oe.cve_check import cve_check_merge_jsons, update_symlinks
-
-        bb.note("Generating JSON CVE summary")
-        index_file = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
-        summary = {"version":"1", "package": []}
-        with open(index_file) as f:
-            filename = f.readline()
-            while filename:
-                with open(filename.rstrip()) as j:
-                    data = json.load(j)
-                    cve_check_merge_jsons(summary, data)
-                filename = f.readline()
-
-        summary["package"].sort(key=lambda d: d['name'])
-
-        with open(out_path, "w") as f:
-            json.dump(summary, f, indent=2)
-
-        update_symlinks(out_path, link_path)
-
-python vex_save_summary_handler () {
-    import shutil
-    import datetime
-    from oe.cve_check import update_symlinks
-
-    cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
-
-    bb.utils.mkdirhier(cvelogpath)
-    timestamp = datetime.datetime.now().strftime('%Y%m%d%H%M%S')
-
-    json_summary_link_name = os.path.join(cvelogpath, d.getVar("CVE_CHECK_SUMMARY_FILE_NAME_JSON"))
-    json_summary_name = os.path.join(cvelogpath, "cve-summary-%s.json" % (timestamp))
-    generate_json_report(d, json_summary_name, json_summary_link_name)
-    bb.plain("Complete CVE JSON report summary created at: %s" % json_summary_link_name)
-}
-
-addhandler vex_save_summary_handler
-vex_save_summary_handler[eventmask] = "bb.event.BuildCompleted"
-
-python do_generate_vex () {
-    """
-    Generate metadata needed for vulnerability checking for
-    the current recipe
-    """
-    from oe.cve_check import get_patched_cves
-
-    try:
-        patched_cves = get_patched_cves(d)
-        cves_status = []
-        products = d.getVar("CVE_PRODUCT").split()
-        for product in products:
-            if ":" in product:
-                _, product = product.split(":", 1)
-            cves_status.append([product, False])
-
-    except FileNotFoundError:
-        bb.fatal("Failure in searching patches")
-
-    cve_write_data_json(d, patched_cves, cves_status)
-}
-
-addtask generate_vex before do_build
-
-python vex_cleanup () {
-    """
-    Delete the file used to gather all the CVE information.
-    """
-    bb.utils.remove(e.data.getVar("CVE_CHECK_SUMMARY_INDEX_PATH"))
-}
-
-addhandler vex_cleanup
-vex_cleanup[eventmask] = "bb.event.BuildCompleted"
-
-python vex_write_rootfs_manifest () {
-    """
-    Create VEX/CVE manifest when building an image
-    """
-
-    import json
-    from oe.rootfs import image_list_installed_packages
-    from oe.cve_check import cve_check_merge_jsons, update_symlinks
-
-    deploy_file_json = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-    if os.path.exists(deploy_file_json):
-        bb.utils.remove(deploy_file_json)
-
-    # Create a list of relevant recipies
-    recipies = set()
-    for pkg in list(image_list_installed_packages(d)):
-        pkg_info = os.path.join(d.getVar('PKGDATA_DIR'),
-                                'runtime-reverse', pkg)
-        pkg_data = oe.packagedata.read_pkgdatafile(pkg_info)
-        recipies.add(pkg_data["PN"])
-
-    bb.note("Writing rootfs VEX manifest")
-    deploy_dir = d.getVar("IMGDEPLOYDIR")
-    link_name = d.getVar("IMAGE_LINK_NAME")
-
-    json_data = {"version":"1", "package": []}
-    text_data = ""
-
-    save_pn = d.getVar("PN")
-
-    for pkg in recipies:
-        # To be able to use the CVE_CHECK_RECIPE_FILE_JSON variable we have to evaluate
-        # it with the different PN names set each time.
-        d.setVar("PN", pkg)
-
-        pkgfilepath = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-        if os.path.exists(pkgfilepath):
-            with open(pkgfilepath) as j:
-                data = json.load(j)
-                cve_check_merge_jsons(json_data, data)
-        else:
-            bb.warn("Missing cve file for %s" % pkg)
-
-    d.setVar("PN", save_pn)
-
-    link_path = os.path.join(deploy_dir, "%s.vex.json" % link_name)
-    manifest_name = d.getVar("CVE_CHECK_MANIFEST_JSON")
-
-    with open(manifest_name, "w") as f:
-        json.dump(json_data, f, indent=2)
-
-    update_symlinks(manifest_name, link_path)
-    bb.plain("Image VEX JSON report stored in: %s" % manifest_name)
-}
-
-ROOTFS_POSTPROCESS_COMMAND:prepend = "vex_write_rootfs_manifest; "
-do_rootfs[recrdeptask] += "do_generate_vex "
-do_populate_sdk[recrdeptask] += "do_generate_vex "
-
-def cve_write_data_json(d, cve_data, cve_status):
-    """
-    Prepare CVE data for the JSON format, then write it.
-    Done for each recipe.
-    """
-
-    from oe.cve_check import get_cpe_ids
-    import json
-
-    output = {"version":"1", "package": []}
-    nvd_link = "https://nvd.nist.gov/vuln/detail/"
-
-    fdir_name  = d.getVar("FILE_DIRNAME")
-    layer = fdir_name.split("/")[-3]
-
-    include_layers = d.getVar("CVE_CHECK_LAYER_INCLUDELIST").split()
-    exclude_layers = d.getVar("CVE_CHECK_LAYER_EXCLUDELIST").split()
-
-    if exclude_layers and layer in exclude_layers:
-        return
-
-    if include_layers and layer not in include_layers:
-        return
-
-    product_data = []
-    for s in cve_status:
-        p = {"product": s[0], "cvesInRecord": "Yes"}
-        if s[1] == False:
-            p["cvesInRecord"] = "No"
-        product_data.append(p)
-    product_data = list({p['product']:p for p in product_data}.values())
-
-    package_version = "%s%s" % (d.getVar("EXTENDPE"), d.getVar("PV"))
-    cpes = get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION"))
-    package_data = {
-        "name" : d.getVar("PN"),
-        "layer" : layer,
-        "version" : package_version,
-        "products": product_data,
-        "cpes": cpes
-    }
-
-    cve_list = []
-
-    for cve in sorted(cve_data):
-        issue_link = "%s%s" % (nvd_link, cve)
-
-        cve_item = {
-            "id" : cve,
-            "status" : cve_data[cve]["abbrev-status"],
-            "link": issue_link,
-        }
-        if 'NVD-summary' in cve_data[cve]:
-            cve_item["summary"] = cve_data[cve]["NVD-summary"]
-            cve_item["scorev2"] = cve_data[cve]["NVD-scorev2"]
-            cve_item["scorev3"] = cve_data[cve]["NVD-scorev3"]
-            cve_item["scorev4"] = cve_data[cve]["NVD-scorev4"]
-            cve_item["vector"] = cve_data[cve]["NVD-vector"]
-            cve_item["vectorString"] = cve_data[cve]["NVD-vectorString"]
-        if 'status' in cve_data[cve]:
-            cve_item["detail"] = cve_data[cve]["status"]
-        if 'justification' in cve_data[cve]:
-            cve_item["description"] = cve_data[cve]["justification"]
-        if 'resource' in cve_data[cve]:
-            cve_item["patch-file"] = cve_data[cve]["resource"]
-        cve_list.append(cve_item)
-
-    package_data["issue"] = cve_list
-    output["package"].append(package_data)
-
-    deploy_file = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-
-    write_string = json.dumps(output, indent=2)
-
-    cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
-    index_path = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
-    bb.utils.mkdirhier(cvelogpath)
-    bb.utils.mkdirhier(os.path.dirname(deploy_file))
-    fragment_file = os.path.basename(deploy_file)
-    fragment_path = os.path.join(cvelogpath, fragment_file)
-    with open(fragment_path, "w") as f:
-        f.write(write_string)
-    with open(deploy_file, "w") as f:
-        f.write(write_string)
-    with open(index_path, "a+") as f:
-        f.write("%s\n" % fragment_path)
-- 
2.43.0



^ permalink raw reply related	[flat|nested] 12+ messages in thread

* [PATCH 2/3] classes/sbom-cve-check: remove references to vex.bbclass
  2026-03-31 13:24 [PATCH 1/3] classes/vex: remove Ross Burton
@ 2026-03-31 13:24 ` Ross Burton
  2026-03-31 13:24 ` [PATCH 3/3] classes/cve-check: remove class Ross Burton
  2026-03-31 14:43 ` [OE-core] [PATCH 1/3] classes/vex: remove Marta Rybczynska
  2 siblings, 0 replies; 12+ messages in thread
From: Ross Burton @ 2026-03-31 13:24 UTC (permalink / raw)
  To: openembedded-core

This class has been deleted now, so remove the references to it and the
file that it writes.

This is effectively a no-op change, as the recommended way to run
sbom-cve-check is with SPDX_INCLUDE_VEX="all", which includes all of the
data in the SPDX that the vex class would have generated.

Signed-off-by: Ross Burton <ross.burton@arm.com>
---
 meta/classes-recipe/sbom-cve-check.bbclass | 7 -------
 1 file changed, 7 deletions(-)

diff --git a/meta/classes-recipe/sbom-cve-check.bbclass b/meta/classes-recipe/sbom-cve-check.bbclass
index a5dc262c6aa..2a8429d0b1c 100644
--- a/meta/classes-recipe/sbom-cve-check.bbclass
+++ b/meta/classes-recipe/sbom-cve-check.bbclass
@@ -48,7 +48,6 @@ python do_sbom_cve_check() {
         bb.fatal("Cannot execute sbom-cve-check missing create-spdx-3.0 inherit.")
 
     sbom_path = d.expand("${DEPLOY_DIR_IMAGE}/${IMAGE_LINK_NAME}.spdx.json")
-    vex_manifest_path = d.expand("${DEPLOY_DIR_IMAGE}/${IMAGE_LINK_NAME}.vex.json")
     dl_db_dir = d.getVar("SBOM_CVE_CHECK_DEPLOY_DB_DIR")
     deploy_dir = d.getVar("SBOM_CVE_CHECK_DEPLOYDIR")
     img_link_name = d.getVar("IMAGE_LINK_NAME")
@@ -72,12 +71,6 @@ python do_sbom_cve_check() {
         "--disable-auto-updates"
     ]
 
-    # Assume that SPDX_INCLUDE_VEX is set globally to "all", and not only for the
-    # image recipe, which is very unlikely. This is not an issue to include the
-    # VEX manifest even if not needed.
-    if bb.data.inherits_class("vex", d) and d.getVar("SPDX_INCLUDE_VEX") != "all":
-        cmd_args.extend(["--yocto-vex-manifest", vex_manifest_path])
-
     for export_file in export_files:
         cmd_args.extend(
             ["--export-type", export_file[0], "--export-path", export_file[1]]
-- 
2.43.0



^ permalink raw reply related	[flat|nested] 12+ messages in thread

* [PATCH 3/3] classes/cve-check: remove class
  2026-03-31 13:24 [PATCH 1/3] classes/vex: remove Ross Burton
  2026-03-31 13:24 ` [PATCH 2/3] classes/sbom-cve-check: remove references to vex.bbclass Ross Burton
@ 2026-03-31 13:24 ` Ross Burton
  2026-03-31 13:48   ` [OE-core] " Marko, Peter
  2026-03-31 14:43 ` [OE-core] [PATCH 1/3] classes/vex: remove Marta Rybczynska
  2 siblings, 1 reply; 12+ messages in thread
From: Ross Burton @ 2026-03-31 13:24 UTC (permalink / raw)
  To: openembedded-core

It's been long known that the cve-check class in oe-core is not that
usable in the real world, for more details see "Future of CVE scanning
in Yocto"[1].  This mail proposed an alternative direction that included
a CVE scanning tool that can be ran both during the build and afterwards,
so that periodic scans of a previously build image is possible.

Last year, Bootlin wrote sbom-cve-check[2] and I compared this to my
proposal in "Comparing cve-check with sbom-cve-check"[3], concluding
that this is likely the missing piece.

Support for sbom-cve-check has been merged into oe-core, and the
cve-check class is now obsolete. So that we don't have to maintain it for
the four-year lifecycle of the Wrynose release, delete it.

This patch also deletes the database fetcher recipes, and the test cases
that were specific to cve-check.  Note that the oe.cve_check library
still exists as this is used by the SPDX classes.

[1] https://lore.kernel.org/openembedded-core/7D6E419E-A7AE-4324-966C-3552C586E452@arm.com/
[2] https://github.com/bootlin/sbom-cve-check
[3] https://lore.kernel.org/openembedded-core/2CD10DD9-FB2A-4B10-B98A-85918EB6B4B7@arm.com/

Signed-off-by: Ross Burton <ross.burton@arm.com>
---
 meta/classes/cve-check.bbclass                | 570 ------------------
 meta/conf/distro/include/maintainers.inc      |   1 -
 meta/conf/documentation.conf                  |   2 -
 meta/lib/oeqa/selftest/cases/cve_check.py     | 172 ------
 .../recipes-core/meta/cve-update-db-native.bb | 421 -------------
 .../meta/cve-update-nvd2-native.bb            | 422 -------------
 6 files changed, 1588 deletions(-)
 delete mode 100644 meta/classes/cve-check.bbclass
 delete mode 100644 meta/recipes-core/meta/cve-update-db-native.bb
 delete mode 100644 meta/recipes-core/meta/cve-update-nvd2-native.bb

diff --git a/meta/classes/cve-check.bbclass b/meta/classes/cve-check.bbclass
deleted file mode 100644
index c63ebd56e16..00000000000
--- a/meta/classes/cve-check.bbclass
+++ /dev/null
@@ -1,570 +0,0 @@
-#
-# Copyright OpenEmbedded Contributors
-#
-# SPDX-License-Identifier: MIT
-#
-
-# This class is used to check recipes against public CVEs.
-#
-# In order to use this class just inherit the class in the
-# local.conf file and it will add the cve_check task for
-# every recipe. The task can be used per recipe, per image,
-# or using the special cases "world" and "universe". The
-# cve_check task will print a warning for every unpatched
-# CVE found and generate a file in the recipe WORKDIR/cve
-# directory. If an image is build it will generate a report
-# in DEPLOY_DIR_IMAGE for all the packages used.
-#
-# Example:
-#   bitbake -c cve_check openssl
-#   bitbake core-image-sato
-#   bitbake -k -c cve_check universe
-#
-# DISCLAIMER
-#
-# This class/tool is meant to be used as support and not
-# the only method to check against CVEs. Running this tool
-# doesn't guarantee your packages are free of CVEs.
-
-# The product name that the CVE database uses defaults to BPN, but may need to
-# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff).
-CVE_PRODUCT ??= "${BPN}"
-CVE_VERSION ??= "${PV}"
-
-# Possible database sources: NVD1, NVD2, FKIE
-NVD_DB_VERSION ?= "FKIE"
-
-# Use different file names for each database source, as they synchronize at different moments, so may be slightly different
-CVE_CHECK_DB_FILENAME ?= "${@'nvdcve_2-2.db' if d.getVar('NVD_DB_VERSION') == 'NVD2' else 'nvdcve_1-3.db' if d.getVar('NVD_DB_VERSION') == 'NVD1' else 'nvdfkie_1-1.db'}"
-CVE_CHECK_DB_FETCHER ?= "${@'cve-update-nvd2-native' if d.getVar('NVD_DB_VERSION') == 'NVD2' else 'cve-update-db-native'}"
-CVE_CHECK_DB_DIR ?= "${STAGING_DIR}/CVE_CHECK"
-CVE_CHECK_DB_FILE ?= "${CVE_CHECK_DB_DIR}/${CVE_CHECK_DB_FILENAME}"
-CVE_CHECK_DB_FILE_LOCK ?= "${CVE_CHECK_DB_FILE}.lock"
-
-CVE_CHECK_SUMMARY_DIR ?= "${LOG_DIR}/cve"
-CVE_CHECK_SUMMARY_FILE_NAME ?= "cve-summary"
-CVE_CHECK_SUMMARY_FILE_NAME_JSON = "cve-summary.json"
-CVE_CHECK_SUMMARY_INDEX_PATH = "${CVE_CHECK_SUMMARY_DIR}/cve-summary-index.txt"
-
-CVE_CHECK_LOG_JSON ?= "${T}/cve.json"
-
-CVE_CHECK_DIR ??= "${DEPLOY_DIR}/cve"
-CVE_CHECK_RECIPE_FILE_JSON ?= "${CVE_CHECK_DIR}/${PN}_cve.json"
-CVE_CHECK_MANIFEST_JSON_SUFFIX ?= "json"
-CVE_CHECK_MANIFEST_JSON ?= "${IMGDEPLOYDIR}/${IMAGE_NAME}.${CVE_CHECK_MANIFEST_JSON_SUFFIX}"
-CVE_CHECK_COPY_FILES ??= "1"
-CVE_CHECK_CREATE_MANIFEST ??= "1"
-
-# Report Patched or Ignored CVEs
-CVE_CHECK_REPORT_PATCHED ??= "1"
-
-CVE_CHECK_SHOW_WARNINGS ??= "1"
-
-# Provide JSON output
-CVE_CHECK_FORMAT_JSON ??= "1"
-
-# Check for packages without CVEs (no issues or missing product name)
-CVE_CHECK_COVERAGE ??= "1"
-
-# Skip CVE Check for packages (PN)
-CVE_CHECK_SKIP_RECIPE ?= ""
-
-# Replace NVD DB check status for a given CVE. Each of CVE has to be mentioned
-# separately with optional detail and description for this status.
-#
-# CVE_STATUS[CVE-1234-0001] = "not-applicable-platform: Issue only applies on Windows"
-# CVE_STATUS[CVE-1234-0002] = "fixed-version: Fixed externally"
-#
-# Settings the same status and reason for multiple CVEs is possible
-# via CVE_STATUS_GROUPS variable.
-#
-# CVE_STATUS_GROUPS = "CVE_STATUS_WIN CVE_STATUS_PATCHED"
-#
-# CVE_STATUS_WIN = "CVE-1234-0001 CVE-1234-0003"
-# CVE_STATUS_WIN[status] = "not-applicable-platform: Issue only applies on Windows"
-# CVE_STATUS_PATCHED = "CVE-1234-0002 CVE-1234-0004"
-# CVE_STATUS_PATCHED[status] = "fixed-version: Fixed externally"
-#
-# All possible CVE statuses could be found in cve-check-map.conf
-# CVE_CHECK_STATUSMAP[not-applicable-platform] = "Ignored"
-# CVE_CHECK_STATUSMAP[fixed-version] = "Patched"
-#
-# CVE_CHECK_IGNORE is deprecated and CVE_STATUS has to be used instead.
-# Keep CVE_CHECK_IGNORE until other layers migrate to new variables
-CVE_CHECK_IGNORE ?= ""
-
-# Layers to be excluded
-CVE_CHECK_LAYER_EXCLUDELIST ??= ""
-
-# Layers to be included
-CVE_CHECK_LAYER_INCLUDELIST ??= ""
-
-
-# set to "alphabetical" for version using single alphabetical character as increment release
-CVE_VERSION_SUFFIX ??= ""
-
-python () {
-    from oe.cve_check import extend_cve_status
-    extend_cve_status(d)
-
-    nvd_database_type = d.getVar("NVD_DB_VERSION")
-    if nvd_database_type not in ("NVD1", "NVD2", "FKIE"):
-        bb.erroronce("Malformed NVD_DB_VERSION, must be one of: NVD1, NVD2, FKIE. Defaulting to NVD2")
-        d.setVar("NVD_DB_VERSION", "NVD2")
-}
-
-def generate_json_report(d, out_path, link_path):
-    if os.path.exists(d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")):
-        import json
-        from oe.cve_check import cve_check_merge_jsons, update_symlinks
-
-        bb.note("Generating JSON CVE summary")
-        index_file = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
-        summary = {"version":"1", "package": []}
-        with open(index_file) as f:
-            filename = f.readline()
-            while filename:
-                with open(filename.rstrip()) as j:
-                    data = json.load(j)
-                    cve_check_merge_jsons(summary, data)
-                filename = f.readline()
-
-        summary["package"].sort(key=lambda d: d['name'])
-
-        with open(out_path, "w") as f:
-            json.dump(summary, f, indent=2)
-
-        update_symlinks(out_path, link_path)
-
-python cve_save_summary_handler () {
-    import shutil
-    import datetime
-    from oe.cve_check import update_symlinks
-
-    cve_summary_name = d.getVar("CVE_CHECK_SUMMARY_FILE_NAME")
-    cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
-    bb.utils.mkdirhier(cvelogpath)
-
-    timestamp = datetime.datetime.now().strftime('%Y%m%d%H%M%S')
-
-    if d.getVar("CVE_CHECK_FORMAT_JSON") == "1":
-        json_summary_link_name = os.path.join(cvelogpath, d.getVar("CVE_CHECK_SUMMARY_FILE_NAME_JSON"))
-        json_summary_name = os.path.join(cvelogpath, "%s-%s.json" % (cve_summary_name, timestamp))
-        generate_json_report(d, json_summary_name, json_summary_link_name)
-        bb.plain("Complete CVE JSON report summary created at: %s" % json_summary_link_name)
-}
-
-addhandler cve_save_summary_handler
-cve_save_summary_handler[eventmask] = "bb.event.BuildCompleted"
-
-python do_cve_check () {
-    """
-    Check recipe for patched and unpatched CVEs
-    """
-    from oe.cve_check import get_patched_cves
-
-    with bb.utils.fileslocked([d.getVar("CVE_CHECK_DB_FILE_LOCK")], shared=True):
-        if os.path.exists(d.getVar("CVE_CHECK_DB_FILE")):
-            try:
-                patched_cves = get_patched_cves(d)
-            except FileNotFoundError:
-                bb.fatal("Failure in searching patches")
-            cve_data, status = check_cves(d, patched_cves)
-            if len(cve_data) or (d.getVar("CVE_CHECK_COVERAGE") == "1" and status):
-                get_cve_info(d, cve_data)
-                cve_write_data(d, cve_data, status)
-        else:
-            bb.note("No CVE database found, skipping CVE check")
-
-}
-
-addtask cve_check before do_build
-do_cve_check[depends] = "${CVE_CHECK_DB_FETCHER}:do_unpack"
-do_cve_check[nostamp] = "1"
-
-python cve_check_cleanup () {
-    """
-    Delete the file used to gather all the CVE information.
-    """
-    bb.utils.remove(e.data.getVar("CVE_CHECK_SUMMARY_INDEX_PATH"))
-}
-
-addhandler cve_check_cleanup
-cve_check_cleanup[eventmask] = "bb.event.BuildCompleted"
-
-python cve_check_write_rootfs_manifest () {
-    """
-    Create CVE manifest when building an image
-    """
-
-    import shutil
-    import json
-    from oe.rootfs import image_list_installed_packages
-    from oe.cve_check import cve_check_merge_jsons, update_symlinks
-
-    if d.getVar("CVE_CHECK_COPY_FILES") == "1":
-        deploy_file_json = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-        if os.path.exists(deploy_file_json):
-            bb.utils.remove(deploy_file_json)
-
-    # Create a list of relevant recipies
-    recipies = set()
-    for pkg in list(image_list_installed_packages(d)):
-        pkg_info = os.path.join(d.getVar('PKGDATA_DIR'),
-                                'runtime-reverse', pkg)
-        pkg_data = oe.packagedata.read_pkgdatafile(pkg_info)
-        recipies.add(pkg_data["PN"])
-
-    bb.note("Writing rootfs CVE manifest")
-    deploy_dir = d.getVar("IMGDEPLOYDIR")
-    link_name = d.getVar("IMAGE_LINK_NAME")
-
-    json_data = {"version":"1", "package": []}
-    text_data = ""
-    enable_json = d.getVar("CVE_CHECK_FORMAT_JSON") == "1"
-
-    save_pn = d.getVar("PN")
-
-    for pkg in recipies:
-        # To be able to use the CVE_CHECK_RECIPE_FILE_JSON variable we have to evaluate
-        # it with the different PN names set each time.
-        d.setVar("PN", pkg)
-
-        if enable_json:
-            pkgfilepath = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-            if os.path.exists(pkgfilepath):
-                with open(pkgfilepath) as j:
-                    data = json.load(j)
-                    cve_check_merge_jsons(json_data, data)
-
-    d.setVar("PN", save_pn)
-
-    if enable_json:
-        manifest_name_suffix = d.getVar("CVE_CHECK_MANIFEST_JSON_SUFFIX")
-        manifest_name = d.getVar("CVE_CHECK_MANIFEST_JSON")
-
-        with open(manifest_name, "w") as f:
-            json.dump(json_data, f, indent=2)
-
-        if link_name:
-            link_path = os.path.join(deploy_dir, "%s.%s" % (link_name, manifest_name_suffix))
-            update_symlinks(manifest_name, link_path)
-
-        bb.plain("Image CVE JSON report stored in: %s" % manifest_name)
-}
-
-ROOTFS_POSTPROCESS_COMMAND:prepend = "${@'cve_check_write_rootfs_manifest ' if d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
-do_rootfs[recrdeptask] += "${@'do_cve_check' if d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
-do_populate_sdk[recrdeptask] += "${@'do_cve_check' if d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
-
-def cve_is_ignored(d, cve_data, cve):
-    if cve not in cve_data:
-        return False
-    if cve_data[cve]['abbrev-status'] == "Ignored":
-        return True
-    return False
-
-def cve_is_patched(d, cve_data, cve):
-    if cve not in cve_data:
-        return False
-    if cve_data[cve]['abbrev-status'] == "Patched":
-        return True
-    return False
-
-def cve_update(d, cve_data, cve, entry):
-    # If no entry, just add it
-    if cve not in cve_data:
-        cve_data[cve] = entry
-        return
-    # If we are updating, there might be change in the status
-    bb.debug(1, "Trying CVE entry update for %s from %s to %s" % (cve, cve_data[cve]['abbrev-status'], entry['abbrev-status']))
-    if cve_data[cve]['abbrev-status'] == "Unknown":
-        cve_data[cve] = entry
-        return
-    if cve_data[cve]['abbrev-status'] == entry['abbrev-status']:
-        return
-    # Update like in {'abbrev-status': 'Patched', 'status': 'version-not-in-range'} to {'abbrev-status': 'Unpatched', 'status': 'version-in-range'}
-    if entry['abbrev-status'] == "Unpatched" and cve_data[cve]['abbrev-status'] == "Patched":
-        if entry['status'] == "version-in-range" and cve_data[cve]['status'] == "version-not-in-range":
-            # New result from the scan, vulnerable
-            cve_data[cve] = entry
-            bb.debug(1, "CVE entry %s update from Patched to Unpatched from the scan result" % cve)
-            return
-    if entry['abbrev-status'] == "Patched" and cve_data[cve]['abbrev-status'] == "Unpatched":
-        if entry['status'] == "version-not-in-range" and cve_data[cve]['status'] == "version-in-range":
-            # Range does not match the scan, but we already have a vulnerable match, ignore
-            bb.debug(1, "CVE entry %s update from Patched to Unpatched from the scan result - not applying" % cve)
-            return
-    # If we have an "Ignored", it has a priority
-    if cve_data[cve]['abbrev-status'] == "Ignored":
-        bb.debug(1, "CVE %s not updating because Ignored" % cve)
-        return
-    bb.warn("Unhandled CVE entry update for %s from %s to %s" % (cve, cve_data[cve], entry))
-
-def check_cves(d, cve_data):
-    """
-    Connect to the NVD database and find unpatched cves.
-    """
-    from oe.cve_check import Version, convert_cve_version, decode_cve_status
-
-    pn = d.getVar("PN")
-    real_pv = d.getVar("PV")
-    suffix = d.getVar("CVE_VERSION_SUFFIX")
-
-    cves_status = []
-    cves_in_recipe = False
-    # CVE_PRODUCT can contain more than one product (eg. curl/libcurl)
-    products = d.getVar("CVE_PRODUCT").split()
-    # If this has been unset then we're not scanning for CVEs here (for example, image recipes)
-    if not products:
-        return ([], [])
-    pv = d.getVar("CVE_VERSION").split("+git")[0]
-
-    # If the recipe has been skipped/ignored we return empty lists
-    if pn in d.getVar("CVE_CHECK_SKIP_RECIPE").split():
-        bb.note("Recipe has been skipped by cve-check")
-        return ([], [])
-
-    import sqlite3
-    db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
-    conn = sqlite3.connect(db_file, uri=True)
-
-    # For each of the known product names (e.g. curl has CPEs using curl and libcurl)...
-    for product in products:
-        cves_in_product = False
-        if ":" in product:
-            vendor, product = product.split(":", 1)
-        else:
-            vendor = "%"
-
-        # Find all relevant CVE IDs.
-        cve_cursor = conn.execute("SELECT DISTINCT ID FROM PRODUCTS WHERE PRODUCT IS ? AND VENDOR LIKE ?", (product, vendor))
-        for cverow in cve_cursor:
-            cve = cverow[0]
-
-            # Write status once only for each product
-            if not cves_in_product:
-                cves_status.append([product, True])
-                cves_in_product = True
-                cves_in_recipe = True
-
-            if cve_is_ignored(d, cve_data, cve):
-                bb.note("%s-%s ignores %s" % (product, pv, cve))
-                continue
-            elif cve_is_patched(d, cve_data, cve):
-                bb.note("%s has been patched" % (cve))
-                continue
-
-            vulnerable = False
-            ignored = False
-
-            product_cursor = conn.execute("SELECT * FROM PRODUCTS WHERE ID IS ? AND PRODUCT IS ? AND VENDOR LIKE ?", (cve, product, vendor))
-            for row in product_cursor:
-                (_, _, _, version_start, operator_start, version_end, operator_end) = row
-                #bb.debug(2, "Evaluating row " + str(row))
-                if cve_is_ignored(d, cve_data, cve):
-                    ignored = True
-
-                version_start = convert_cve_version(version_start)
-                version_end = convert_cve_version(version_end)
-
-                if (operator_start == '=' and pv == version_start) or version_start == '-':
-                    vulnerable = True
-                else:
-                    if operator_start:
-                        try:
-                            vulnerable_start =  (operator_start == '>=' and Version(pv,suffix) >= Version(version_start,suffix))
-                            vulnerable_start |= (operator_start == '>' and Version(pv,suffix) > Version(version_start,suffix))
-                        except:
-                            bb.warn("%s: Failed to compare %s %s %s for %s" %
-                                    (product, pv, operator_start, version_start, cve))
-                            vulnerable_start = False
-                    else:
-                        vulnerable_start = False
-
-                    if operator_end:
-                        try:
-                            vulnerable_end  = (operator_end == '<=' and Version(pv,suffix) <= Version(version_end,suffix) )
-                            vulnerable_end |= (operator_end == '<' and Version(pv,suffix) < Version(version_end,suffix) )
-                        except:
-                            bb.warn("%s: Failed to compare %s %s %s for %s" %
-                                    (product, pv, operator_end, version_end, cve))
-                            vulnerable_end = False
-                    else:
-                        vulnerable_end = False
-
-                    if operator_start and operator_end:
-                        vulnerable = vulnerable_start and vulnerable_end
-                    else:
-                        vulnerable = vulnerable_start or vulnerable_end
-
-                if vulnerable:
-                    if ignored:
-                        bb.note("%s is ignored in %s-%s" % (cve, pn, real_pv))
-                        cve_update(d, cve_data, cve, {"abbrev-status": "Ignored"})
-                    else:
-                        bb.note("%s-%s is vulnerable to %s" % (pn, real_pv, cve))
-                        cve_update(d, cve_data, cve, {"abbrev-status": "Unpatched", "status": "version-in-range"})
-                    break
-            product_cursor.close()
-
-            if not vulnerable:
-                bb.note("%s-%s is not vulnerable to %s" % (pn, real_pv, cve))
-                cve_update(d, cve_data, cve, {"abbrev-status": "Patched", "status": "version-not-in-range"})
-        cve_cursor.close()
-
-        if not cves_in_product:
-            bb.note("No CVE records found for product %s, pn %s" % (product, pn))
-            cves_status.append([product, False])
-
-    conn.close()
-
-    if not cves_in_recipe:
-        bb.note("No CVE records for products in recipe %s" % (pn))
-
-    if d.getVar("CVE_CHECK_SHOW_WARNINGS") == "1":
-        unpatched_cves = [cve for cve in cve_data if cve_data[cve]["abbrev-status"] == "Unpatched"]
-        if unpatched_cves:
-            bb.warn("Found unpatched CVE (%s)" % " ".join(unpatched_cves))
-
-    return (cve_data, cves_status)
-
-def get_cve_info(d, cve_data):
-    """
-    Get CVE information from the database.
-    """
-
-    import sqlite3
-
-    db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
-    conn = sqlite3.connect(db_file, uri=True)
-
-    for cve in cve_data:
-        cursor = conn.execute("SELECT * FROM NVD WHERE ID IS ?", (cve,))
-        for row in cursor:
-            # The CVE itdelf has been added already
-            if row[0] not in cve_data:
-                bb.note("CVE record %s not present" % row[0])
-                continue
-            #cve_data[row[0]] = {}
-            cve_data[row[0]]["NVD-summary"] = row[1]
-            cve_data[row[0]]["NVD-scorev2"] = row[2]
-            cve_data[row[0]]["NVD-scorev3"] = row[3]
-            cve_data[row[0]]["NVD-scorev4"] = row[4]
-            cve_data[row[0]]["NVD-modified"] = row[5]
-            cve_data[row[0]]["NVD-vector"] = row[6]
-            cve_data[row[0]]["NVD-vectorString"] = row[7]
-        cursor.close()
-    conn.close()
-
-def cve_check_write_json_output(d, output, direct_file, deploy_file, manifest_file):
-    """
-    Write CVE information in the JSON format: to WORKDIR; and to
-    CVE_CHECK_DIR, if CVE manifest if enabled, write fragment
-    files that will be assembled at the end in cve_check_write_rootfs_manifest.
-    """
-
-    import json
-
-    write_string = json.dumps(output, indent=2)
-    with open(direct_file, "w") as f:
-        bb.note("Writing file %s with CVE information" % direct_file)
-        f.write(write_string)
-
-    if d.getVar("CVE_CHECK_COPY_FILES") == "1":
-        bb.utils.mkdirhier(os.path.dirname(deploy_file))
-        with open(deploy_file, "w") as f:
-            f.write(write_string)
-
-    if d.getVar("CVE_CHECK_CREATE_MANIFEST") == "1":
-        cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
-        index_path = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
-        bb.utils.mkdirhier(cvelogpath)
-        fragment_file = os.path.basename(deploy_file)
-        fragment_path = os.path.join(cvelogpath, fragment_file)
-        with open(fragment_path, "w") as f:
-            f.write(write_string)
-        with open(index_path, "a+") as f:
-            f.write("%s\n" % fragment_path)
-
-def cve_write_data_json(d, cve_data, cve_status):
-    """
-    Prepare CVE data for the JSON format, then write it.
-    """
-
-    output = {"version":"1", "package": []}
-    nvd_link = "https://nvd.nist.gov/vuln/detail/"
-
-    fdir_name  = d.getVar("FILE_DIRNAME")
-    layer = fdir_name.split("/")[-3]
-
-    include_layers = d.getVar("CVE_CHECK_LAYER_INCLUDELIST").split()
-    exclude_layers = d.getVar("CVE_CHECK_LAYER_EXCLUDELIST").split()
-
-    report_all = d.getVar("CVE_CHECK_REPORT_PATCHED") == "1"
-
-    if exclude_layers and layer in exclude_layers:
-        return
-
-    if include_layers and layer not in include_layers:
-        return
-
-    product_data = []
-    for s in cve_status:
-        p = {"product": s[0], "cvesInRecord": "Yes"}
-        if s[1] == False:
-            p["cvesInRecord"] = "No"
-        product_data.append(p)
-
-    package_version = "%s%s" % (d.getVar("EXTENDPE"), d.getVar("PV"))
-    package_data = {
-        "name" : d.getVar("PN"),
-        "layer" : layer,
-        "version" : package_version,
-        "products": product_data
-    }
-
-    cve_list = []
-
-    for cve in sorted(cve_data):
-        if not report_all and (cve_data[cve]["abbrev-status"] == "Patched" or cve_data[cve]["abbrev-status"] == "Ignored"):
-            continue
-        issue_link = "%s%s" % (nvd_link, cve)
-
-        cve_item = {
-            "id" : cve,
-            "status" : cve_data[cve]["abbrev-status"],
-            "link": issue_link,
-        }
-        if 'NVD-summary' in cve_data[cve]:
-            cve_item["summary"] = cve_data[cve]["NVD-summary"]
-            cve_item["scorev2"] = cve_data[cve]["NVD-scorev2"]
-            cve_item["scorev3"] = cve_data[cve]["NVD-scorev3"]
-            cve_item["scorev4"] = cve_data[cve]["NVD-scorev4"]
-            cve_item["modified"] = cve_data[cve]["NVD-modified"]
-            cve_item["vector"] = cve_data[cve]["NVD-vector"]
-            cve_item["vectorString"] = cve_data[cve]["NVD-vectorString"]
-        if 'status' in cve_data[cve]:
-            cve_item["detail"] = cve_data[cve]["status"]
-        if 'justification' in cve_data[cve]:
-            cve_item["description"] = cve_data[cve]["justification"]
-        if 'resource' in cve_data[cve]:
-            cve_item["patch-file"] = cve_data[cve]["resource"]
-        cve_list.append(cve_item)
-
-    package_data["issue"] = cve_list
-    output["package"].append(package_data)
-
-    direct_file = d.getVar("CVE_CHECK_LOG_JSON")
-    deploy_file = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
-    manifest_file = d.getVar("CVE_CHECK_SUMMARY_FILE_NAME_JSON")
-
-    cve_check_write_json_output(d, output, direct_file, deploy_file, manifest_file)
-
-def cve_write_data(d, cve_data, status):
-    """
-    Write CVE data in each enabled format.
-    """
-
-    if d.getVar("CVE_CHECK_FORMAT_JSON") == "1":
-        cve_write_data_json(d, cve_data, status)
diff --git a/meta/conf/distro/include/maintainers.inc b/meta/conf/distro/include/maintainers.inc
index 1bd43211e23..a429320b88b 100644
--- a/meta/conf/distro/include/maintainers.inc
+++ b/meta/conf/distro/include/maintainers.inc
@@ -140,7 +140,6 @@ RECIPE_MAINTAINER:pn-cryptodev-module = "Robert Yang <liezhi.yang@windriver.com>
 RECIPE_MAINTAINER:pn-cryptodev-tests = "Robert Yang <liezhi.yang@windriver.com>"
 RECIPE_MAINTAINER:pn-cups = "Chen Qi <Qi.Chen@windriver.com>"
 RECIPE_MAINTAINER:pn-curl = "Robert Joslyn <robert.joslyn@redrectangle.org>"
-RECIPE_MAINTAINER:pn-cve-update-nvd2-native = "Ross Burton <ross.burton@arm.com>"
 RECIPE_MAINTAINER:pn-db = "Unassigned <unassigned@yoctoproject.org>"
 RECIPE_MAINTAINER:pn-dbus = "Chen Qi <Qi.Chen@windriver.com>"
 RECIPE_MAINTAINER:pn-dbus-glib = "Chen Qi <Qi.Chen@windriver.com>"
diff --git a/meta/conf/documentation.conf b/meta/conf/documentation.conf
index 1853676fa06..9d429ba9a31 100644
--- a/meta/conf/documentation.conf
+++ b/meta/conf/documentation.conf
@@ -121,8 +121,6 @@ CONFLICT_MACHINE_FEATURES[doc] = "When a recipe inherits the features_check clas
 CORE_IMAGE_EXTRA_INSTALL[doc] = "Specifies the list of packages to be added to the image. You should only set this variable in the conf/local.conf file in the Build Directory."
 COREBASE[doc] = "Specifies the parent directory of the OpenEmbedded Core Metadata layer (i.e. meta)."
 CONF_VERSION[doc] = "Tracks the version of local.conf.  Increased each time build/conf/ changes incompatibly."
-CVE_CHECK_LAYER_EXCLUDELIST[doc] = "Defines which layers to exclude from cve-check scanning"
-CVE_CHECK_LAYER_INCLUDELIST[doc] = "Defines which layers to include during cve-check scanning"
 
 #D
 
diff --git a/meta/lib/oeqa/selftest/cases/cve_check.py b/meta/lib/oeqa/selftest/cases/cve_check.py
index 511e4b81b41..891a7de3317 100644
--- a/meta/lib/oeqa/selftest/cases/cve_check.py
+++ b/meta/lib/oeqa/selftest/cases/cve_check.py
@@ -4,10 +4,7 @@
 # SPDX-License-Identifier: MIT
 #
 
-import json
-import os
 from oeqa.selftest.case import OESelftestTestCase
-from oeqa.utils.commands import bitbake, get_bb_vars
 
 class CVECheck(OESelftestTestCase):
 
@@ -325,172 +322,3 @@ class CVECheck(OESelftestTestCase):
             ),
             {"CVE-2019-6461", "CVE-2019-6462", "CVE-2019-6463", "CVE-2019-6464"},
         )
-
-    def test_recipe_report_json(self):
-        config = """
-INHERIT += "cve-check"
-CVE_CHECK_FORMAT_JSON = "1"
-"""
-        self.write_config(config)
-
-        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR", "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], "m4-native_cve.json")
-
-        try:
-            os.remove(summary_json)
-            os.remove(recipe_json)
-        except FileNotFoundError:
-            pass
-
-        bitbake("m4-native -c cve_check")
-
-        def check_m4_json(filename):
-            with open(filename) as f:
-                report = json.load(f)
-            self.assertEqual(report["version"], "1")
-            self.assertEqual(len(report["package"]), 1)
-            package = report["package"][0]
-            self.assertEqual(package["name"], "m4-native")
-            found_cves = { issue["id"]: issue["status"] for issue in package["issue"]}
-            self.assertIn("CVE-2008-1687", found_cves)
-            self.assertEqual(found_cves["CVE-2008-1687"], "Patched")
-
-        self.assertExists(summary_json)
-        check_m4_json(summary_json)
-        self.assertExists(recipe_json)
-        check_m4_json(recipe_json)
-
-
-    def test_image_json(self):
-        config = """
-INHERIT += "cve-check"
-CVE_CHECK_FORMAT_JSON = "1"
-"""
-        self.write_config(config)
-
-        vars = get_bb_vars(["CVE_CHECK_DIR", "CVE_CHECK_SUMMARY_DIR", "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        report_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        print(report_json)
-        try:
-            os.remove(report_json)
-        except FileNotFoundError:
-            pass
-
-        bitbake("core-image-minimal-initramfs")
-        self.assertExists(report_json)
-
-        # Check that the summary report lists at least one package
-        with open(report_json) as f:
-            report = json.load(f)
-        self.assertEqual(report["version"], "1")
-        self.assertGreater(len(report["package"]), 1)
-
-        # Check that a random recipe wrote a recipe report to deploy/cve/
-        recipename = report["package"][0]["name"]
-        recipe_report = os.path.join(vars["CVE_CHECK_DIR"], recipename + "_cve.json")
-        self.assertExists(recipe_report)
-        with open(recipe_report) as f:
-            report = json.load(f)
-        self.assertEqual(report["version"], "1")
-        self.assertEqual(len(report["package"]), 1)
-        self.assertEqual(report["package"][0]["name"], recipename)
-
-
-    def test_recipe_report_json_unpatched(self):
-        config = """
-INHERIT += "cve-check"
-CVE_CHECK_FORMAT_JSON = "1"
-CVE_CHECK_REPORT_PATCHED = "0"
-"""
-        self.write_config(config)
-
-        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR", "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], "m4-native_cve.json")
-
-        try:
-            os.remove(summary_json)
-            os.remove(recipe_json)
-        except FileNotFoundError:
-            pass
-
-        bitbake("m4-native -c cve_check")
-
-        def check_m4_json(filename):
-            with open(filename) as f:
-                report = json.load(f)
-            self.assertEqual(report["version"], "1")
-            self.assertEqual(len(report["package"]), 1)
-            package = report["package"][0]
-            self.assertEqual(package["name"], "m4-native")
-            #m4 had only Patched CVEs, so the issues array will be empty
-            self.assertEqual(package["issue"], [])
-
-        self.assertExists(summary_json)
-        check_m4_json(summary_json)
-        self.assertExists(recipe_json)
-        check_m4_json(recipe_json)
-
-
-    def test_recipe_report_json_ignored(self):
-        config = """
-INHERIT += "cve-check"
-CVE_CHECK_FORMAT_JSON = "1"
-CVE_CHECK_REPORT_PATCHED = "1"
-"""
-        self.write_config(config)
-
-        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR", "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
-        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], "logrotate_cve.json")
-
-        try:
-            os.remove(summary_json)
-            os.remove(recipe_json)
-        except FileNotFoundError:
-            pass
-
-        bitbake("logrotate -c cve_check")
-
-        def check_m4_json(filename):
-            with open(filename) as f:
-                report = json.load(f)
-            self.assertEqual(report["version"], "1")
-            self.assertEqual(len(report["package"]), 1)
-            package = report["package"][0]
-            self.assertEqual(package["name"], "logrotate")
-            found_cves = {}
-            for issue in package["issue"]:
-                found_cves[issue["id"]] = {
-                    "status" : issue["status"],
-                    "detail" : issue["detail"] if "detail" in issue else "",
-                    "description" : issue["description"] if "description" in issue else ""
-                }
-            # m4 CVE should not be in logrotate
-            self.assertNotIn("CVE-2008-1687", found_cves)
-            # logrotate has both Patched and Ignored CVEs
-            detail = "version-not-in-range"
-            self.assertIn("CVE-2011-1098", found_cves)
-            self.assertEqual(found_cves["CVE-2011-1098"]["status"], "Patched")
-            self.assertEqual(found_cves["CVE-2011-1098"]["detail"], detail)
-            self.assertEqual(len(found_cves["CVE-2011-1098"]["description"]), 0)
-            detail = "not-applicable-platform"
-            description = "CVE is debian, gentoo or SUSE specific on the way logrotate was installed/used"
-            self.assertIn("CVE-2011-1548", found_cves)
-            self.assertEqual(found_cves["CVE-2011-1548"]["status"], "Ignored")
-            self.assertEqual(found_cves["CVE-2011-1548"]["detail"], detail)
-            self.assertEqual(found_cves["CVE-2011-1548"]["description"], description)
-            self.assertIn("CVE-2011-1549", found_cves)
-            self.assertEqual(found_cves["CVE-2011-1549"]["status"], "Ignored")
-            self.assertEqual(found_cves["CVE-2011-1549"]["detail"], detail)
-            self.assertEqual(found_cves["CVE-2011-1549"]["description"], description)
-            self.assertIn("CVE-2011-1550", found_cves)
-            self.assertEqual(found_cves["CVE-2011-1550"]["status"], "Ignored")
-            self.assertEqual(found_cves["CVE-2011-1550"]["detail"], detail)
-            self.assertEqual(found_cves["CVE-2011-1550"]["description"], description)
-
-        self.assertExists(summary_json)
-        check_m4_json(summary_json)
-        self.assertExists(recipe_json)
-        check_m4_json(recipe_json)
diff --git a/meta/recipes-core/meta/cve-update-db-native.bb b/meta/recipes-core/meta/cve-update-db-native.bb
deleted file mode 100644
index 01f942dcdbf..00000000000
--- a/meta/recipes-core/meta/cve-update-db-native.bb
+++ /dev/null
@@ -1,421 +0,0 @@
-SUMMARY = "Updates the NVD CVE database"
-LICENSE = "MIT"
-
-INHIBIT_DEFAULT_DEPS = "1"
-
-inherit native
-
-deltask do_patch
-deltask do_configure
-deltask do_compile
-deltask do_install
-deltask do_populate_sysroot
-
-NVDCVE_URL ?= "https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-"
-FKIE_URL ?= "https://github.com/fkie-cad/nvd-json-data-feeds/releases/latest/download/CVE-"
-
-# CVE database update interval, in seconds. By default: once a day (23*60*60).
-# Use 0 to force the update
-# Use a negative value to skip the update
-CVE_DB_UPDATE_INTERVAL ?= "82800"
-
-# Timeout for blocking socket operations, such as the connection attempt.
-CVE_SOCKET_TIMEOUT ?= "60"
-
-CVE_CHECK_DB_DLDIR_FILE ?= "${DL_DIR}/CVE_CHECK2/${CVE_CHECK_DB_FILENAME}"
-CVE_CHECK_DB_DLDIR_LOCK ?= "${CVE_CHECK_DB_DLDIR_FILE}.lock"
-CVE_CHECK_DB_TEMP_FILE ?= "${CVE_CHECK_DB_FILE}.tmp"
-
-python () {
-    if not bb.data.inherits_class("cve-check", d):
-        raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.")
-}
-
-python do_fetch() {
-    """
-    Update NVD database with json data feed
-    """
-    import bb.utils
-    import bb.progress
-    import shutil
-
-    bb.utils.export_proxies(d)
-
-    db_file = d.getVar("CVE_CHECK_DB_DLDIR_FILE")
-    db_dir = os.path.dirname(db_file)
-    db_tmp_file = d.getVar("CVE_CHECK_DB_TEMP_FILE")
-
-    cleanup_db_download(db_tmp_file)
-
-    # The NVD database changes once a day, so no need to update more frequently
-    # Allow the user to force-update
-    try:
-        import time
-        update_interval = int(d.getVar("CVE_DB_UPDATE_INTERVAL"))
-        if update_interval < 0:
-            bb.note("CVE database update skipped")
-            if not os.path.exists(db_file):
-                bb.error("CVE database %s not present, database fetch/update skipped" % db_file)
-            return
-        curr_time = time.time()
-        database_time = os.path.getmtime(db_file)
-        bb.note("Current time: %s; DB time: %s" % (time.ctime(curr_time), time.ctime(database_time)))
-        if curr_time < database_time:
-            bb.warn("Database time is in the future, force DB update")
-        elif curr_time - database_time < update_interval:
-            bb.note("CVE database recently updated, skipping")
-            return
-
-    except OSError:
-        pass
-
-    if bb.utils.to_boolean(d.getVar("BB_NO_NETWORK")):
-        bb.error("BB_NO_NETWORK attempted to disable fetch, this recipe uses CVE_DB_UPDATE_INTERVAL to control download, set to '-1' to disable fetch or update")
-
-    bb.utils.mkdirhier(db_dir)
-    bb.utils.mkdirhier(os.path.dirname(db_tmp_file))
-    if os.path.exists(db_file):
-        shutil.copy2(db_file, db_tmp_file)
-
-    if update_db_file(db_tmp_file, d):
-        # Update downloaded correctly, we can swap files. To avoid potential
-        # NFS caching issues, ensure that the destination file has a new inode
-        # number. We do this in two steps as the downloads directory may be on
-        # a different filesystem to tmpdir we're working in.
-        new_file = "%s.new" % (db_file)
-        shutil.move(db_tmp_file, new_file)
-        os.rename(new_file, db_file)
-    else:
-        # Update failed, do not modify the database
-        bb.warn("CVE database update failed")
-        os.remove(db_tmp_file)
-}
-
-do_fetch[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}"
-do_fetch[file-checksums] = ""
-do_fetch[vardeps] = ""
-
-python do_unpack() {
-    import shutil
-    shutil.copyfile(d.getVar("CVE_CHECK_DB_DLDIR_FILE"), d.getVar("CVE_CHECK_DB_FILE"))
-}
-do_unpack[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK} ${CVE_CHECK_DB_FILE_LOCK}"
-
-def cleanup_db_download(db_tmp_file):
-    """
-    Cleanup the download space from possible failed downloads
-    """
-
-    # Clean-up the temporary file downloads, we can remove both journal
-    # and the temporary database
-    if os.path.exists("{0}-journal".format(db_tmp_file)):
-        os.remove("{0}-journal".format(db_tmp_file))
-    if os.path.exists(db_tmp_file):
-        os.remove(db_tmp_file)
-
-def db_file_names(d, year, is_nvd):
-    if is_nvd:
-        year_url = d.getVar('NVDCVE_URL') + str(year)
-        meta_url = year_url + ".meta"
-        json_url = year_url + ".json.gz"
-        return json_url, meta_url
-    year_url = d.getVar('FKIE_URL') + str(year)
-    meta_url = year_url + ".meta"
-    json_url = year_url + ".json.xz"
-    return json_url, meta_url
-
-def host_db_name(d, is_nvd):
-    if is_nvd:
-        return "nvd.nist.gov"
-    return "github.com"
-
-def db_decompress(d, data, is_nvd):
-    import gzip, lzma
-
-    if is_nvd:
-        return gzip.decompress(data).decode('utf-8')
-    # otherwise
-    return lzma.decompress(data)
-
-def update_db_file(db_tmp_file, d):
-    """
-    Update the given database file
-    """
-    import bb.progress
-    import bb.utils
-    from datetime import date
-    import sqlite3
-    import urllib
-
-    YEAR_START = 2002
-    cve_socket_timeout = int(d.getVar("CVE_SOCKET_TIMEOUT"))
-    is_nvd = d.getVar("NVD_DB_VERSION") == "NVD1"
-
-    # Connect to database
-    conn = sqlite3.connect(db_tmp_file)
-    initialize_db(conn)
-
-    with bb.progress.ProgressHandler(d) as ph, open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') as cve_f:
-        total_years = date.today().year + 1 - YEAR_START
-        for i, year in enumerate(range(YEAR_START, date.today().year + 1)):
-            bb.note("Updating %d" % year)
-            ph.update((float(i + 1) / total_years) * 100)
-            json_url, meta_url = db_file_names(d, year, is_nvd)
-
-            # Retrieve meta last modified date
-            try:
-                response = urllib.request.urlopen(meta_url, timeout=cve_socket_timeout)
-            except urllib.error.URLError as e:
-                cve_f.write('Warning: CVE db update error, Unable to fetch CVE data.\n\n')
-                bb.warn("Failed to fetch CVE data (%s)" % e)
-                import socket
-                result = socket.getaddrinfo(host_db_name(d, is_nvd), 443, proto=socket.IPPROTO_TCP)
-                bb.warn("Host IPs are %s" % (", ".join(t[4][0] for t in result)))
-                return False
-
-            if response:
-                for line in response.read().decode("utf-8").splitlines():
-                    key, value = line.split(":", 1)
-                    if key == "lastModifiedDate":
-                        last_modified = value
-                        break
-                else:
-                    bb.warn("Cannot parse CVE metadata, update failed")
-                    return False
-
-            # Compare with current db last modified date
-            cursor = conn.execute("select DATE from META where YEAR = ?", (year,))
-            meta = cursor.fetchone()
-            cursor.close()
-
-            if not meta or meta[0] != last_modified:
-                bb.note("Updating entries")
-                # Clear products table entries corresponding to current year
-                conn.execute("delete from PRODUCTS where ID like ?", ('CVE-%d%%' % year,)).close()
-
-                # Update db with current year json file
-                try:
-                    response = urllib.request.urlopen(json_url, timeout=cve_socket_timeout)
-                    if response:
-                        update_db(d, conn, db_decompress(d, response.read(), is_nvd))
-                    conn.execute("insert or replace into META values (?, ?)", [year, last_modified]).close()
-                except urllib.error.URLError as e:
-                    cve_f.write('Warning: CVE db update error, CVE data is outdated.\n\n')
-                    bb.warn("Cannot parse CVE data (%s), update failed" % e.reason)
-                    return False
-            else:
-                bb.debug(2, "Already up to date (last modified %s)" % last_modified)
-            # Update success, set the date to cve_check file.
-            if year == date.today().year:
-                cve_f.write('CVE database update : %s\n\n' % date.today())
-
-        conn.commit()
-        conn.close()
-        return True
-
-def initialize_db(conn):
-    with conn:
-        c = conn.cursor()
-
-        c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER UNIQUE, DATE TEXT)")
-
-        c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE, SUMMARY TEXT, \
-            SCOREV2 TEXT, SCOREV3 TEXT, SCOREV4 TEXT, MODIFIED INTEGER, VECTOR TEXT, VECTORSTRING TEXT)")
-
-        c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \
-            VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT, OPERATOR_START TEXT, \
-            VERSION_END TEXT, OPERATOR_END TEXT)")
-        c.execute("CREATE INDEX IF NOT EXISTS PRODUCT_ID_IDX on PRODUCTS(ID);")
-
-        c.close()
-
-def parse_node_and_insert(conn, node, cveId, is_nvd):
-    # Parse children node if needed
-    for child in node.get('children', ()):
-        parse_node_and_insert(conn, child, cveId, is_nvd)
-
-    def cpe_generator(is_nvd):
-        match_string = "cpeMatch"
-        cpe_string = 'criteria'
-        if is_nvd:
-            match_string = "cpe_match"
-            cpe_string = 'cpe23Uri'
-
-        for cpe in node.get(match_string, ()):
-            if not cpe['vulnerable']:
-                return
-            cpe23 = cpe.get(cpe_string)
-            if not cpe23:
-                return
-            cpe23 = cpe23.split(':')
-            if len(cpe23) < 6:
-                return
-            vendor = cpe23[3]
-            product = cpe23[4]
-            version = cpe23[5]
-
-            if cpe23[6] == '*' or cpe23[6] == '-':
-                version_suffix = ""
-            else:
-                version_suffix = "_" + cpe23[6]
-
-            if version != '*' and version != '-':
-                # Version is defined, this is a '=' match
-                yield [cveId, vendor, product, version + version_suffix, '=', '', '']
-            elif version == '-':
-                # no version information is available
-                yield [cveId, vendor, product, version, '', '', '']
-            else:
-                # Parse start version, end version and operators
-                op_start = ''
-                op_end = ''
-                v_start = ''
-                v_end = ''
-
-                if 'versionStartIncluding' in cpe:
-                    op_start = '>='
-                    v_start = cpe['versionStartIncluding']
-
-                if 'versionStartExcluding' in cpe:
-                    op_start = '>'
-                    v_start = cpe['versionStartExcluding']
-
-                if 'versionEndIncluding' in cpe:
-                    op_end = '<='
-                    v_end = cpe['versionEndIncluding']
-
-                if 'versionEndExcluding' in cpe:
-                    op_end = '<'
-                    v_end = cpe['versionEndExcluding']
-
-                if op_start or op_end or v_start or v_end:
-                    yield [cveId, vendor, product, v_start, op_start, v_end, op_end]
-                else:
-                    # This is no version information, expressed differently.
-                    # Save processing by representing as -.
-                    yield [cveId, vendor, product, '-', '', '', '']
-
-    conn.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)", cpe_generator(is_nvd)).close()
-
-def update_db_nvdjson(conn, jsondata):
-    import json
-    root = json.loads(jsondata)
-
-    for elt in root['CVE_Items']:
-        if not elt['impact']:
-            continue
-
-        accessVector = None
-        vectorString = None
-        cvssv2 = 0.0
-        cvssv3 = 0.0
-        cvssv4 = 0.0
-        cveId = elt['cve']['CVE_data_meta']['ID']
-        cveDesc = elt['cve']['description']['description_data'][0]['value']
-        date = elt['lastModifiedDate']
-        try:
-            accessVector = elt['impact']['baseMetricV2']['cvssV2']['accessVector']
-            vectorString = elt['impact']['baseMetricV2']['cvssV2']['vectorString']
-            cvssv2 = elt['impact']['baseMetricV2']['cvssV2']['baseScore']
-        except KeyError:
-            cvssv2 = 0.0
-        try:
-            accessVector = accessVector or elt['impact']['baseMetricV3']['cvssV3']['attackVector']
-            vectorString = vectorString or elt['impact']['baseMetricV3']['cvssV3']['vectorString']
-            cvssv3 = elt['impact']['baseMetricV3']['cvssV3']['baseScore']
-        except KeyError:
-            accessVector = accessVector or "UNKNOWN"
-            cvssv3 = 0.0
-
-        conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
-                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector, vectorString]).close()
-
-        configurations = elt['configurations']['nodes']
-        for config in configurations:
-            parse_node_and_insert(conn, config, cveId, True)
-
-def get_metric_entry(metric):
-    primaries = [c for c in metric if c['type'] == "Primary"]
-    secondaries = [c for c in metric if c['type'] == "Secondary"]
-    if len(primaries) > 0:
-        return primaries[0]
-    elif len(secondaries) > 0:
-        return secondaries[0]
-    return None
-
-def update_db_fkie(conn, jsondata):
-    import json
-    root = json.loads(jsondata)
-
-    for elt in root['cve_items']:
-        if 'vulnStatus' not in elt or elt['vulnStatus'] == 'Rejected':
-            continue
-
-        if 'configurations' not in elt:
-            continue
-
-        accessVector = None
-        vectorString = None
-        cvssv2 = 0.0
-        cvssv3 = 0.0
-        cvssv4 = 0.0
-        cveId = elt['id']
-        cveDesc = elt['descriptions'][0]['value']
-        date = elt['lastModified']
-        try:
-            if 'cvssMetricV2' in elt['metrics']:
-                entry = get_metric_entry(elt['metrics']['cvssMetricV2'])
-                if entry:
-                    accessVector = entry['cvssData']['accessVector']
-                    vectorString = entry['cvssData']['vectorString']
-                    cvssv2 = entry['cvssData']['baseScore']
-        except KeyError:
-            cvssv2 = 0.0
-        try:
-            if 'cvssMetricV30' in elt['metrics']:
-                entry = get_metric_entry(elt['metrics']['cvssMetricV30'])
-                if entry:
-                    accessVector = entry['cvssData']['attackVector']
-                    vectorString = entry['cvssData']['vectorString']
-                    cvssv3 = entry['cvssData']['baseScore']
-        except KeyError:
-            accessVector = accessVector or "UNKNOWN"
-            cvssv3 = 0.0
-        try:
-            if 'cvssMetricV31' in elt['metrics']:
-                entry = get_metric_entry(elt['metrics']['cvssMetricV31'])
-                if entry:
-                    accessVector = entry['cvssData']['attackVector']
-                    vectorString = entry['cvssData']['vectorString']
-                    cvssv3 = entry['cvssData']['baseScore']
-        except KeyError:
-            accessVector = accessVector or "UNKNOWN"
-            cvssv3 = 0.0
-        try:
-            if 'cvssMetricV40' in elt['metrics']:
-                entry = get_metric_entry(elt['metrics']['cvssMetricV40'])
-                if entry:
-                    accessVector = entry['cvssData']['attackVector']
-                    vectorString = entry['cvssData']['vectorString']
-                    cvssv4 = entry['cvssData']['baseScore']
-        except KeyError:
-            accessVector = accessVector or "UNKNOWN"
-            cvssv4 = 0.0
-
-        conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
-                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector, vectorString]).close()
-
-        for config in elt['configurations']:
-            # This is suboptimal as it doesn't handle AND/OR and negate, but is better than nothing
-            for node in config.get("nodes") or []:
-                parse_node_and_insert(conn, node, cveId, False)
-
-def update_db(d, conn, jsondata):
-    if (d.getVar("NVD_DB_VERSION") == "FKIE"):
-        return update_db_fkie(conn, jsondata)
-    else:
-        return update_db_nvdjson(conn, jsondata)
-
-do_fetch[nostamp] = "1"
-
-EXCLUDE_FROM_WORLD = "1"
diff --git a/meta/recipes-core/meta/cve-update-nvd2-native.bb b/meta/recipes-core/meta/cve-update-nvd2-native.bb
deleted file mode 100644
index 41c34ba0d01..00000000000
--- a/meta/recipes-core/meta/cve-update-nvd2-native.bb
+++ /dev/null
@@ -1,422 +0,0 @@
-SUMMARY = "Updates the NVD CVE database"
-LICENSE = "MIT"
-
-# Important note:
-# This product uses the NVD API but is not endorsed or certified by the NVD.
-
-INHIBIT_DEFAULT_DEPS = "1"
-
-inherit native
-
-deltask do_patch
-deltask do_configure
-deltask do_compile
-deltask do_install
-deltask do_populate_sysroot
-
-NVDCVE_URL ?= "https://services.nvd.nist.gov/rest/json/cves/2.0"
-
-# If you have a NVD API key (https://nvd.nist.gov/developers/request-an-api-key)
-# then setting this to get higher rate limits.
-NVDCVE_API_KEY ?= ""
-
-# CVE database update interval, in seconds. By default: once a day (23*60*60).
-# Use 0 to force the update
-# Use a negative value to skip the update
-CVE_DB_UPDATE_INTERVAL ?= "82800"
-
-# CVE database incremental update age threshold, in seconds. If the database is
-# older than this threshold, do a full re-download, else, do an incremental
-# update. By default: the maximum allowed value from NVD: 120 days (120*24*60*60)
-# Use 0 to force a full download.
-CVE_DB_INCR_UPDATE_AGE_THRES ?= "10368000"
-
-# Number of attempts for each http query to nvd server before giving up
-CVE_DB_UPDATE_ATTEMPTS ?= "5"
-
-CVE_CHECK_DB_DLDIR_FILE ?= "${DL_DIR}/CVE_CHECK2/${CVE_CHECK_DB_FILENAME}"
-CVE_CHECK_DB_DLDIR_LOCK ?= "${CVE_CHECK_DB_DLDIR_FILE}.lock"
-CVE_CHECK_DB_TEMP_FILE ?= "${CVE_CHECK_DB_FILE}.tmp"
-
-python () {
-    if not bb.data.inherits_class("cve-check", d):
-        raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.")
-}
-
-python do_fetch() {
-    """
-    Update NVD database with API 2.0
-    """
-    import bb.utils
-    import bb.progress
-    import shutil
-
-    bb.utils.export_proxies(d)
-
-    db_file = d.getVar("CVE_CHECK_DB_DLDIR_FILE")
-    db_dir = os.path.dirname(db_file)
-    db_tmp_file = d.getVar("CVE_CHECK_DB_TEMP_FILE")
-
-    cleanup_db_download(db_tmp_file)
-    # By default let's update the whole database (since time 0)
-    database_time = 0
-
-    # The NVD database changes once a day, so no need to update more frequently
-    # Allow the user to force-update
-    try:
-        import time
-        update_interval = int(d.getVar("CVE_DB_UPDATE_INTERVAL"))
-        if update_interval < 0:
-            bb.note("CVE database update skipped")
-            if not os.path.exists(db_file):
-                bb.error("CVE database %s not present, database fetch/update skipped" % db_file)
-            return
-        curr_time = time.time()
-        database_time = os.path.getmtime(db_file)
-        bb.note("Current time: %s; DB time: %s" % (time.ctime(curr_time), time.ctime(database_time)))
-        if curr_time < database_time:
-            bb.warn("Database time is in the future, force DB update")
-            database_time = 0
-        elif curr_time - database_time < update_interval:
-            bb.note("CVE database recently updated, skipping")
-            return
-
-    except OSError:
-        pass
-
-    if bb.utils.to_boolean(d.getVar("BB_NO_NETWORK")):
-        bb.error("BB_NO_NETWORK attempted to disable fetch, this recipe uses CVE_DB_UPDATE_INTERVAL to control download, set to '-1' to disable fetch or update")
-
-    bb.utils.mkdirhier(db_dir)
-    bb.utils.mkdirhier(os.path.dirname(db_tmp_file))
-    if os.path.exists(db_file):
-        shutil.copy2(db_file, db_tmp_file)
-
-    if update_db_file(db_tmp_file, d, database_time):
-        # Update downloaded correctly, we can swap files. To avoid potential
-        # NFS caching issues, ensure that the destination file has a new inode
-        # number. We do this in two steps as the downloads directory may be on
-        # a different filesystem to tmpdir we're working in.
-        new_file = "%s.new" % (db_file)
-        shutil.move(db_tmp_file, new_file)
-        os.rename(new_file, db_file)
-    else:
-        # Update failed, do not modify the database
-        bb.warn("CVE database update failed")
-        os.remove(db_tmp_file)
-}
-
-do_fetch[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}"
-do_fetch[file-checksums] = ""
-do_fetch[vardeps] = ""
-
-python do_unpack() {
-    import shutil
-    shutil.copyfile(d.getVar("CVE_CHECK_DB_DLDIR_FILE"), d.getVar("CVE_CHECK_DB_FILE"))
-}
-do_unpack[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK} ${CVE_CHECK_DB_FILE_LOCK}"
-
-def cleanup_db_download(db_tmp_file):
-    """
-    Cleanup the download space from possible failed downloads
-    """
-
-    # Clean-up the temporary file downloads, we can remove both journal
-    # and the temporary database
-    if os.path.exists("{0}-journal".format(db_tmp_file)):
-        os.remove("{0}-journal".format(db_tmp_file))
-    if os.path.exists(db_tmp_file):
-        os.remove(db_tmp_file)
-
-def nvd_request_wait(attempt, min_wait):
-    return min(((2 * attempt) + min_wait), 30)
-
-def nvd_request_next(url, attempts, api_key, args, min_wait):
-    """
-    Request next part of the NVD database
-    NVD API documentation: https://nvd.nist.gov/developers/vulnerabilities
-    """
-
-    import urllib.request
-    import urllib.parse
-    import gzip
-    import http
-    import time
-
-    request = urllib.request.Request(url + "?" + urllib.parse.urlencode(args))
-    if api_key:
-        request.add_header("apiKey", api_key)
-    bb.note("Requesting %s" % request.full_url)
-
-    for attempt in range(attempts):
-        try:
-            r = urllib.request.urlopen(request)
-
-            if (r.headers['content-encoding'] == 'gzip'):
-                buf = r.read()
-                raw_data = gzip.decompress(buf)
-            else:
-                raw_data = r.read().decode("utf-8")
-
-            r.close()
-
-        except Exception as e:
-            wait_time = nvd_request_wait(attempt, min_wait)
-            bb.note("CVE database: received error (%s)" % (e))
-            bb.note("CVE database: retrying download after %d seconds. attempted (%d/%d)" % (wait_time, attempt+1, attempts))
-            time.sleep(wait_time)
-            pass
-        else:
-            return raw_data
-    else:
-        # We failed at all attempts
-        return None
-
-def update_db_file(db_tmp_file, d, database_time):
-    """
-    Update the given database file
-    """
-    import bb.progress
-    import bb.utils
-    import datetime
-    import sqlite3
-    import json
-
-    # Connect to database
-    conn = sqlite3.connect(db_tmp_file)
-    initialize_db(conn)
-
-    req_args = {'startIndex': 0}
-
-    incr_update_threshold = int(d.getVar("CVE_DB_INCR_UPDATE_AGE_THRES"))
-    if database_time != 0:
-        database_date = datetime.datetime.fromtimestamp(database_time, tz=datetime.timezone.utc)
-        today_date = datetime.datetime.now(tz=datetime.timezone.utc)
-        delta = today_date - database_date
-        if incr_update_threshold == 0:
-            bb.note("CVE database: forced full update")
-        elif delta < datetime.timedelta(seconds=incr_update_threshold):
-            bb.note("CVE database: performing partial update")
-            # The maximum range for time is 120 days
-            if delta > datetime.timedelta(days=120):
-                bb.error("CVE database: Trying to do an incremental update on a larger than supported range")
-            req_args['lastModStartDate'] = database_date.isoformat()
-            req_args['lastModEndDate'] = today_date.isoformat()
-        else:
-            bb.note("CVE database: file too old, forcing a full update")
-    else:
-        bb.note("CVE database: no preexisting database, do a full download")
-
-    with bb.progress.ProgressHandler(d) as ph, open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') as cve_f:
-
-        bb.note("Updating entries")
-        index = 0
-        url = d.getVar("NVDCVE_URL")
-        api_key = d.getVar("NVDCVE_API_KEY") or None
-        attempts = int(d.getVar("CVE_DB_UPDATE_ATTEMPTS"))
-
-        # Recommended by NVD
-        wait_time = 6
-        if api_key:
-            wait_time = 2
-
-        while True:
-            req_args['startIndex'] = index
-            raw_data = nvd_request_next(url, attempts, api_key, req_args, wait_time)
-            if raw_data is None:
-                # We haven't managed to download data
-                return False
-
-            # hack for json5 style responses
-            if raw_data[-3:] == ',]}':
-                bb.note("Removing trailing ',' from nvd response")
-                raw_data = raw_data[:-3] + ']}'
-
-            data = json.loads(raw_data)
-
-            index = data["startIndex"]
-            total = data["totalResults"]
-            per_page = data["resultsPerPage"]
-            bb.note("Got %d entries" % per_page)
-            for cve in data["vulnerabilities"]:
-                update_db(conn, cve)
-
-            index += per_page
-            ph.update((float(index) / (total+1)) * 100)
-            if index >= total:
-                break
-
-            # Recommended by NVD
-            time.sleep(wait_time)
-
-        # Update success, set the date to cve_check file.
-        cve_f.write('CVE database update : %s\n\n' % datetime.date.today())
-
-    conn.commit()
-    conn.close()
-    return True
-
-def initialize_db(conn):
-    with conn:
-        c = conn.cursor()
-
-        c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER UNIQUE, DATE TEXT)")
-
-        c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE, SUMMARY TEXT, \
-            SCOREV2 TEXT, SCOREV3 TEXT, SCOREV4 TEXT, MODIFIED INTEGER, VECTOR TEXT, VECTORSTRING TEXT)")
-
-        c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \
-            VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT, OPERATOR_START TEXT, \
-            VERSION_END TEXT, OPERATOR_END TEXT)")
-        c.execute("CREATE INDEX IF NOT EXISTS PRODUCT_ID_IDX on PRODUCTS(ID);")
-
-        c.close()
-
-def parse_node_and_insert(conn, node, cveId):
-
-    def cpe_generator():
-        for cpe in node.get('cpeMatch', ()):
-            if not cpe['vulnerable']:
-                return
-            cpe23 = cpe.get('criteria')
-            if not cpe23:
-                return
-            cpe23 = cpe23.split(':')
-            if len(cpe23) < 6:
-                return
-            vendor = cpe23[3]
-            product = cpe23[4]
-            version = cpe23[5]
-
-            if cpe23[6] == '*' or cpe23[6] == '-':
-                version_suffix = ""
-            else:
-                version_suffix = "_" + cpe23[6]
-
-            if version != '*' and version != '-':
-                # Version is defined, this is a '=' match
-                yield [cveId, vendor, product, version + version_suffix, '=', '', '']
-            elif version == '-':
-                # no version information is available
-                yield [cveId, vendor, product, version, '', '', '']
-            else:
-                # Parse start version, end version and operators
-                op_start = ''
-                op_end = ''
-                v_start = ''
-                v_end = ''
-
-                if 'versionStartIncluding' in cpe:
-                    op_start = '>='
-                    v_start = cpe['versionStartIncluding']
-
-                if 'versionStartExcluding' in cpe:
-                    op_start = '>'
-                    v_start = cpe['versionStartExcluding']
-
-                if 'versionEndIncluding' in cpe:
-                    op_end = '<='
-                    v_end = cpe['versionEndIncluding']
-
-                if 'versionEndExcluding' in cpe:
-                    op_end = '<'
-                    v_end = cpe['versionEndExcluding']
-
-                if op_start or op_end or v_start or v_end:
-                    yield [cveId, vendor, product, v_start, op_start, v_end, op_end]
-                else:
-                    # This is no version information, expressed differently.
-                    # Save processing by representing as -.
-                    yield [cveId, vendor, product, '-', '', '', '']
-
-    conn.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)", cpe_generator()).close()
-
-def update_db(conn, elt):
-    """
-    Update a single entry in the on-disk database
-    """
-
-    accessVector = None
-    vectorString = None
-    cveId = elt['cve']['id']
-    if elt['cve'].get('vulnStatus') == "Rejected":
-        c = conn.cursor()
-        c.execute("delete from PRODUCTS where ID = ?;", [cveId])
-        c.execute("delete from NVD where ID = ?;", [cveId])
-        c.close()
-        return
-    cveDesc = ""
-    for desc in elt['cve']['descriptions']:
-        if desc['lang'] == 'en':
-            cveDesc = desc['value']
-    date = elt['cve']['lastModified']
-
-    # Extract maximum CVSS scores from all sources (Primary and Secondary)
-    cvssv2 = 0.0
-    try:
-        # Iterate through all cvssMetricV2 entries and find the maximum score
-        for metric in elt['cve']['metrics']['cvssMetricV2']:
-            score = metric['cvssData']['baseScore']
-            if score > cvssv2:
-                cvssv2 = score
-                accessVector = metric['cvssData']['accessVector']
-                vectorString = metric['cvssData']['vectorString']
-    except KeyError:
-        pass
-
-    cvssv3 = 0.0
-    try:
-        # Iterate through all cvssMetricV30 entries and find the maximum score
-        for metric in elt['cve']['metrics']['cvssMetricV30']:
-            score = metric['cvssData']['baseScore']
-            if score > cvssv3:
-                cvssv3 = score
-                accessVector = accessVector or metric['cvssData']['attackVector']
-                vectorString = vectorString or metric['cvssData']['vectorString']
-    except KeyError:
-        pass
-
-    try:
-        # Iterate through all cvssMetricV31 entries and find the maximum score
-        for metric in elt['cve']['metrics']['cvssMetricV31']:
-            score = metric['cvssData']['baseScore']
-            if score > cvssv3:
-                cvssv3 = score
-                accessVector = accessVector or metric['cvssData']['attackVector']
-                vectorString = vectorString or metric['cvssData']['vectorString']
-    except KeyError:
-        pass
-
-    cvssv4 = 0.0
-    try:
-        # Iterate through all cvssMetricV40 entries and find the maximum score
-        for metric in elt['cve']['metrics']['cvssMetricV40']:
-            score = metric['cvssData']['baseScore']
-            if score > cvssv4:
-                cvssv4 = score
-                accessVector = accessVector or metric['cvssData']['attackVector']
-                vectorString = vectorString or metric['cvssData']['vectorString']
-    except KeyError:
-        pass
-
-    accessVector = accessVector or "UNKNOWN"
-    vectorString = vectorString or "UNKNOWN"
-
-    conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
-                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector, vectorString]).close()
-
-    try:
-        # Remove any pre-existing CVE configuration. Even for partial database
-        # update, those will be repopulated. This ensures that old
-        # configuration is not kept for an updated CVE.
-        conn.execute("delete from PRODUCTS where ID = ?", [cveId]).close()
-        for config in elt['cve']['configurations']:
-            # This is suboptimal as it doesn't handle AND/OR and negate, but is better than nothing
-            for node in config["nodes"]:
-                parse_node_and_insert(conn, node, cveId)
-    except KeyError:
-        bb.note("CVE %s has no configurations" % cveId)
-
-do_fetch[nostamp] = "1"
-
-EXCLUDE_FROM_WORLD = "1"
-- 
2.43.0



^ permalink raw reply related	[flat|nested] 12+ messages in thread

* RE: [OE-core] [PATCH 3/3] classes/cve-check: remove class
  2026-03-31 13:24 ` [PATCH 3/3] classes/cve-check: remove class Ross Burton
@ 2026-03-31 13:48   ` Marko, Peter
  0 siblings, 0 replies; 12+ messages in thread
From: Marko, Peter @ 2026-03-31 13:48 UTC (permalink / raw)
  To: ross.burton@arm.com, openembedded-core@lists.openembedded.org

Please adapt build scripts on AB before merging this.
Peter

> -----Original Message-----
> From: openembedded-core@lists.openembedded.org <openembedded-
> core@lists.openembedded.org> On Behalf Of Ross Burton via
> lists.openembedded.org
> Sent: Tuesday, March 31, 2026 3:24 PM
> To: openembedded-core@lists.openembedded.org
> Subject: [OE-core] [PATCH 3/3] classes/cve-check: remove class
> 
> It's been long known that the cve-check class in oe-core is not that
> usable in the real world, for more details see "Future of CVE scanning
> in Yocto"[1].  This mail proposed an alternative direction that included
> a CVE scanning tool that can be ran both during the build and afterwards,
> so that periodic scans of a previously build image is possible.
> 
> Last year, Bootlin wrote sbom-cve-check[2] and I compared this to my
> proposal in "Comparing cve-check with sbom-cve-check"[3], concluding
> that this is likely the missing piece.
> 
> Support for sbom-cve-check has been merged into oe-core, and the
> cve-check class is now obsolete. So that we don't have to maintain it for
> the four-year lifecycle of the Wrynose release, delete it.
> 
> This patch also deletes the database fetcher recipes, and the test cases
> that were specific to cve-check.  Note that the oe.cve_check library
> still exists as this is used by the SPDX classes.
> 
> [1] https://lore.kernel.org/openembedded-core/7D6E419E-A7AE-4324-966C-
> 3552C586E452@arm.com/
> [2] https://github.com/bootlin/sbom-cve-check
> [3] https://lore.kernel.org/openembedded-core/2CD10DD9-FB2A-4B10-B98A-
> 85918EB6B4B7@arm.com/
> 
> Signed-off-by: Ross Burton <ross.burton@arm.com>
> ---
>  meta/classes/cve-check.bbclass                | 570 ------------------
>  meta/conf/distro/include/maintainers.inc      |   1 -
>  meta/conf/documentation.conf                  |   2 -
>  meta/lib/oeqa/selftest/cases/cve_check.py     | 172 ------
>  .../recipes-core/meta/cve-update-db-native.bb | 421 -------------
>  .../meta/cve-update-nvd2-native.bb            | 422 -------------
>  6 files changed, 1588 deletions(-)
>  delete mode 100644 meta/classes/cve-check.bbclass
>  delete mode 100644 meta/recipes-core/meta/cve-update-db-native.bb
>  delete mode 100644 meta/recipes-core/meta/cve-update-nvd2-native.bb
> 
> diff --git a/meta/classes/cve-check.bbclass b/meta/classes/cve-check.bbclass
> deleted file mode 100644
> index c63ebd56e16..00000000000
> --- a/meta/classes/cve-check.bbclass
> +++ /dev/null
> @@ -1,570 +0,0 @@
> -#
> -# Copyright OpenEmbedded Contributors
> -#
> -# SPDX-License-Identifier: MIT
> -#
> -
> -# This class is used to check recipes against public CVEs.
> -#
> -# In order to use this class just inherit the class in the
> -# local.conf file and it will add the cve_check task for
> -# every recipe. The task can be used per recipe, per image,
> -# or using the special cases "world" and "universe". The
> -# cve_check task will print a warning for every unpatched
> -# CVE found and generate a file in the recipe WORKDIR/cve
> -# directory. If an image is build it will generate a report
> -# in DEPLOY_DIR_IMAGE for all the packages used.
> -#
> -# Example:
> -#   bitbake -c cve_check openssl
> -#   bitbake core-image-sato
> -#   bitbake -k -c cve_check universe
> -#
> -# DISCLAIMER
> -#
> -# This class/tool is meant to be used as support and not
> -# the only method to check against CVEs. Running this tool
> -# doesn't guarantee your packages are free of CVEs.
> -
> -# The product name that the CVE database uses defaults to BPN, but may need to
> -# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff).
> -CVE_PRODUCT ??= "${BPN}"
> -CVE_VERSION ??= "${PV}"
> -
> -# Possible database sources: NVD1, NVD2, FKIE
> -NVD_DB_VERSION ?= "FKIE"
> -
> -# Use different file names for each database source, as they synchronize at different
> moments, so may be slightly different
> -CVE_CHECK_DB_FILENAME ?= "${@'nvdcve_2-2.db' if
> d.getVar('NVD_DB_VERSION') == 'NVD2' else 'nvdcve_1-3.db' if
> d.getVar('NVD_DB_VERSION') == 'NVD1' else 'nvdfkie_1-1.db'}"
> -CVE_CHECK_DB_FETCHER ?= "${@'cve-update-nvd2-native' if
> d.getVar('NVD_DB_VERSION') == 'NVD2' else 'cve-update-db-native'}"
> -CVE_CHECK_DB_DIR ?= "${STAGING_DIR}/CVE_CHECK"
> -CVE_CHECK_DB_FILE ?=
> "${CVE_CHECK_DB_DIR}/${CVE_CHECK_DB_FILENAME}"
> -CVE_CHECK_DB_FILE_LOCK ?= "${CVE_CHECK_DB_FILE}.lock"
> -
> -CVE_CHECK_SUMMARY_DIR ?= "${LOG_DIR}/cve"
> -CVE_CHECK_SUMMARY_FILE_NAME ?= "cve-summary"
> -CVE_CHECK_SUMMARY_FILE_NAME_JSON = "cve-summary.json"
> -CVE_CHECK_SUMMARY_INDEX_PATH = "${CVE_CHECK_SUMMARY_DIR}/cve-
> summary-index.txt"
> -
> -CVE_CHECK_LOG_JSON ?= "${T}/cve.json"
> -
> -CVE_CHECK_DIR ??= "${DEPLOY_DIR}/cve"
> -CVE_CHECK_RECIPE_FILE_JSON ?= "${CVE_CHECK_DIR}/${PN}_cve.json"
> -CVE_CHECK_MANIFEST_JSON_SUFFIX ?= "json"
> -CVE_CHECK_MANIFEST_JSON ?=
> "${IMGDEPLOYDIR}/${IMAGE_NAME}.${CVE_CHECK_MANIFEST_JSON_SUFFIX
> }"
> -CVE_CHECK_COPY_FILES ??= "1"
> -CVE_CHECK_CREATE_MANIFEST ??= "1"
> -
> -# Report Patched or Ignored CVEs
> -CVE_CHECK_REPORT_PATCHED ??= "1"
> -
> -CVE_CHECK_SHOW_WARNINGS ??= "1"
> -
> -# Provide JSON output
> -CVE_CHECK_FORMAT_JSON ??= "1"
> -
> -# Check for packages without CVEs (no issues or missing product name)
> -CVE_CHECK_COVERAGE ??= "1"
> -
> -# Skip CVE Check for packages (PN)
> -CVE_CHECK_SKIP_RECIPE ?= ""
> -
> -# Replace NVD DB check status for a given CVE. Each of CVE has to be
> mentioned
> -# separately with optional detail and description for this status.
> -#
> -# CVE_STATUS[CVE-1234-0001] = "not-applicable-platform: Issue only applies on
> Windows"
> -# CVE_STATUS[CVE-1234-0002] = "fixed-version: Fixed externally"
> -#
> -# Settings the same status and reason for multiple CVEs is possible
> -# via CVE_STATUS_GROUPS variable.
> -#
> -# CVE_STATUS_GROUPS = "CVE_STATUS_WIN CVE_STATUS_PATCHED"
> -#
> -# CVE_STATUS_WIN = "CVE-1234-0001 CVE-1234-0003"
> -# CVE_STATUS_WIN[status] = "not-applicable-platform: Issue only applies on
> Windows"
> -# CVE_STATUS_PATCHED = "CVE-1234-0002 CVE-1234-0004"
> -# CVE_STATUS_PATCHED[status] = "fixed-version: Fixed externally"
> -#
> -# All possible CVE statuses could be found in cve-check-map.conf
> -# CVE_CHECK_STATUSMAP[not-applicable-platform] = "Ignored"
> -# CVE_CHECK_STATUSMAP[fixed-version] = "Patched"
> -#
> -# CVE_CHECK_IGNORE is deprecated and CVE_STATUS has to be used instead.
> -# Keep CVE_CHECK_IGNORE until other layers migrate to new variables
> -CVE_CHECK_IGNORE ?= ""
> -
> -# Layers to be excluded
> -CVE_CHECK_LAYER_EXCLUDELIST ??= ""
> -
> -# Layers to be included
> -CVE_CHECK_LAYER_INCLUDELIST ??= ""
> -
> -
> -# set to "alphabetical" for version using single alphabetical character as increment
> release
> -CVE_VERSION_SUFFIX ??= ""
> -
> -python () {
> -    from oe.cve_check import extend_cve_status
> -    extend_cve_status(d)
> -
> -    nvd_database_type = d.getVar("NVD_DB_VERSION")
> -    if nvd_database_type not in ("NVD1", "NVD2", "FKIE"):
> -        bb.erroronce("Malformed NVD_DB_VERSION, must be one of: NVD1, NVD2,
> FKIE. Defaulting to NVD2")
> -        d.setVar("NVD_DB_VERSION", "NVD2")
> -}
> -
> -def generate_json_report(d, out_path, link_path):
> -    if os.path.exists(d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")):
> -        import json
> -        from oe.cve_check import cve_check_merge_jsons, update_symlinks
> -
> -        bb.note("Generating JSON CVE summary")
> -        index_file = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
> -        summary = {"version":"1", "package": []}
> -        with open(index_file) as f:
> -            filename = f.readline()
> -            while filename:
> -                with open(filename.rstrip()) as j:
> -                    data = json.load(j)
> -                    cve_check_merge_jsons(summary, data)
> -                filename = f.readline()
> -
> -        summary["package"].sort(key=lambda d: d['name'])
> -
> -        with open(out_path, "w") as f:
> -            json.dump(summary, f, indent=2)
> -
> -        update_symlinks(out_path, link_path)
> -
> -python cve_save_summary_handler () {
> -    import shutil
> -    import datetime
> -    from oe.cve_check import update_symlinks
> -
> -    cve_summary_name = d.getVar("CVE_CHECK_SUMMARY_FILE_NAME")
> -    cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
> -    bb.utils.mkdirhier(cvelogpath)
> -
> -    timestamp = datetime.datetime.now().strftime('%Y%m%d%H%M%S')
> -
> -    if d.getVar("CVE_CHECK_FORMAT_JSON") == "1":
> -        json_summary_link_name = os.path.join(cvelogpath,
> d.getVar("CVE_CHECK_SUMMARY_FILE_NAME_JSON"))
> -        json_summary_name = os.path.join(cvelogpath, "%s-%s.json" %
> (cve_summary_name, timestamp))
> -        generate_json_report(d, json_summary_name, json_summary_link_name)
> -        bb.plain("Complete CVE JSON report summary created at: %s" %
> json_summary_link_name)
> -}
> -
> -addhandler cve_save_summary_handler
> -cve_save_summary_handler[eventmask] = "bb.event.BuildCompleted"
> -
> -python do_cve_check () {
> -    """
> -    Check recipe for patched and unpatched CVEs
> -    """
> -    from oe.cve_check import get_patched_cves
> -
> -    with bb.utils.fileslocked([d.getVar("CVE_CHECK_DB_FILE_LOCK")],
> shared=True):
> -        if os.path.exists(d.getVar("CVE_CHECK_DB_FILE")):
> -            try:
> -                patched_cves = get_patched_cves(d)
> -            except FileNotFoundError:
> -                bb.fatal("Failure in searching patches")
> -            cve_data, status = check_cves(d, patched_cves)
> -            if len(cve_data) or (d.getVar("CVE_CHECK_COVERAGE") == "1" and
> status):
> -                get_cve_info(d, cve_data)
> -                cve_write_data(d, cve_data, status)
> -        else:
> -            bb.note("No CVE database found, skipping CVE check")
> -
> -}
> -
> -addtask cve_check before do_build
> -do_cve_check[depends] = "${CVE_CHECK_DB_FETCHER}:do_unpack"
> -do_cve_check[nostamp] = "1"
> -
> -python cve_check_cleanup () {
> -    """
> -    Delete the file used to gather all the CVE information.
> -    """
> -    bb.utils.remove(e.data.getVar("CVE_CHECK_SUMMARY_INDEX_PATH"))
> -}
> -
> -addhandler cve_check_cleanup
> -cve_check_cleanup[eventmask] = "bb.event.BuildCompleted"
> -
> -python cve_check_write_rootfs_manifest () {
> -    """
> -    Create CVE manifest when building an image
> -    """
> -
> -    import shutil
> -    import json
> -    from oe.rootfs import image_list_installed_packages
> -    from oe.cve_check import cve_check_merge_jsons, update_symlinks
> -
> -    if d.getVar("CVE_CHECK_COPY_FILES") == "1":
> -        deploy_file_json = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
> -        if os.path.exists(deploy_file_json):
> -            bb.utils.remove(deploy_file_json)
> -
> -    # Create a list of relevant recipies
> -    recipies = set()
> -    for pkg in list(image_list_installed_packages(d)):
> -        pkg_info = os.path.join(d.getVar('PKGDATA_DIR'),
> -                                'runtime-reverse', pkg)
> -        pkg_data = oe.packagedata.read_pkgdatafile(pkg_info)
> -        recipies.add(pkg_data["PN"])
> -
> -    bb.note("Writing rootfs CVE manifest")
> -    deploy_dir = d.getVar("IMGDEPLOYDIR")
> -    link_name = d.getVar("IMAGE_LINK_NAME")
> -
> -    json_data = {"version":"1", "package": []}
> -    text_data = ""
> -    enable_json = d.getVar("CVE_CHECK_FORMAT_JSON") == "1"
> -
> -    save_pn = d.getVar("PN")
> -
> -    for pkg in recipies:
> -        # To be able to use the CVE_CHECK_RECIPE_FILE_JSON variable we have
> to evaluate
> -        # it with the different PN names set each time.
> -        d.setVar("PN", pkg)
> -
> -        if enable_json:
> -            pkgfilepath = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
> -            if os.path.exists(pkgfilepath):
> -                with open(pkgfilepath) as j:
> -                    data = json.load(j)
> -                    cve_check_merge_jsons(json_data, data)
> -
> -    d.setVar("PN", save_pn)
> -
> -    if enable_json:
> -        manifest_name_suffix =
> d.getVar("CVE_CHECK_MANIFEST_JSON_SUFFIX")
> -        manifest_name = d.getVar("CVE_CHECK_MANIFEST_JSON")
> -
> -        with open(manifest_name, "w") as f:
> -            json.dump(json_data, f, indent=2)
> -
> -        if link_name:
> -            link_path = os.path.join(deploy_dir, "%s.%s" % (link_name,
> manifest_name_suffix))
> -            update_symlinks(manifest_name, link_path)
> -
> -        bb.plain("Image CVE JSON report stored in: %s" % manifest_name)
> -}
> -
> -ROOTFS_POSTPROCESS_COMMAND:prepend =
> "${@'cve_check_write_rootfs_manifest ' if
> d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
> -do_rootfs[recrdeptask] += "${@'do_cve_check' if
> d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
> -do_populate_sdk[recrdeptask] += "${@'do_cve_check' if
> d.getVar('CVE_CHECK_CREATE_MANIFEST') == '1' else ''}"
> -
> -def cve_is_ignored(d, cve_data, cve):
> -    if cve not in cve_data:
> -        return False
> -    if cve_data[cve]['abbrev-status'] == "Ignored":
> -        return True
> -    return False
> -
> -def cve_is_patched(d, cve_data, cve):
> -    if cve not in cve_data:
> -        return False
> -    if cve_data[cve]['abbrev-status'] == "Patched":
> -        return True
> -    return False
> -
> -def cve_update(d, cve_data, cve, entry):
> -    # If no entry, just add it
> -    if cve not in cve_data:
> -        cve_data[cve] = entry
> -        return
> -    # If we are updating, there might be change in the status
> -    bb.debug(1, "Trying CVE entry update for %s from %s to %s" % (cve,
> cve_data[cve]['abbrev-status'], entry['abbrev-status']))
> -    if cve_data[cve]['abbrev-status'] == "Unknown":
> -        cve_data[cve] = entry
> -        return
> -    if cve_data[cve]['abbrev-status'] == entry['abbrev-status']:
> -        return
> -    # Update like in {'abbrev-status': 'Patched', 'status': 'version-not-in-range'} to
> {'abbrev-status': 'Unpatched', 'status': 'version-in-range'}
> -    if entry['abbrev-status'] == "Unpatched" and cve_data[cve]['abbrev-status'] ==
> "Patched":
> -        if entry['status'] == "version-in-range" and cve_data[cve]['status'] == "version-
> not-in-range":
> -            # New result from the scan, vulnerable
> -            cve_data[cve] = entry
> -            bb.debug(1, "CVE entry %s update from Patched to Unpatched from the
> scan result" % cve)
> -            return
> -    if entry['abbrev-status'] == "Patched" and cve_data[cve]['abbrev-status'] ==
> "Unpatched":
> -        if entry['status'] == "version-not-in-range" and cve_data[cve]['status'] ==
> "version-in-range":
> -            # Range does not match the scan, but we already have a vulnerable match,
> ignore
> -            bb.debug(1, "CVE entry %s update from Patched to Unpatched from the
> scan result - not applying" % cve)
> -            return
> -    # If we have an "Ignored", it has a priority
> -    if cve_data[cve]['abbrev-status'] == "Ignored":
> -        bb.debug(1, "CVE %s not updating because Ignored" % cve)
> -        return
> -    bb.warn("Unhandled CVE entry update for %s from %s to %s" % (cve,
> cve_data[cve], entry))
> -
> -def check_cves(d, cve_data):
> -    """
> -    Connect to the NVD database and find unpatched cves.
> -    """
> -    from oe.cve_check import Version, convert_cve_version, decode_cve_status
> -
> -    pn = d.getVar("PN")
> -    real_pv = d.getVar("PV")
> -    suffix = d.getVar("CVE_VERSION_SUFFIX")
> -
> -    cves_status = []
> -    cves_in_recipe = False
> -    # CVE_PRODUCT can contain more than one product (eg. curl/libcurl)
> -    products = d.getVar("CVE_PRODUCT").split()
> -    # If this has been unset then we're not scanning for CVEs here (for example,
> image recipes)
> -    if not products:
> -        return ([], [])
> -    pv = d.getVar("CVE_VERSION").split("+git")[0]
> -
> -    # If the recipe has been skipped/ignored we return empty lists
> -    if pn in d.getVar("CVE_CHECK_SKIP_RECIPE").split():
> -        bb.note("Recipe has been skipped by cve-check")
> -        return ([], [])
> -
> -    import sqlite3
> -    db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
> -    conn = sqlite3.connect(db_file, uri=True)
> -
> -    # For each of the known product names (e.g. curl has CPEs using curl and
> libcurl)...
> -    for product in products:
> -        cves_in_product = False
> -        if ":" in product:
> -            vendor, product = product.split(":", 1)
> -        else:
> -            vendor = "%"
> -
> -        # Find all relevant CVE IDs.
> -        cve_cursor = conn.execute("SELECT DISTINCT ID FROM PRODUCTS
> WHERE PRODUCT IS ? AND VENDOR LIKE ?", (product, vendor))
> -        for cverow in cve_cursor:
> -            cve = cverow[0]
> -
> -            # Write status once only for each product
> -            if not cves_in_product:
> -                cves_status.append([product, True])
> -                cves_in_product = True
> -                cves_in_recipe = True
> -
> -            if cve_is_ignored(d, cve_data, cve):
> -                bb.note("%s-%s ignores %s" % (product, pv, cve))
> -                continue
> -            elif cve_is_patched(d, cve_data, cve):
> -                bb.note("%s has been patched" % (cve))
> -                continue
> -
> -            vulnerable = False
> -            ignored = False
> -
> -            product_cursor = conn.execute("SELECT * FROM PRODUCTS WHERE ID
> IS ? AND PRODUCT IS ? AND VENDOR LIKE ?", (cve, product, vendor))
> -            for row in product_cursor:
> -                (_, _, _, version_start, operator_start, version_end, operator_end) = row
> -                #bb.debug(2, "Evaluating row " + str(row))
> -                if cve_is_ignored(d, cve_data, cve):
> -                    ignored = True
> -
> -                version_start = convert_cve_version(version_start)
> -                version_end = convert_cve_version(version_end)
> -
> -                if (operator_start == '=' and pv == version_start) or version_start == '-':
> -                    vulnerable = True
> -                else:
> -                    if operator_start:
> -                        try:
> -                            vulnerable_start =  (operator_start == '>=' and Version(pv,suffix)
> >= Version(version_start,suffix))
> -                            vulnerable_start |= (operator_start == '>' and Version(pv,suffix) >
> Version(version_start,suffix))
> -                        except:
> -                            bb.warn("%s: Failed to compare %s %s %s for %s" %
> -                                    (product, pv, operator_start, version_start, cve))
> -                            vulnerable_start = False
> -                    else:
> -                        vulnerable_start = False
> -
> -                    if operator_end:
> -                        try:
> -                            vulnerable_end  = (operator_end == '<=' and Version(pv,suffix)
> <= Version(version_end,suffix) )
> -                            vulnerable_end |= (operator_end == '<' and Version(pv,suffix) <
> Version(version_end,suffix) )
> -                        except:
> -                            bb.warn("%s: Failed to compare %s %s %s for %s" %
> -                                    (product, pv, operator_end, version_end, cve))
> -                            vulnerable_end = False
> -                    else:
> -                        vulnerable_end = False
> -
> -                    if operator_start and operator_end:
> -                        vulnerable = vulnerable_start and vulnerable_end
> -                    else:
> -                        vulnerable = vulnerable_start or vulnerable_end
> -
> -                if vulnerable:
> -                    if ignored:
> -                        bb.note("%s is ignored in %s-%s" % (cve, pn, real_pv))
> -                        cve_update(d, cve_data, cve, {"abbrev-status": "Ignored"})
> -                    else:
> -                        bb.note("%s-%s is vulnerable to %s" % (pn, real_pv, cve))
> -                        cve_update(d, cve_data, cve, {"abbrev-status": "Unpatched",
> "status": "version-in-range"})
> -                    break
> -            product_cursor.close()
> -
> -            if not vulnerable:
> -                bb.note("%s-%s is not vulnerable to %s" % (pn, real_pv, cve))
> -                cve_update(d, cve_data, cve, {"abbrev-status": "Patched", "status":
> "version-not-in-range"})
> -        cve_cursor.close()
> -
> -        if not cves_in_product:
> -            bb.note("No CVE records found for product %s, pn %s" % (product, pn))
> -            cves_status.append([product, False])
> -
> -    conn.close()
> -
> -    if not cves_in_recipe:
> -        bb.note("No CVE records for products in recipe %s" % (pn))
> -
> -    if d.getVar("CVE_CHECK_SHOW_WARNINGS") == "1":
> -        unpatched_cves = [cve for cve in cve_data if cve_data[cve]["abbrev-status"]
> == "Unpatched"]
> -        if unpatched_cves:
> -            bb.warn("Found unpatched CVE (%s)" % " ".join(unpatched_cves))
> -
> -    return (cve_data, cves_status)
> -
> -def get_cve_info(d, cve_data):
> -    """
> -    Get CVE information from the database.
> -    """
> -
> -    import sqlite3
> -
> -    db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
> -    conn = sqlite3.connect(db_file, uri=True)
> -
> -    for cve in cve_data:
> -        cursor = conn.execute("SELECT * FROM NVD WHERE ID IS ?", (cve,))
> -        for row in cursor:
> -            # The CVE itdelf has been added already
> -            if row[0] not in cve_data:
> -                bb.note("CVE record %s not present" % row[0])
> -                continue
> -            #cve_data[row[0]] = {}
> -            cve_data[row[0]]["NVD-summary"] = row[1]
> -            cve_data[row[0]]["NVD-scorev2"] = row[2]
> -            cve_data[row[0]]["NVD-scorev3"] = row[3]
> -            cve_data[row[0]]["NVD-scorev4"] = row[4]
> -            cve_data[row[0]]["NVD-modified"] = row[5]
> -            cve_data[row[0]]["NVD-vector"] = row[6]
> -            cve_data[row[0]]["NVD-vectorString"] = row[7]
> -        cursor.close()
> -    conn.close()
> -
> -def cve_check_write_json_output(d, output, direct_file, deploy_file, manifest_file):
> -    """
> -    Write CVE information in the JSON format: to WORKDIR; and to
> -    CVE_CHECK_DIR, if CVE manifest if enabled, write fragment
> -    files that will be assembled at the end in cve_check_write_rootfs_manifest.
> -    """
> -
> -    import json
> -
> -    write_string = json.dumps(output, indent=2)
> -    with open(direct_file, "w") as f:
> -        bb.note("Writing file %s with CVE information" % direct_file)
> -        f.write(write_string)
> -
> -    if d.getVar("CVE_CHECK_COPY_FILES") == "1":
> -        bb.utils.mkdirhier(os.path.dirname(deploy_file))
> -        with open(deploy_file, "w") as f:
> -            f.write(write_string)
> -
> -    if d.getVar("CVE_CHECK_CREATE_MANIFEST") == "1":
> -        cvelogpath = d.getVar("CVE_CHECK_SUMMARY_DIR")
> -        index_path = d.getVar("CVE_CHECK_SUMMARY_INDEX_PATH")
> -        bb.utils.mkdirhier(cvelogpath)
> -        fragment_file = os.path.basename(deploy_file)
> -        fragment_path = os.path.join(cvelogpath, fragment_file)
> -        with open(fragment_path, "w") as f:
> -            f.write(write_string)
> -        with open(index_path, "a+") as f:
> -            f.write("%s\n" % fragment_path)
> -
> -def cve_write_data_json(d, cve_data, cve_status):
> -    """
> -    Prepare CVE data for the JSON format, then write it.
> -    """
> -
> -    output = {"version":"1", "package": []}
> -    nvd_link = "https://nvd.nist.gov/vuln/detail/"
> -
> -    fdir_name  = d.getVar("FILE_DIRNAME")
> -    layer = fdir_name.split("/")[-3]
> -
> -    include_layers = d.getVar("CVE_CHECK_LAYER_INCLUDELIST").split()
> -    exclude_layers = d.getVar("CVE_CHECK_LAYER_EXCLUDELIST").split()
> -
> -    report_all = d.getVar("CVE_CHECK_REPORT_PATCHED") == "1"
> -
> -    if exclude_layers and layer in exclude_layers:
> -        return
> -
> -    if include_layers and layer not in include_layers:
> -        return
> -
> -    product_data = []
> -    for s in cve_status:
> -        p = {"product": s[0], "cvesInRecord": "Yes"}
> -        if s[1] == False:
> -            p["cvesInRecord"] = "No"
> -        product_data.append(p)
> -
> -    package_version = "%s%s" % (d.getVar("EXTENDPE"), d.getVar("PV"))
> -    package_data = {
> -        "name" : d.getVar("PN"),
> -        "layer" : layer,
> -        "version" : package_version,
> -        "products": product_data
> -    }
> -
> -    cve_list = []
> -
> -    for cve in sorted(cve_data):
> -        if not report_all and (cve_data[cve]["abbrev-status"] == "Patched" or
> cve_data[cve]["abbrev-status"] == "Ignored"):
> -            continue
> -        issue_link = "%s%s" % (nvd_link, cve)
> -
> -        cve_item = {
> -            "id" : cve,
> -            "status" : cve_data[cve]["abbrev-status"],
> -            "link": issue_link,
> -        }
> -        if 'NVD-summary' in cve_data[cve]:
> -            cve_item["summary"] = cve_data[cve]["NVD-summary"]
> -            cve_item["scorev2"] = cve_data[cve]["NVD-scorev2"]
> -            cve_item["scorev3"] = cve_data[cve]["NVD-scorev3"]
> -            cve_item["scorev4"] = cve_data[cve]["NVD-scorev4"]
> -            cve_item["modified"] = cve_data[cve]["NVD-modified"]
> -            cve_item["vector"] = cve_data[cve]["NVD-vector"]
> -            cve_item["vectorString"] = cve_data[cve]["NVD-vectorString"]
> -        if 'status' in cve_data[cve]:
> -            cve_item["detail"] = cve_data[cve]["status"]
> -        if 'justification' in cve_data[cve]:
> -            cve_item["description"] = cve_data[cve]["justification"]
> -        if 'resource' in cve_data[cve]:
> -            cve_item["patch-file"] = cve_data[cve]["resource"]
> -        cve_list.append(cve_item)
> -
> -    package_data["issue"] = cve_list
> -    output["package"].append(package_data)
> -
> -    direct_file = d.getVar("CVE_CHECK_LOG_JSON")
> -    deploy_file = d.getVar("CVE_CHECK_RECIPE_FILE_JSON")
> -    manifest_file = d.getVar("CVE_CHECK_SUMMARY_FILE_NAME_JSON")
> -
> -    cve_check_write_json_output(d, output, direct_file, deploy_file, manifest_file)
> -
> -def cve_write_data(d, cve_data, status):
> -    """
> -    Write CVE data in each enabled format.
> -    """
> -
> -    if d.getVar("CVE_CHECK_FORMAT_JSON") == "1":
> -        cve_write_data_json(d, cve_data, status)
> diff --git a/meta/conf/distro/include/maintainers.inc
> b/meta/conf/distro/include/maintainers.inc
> index 1bd43211e23..a429320b88b 100644
> --- a/meta/conf/distro/include/maintainers.inc
> +++ b/meta/conf/distro/include/maintainers.inc
> @@ -140,7 +140,6 @@ RECIPE_MAINTAINER:pn-cryptodev-module = "Robert
> Yang <liezhi.yang@windriver.com>
>  RECIPE_MAINTAINER:pn-cryptodev-tests = "Robert Yang
> <liezhi.yang@windriver.com>"
>  RECIPE_MAINTAINER:pn-cups = "Chen Qi <Qi.Chen@windriver.com>"
>  RECIPE_MAINTAINER:pn-curl = "Robert Joslyn <robert.joslyn@redrectangle.org>"
> -RECIPE_MAINTAINER:pn-cve-update-nvd2-native = "Ross Burton
> <ross.burton@arm.com>"
>  RECIPE_MAINTAINER:pn-db = "Unassigned <unassigned@yoctoproject.org>"
>  RECIPE_MAINTAINER:pn-dbus = "Chen Qi <Qi.Chen@windriver.com>"
>  RECIPE_MAINTAINER:pn-dbus-glib = "Chen Qi <Qi.Chen@windriver.com>"
> diff --git a/meta/conf/documentation.conf b/meta/conf/documentation.conf
> index 1853676fa06..9d429ba9a31 100644
> --- a/meta/conf/documentation.conf
> +++ b/meta/conf/documentation.conf
> @@ -121,8 +121,6 @@ CONFLICT_MACHINE_FEATURES[doc] = "When a recipe
> inherits the features_check clas
>  CORE_IMAGE_EXTRA_INSTALL[doc] = "Specifies the list of packages to be
> added to the image. You should only set this variable in the conf/local.conf file in the
> Build Directory."
>  COREBASE[doc] = "Specifies the parent directory of the OpenEmbedded Core
> Metadata layer (i.e. meta)."
>  CONF_VERSION[doc] = "Tracks the version of local.conf.  Increased each time
> build/conf/ changes incompatibly."
> -CVE_CHECK_LAYER_EXCLUDELIST[doc] = "Defines which layers to exclude from
> cve-check scanning"
> -CVE_CHECK_LAYER_INCLUDELIST[doc] = "Defines which layers to include during
> cve-check scanning"
> 
>  #D
> 
> diff --git a/meta/lib/oeqa/selftest/cases/cve_check.py
> b/meta/lib/oeqa/selftest/cases/cve_check.py
> index 511e4b81b41..891a7de3317 100644
> --- a/meta/lib/oeqa/selftest/cases/cve_check.py
> +++ b/meta/lib/oeqa/selftest/cases/cve_check.py
> @@ -4,10 +4,7 @@
>  # SPDX-License-Identifier: MIT
>  #
> 
> -import json
> -import os
>  from oeqa.selftest.case import OESelftestTestCase
> -from oeqa.utils.commands import bitbake, get_bb_vars
> 
>  class CVECheck(OESelftestTestCase):
> 
> @@ -325,172 +322,3 @@ class CVECheck(OESelftestTestCase):
>              ),
>              {"CVE-2019-6461", "CVE-2019-6462", "CVE-2019-6463", "CVE-2019-6464"},
>          )
> -
> -    def test_recipe_report_json(self):
> -        config = """
> -INHERIT += "cve-check"
> -CVE_CHECK_FORMAT_JSON = "1"
> -"""
> -        self.write_config(config)
> -
> -        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR",
> "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"],
> vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], "m4-
> native_cve.json")
> -
> -        try:
> -            os.remove(summary_json)
> -            os.remove(recipe_json)
> -        except FileNotFoundError:
> -            pass
> -
> -        bitbake("m4-native -c cve_check")
> -
> -        def check_m4_json(filename):
> -            with open(filename) as f:
> -                report = json.load(f)
> -            self.assertEqual(report["version"], "1")
> -            self.assertEqual(len(report["package"]), 1)
> -            package = report["package"][0]
> -            self.assertEqual(package["name"], "m4-native")
> -            found_cves = { issue["id"]: issue["status"] for issue in package["issue"]}
> -            self.assertIn("CVE-2008-1687", found_cves)
> -            self.assertEqual(found_cves["CVE-2008-1687"], "Patched")
> -
> -        self.assertExists(summary_json)
> -        check_m4_json(summary_json)
> -        self.assertExists(recipe_json)
> -        check_m4_json(recipe_json)
> -
> -
> -    def test_image_json(self):
> -        config = """
> -INHERIT += "cve-check"
> -CVE_CHECK_FORMAT_JSON = "1"
> -"""
> -        self.write_config(config)
> -
> -        vars = get_bb_vars(["CVE_CHECK_DIR", "CVE_CHECK_SUMMARY_DIR",
> "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        report_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"],
> vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        print(report_json)
> -        try:
> -            os.remove(report_json)
> -        except FileNotFoundError:
> -            pass
> -
> -        bitbake("core-image-minimal-initramfs")
> -        self.assertExists(report_json)
> -
> -        # Check that the summary report lists at least one package
> -        with open(report_json) as f:
> -            report = json.load(f)
> -        self.assertEqual(report["version"], "1")
> -        self.assertGreater(len(report["package"]), 1)
> -
> -        # Check that a random recipe wrote a recipe report to deploy/cve/
> -        recipename = report["package"][0]["name"]
> -        recipe_report = os.path.join(vars["CVE_CHECK_DIR"], recipename +
> "_cve.json")
> -        self.assertExists(recipe_report)
> -        with open(recipe_report) as f:
> -            report = json.load(f)
> -        self.assertEqual(report["version"], "1")
> -        self.assertEqual(len(report["package"]), 1)
> -        self.assertEqual(report["package"][0]["name"], recipename)
> -
> -
> -    def test_recipe_report_json_unpatched(self):
> -        config = """
> -INHERIT += "cve-check"
> -CVE_CHECK_FORMAT_JSON = "1"
> -CVE_CHECK_REPORT_PATCHED = "0"
> -"""
> -        self.write_config(config)
> -
> -        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR",
> "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"],
> vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"], "m4-
> native_cve.json")
> -
> -        try:
> -            os.remove(summary_json)
> -            os.remove(recipe_json)
> -        except FileNotFoundError:
> -            pass
> -
> -        bitbake("m4-native -c cve_check")
> -
> -        def check_m4_json(filename):
> -            with open(filename) as f:
> -                report = json.load(f)
> -            self.assertEqual(report["version"], "1")
> -            self.assertEqual(len(report["package"]), 1)
> -            package = report["package"][0]
> -            self.assertEqual(package["name"], "m4-native")
> -            #m4 had only Patched CVEs, so the issues array will be empty
> -            self.assertEqual(package["issue"], [])
> -
> -        self.assertExists(summary_json)
> -        check_m4_json(summary_json)
> -        self.assertExists(recipe_json)
> -        check_m4_json(recipe_json)
> -
> -
> -    def test_recipe_report_json_ignored(self):
> -        config = """
> -INHERIT += "cve-check"
> -CVE_CHECK_FORMAT_JSON = "1"
> -CVE_CHECK_REPORT_PATCHED = "1"
> -"""
> -        self.write_config(config)
> -
> -        vars = get_bb_vars(["CVE_CHECK_SUMMARY_DIR",
> "CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        summary_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"],
> vars["CVE_CHECK_SUMMARY_FILE_NAME_JSON"])
> -        recipe_json = os.path.join(vars["CVE_CHECK_SUMMARY_DIR"],
> "logrotate_cve.json")
> -
> -        try:
> -            os.remove(summary_json)
> -            os.remove(recipe_json)
> -        except FileNotFoundError:
> -            pass
> -
> -        bitbake("logrotate -c cve_check")
> -
> -        def check_m4_json(filename):
> -            with open(filename) as f:
> -                report = json.load(f)
> -            self.assertEqual(report["version"], "1")
> -            self.assertEqual(len(report["package"]), 1)
> -            package = report["package"][0]
> -            self.assertEqual(package["name"], "logrotate")
> -            found_cves = {}
> -            for issue in package["issue"]:
> -                found_cves[issue["id"]] = {
> -                    "status" : issue["status"],
> -                    "detail" : issue["detail"] if "detail" in issue else "",
> -                    "description" : issue["description"] if "description" in issue else ""
> -                }
> -            # m4 CVE should not be in logrotate
> -            self.assertNotIn("CVE-2008-1687", found_cves)
> -            # logrotate has both Patched and Ignored CVEs
> -            detail = "version-not-in-range"
> -            self.assertIn("CVE-2011-1098", found_cves)
> -            self.assertEqual(found_cves["CVE-2011-1098"]["status"], "Patched")
> -            self.assertEqual(found_cves["CVE-2011-1098"]["detail"], detail)
> -            self.assertEqual(len(found_cves["CVE-2011-1098"]["description"]), 0)
> -            detail = "not-applicable-platform"
> -            description = "CVE is debian, gentoo or SUSE specific on the way logrotate
> was installed/used"
> -            self.assertIn("CVE-2011-1548", found_cves)
> -            self.assertEqual(found_cves["CVE-2011-1548"]["status"], "Ignored")
> -            self.assertEqual(found_cves["CVE-2011-1548"]["detail"], detail)
> -            self.assertEqual(found_cves["CVE-2011-1548"]["description"], description)
> -            self.assertIn("CVE-2011-1549", found_cves)
> -            self.assertEqual(found_cves["CVE-2011-1549"]["status"], "Ignored")
> -            self.assertEqual(found_cves["CVE-2011-1549"]["detail"], detail)
> -            self.assertEqual(found_cves["CVE-2011-1549"]["description"], description)
> -            self.assertIn("CVE-2011-1550", found_cves)
> -            self.assertEqual(found_cves["CVE-2011-1550"]["status"], "Ignored")
> -            self.assertEqual(found_cves["CVE-2011-1550"]["detail"], detail)
> -            self.assertEqual(found_cves["CVE-2011-1550"]["description"], description)
> -
> -        self.assertExists(summary_json)
> -        check_m4_json(summary_json)
> -        self.assertExists(recipe_json)
> -        check_m4_json(recipe_json)
> diff --git a/meta/recipes-core/meta/cve-update-db-native.bb b/meta/recipes-
> core/meta/cve-update-db-native.bb
> deleted file mode 100644
> index 01f942dcdbf..00000000000
> --- a/meta/recipes-core/meta/cve-update-db-native.bb
> +++ /dev/null
> @@ -1,421 +0,0 @@
> -SUMMARY = "Updates the NVD CVE database"
> -LICENSE = "MIT"
> -
> -INHIBIT_DEFAULT_DEPS = "1"
> -
> -inherit native
> -
> -deltask do_patch
> -deltask do_configure
> -deltask do_compile
> -deltask do_install
> -deltask do_populate_sysroot
> -
> -NVDCVE_URL ?= "https://nvd.nist.gov/feeds/json/cve/1.1/nvdcve-1.1-"
> -FKIE_URL ?= "https://github.com/fkie-cad/nvd-json-data-
> feeds/releases/latest/download/CVE-"
> -
> -# CVE database update interval, in seconds. By default: once a day (23*60*60).
> -# Use 0 to force the update
> -# Use a negative value to skip the update
> -CVE_DB_UPDATE_INTERVAL ?= "82800"
> -
> -# Timeout for blocking socket operations, such as the connection attempt.
> -CVE_SOCKET_TIMEOUT ?= "60"
> -
> -CVE_CHECK_DB_DLDIR_FILE ?=
> "${DL_DIR}/CVE_CHECK2/${CVE_CHECK_DB_FILENAME}"
> -CVE_CHECK_DB_DLDIR_LOCK ?= "${CVE_CHECK_DB_DLDIR_FILE}.lock"
> -CVE_CHECK_DB_TEMP_FILE ?= "${CVE_CHECK_DB_FILE}.tmp"
> -
> -python () {
> -    if not bb.data.inherits_class("cve-check", d):
> -        raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.")
> -}
> -
> -python do_fetch() {
> -    """
> -    Update NVD database with json data feed
> -    """
> -    import bb.utils
> -    import bb.progress
> -    import shutil
> -
> -    bb.utils.export_proxies(d)
> -
> -    db_file = d.getVar("CVE_CHECK_DB_DLDIR_FILE")
> -    db_dir = os.path.dirname(db_file)
> -    db_tmp_file = d.getVar("CVE_CHECK_DB_TEMP_FILE")
> -
> -    cleanup_db_download(db_tmp_file)
> -
> -    # The NVD database changes once a day, so no need to update more frequently
> -    # Allow the user to force-update
> -    try:
> -        import time
> -        update_interval = int(d.getVar("CVE_DB_UPDATE_INTERVAL"))
> -        if update_interval < 0:
> -            bb.note("CVE database update skipped")
> -            if not os.path.exists(db_file):
> -                bb.error("CVE database %s not present, database fetch/update skipped"
> % db_file)
> -            return
> -        curr_time = time.time()
> -        database_time = os.path.getmtime(db_file)
> -        bb.note("Current time: %s; DB time: %s" % (time.ctime(curr_time),
> time.ctime(database_time)))
> -        if curr_time < database_time:
> -            bb.warn("Database time is in the future, force DB update")
> -        elif curr_time - database_time < update_interval:
> -            bb.note("CVE database recently updated, skipping")
> -            return
> -
> -    except OSError:
> -        pass
> -
> -    if bb.utils.to_boolean(d.getVar("BB_NO_NETWORK")):
> -        bb.error("BB_NO_NETWORK attempted to disable fetch, this recipe uses
> CVE_DB_UPDATE_INTERVAL to control download, set to '-1' to disable fetch or
> update")
> -
> -    bb.utils.mkdirhier(db_dir)
> -    bb.utils.mkdirhier(os.path.dirname(db_tmp_file))
> -    if os.path.exists(db_file):
> -        shutil.copy2(db_file, db_tmp_file)
> -
> -    if update_db_file(db_tmp_file, d):
> -        # Update downloaded correctly, we can swap files. To avoid potential
> -        # NFS caching issues, ensure that the destination file has a new inode
> -        # number. We do this in two steps as the downloads directory may be on
> -        # a different filesystem to tmpdir we're working in.
> -        new_file = "%s.new" % (db_file)
> -        shutil.move(db_tmp_file, new_file)
> -        os.rename(new_file, db_file)
> -    else:
> -        # Update failed, do not modify the database
> -        bb.warn("CVE database update failed")
> -        os.remove(db_tmp_file)
> -}
> -
> -do_fetch[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}"
> -do_fetch[file-checksums] = ""
> -do_fetch[vardeps] = ""
> -
> -python do_unpack() {
> -    import shutil
> -    shutil.copyfile(d.getVar("CVE_CHECK_DB_DLDIR_FILE"),
> d.getVar("CVE_CHECK_DB_FILE"))
> -}
> -do_unpack[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}
> ${CVE_CHECK_DB_FILE_LOCK}"
> -
> -def cleanup_db_download(db_tmp_file):
> -    """
> -    Cleanup the download space from possible failed downloads
> -    """
> -
> -    # Clean-up the temporary file downloads, we can remove both journal
> -    # and the temporary database
> -    if os.path.exists("{0}-journal".format(db_tmp_file)):
> -        os.remove("{0}-journal".format(db_tmp_file))
> -    if os.path.exists(db_tmp_file):
> -        os.remove(db_tmp_file)
> -
> -def db_file_names(d, year, is_nvd):
> -    if is_nvd:
> -        year_url = d.getVar('NVDCVE_URL') + str(year)
> -        meta_url = year_url + ".meta"
> -        json_url = year_url + ".json.gz"
> -        return json_url, meta_url
> -    year_url = d.getVar('FKIE_URL') + str(year)
> -    meta_url = year_url + ".meta"
> -    json_url = year_url + ".json.xz"
> -    return json_url, meta_url
> -
> -def host_db_name(d, is_nvd):
> -    if is_nvd:
> -        return "nvd.nist.gov"
> -    return "github.com"
> -
> -def db_decompress(d, data, is_nvd):
> -    import gzip, lzma
> -
> -    if is_nvd:
> -        return gzip.decompress(data).decode('utf-8')
> -    # otherwise
> -    return lzma.decompress(data)
> -
> -def update_db_file(db_tmp_file, d):
> -    """
> -    Update the given database file
> -    """
> -    import bb.progress
> -    import bb.utils
> -    from datetime import date
> -    import sqlite3
> -    import urllib
> -
> -    YEAR_START = 2002
> -    cve_socket_timeout = int(d.getVar("CVE_SOCKET_TIMEOUT"))
> -    is_nvd = d.getVar("NVD_DB_VERSION") == "NVD1"
> -
> -    # Connect to database
> -    conn = sqlite3.connect(db_tmp_file)
> -    initialize_db(conn)
> -
> -    with bb.progress.ProgressHandler(d) as ph,
> open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') as cve_f:
> -        total_years = date.today().year + 1 - YEAR_START
> -        for i, year in enumerate(range(YEAR_START, date.today().year + 1)):
> -            bb.note("Updating %d" % year)
> -            ph.update((float(i + 1) / total_years) * 100)
> -            json_url, meta_url = db_file_names(d, year, is_nvd)
> -
> -            # Retrieve meta last modified date
> -            try:
> -                response = urllib.request.urlopen(meta_url, timeout=cve_socket_timeout)
> -            except urllib.error.URLError as e:
> -                cve_f.write('Warning: CVE db update error, Unable to fetch CVE
> data.\n\n')
> -                bb.warn("Failed to fetch CVE data (%s)" % e)
> -                import socket
> -                result = socket.getaddrinfo(host_db_name(d, is_nvd), 443,
> proto=socket.IPPROTO_TCP)
> -                bb.warn("Host IPs are %s" % (", ".join(t[4][0] for t in result)))
> -                return False
> -
> -            if response:
> -                for line in response.read().decode("utf-8").splitlines():
> -                    key, value = line.split(":", 1)
> -                    if key == "lastModifiedDate":
> -                        last_modified = value
> -                        break
> -                else:
> -                    bb.warn("Cannot parse CVE metadata, update failed")
> -                    return False
> -
> -            # Compare with current db last modified date
> -            cursor = conn.execute("select DATE from META where YEAR = ?", (year,))
> -            meta = cursor.fetchone()
> -            cursor.close()
> -
> -            if not meta or meta[0] != last_modified:
> -                bb.note("Updating entries")
> -                # Clear products table entries corresponding to current year
> -                conn.execute("delete from PRODUCTS where ID like ?", ('CVE-%d%%'
> % year,)).close()
> -
> -                # Update db with current year json file
> -                try:
> -                    response = urllib.request.urlopen(json_url,
> timeout=cve_socket_timeout)
> -                    if response:
> -                        update_db(d, conn, db_decompress(d, response.read(), is_nvd))
> -                    conn.execute("insert or replace into META values (?, ?)", [year,
> last_modified]).close()
> -                except urllib.error.URLError as e:
> -                    cve_f.write('Warning: CVE db update error, CVE data is outdated.\n\n')
> -                    bb.warn("Cannot parse CVE data (%s), update failed" % e.reason)
> -                    return False
> -            else:
> -                bb.debug(2, "Already up to date (last modified %s)" % last_modified)
> -            # Update success, set the date to cve_check file.
> -            if year == date.today().year:
> -                cve_f.write('CVE database update : %s\n\n' % date.today())
> -
> -        conn.commit()
> -        conn.close()
> -        return True
> -
> -def initialize_db(conn):
> -    with conn:
> -        c = conn.cursor()
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER
> UNIQUE, DATE TEXT)")
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE,
> SUMMARY TEXT, \
> -            SCOREV2 TEXT, SCOREV3 TEXT, SCOREV4 TEXT, MODIFIED
> INTEGER, VECTOR TEXT, VECTORSTRING TEXT)")
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \
> -            VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT,
> OPERATOR_START TEXT, \
> -            VERSION_END TEXT, OPERATOR_END TEXT)")
> -        c.execute("CREATE INDEX IF NOT EXISTS PRODUCT_ID_IDX on
> PRODUCTS(ID);")
> -
> -        c.close()
> -
> -def parse_node_and_insert(conn, node, cveId, is_nvd):
> -    # Parse children node if needed
> -    for child in node.get('children', ()):
> -        parse_node_and_insert(conn, child, cveId, is_nvd)
> -
> -    def cpe_generator(is_nvd):
> -        match_string = "cpeMatch"
> -        cpe_string = 'criteria'
> -        if is_nvd:
> -            match_string = "cpe_match"
> -            cpe_string = 'cpe23Uri'
> -
> -        for cpe in node.get(match_string, ()):
> -            if not cpe['vulnerable']:
> -                return
> -            cpe23 = cpe.get(cpe_string)
> -            if not cpe23:
> -                return
> -            cpe23 = cpe23.split(':')
> -            if len(cpe23) < 6:
> -                return
> -            vendor = cpe23[3]
> -            product = cpe23[4]
> -            version = cpe23[5]
> -
> -            if cpe23[6] == '*' or cpe23[6] == '-':
> -                version_suffix = ""
> -            else:
> -                version_suffix = "_" + cpe23[6]
> -
> -            if version != '*' and version != '-':
> -                # Version is defined, this is a '=' match
> -                yield [cveId, vendor, product, version + version_suffix, '=', '', '']
> -            elif version == '-':
> -                # no version information is available
> -                yield [cveId, vendor, product, version, '', '', '']
> -            else:
> -                # Parse start version, end version and operators
> -                op_start = ''
> -                op_end = ''
> -                v_start = ''
> -                v_end = ''
> -
> -                if 'versionStartIncluding' in cpe:
> -                    op_start = '>='
> -                    v_start = cpe['versionStartIncluding']
> -
> -                if 'versionStartExcluding' in cpe:
> -                    op_start = '>'
> -                    v_start = cpe['versionStartExcluding']
> -
> -                if 'versionEndIncluding' in cpe:
> -                    op_end = '<='
> -                    v_end = cpe['versionEndIncluding']
> -
> -                if 'versionEndExcluding' in cpe:
> -                    op_end = '<'
> -                    v_end = cpe['versionEndExcluding']
> -
> -                if op_start or op_end or v_start or v_end:
> -                    yield [cveId, vendor, product, v_start, op_start, v_end, op_end]
> -                else:
> -                    # This is no version information, expressed differently.
> -                    # Save processing by representing as -.
> -                    yield [cveId, vendor, product, '-', '', '', '']
> -
> -    conn.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)",
> cpe_generator(is_nvd)).close()
> -
> -def update_db_nvdjson(conn, jsondata):
> -    import json
> -    root = json.loads(jsondata)
> -
> -    for elt in root['CVE_Items']:
> -        if not elt['impact']:
> -            continue
> -
> -        accessVector = None
> -        vectorString = None
> -        cvssv2 = 0.0
> -        cvssv3 = 0.0
> -        cvssv4 = 0.0
> -        cveId = elt['cve']['CVE_data_meta']['ID']
> -        cveDesc = elt['cve']['description']['description_data'][0]['value']
> -        date = elt['lastModifiedDate']
> -        try:
> -            accessVector = elt['impact']['baseMetricV2']['cvssV2']['accessVector']
> -            vectorString = elt['impact']['baseMetricV2']['cvssV2']['vectorString']
> -            cvssv2 = elt['impact']['baseMetricV2']['cvssV2']['baseScore']
> -        except KeyError:
> -            cvssv2 = 0.0
> -        try:
> -            accessVector = accessVector or
> elt['impact']['baseMetricV3']['cvssV3']['attackVector']
> -            vectorString = vectorString or
> elt['impact']['baseMetricV3']['cvssV3']['vectorString']
> -            cvssv3 = elt['impact']['baseMetricV3']['cvssV3']['baseScore']
> -        except KeyError:
> -            accessVector = accessVector or "UNKNOWN"
> -            cvssv3 = 0.0
> -
> -        conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
> -                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector,
> vectorString]).close()
> -
> -        configurations = elt['configurations']['nodes']
> -        for config in configurations:
> -            parse_node_and_insert(conn, config, cveId, True)
> -
> -def get_metric_entry(metric):
> -    primaries = [c for c in metric if c['type'] == "Primary"]
> -    secondaries = [c for c in metric if c['type'] == "Secondary"]
> -    if len(primaries) > 0:
> -        return primaries[0]
> -    elif len(secondaries) > 0:
> -        return secondaries[0]
> -    return None
> -
> -def update_db_fkie(conn, jsondata):
> -    import json
> -    root = json.loads(jsondata)
> -
> -    for elt in root['cve_items']:
> -        if 'vulnStatus' not in elt or elt['vulnStatus'] == 'Rejected':
> -            continue
> -
> -        if 'configurations' not in elt:
> -            continue
> -
> -        accessVector = None
> -        vectorString = None
> -        cvssv2 = 0.0
> -        cvssv3 = 0.0
> -        cvssv4 = 0.0
> -        cveId = elt['id']
> -        cveDesc = elt['descriptions'][0]['value']
> -        date = elt['lastModified']
> -        try:
> -            if 'cvssMetricV2' in elt['metrics']:
> -                entry = get_metric_entry(elt['metrics']['cvssMetricV2'])
> -                if entry:
> -                    accessVector = entry['cvssData']['accessVector']
> -                    vectorString = entry['cvssData']['vectorString']
> -                    cvssv2 = entry['cvssData']['baseScore']
> -        except KeyError:
> -            cvssv2 = 0.0
> -        try:
> -            if 'cvssMetricV30' in elt['metrics']:
> -                entry = get_metric_entry(elt['metrics']['cvssMetricV30'])
> -                if entry:
> -                    accessVector = entry['cvssData']['attackVector']
> -                    vectorString = entry['cvssData']['vectorString']
> -                    cvssv3 = entry['cvssData']['baseScore']
> -        except KeyError:
> -            accessVector = accessVector or "UNKNOWN"
> -            cvssv3 = 0.0
> -        try:
> -            if 'cvssMetricV31' in elt['metrics']:
> -                entry = get_metric_entry(elt['metrics']['cvssMetricV31'])
> -                if entry:
> -                    accessVector = entry['cvssData']['attackVector']
> -                    vectorString = entry['cvssData']['vectorString']
> -                    cvssv3 = entry['cvssData']['baseScore']
> -        except KeyError:
> -            accessVector = accessVector or "UNKNOWN"
> -            cvssv3 = 0.0
> -        try:
> -            if 'cvssMetricV40' in elt['metrics']:
> -                entry = get_metric_entry(elt['metrics']['cvssMetricV40'])
> -                if entry:
> -                    accessVector = entry['cvssData']['attackVector']
> -                    vectorString = entry['cvssData']['vectorString']
> -                    cvssv4 = entry['cvssData']['baseScore']
> -        except KeyError:
> -            accessVector = accessVector or "UNKNOWN"
> -            cvssv4 = 0.0
> -
> -        conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
> -                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector,
> vectorString]).close()
> -
> -        for config in elt['configurations']:
> -            # This is suboptimal as it doesn't handle AND/OR and negate, but is better
> than nothing
> -            for node in config.get("nodes") or []:
> -                parse_node_and_insert(conn, node, cveId, False)
> -
> -def update_db(d, conn, jsondata):
> -    if (d.getVar("NVD_DB_VERSION") == "FKIE"):
> -        return update_db_fkie(conn, jsondata)
> -    else:
> -        return update_db_nvdjson(conn, jsondata)
> -
> -do_fetch[nostamp] = "1"
> -
> -EXCLUDE_FROM_WORLD = "1"
> diff --git a/meta/recipes-core/meta/cve-update-nvd2-native.bb b/meta/recipes-
> core/meta/cve-update-nvd2-native.bb
> deleted file mode 100644
> index 41c34ba0d01..00000000000
> --- a/meta/recipes-core/meta/cve-update-nvd2-native.bb
> +++ /dev/null
> @@ -1,422 +0,0 @@
> -SUMMARY = "Updates the NVD CVE database"
> -LICENSE = "MIT"
> -
> -# Important note:
> -# This product uses the NVD API but is not endorsed or certified by the NVD.
> -
> -INHIBIT_DEFAULT_DEPS = "1"
> -
> -inherit native
> -
> -deltask do_patch
> -deltask do_configure
> -deltask do_compile
> -deltask do_install
> -deltask do_populate_sysroot
> -
> -NVDCVE_URL ?= "https://services.nvd.nist.gov/rest/json/cves/2.0"
> -
> -# If you have a NVD API key (https://nvd.nist.gov/developers/request-an-api-key)
> -# then setting this to get higher rate limits.
> -NVDCVE_API_KEY ?= ""
> -
> -# CVE database update interval, in seconds. By default: once a day (23*60*60).
> -# Use 0 to force the update
> -# Use a negative value to skip the update
> -CVE_DB_UPDATE_INTERVAL ?= "82800"
> -
> -# CVE database incremental update age threshold, in seconds. If the database is
> -# older than this threshold, do a full re-download, else, do an incremental
> -# update. By default: the maximum allowed value from NVD: 120 days
> (120*24*60*60)
> -# Use 0 to force a full download.
> -CVE_DB_INCR_UPDATE_AGE_THRES ?= "10368000"
> -
> -# Number of attempts for each http query to nvd server before giving up
> -CVE_DB_UPDATE_ATTEMPTS ?= "5"
> -
> -CVE_CHECK_DB_DLDIR_FILE ?=
> "${DL_DIR}/CVE_CHECK2/${CVE_CHECK_DB_FILENAME}"
> -CVE_CHECK_DB_DLDIR_LOCK ?= "${CVE_CHECK_DB_DLDIR_FILE}.lock"
> -CVE_CHECK_DB_TEMP_FILE ?= "${CVE_CHECK_DB_FILE}.tmp"
> -
> -python () {
> -    if not bb.data.inherits_class("cve-check", d):
> -        raise bb.parse.SkipRecipe("Skip recipe when cve-check class is not loaded.")
> -}
> -
> -python do_fetch() {
> -    """
> -    Update NVD database with API 2.0
> -    """
> -    import bb.utils
> -    import bb.progress
> -    import shutil
> -
> -    bb.utils.export_proxies(d)
> -
> -    db_file = d.getVar("CVE_CHECK_DB_DLDIR_FILE")
> -    db_dir = os.path.dirname(db_file)
> -    db_tmp_file = d.getVar("CVE_CHECK_DB_TEMP_FILE")
> -
> -    cleanup_db_download(db_tmp_file)
> -    # By default let's update the whole database (since time 0)
> -    database_time = 0
> -
> -    # The NVD database changes once a day, so no need to update more frequently
> -    # Allow the user to force-update
> -    try:
> -        import time
> -        update_interval = int(d.getVar("CVE_DB_UPDATE_INTERVAL"))
> -        if update_interval < 0:
> -            bb.note("CVE database update skipped")
> -            if not os.path.exists(db_file):
> -                bb.error("CVE database %s not present, database fetch/update skipped"
> % db_file)
> -            return
> -        curr_time = time.time()
> -        database_time = os.path.getmtime(db_file)
> -        bb.note("Current time: %s; DB time: %s" % (time.ctime(curr_time),
> time.ctime(database_time)))
> -        if curr_time < database_time:
> -            bb.warn("Database time is in the future, force DB update")
> -            database_time = 0
> -        elif curr_time - database_time < update_interval:
> -            bb.note("CVE database recently updated, skipping")
> -            return
> -
> -    except OSError:
> -        pass
> -
> -    if bb.utils.to_boolean(d.getVar("BB_NO_NETWORK")):
> -        bb.error("BB_NO_NETWORK attempted to disable fetch, this recipe uses
> CVE_DB_UPDATE_INTERVAL to control download, set to '-1' to disable fetch or
> update")
> -
> -    bb.utils.mkdirhier(db_dir)
> -    bb.utils.mkdirhier(os.path.dirname(db_tmp_file))
> -    if os.path.exists(db_file):
> -        shutil.copy2(db_file, db_tmp_file)
> -
> -    if update_db_file(db_tmp_file, d, database_time):
> -        # Update downloaded correctly, we can swap files. To avoid potential
> -        # NFS caching issues, ensure that the destination file has a new inode
> -        # number. We do this in two steps as the downloads directory may be on
> -        # a different filesystem to tmpdir we're working in.
> -        new_file = "%s.new" % (db_file)
> -        shutil.move(db_tmp_file, new_file)
> -        os.rename(new_file, db_file)
> -    else:
> -        # Update failed, do not modify the database
> -        bb.warn("CVE database update failed")
> -        os.remove(db_tmp_file)
> -}
> -
> -do_fetch[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}"
> -do_fetch[file-checksums] = ""
> -do_fetch[vardeps] = ""
> -
> -python do_unpack() {
> -    import shutil
> -    shutil.copyfile(d.getVar("CVE_CHECK_DB_DLDIR_FILE"),
> d.getVar("CVE_CHECK_DB_FILE"))
> -}
> -do_unpack[lockfiles] += "${CVE_CHECK_DB_DLDIR_LOCK}
> ${CVE_CHECK_DB_FILE_LOCK}"
> -
> -def cleanup_db_download(db_tmp_file):
> -    """
> -    Cleanup the download space from possible failed downloads
> -    """
> -
> -    # Clean-up the temporary file downloads, we can remove both journal
> -    # and the temporary database
> -    if os.path.exists("{0}-journal".format(db_tmp_file)):
> -        os.remove("{0}-journal".format(db_tmp_file))
> -    if os.path.exists(db_tmp_file):
> -        os.remove(db_tmp_file)
> -
> -def nvd_request_wait(attempt, min_wait):
> -    return min(((2 * attempt) + min_wait), 30)
> -
> -def nvd_request_next(url, attempts, api_key, args, min_wait):
> -    """
> -    Request next part of the NVD database
> -    NVD API documentation: https://nvd.nist.gov/developers/vulnerabilities
> -    """
> -
> -    import urllib.request
> -    import urllib.parse
> -    import gzip
> -    import http
> -    import time
> -
> -    request = urllib.request.Request(url + "?" + urllib.parse.urlencode(args))
> -    if api_key:
> -        request.add_header("apiKey", api_key)
> -    bb.note("Requesting %s" % request.full_url)
> -
> -    for attempt in range(attempts):
> -        try:
> -            r = urllib.request.urlopen(request)
> -
> -            if (r.headers['content-encoding'] == 'gzip'):
> -                buf = r.read()
> -                raw_data = gzip.decompress(buf)
> -            else:
> -                raw_data = r.read().decode("utf-8")
> -
> -            r.close()
> -
> -        except Exception as e:
> -            wait_time = nvd_request_wait(attempt, min_wait)
> -            bb.note("CVE database: received error (%s)" % (e))
> -            bb.note("CVE database: retrying download after %d seconds. attempted
> (%d/%d)" % (wait_time, attempt+1, attempts))
> -            time.sleep(wait_time)
> -            pass
> -        else:
> -            return raw_data
> -    else:
> -        # We failed at all attempts
> -        return None
> -
> -def update_db_file(db_tmp_file, d, database_time):
> -    """
> -    Update the given database file
> -    """
> -    import bb.progress
> -    import bb.utils
> -    import datetime
> -    import sqlite3
> -    import json
> -
> -    # Connect to database
> -    conn = sqlite3.connect(db_tmp_file)
> -    initialize_db(conn)
> -
> -    req_args = {'startIndex': 0}
> -
> -    incr_update_threshold = int(d.getVar("CVE_DB_INCR_UPDATE_AGE_THRES"))
> -    if database_time != 0:
> -        database_date = datetime.datetime.fromtimestamp(database_time,
> tz=datetime.timezone.utc)
> -        today_date = datetime.datetime.now(tz=datetime.timezone.utc)
> -        delta = today_date - database_date
> -        if incr_update_threshold == 0:
> -            bb.note("CVE database: forced full update")
> -        elif delta < datetime.timedelta(seconds=incr_update_threshold):
> -            bb.note("CVE database: performing partial update")
> -            # The maximum range for time is 120 days
> -            if delta > datetime.timedelta(days=120):
> -                bb.error("CVE database: Trying to do an incremental update on a larger
> than supported range")
> -            req_args['lastModStartDate'] = database_date.isoformat()
> -            req_args['lastModEndDate'] = today_date.isoformat()
> -        else:
> -            bb.note("CVE database: file too old, forcing a full update")
> -    else:
> -        bb.note("CVE database: no preexisting database, do a full download")
> -
> -    with bb.progress.ProgressHandler(d) as ph,
> open(os.path.join(d.getVar("TMPDIR"), 'cve_check'), 'a') as cve_f:
> -
> -        bb.note("Updating entries")
> -        index = 0
> -        url = d.getVar("NVDCVE_URL")
> -        api_key = d.getVar("NVDCVE_API_KEY") or None
> -        attempts = int(d.getVar("CVE_DB_UPDATE_ATTEMPTS"))
> -
> -        # Recommended by NVD
> -        wait_time = 6
> -        if api_key:
> -            wait_time = 2
> -
> -        while True:
> -            req_args['startIndex'] = index
> -            raw_data = nvd_request_next(url, attempts, api_key, req_args, wait_time)
> -            if raw_data is None:
> -                # We haven't managed to download data
> -                return False
> -
> -            # hack for json5 style responses
> -            if raw_data[-3:] == ',]}':
> -                bb.note("Removing trailing ',' from nvd response")
> -                raw_data = raw_data[:-3] + ']}'
> -
> -            data = json.loads(raw_data)
> -
> -            index = data["startIndex"]
> -            total = data["totalResults"]
> -            per_page = data["resultsPerPage"]
> -            bb.note("Got %d entries" % per_page)
> -            for cve in data["vulnerabilities"]:
> -                update_db(conn, cve)
> -
> -            index += per_page
> -            ph.update((float(index) / (total+1)) * 100)
> -            if index >= total:
> -                break
> -
> -            # Recommended by NVD
> -            time.sleep(wait_time)
> -
> -        # Update success, set the date to cve_check file.
> -        cve_f.write('CVE database update : %s\n\n' % datetime.date.today())
> -
> -    conn.commit()
> -    conn.close()
> -    return True
> -
> -def initialize_db(conn):
> -    with conn:
> -        c = conn.cursor()
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS META (YEAR INTEGER
> UNIQUE, DATE TEXT)")
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS NVD (ID TEXT UNIQUE,
> SUMMARY TEXT, \
> -            SCOREV2 TEXT, SCOREV3 TEXT, SCOREV4 TEXT, MODIFIED
> INTEGER, VECTOR TEXT, VECTORSTRING TEXT)")
> -
> -        c.execute("CREATE TABLE IF NOT EXISTS PRODUCTS (ID TEXT, \
> -            VENDOR TEXT, PRODUCT TEXT, VERSION_START TEXT,
> OPERATOR_START TEXT, \
> -            VERSION_END TEXT, OPERATOR_END TEXT)")
> -        c.execute("CREATE INDEX IF NOT EXISTS PRODUCT_ID_IDX on
> PRODUCTS(ID);")
> -
> -        c.close()
> -
> -def parse_node_and_insert(conn, node, cveId):
> -
> -    def cpe_generator():
> -        for cpe in node.get('cpeMatch', ()):
> -            if not cpe['vulnerable']:
> -                return
> -            cpe23 = cpe.get('criteria')
> -            if not cpe23:
> -                return
> -            cpe23 = cpe23.split(':')
> -            if len(cpe23) < 6:
> -                return
> -            vendor = cpe23[3]
> -            product = cpe23[4]
> -            version = cpe23[5]
> -
> -            if cpe23[6] == '*' or cpe23[6] == '-':
> -                version_suffix = ""
> -            else:
> -                version_suffix = "_" + cpe23[6]
> -
> -            if version != '*' and version != '-':
> -                # Version is defined, this is a '=' match
> -                yield [cveId, vendor, product, version + version_suffix, '=', '', '']
> -            elif version == '-':
> -                # no version information is available
> -                yield [cveId, vendor, product, version, '', '', '']
> -            else:
> -                # Parse start version, end version and operators
> -                op_start = ''
> -                op_end = ''
> -                v_start = ''
> -                v_end = ''
> -
> -                if 'versionStartIncluding' in cpe:
> -                    op_start = '>='
> -                    v_start = cpe['versionStartIncluding']
> -
> -                if 'versionStartExcluding' in cpe:
> -                    op_start = '>'
> -                    v_start = cpe['versionStartExcluding']
> -
> -                if 'versionEndIncluding' in cpe:
> -                    op_end = '<='
> -                    v_end = cpe['versionEndIncluding']
> -
> -                if 'versionEndExcluding' in cpe:
> -                    op_end = '<'
> -                    v_end = cpe['versionEndExcluding']
> -
> -                if op_start or op_end or v_start or v_end:
> -                    yield [cveId, vendor, product, v_start, op_start, v_end, op_end]
> -                else:
> -                    # This is no version information, expressed differently.
> -                    # Save processing by representing as -.
> -                    yield [cveId, vendor, product, '-', '', '', '']
> -
> -    conn.executemany("insert into PRODUCTS values (?, ?, ?, ?, ?, ?, ?)",
> cpe_generator()).close()
> -
> -def update_db(conn, elt):
> -    """
> -    Update a single entry in the on-disk database
> -    """
> -
> -    accessVector = None
> -    vectorString = None
> -    cveId = elt['cve']['id']
> -    if elt['cve'].get('vulnStatus') == "Rejected":
> -        c = conn.cursor()
> -        c.execute("delete from PRODUCTS where ID = ?;", [cveId])
> -        c.execute("delete from NVD where ID = ?;", [cveId])
> -        c.close()
> -        return
> -    cveDesc = ""
> -    for desc in elt['cve']['descriptions']:
> -        if desc['lang'] == 'en':
> -            cveDesc = desc['value']
> -    date = elt['cve']['lastModified']
> -
> -    # Extract maximum CVSS scores from all sources (Primary and Secondary)
> -    cvssv2 = 0.0
> -    try:
> -        # Iterate through all cvssMetricV2 entries and find the maximum score
> -        for metric in elt['cve']['metrics']['cvssMetricV2']:
> -            score = metric['cvssData']['baseScore']
> -            if score > cvssv2:
> -                cvssv2 = score
> -                accessVector = metric['cvssData']['accessVector']
> -                vectorString = metric['cvssData']['vectorString']
> -    except KeyError:
> -        pass
> -
> -    cvssv3 = 0.0
> -    try:
> -        # Iterate through all cvssMetricV30 entries and find the maximum score
> -        for metric in elt['cve']['metrics']['cvssMetricV30']:
> -            score = metric['cvssData']['baseScore']
> -            if score > cvssv3:
> -                cvssv3 = score
> -                accessVector = accessVector or metric['cvssData']['attackVector']
> -                vectorString = vectorString or metric['cvssData']['vectorString']
> -    except KeyError:
> -        pass
> -
> -    try:
> -        # Iterate through all cvssMetricV31 entries and find the maximum score
> -        for metric in elt['cve']['metrics']['cvssMetricV31']:
> -            score = metric['cvssData']['baseScore']
> -            if score > cvssv3:
> -                cvssv3 = score
> -                accessVector = accessVector or metric['cvssData']['attackVector']
> -                vectorString = vectorString or metric['cvssData']['vectorString']
> -    except KeyError:
> -        pass
> -
> -    cvssv4 = 0.0
> -    try:
> -        # Iterate through all cvssMetricV40 entries and find the maximum score
> -        for metric in elt['cve']['metrics']['cvssMetricV40']:
> -            score = metric['cvssData']['baseScore']
> -            if score > cvssv4:
> -                cvssv4 = score
> -                accessVector = accessVector or metric['cvssData']['attackVector']
> -                vectorString = vectorString or metric['cvssData']['vectorString']
> -    except KeyError:
> -        pass
> -
> -    accessVector = accessVector or "UNKNOWN"
> -    vectorString = vectorString or "UNKNOWN"
> -
> -    conn.execute("insert or replace into NVD values (?, ?, ?, ?, ?, ?, ?, ?)",
> -                [cveId, cveDesc, cvssv2, cvssv3, cvssv4, date, accessVector,
> vectorString]).close()
> -
> -    try:
> -        # Remove any pre-existing CVE configuration. Even for partial database
> -        # update, those will be repopulated. This ensures that old
> -        # configuration is not kept for an updated CVE.
> -        conn.execute("delete from PRODUCTS where ID = ?", [cveId]).close()
> -        for config in elt['cve']['configurations']:
> -            # This is suboptimal as it doesn't handle AND/OR and negate, but is better
> than nothing
> -            for node in config["nodes"]:
> -                parse_node_and_insert(conn, node, cveId)
> -    except KeyError:
> -        bb.note("CVE %s has no configurations" % cveId)
> -
> -do_fetch[nostamp] = "1"
> -
> -EXCLUDE_FROM_WORLD = "1"
> --
> 2.43.0



^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-03-31 13:24 [PATCH 1/3] classes/vex: remove Ross Burton
  2026-03-31 13:24 ` [PATCH 2/3] classes/sbom-cve-check: remove references to vex.bbclass Ross Burton
  2026-03-31 13:24 ` [PATCH 3/3] classes/cve-check: remove class Ross Burton
@ 2026-03-31 14:43 ` Marta Rybczynska
  2026-03-31 15:04   ` Ross Burton
  2026-04-01 12:29   ` Daniel Turull
  2 siblings, 2 replies; 12+ messages in thread
From: Marta Rybczynska @ 2026-03-31 14:43 UTC (permalink / raw)
  To: ross.burton; +Cc: openembedded-core

[-- Attachment #1: Type: text/plain, Size: 863 bytes --]

On Tue, Mar 31, 2026 at 3:24 PM Ross Burton via lists.openembedded.org
<ross.burton=arm.com@lists.openembedded.org> wrote:

> This class existed as a provider of information for external CVE tooling,
> and uses a non-standard format that is OpenEmbedded-specific[1].
>
> However, the SPDX 3 output can contain all of this needed information,
> in a format that is standardised.
>
> I'm unaware of any active users of this class beyond sbom-cve-check,
> which can also read the data from the SPDX if SPDX_INCLUDE_VEX has been
> set.
>
> So that we don't have to maintain this class for the lifetime of the
> Wrynose LTS, delete it.
>
> [1] oe-core 6352ad93a72 ("vex.bbclass: add a new class")
>
>
For the record, I do not agree with this removal. SPDX3 has still not
reached mature usage outside of the LF ecosystem.

Kind regards,
Marta

[-- Attachment #2: Type: text/html, Size: 1343 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-03-31 14:43 ` [OE-core] [PATCH 1/3] classes/vex: remove Marta Rybczynska
@ 2026-03-31 15:04   ` Ross Burton
  2026-03-31 15:13     ` Marta Rybczynska
  2026-04-01 12:29   ` Daniel Turull
  1 sibling, 1 reply; 12+ messages in thread
From: Ross Burton @ 2026-03-31 15:04 UTC (permalink / raw)
  To: rybczynska@gmail.com; +Cc: openembedded-core@lists.openembedded.org

On 31 Mar 2026, at 15:43, Marta Rybczynska via lists.openembedded.org <rybczynska=gmail.com@lists.openembedded.org> wrote:
> For the record, I do not agree with this removal. SPDX3 has still not reached mature usage outside of the LF ecosystem.

Is there an ecosystem of tooling that uses the Yocto-specific JSON format that vex.bbclass generates?

Ross

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-03-31 15:04   ` Ross Burton
@ 2026-03-31 15:13     ` Marta Rybczynska
  2026-03-31 15:19       ` Ross Burton
  0 siblings, 1 reply; 12+ messages in thread
From: Marta Rybczynska @ 2026-03-31 15:13 UTC (permalink / raw)
  To: Ross Burton; +Cc: openembedded-core@lists.openembedded.org

[-- Attachment #1: Type: text/plain, Size: 678 bytes --]

On Tue, Mar 31, 2026 at 5:05 PM Ross Burton <Ross.Burton@arm.com> wrote:

> On 31 Mar 2026, at 15:43, Marta Rybczynska via lists.openembedded.org
> <rybczynska=gmail.com@lists.openembedded.org> wrote:
> > For the record, I do not agree with this removal. SPDX3 has still not
> reached mature usage outside of the LF ecosystem.
>
> Is there an ecosystem of tooling that uses the Yocto-specific JSON format
> that vex.bbclass generates?


Yes there are tools that support YP cve-check JSON format as input and, by
extension, they can use the vex.bbclass too.

Not sure how many of them want to manifest, this is usually internal
tooling.

Kind regards,
Marta

[-- Attachment #2: Type: text/html, Size: 1242 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-03-31 15:13     ` Marta Rybczynska
@ 2026-03-31 15:19       ` Ross Burton
  0 siblings, 0 replies; 12+ messages in thread
From: Ross Burton @ 2026-03-31 15:19 UTC (permalink / raw)
  To: Marta Rybczynska; +Cc: openembedded-core@lists.openembedded.org

On 31 Mar 2026, at 16:13, Marta Rybczynska <rybczynska@gmail.com> wrote:
> On Tue, Mar 31, 2026 at 5:05 PM Ross Burton <Ross.Burton@arm.com> wrote:
> On 31 Mar 2026, at 15:43, Marta Rybczynska via lists.openembedded.org <rybczynska=gmail.com@lists.openembedded.org> wrote:
> > For the record, I do not agree with this removal. SPDX3 has still not reached mature usage outside of the LF ecosystem.
> 
> Is there an ecosystem of tooling that uses the Yocto-specific JSON format that vex.bbclass generates?
> 
> Yes there are tools that support YP cve-check JSON format as input and, by extension, they can use the vex.bbclass too.
> 
> Not sure how many of them want to manifest, this is usually internal tooling.

IMHO, if these users don’t want to move to SPDX, then they can copy the vex.bbclass from current master into their distro.

Ross

^ permalink raw reply	[flat|nested] 12+ messages in thread

* RE: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-03-31 14:43 ` [OE-core] [PATCH 1/3] classes/vex: remove Marta Rybczynska
  2026-03-31 15:04   ` Ross Burton
@ 2026-04-01 12:29   ` Daniel Turull
  2026-04-02 14:42     ` Ross Burton
  1 sibling, 1 reply; 12+ messages in thread
From: Daniel Turull @ 2026-04-01 12:29 UTC (permalink / raw)
  To: rybczynska@gmail.com, ross.burton@arm.com
  Cc: openembedded-core@lists.openembedded.org

[-- Attachment #1: Type: text/plain, Size: 1400 bytes --]

Hi,

The kernel scripts to check CVEs uses the vex output as input.
https://git.openembedded.org/openembedded-core/tree/scripts/contrib/improve_kernel_cve_report.py

Daniel

From: openembedded-core@lists.openembedded.org <openembedded-core@lists.openembedded.org> On Behalf Of Marta Rybczynska via lists.openembedded.org
Sent: Tuesday, 31 March 2026 16:43
To: ross.burton@arm.com
Cc: openembedded-core@lists.openembedded.org
Subject: Re: [OE-core] [PATCH 1/3] classes/vex: remove



On Tue, Mar 31, 2026 at 3:24 PM Ross Burton via lists.openembedded.org<http://lists.openembedded.org/> <ross.burton=arm.com@lists.openembedded.org<mailto:arm.com@lists.openembedded.org>> wrote:
This class existed as a provider of information for external CVE tooling,
and uses a non-standard format that is OpenEmbedded-specific[1].

However, the SPDX 3 output can contain all of this needed information,
in a format that is standardised.

I'm unaware of any active users of this class beyond sbom-cve-check,
which can also read the data from the SPDX if SPDX_INCLUDE_VEX has been
set.

So that we don't have to maintain this class for the lifetime of the
Wrynose LTS, delete it.

[1] oe-core 6352ad93a72 ("vex.bbclass: add a new class")

For the record, I do not agree with this removal. SPDX3 has still not reached mature usage outside of the LF ecosystem.

Kind regards,
Marta

[-- Attachment #2: Type: text/html, Size: 5102 bytes --]

^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-04-01 12:29   ` Daniel Turull
@ 2026-04-02 14:42     ` Ross Burton
  2026-04-02 14:53       ` Benjamin Robin
  0 siblings, 1 reply; 12+ messages in thread
From: Ross Burton @ 2026-04-02 14:42 UTC (permalink / raw)
  To: Daniel Turull
  Cc: rybczynska@gmail.com, openembedded-core@lists.openembedded.org,
	Benjamin Robin

On 1 Apr 2026, at 13:29, Daniel Turull <daniel.turull@ericsson.com> wrote:
>  The kernel scripts to check CVEs uses the vex output as input.
> https://git.openembedded.org/openembedded-core/tree/scripts/contrib/improve_kernel_cve_report.py

I believe this functionality is also superceded by sbom-cve-check, as the recommended configuration fragment sets SPDX_INCLUDE_COMPILED_SOURCES:pn-linux-yocto = “1”.

Would you be able to verify this, we might be able to deprecate/remove this script too in master.

Ross


^ permalink raw reply	[flat|nested] 12+ messages in thread

* Re: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-04-02 14:42     ` Ross Burton
@ 2026-04-02 14:53       ` Benjamin Robin
  2026-04-07  7:39         ` Daniel Turull
  0 siblings, 1 reply; 12+ messages in thread
From: Benjamin Robin @ 2026-04-02 14:53 UTC (permalink / raw)
  To: Daniel Turull, Ross Burton
  Cc: rybczynska@gmail.com, openembedded-core@lists.openembedded.org

Hello,

On Thursday, April 2, 2026 at 4:42 PM, Ross Burton wrote:
> On 1 Apr 2026, at 13:29, Daniel Turull <daniel.turull@ericsson.com> wrote:
> >  The kernel scripts to check CVEs uses the vex output as input.
> > https://git.openembedded.org/openembedded-core/tree/scripts/contrib/improve_kernel_cve_report.py
> 
> I believe this functionality is also superceded by sbom-cve-check, as the recommended configuration fragment sets SPDX_INCLUDE_COMPILED_SOURCES:pn-linux-yocto = “1”.
> 
> Would you be able to verify this, we might be able to deprecate/remove this script too in master.
> 
> Ross

Currently, sbom-cve-check does not fully handle Linux kernel CVEs correctly.
Special processing is required when the information originates from the
kernel CNA, as many kernel CVEs are incorrectly marked as vulnerable.

Additionally, sbom-cve-check does not yet provide an assessment as detailed
as improve_kernel_cve_report.py.

The first limitation is planned to be addressed in the very near future
(within this month). And for the second point, I hope I can address it
at the same time.

-- 
Benjamin Robin, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com





^ permalink raw reply	[flat|nested] 12+ messages in thread

* RE: [OE-core] [PATCH 1/3] classes/vex: remove
  2026-04-02 14:53       ` Benjamin Robin
@ 2026-04-07  7:39         ` Daniel Turull
  0 siblings, 0 replies; 12+ messages in thread
From: Daniel Turull @ 2026-04-07  7:39 UTC (permalink / raw)
  To: Benjamin Robin, Ross Burton
  Cc: rybczynska@gmail.com, openembedded-core@lists.openembedded.org

Hi,
While the vex output is optional to improve the quality is not a blocker to use improve_kernel_cve_report.py. But Benjamin also says that it is still not integrated with the sbom-cve-check.

I could also add the functionality to read from the spdx3 sbom but it won't make it for the next release.

I think is a bit too late in the cycle to keep removing functionality. I can keep the vex.bbclass on our meta-layers but this will reduce the collaborative work around it.

Best regards,
Daniel

> -----Original Message-----
> From: Benjamin Robin <benjamin.robin@bootlin.com>
> Sent: Thursday, 2 April 2026 16:54
> To: Daniel Turull <daniel.turull@ericsson.com>; Ross Burton
> <Ross.Burton@arm.com>
> Cc: rybczynska@gmail.com; openembedded-core@lists.openembedded.org
> Subject: Re: [OE-core] [PATCH 1/3] classes/vex: remove
> 
> [You don't often get email from benjamin.robin@bootlin.com. Learn why this
> is important at https://aka.ms/LearnAboutSenderIdentification ]
> 
> Hello,
> 
> On Thursday, April 2, 2026 at 4:42 PM, Ross Burton wrote:
> > On 1 Apr 2026, at 13:29, Daniel Turull <daniel.turull@ericsson.com> wrote:
> > >  The kernel scripts to check CVEs uses the vex output as input.
> > > https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgi
> > > t.openembedded.org%2Fopenembedded-
> core%2Ftree%2Fscripts%2Fcontrib%2F
> > >
> improve_kernel_cve_report.py&data=05%7C02%7Cdaniel.turull%40ericsson
> > >
> .com%7C60ce89cdb801483670c408de90c79ed8%7C92e84cebfbfd47abbe5208
> 0c6b
> > >
> 87953f%7C0%7C0%7C639107384201245112%7CUnknown%7CTWFpbGZsb3d
> 8eyJFbXB0
> > >
> eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpb
> CIs
> > >
> IldUIjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=DEPJ9k1Z6SA%2BhgmvIu0VO0b
> WbQXmTu
> > > LIsZ7PBJ%2F0O48%3D&reserved=0
> >
> > I believe this functionality is also superceded by sbom-cve-check, as the
> recommended configuration fragment sets
> SPDX_INCLUDE_COMPILED_SOURCES:pn-linux-yocto = “1”.
> >
> > Would you be able to verify this, we might be able to deprecate/remove this
> script too in master.
> >
> > Ross
> 
> Currently, sbom-cve-check does not fully handle Linux kernel CVEs correctly.
> Special processing is required when the information originates from the
> kernel CNA, as many kernel CVEs are incorrectly marked as vulnerable.
> 
> Additionally, sbom-cve-check does not yet provide an assessment as detailed
> as improve_kernel_cve_report.py.
> 
> The first limitation is planned to be addressed in the very near future (within
> this month). And for the second point, I hope I can address it at the same
> time.
> 
> --
> Benjamin Robin, Bootlin
> Embedded Linux and Kernel engineering
> https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fbootlin
> .com%2F&data=05%7C02%7Cdaniel.turull%40ericsson.com%7C60ce89cdb80
> 1483670c408de90c79ed8%7C92e84cebfbfd47abbe52080c6b87953f%7C0%7C
> 0%7C639107384201271513%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hc
> GkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldU
> IjoyfQ%3D%3D%7C0%7C%7C%7C&sdata=zjkvycnZmIybcUX4M5wFqSHCXQwe
> CYP3I7AFf4QPKW0%3D&reserved=0
> 
> 


^ permalink raw reply	[flat|nested] 12+ messages in thread

end of thread, other threads:[~2026-04-07  7:40 UTC | newest]

Thread overview: 12+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2026-03-31 13:24 [PATCH 1/3] classes/vex: remove Ross Burton
2026-03-31 13:24 ` [PATCH 2/3] classes/sbom-cve-check: remove references to vex.bbclass Ross Burton
2026-03-31 13:24 ` [PATCH 3/3] classes/cve-check: remove class Ross Burton
2026-03-31 13:48   ` [OE-core] " Marko, Peter
2026-03-31 14:43 ` [OE-core] [PATCH 1/3] classes/vex: remove Marta Rybczynska
2026-03-31 15:04   ` Ross Burton
2026-03-31 15:13     ` Marta Rybczynska
2026-03-31 15:19       ` Ross Burton
2026-04-01 12:29   ` Daniel Turull
2026-04-02 14:42     ` Ross Burton
2026-04-02 14:53       ` Benjamin Robin
2026-04-07  7:39         ` Daniel Turull

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox