summaryrefslogtreecommitdiff
path: root/poky/meta/classes/create-spdx.bbclass
diff options
context:
space:
mode:
authorAndrew Geissler <geissonator@yahoo.com>2021-09-25 00:47:35 +0300
committerAndrew Geissler <geissonator@yahoo.com>2021-10-05 22:27:21 +0300
commit5199d831602da71945df7cef62eb3c01183cf20e (patch)
tree0e4c75d4ac0f346489cb92fa4ccccf0a0aa56fe1 /poky/meta/classes/create-spdx.bbclass
parente9e982486160a1d724bf30f21167d72dfbcb84ce (diff)
downloadopenbmc-5199d831602da71945df7cef62eb3c01183cf20e.tar.xz
subtree updates
meta-security: 1f18c623e9..de6712a806: Armin Kuster (8): cryfs: drop recipe trousers: set precise BSD license ibmtpm2tss: set precise BSD license ibmswtpm2: set precise BSD license opendnssec: set precise BSD license checksec: set precise BSD license isic: set precise BSD license tpm-quote-tools: Update SRC_URI Christer Fletcher (1): dm-verity-img.bbclass: Expose --data-block-size for configuration Kai Kang (1): sssd: 2.5.1 -> 2.5.2 meta-raspberrypi: a6fa6b3aec..9eb4879cf4: Andrew Penner (1): rpi-cmdline: Support ethernet over USB Khem Raj (2): linux-raspberrypi: Update to 5.10.63 raspberrypi-firmware: Update to latest meta-openembedded: e4a3c66505..cff8331f96: Armin Kuster (21): python3-cycler: set precise BSD license python3-dill: set precise BSD license python3-ipython-genutils: set precise BSD license python3-traitlets: set precise BSD license python3-parallax: set precise BSD license python3-ipython:set precise BSD license python3-mpmath: set precise BSD license python3-sympy: set precise BSD license python3-sqlparse: set precise BSD license python3-webencodings: set precise BSD license python3-pyperclip:set precise BSD license python3-geojson: set precise BSD license python3-aenum: set precise BSD license python3-gnupg: set precise BSD license python3-kiwisolver: set precise BSD license python3-jdcal: set precise BSD license python3-send2trash: set precise BSD license python3-flask-bootstrap: Update LICENSES autossh: set precise BSD licenses jemalloc: set precise BSD license gpsd-machine-conf: set precise BSD license Bruce Ashfield (1): vboxguestdrivers: fix build against 5.14+ Ed Tanous (1): Boost-url Move to latest version Khem Raj (57): gdm: Add polkit to required distro features python3-lxml: Inherit pkgconfig python3-icu: Inherit pkgconfig python3-h5py: Inherit pkgconfig python3-pyparted: Inherit pkgconfig python3-systemd: Inherit pkgconfig rp-pppoe: Add configure cached variable via recipe site: Remove local site files postfix: Inherit pkgconfig emacs: Inherit pkgconfig libgnt: Inherit pkgconfig libgnt: Inherit pkgconfig portaudio-v19: Inherit pkgconfig sshfs-fuse: Inherit pkgconfig appstream-glib: Inherit pkgconfig volume-key: Inherit pkgconfig kronosnet: Inherit pkgconfig rrdtool: Inherit pkgconfig libbytesize: Inherit pkgconfig dlt-daemon: Inherit pkgconfig libmypaint: Inherit pkgconfig libubox: Inherit pkgconfig xfsprogs: Inherit pkgconfig pavucontrol: Inherit pkgconfig blueman: Inherit pkgconfig mimic: Inherit pkgconfig libchamplain: Inherit pkgconfig gst-shark: Inherit pkgconfig zchunk: Inherit pkgconfig libvdpau: Inherit pkgconfig tigervnc: Inherit pkgconfig mpc: Inherit pkgconfig avro-c: Inherit pkgconfig udevil: Inherit pkgconfig remmina: Inherit pkgconfig transmission: Inherit pkgconfig libuvc: Inherit pkgconfig crda: Inherit pkgconfig wxwidgets: Inherit pkgconfig mdbus2: Inherit pkgconfig firewalld: Inherit pkgconfig renderdoc: Inherit pkgconfig fetchmail: Inherit pkgconfig ncmpc: Inherit pkgconfig yad: Inherit pkgconfig mscgen: Inherit pkgconfig libldb: Inherit pkgconfig pahole: Inherit missing pkgconfig gerbera: Inherit pkgconfig xfce4-datetime-setter: Inherit pkgconfig libblockdev: Inherit pkgconfig ntopng: Inherit pkgconfig mosquitto: Inherit pkgconfig samba: Inherit pkgconfig fio: Upgrade to 3.28 rdma-core: Inherit pkgconfig postfix: Add missing dependency on m4 Marek Vasut (1): dstat: Add missing python-six runtime dependency Matteo Croce (1): pahole: call python via env in the shebang Pascal Bach (1): poco: update to 1.11.0 Peter Kjellerstedt (1): libiio: Make libiio-python3 depend on python3-core Pierre-Jean Texier (1): cppzmq: upgrade 4.8.0 -> 4.8.1 Sakib Sajal (3): bats: source files from correct directory gd: upgrade 2.3.2 -> 2.3.3 lmdb: replace tag with commit id in SRCREV Trevor Woerner (2): vk-gl-cts: allow the user to specify the target vk-gl-cts: fix soname linking Yi Zhao (2): samba: upgrade 4.14.5 -> 4.14.7 net-snmp: remove perllocal.pod when enable packageconfig[perl] jan (1): netdata: Fixed the recipe. wangmy (3): byacc: upgrade 20200910 -> 20210808 nghttp2: upgrade 1.44.0 -> 1.45.1 apache2: upgrade 2.4.48 -> 2.4.49 zangrc (5): python3-beautifulsoup4: upgrade 4.9.3 -> 4.10.0 python3-bitarray: upgrade 2.3.3 -> 2.3.4 python3-decorator: upgrade 5.0.9 -> 5.1.0 python3-grpcio-tools: upgrade 1.39.0 -> 1.40.0 python3-grpcio: upgrade 1.39.0 -> 1.40.0 zhengruoqin (5): python3-openpyxl: upgrade 3.0.7 -> 3.0.8 python3-pandas: upgrade 1.3.2 -> 1.3.3 python3-pulsectl: upgrade 21.5.18 -> 21.9.1 protobuf: upgrade 3.17.3 -> 3.18.0 span-lite: upgrade 0.10.0 -> 0.10.1 poky: 359e1cb62f..06dcace68b: Alexander Kanavin (13): lttng: update 2.12 -> 2.13.0 core-image-ptest-all: bump RAM requirement to 4G bitbake: bitbake: drop old rules for python warnings bitbake: bitbake: correct the collections vs collections.abc deprecation bitbake: bitbake: fix regexp deprecation warnings bitbake: bitbake: do not import imp in layerindexlib bitbake: bitbake: adjust parser error check for python 3.10 compatibility bitbake: bitbake: correct deprecation warning in process.py bitbake: bitbake: enable python warnings at the first opportunity meta: correct collections vs collections.abc deprecation wic: keep rootfs_size as integer cpan-base.bbclass: use raw string for regexp testimage: symlink the task log and qemu console log to tmp/log/oeqa Armin Kuster (2): apr: Security fix for CVE-2021-35940 tar: ignore node-tar CVEs Bruce Ashfield (11): linux-yocto/5.13: update to v5.13.13 linux-yocto/5.13: update to v5.13.15 linux-yocto/5.10: update to v5.10.61 linux-yocto/5.10: update to v5.10.63 yocto-bsp/5.10: update to v5.10.63 yocto-bsp/5.13: update to v5.13.15 libc-headers: bump to v5.14 linux-yocto: introduce 5.14 reference kernel systemtap: update to 4.5-latest conf/machine: bump qemu preferred versions to 5.14 poky: set default kernel to 5.14 Changqing Li (1): lttng-ust: fix do_compile error when PACKAGECONFIG examples is enabled Chanho Park (1): binutils: inherit pkgconfig to address libdebuginfod depdency Claudius Heine (1): rng-tools: add systemd-udev-settle wants to service Daniel Ammann (1): bitbake: fetch2/wget: Enable ftps Daniel Wagenknecht (2): mirrors.bbclass: provide additional rule for git repo fallbacks mirrors.bbclass: remove redundant server-specific mirrors Denys Dmytriyenko (1): readline: correct pkg-config dependency for termcap Hsia-Jun(Randy) Li (1): cross-canadian: make android pass target sys check Jon Mason (6): Update mailing list address README: update mailing list address dev-manual: update mailing list address core-image-sato: Fix runqemu error for qemuarmv5 machine/qemuarm*: use virtio graphics testimage: remove aarch64 xorg exclusion Joshua Watt (17): Add SPDX licenses classes/package: Add extended packaged data classes/create-spdx: Add class classes/create-spdx: Change creator classes/create-spdx: Add SHA1 to index file classes/create-spdx: Add index to DEPLOYDIR classes/create-spdx: Add runtime dependency mapping classes/create-spdx: Add NOASSERTION for unknown debug sources classes/create-spdx: Fix another creator classes/create-spdx: Fix up license reporting classes/create-spdx: Speed up hash calculations classes/create-spdx: Fix file:// in downloadLocation classes/create-spdx: Add special exception for Public Domain license classes/create-spdx: Collect all task dependencies classes/create-spdx: Skip package processing for native recipes classes/create-spdx: Comment out placeholder license warning bitbake: cooker: Allow upstream for local hash equivalence server Kai Kang (2): perl: fix CVE-2021-36770 rust-common.bbclass: make sure ccache exist Kevin Hao (1): meta-yocto-bsp: Update the default kernel to v5.14 Khem Raj (3): vim: Add packageconfig for sound notification support site: Drop caching libIDL_cv_long_long_format site: Drop ORBit2 relared cached variables Konrad Weihmann (1): expat: pull from github releases Kristian Klausen (3): systemd: Add homed PACKAGECONFIG wic: Add extra-space argument systemd: Add tpm2 PACKAGECONFIG Mark Hatle (3): reproducible_build: Remove BUILD_REPRODUCIBLE_BINARIES checking externalsrc: Work with reproducible_build tcf-agent: Move to the latest master version Markus Volk (1): util-linux: disable raw Martin Jansa (3): default-distrovars.inc: Set BBINCLUDELOGS to empty to disable printing failed task output multiple times bitbake: bitbake.conf: fix vars_from_file() call qemu-native: add direct dependency on ninja-native and meson-native Michael Halstead (1): releases: update to include 3.3.3 Michael Opdenacker (9): dev-manual: explicit that devpyshell is a task bitbake: bitbake-user-manual: replace "file name" by "filename" manuals: replace Freenode by Libera Chat as IRC host manuals: delete unmaintained history sections ref-manual: document UPSTREAM_CHECK_COMMITS and UPSTREAM_VERSION_UNKNOWN ref-manual: remove checkpkg task ref-manual: improve "devtool check-upgrade-status" details ref-manual: improve documentation for RECIPE_NO_UPDATE_REASON ref-manual: update "devtool check-upgrade-status" output Mingli Yu (6): coreutils: add pkgconfig for selinux findutils: add pkgconfig for selinux tar: add pkgconfig for selinux multilib.bbclass: add RDEPENDS related check back insane.bbclass: add FILERDEPENDS related check back python3: fix multilib qa issue Peter Bergin (1): systemd: add packageconfig for wheel-group Peter Kjellerstedt (2): common-licenses, licenses.conf: Remove duplicate licenses create-spdx.bbclass: Search all license directories for licenses Quentin Schulz (3): bitbake: doc: bitbake-user-manual-execution: remove mention to long-gone BBHASHDEPS variable conf/mips: mips16e: prepend override to MACHINEOVERRIDES bitbake: doc: bitbake-user-manual-fetching: S should be set to WORKDIR/git for git fetcher Randy MacLeod (1): tcmode-default: add rust to the default toolchains Ranjitsinh Rathod (1): rpm: Handle proper return value to avoid major issues Richard Purdie (67): oeqa/runtime/parselogs: Make DVD ata error apply to all qemux86 machines tcl: Exclude CVE-2021-35331 from checks xdg-utils: Add fix for CVE-2020-27748 build-appliance-image: Update to master head revision utils: Drop unused variable staging_install from oe_libinstall utils: Drop obsolete oe_machinstall function flex: Add CVE-2019-6293 to exclusions for checks go: Exclude CVE-2021-29923 from report list bitbake: runqueue: Avoid deadlock avoidance task graph corruption bitbake: runqueue: Fix issues with multiconfig deferred task deadlock messages oeqa/oescripts: Fix after tar recipe changes pseudo: Update with fcntl and glibc 2.34 fixes bitbake: persist_data: Drop deprecated/unused function bitbake: parse_py: Drop deprecated function reference bitbake: build: Match markup to real function name bitbake: build: Handle SystemExit in python tasks correctly bitbake: process: Don't include logs in error message if piping them bitbake: build: Avoid duplicating logs in verbose mode bitbake: data_smart: Make ExpansionErrors more readable bitbake: build: Catch and error upon circular task references bitbake: data_smart: Improve error display for handled exceptions bitbake: fetch2: Add recursion guard bitbake: cookerdata: Improve missing core layer error message bitbake: cookerdata: Show error for no BBLAYERS in bblayers.conf bitbake: runqueue: Clean up task stats handling Revert "default-distrovars.inc: Set BBINCLUDELOGS to empty to disable printing failed task output multiple times" bitbake.conf: Ensure XZ_THREADS doesn't change sstate checksums sstate: Avoid problems with recipes using SRCPV when fetching sstate local.conf.sample: Update sstate mirror entry with new hash equivalence setting useradd: Ensure preinst data is expanded correctly in pkgdata package: Fix pkgdata determinism issues sstate: Ensure SDE is accounted for in package task timestamps bash: Ensure deterministic build sstatesig: Allow exclusion of the root directory for do_package bitbake: bitbake-worker: Improve error handling bitbake: runqueue/knotty: Improve UI handling of setscene task counting bitbake: fetch2/git: Avoid races over mirror tarball creation README: Update email address for Bruce bitbake: cookerdata: Show a readable error for invalid multiconfig name bitbake: fetch2/git: Use os.rename instead of mv bitbake: tests/fetch2: Fix quoting warning bitbake: data_smart: Don't add None to ExpansionError varlist bitbake: fetch2/svn: Allow peg-revision functionality to be disabled vim: Backport fix for CVE-2021-3770 libgcrypt: Upgrade 1.9.3 -> 1.9.4 sqlite3: Exclude CVE-2021-36690 from cve checks recipes: Add missing pkgconfig inherit lttng-tools: Add missing DEPENDS on bison-native cross: Drop unused do_install pybootchart: Avoid divide by zero bitbake: tests/fetch2: Use our own git server for dtc test repo scripts/oe-publish-sdk: Disable git gc to avoid build errors image/qemu: Add explict depends for qemu-helper addto_recipe_sysroot task siteinfo/autotools: Ensure task checksums reflect site files package_ipk/deb/rpm: Drop recursive do_build task dependencies reproducible_build/package_XXX: Ensure SDE task is in dependency chain populate_sdk_base/images: Drop use of 'meta' class and hence do_build dependencies buildtools-tarball/uninative-tarball/meta-ide-support: Drop useless meta class meta: Drop useless class staging: Mark deploy an sstate task sstate: Ensure deploy tasks don't pull in toolchains sstate: Avoid deploy_source_date_epoch sstate when unneeded ssate: Cleanup directtasks handling bitbake: build: Ensure python stdout/stderr is logged correctly bitbake: build: Make exception printing clearer bitbake: build: Fix log flushing race oeqa/selftest: Add tests for bitbake shell/python task output Robert P. J. Day (16): dev-manual: pass False to d.getVar() for devpyshell example ref-manual: add missing "${PN}-src" to default PACKAGES list dev-manual: small number of minor aesthetic tweaks dev-manual: various pedantic nitpickery dev-manual: drop "three" since there are four requirements ref-manual: update SYSROOT_DIRS_* variable entries README: update manual list and names, online docs URL image_types_wic.bbclass: alphabetize list of WICVARS systemd: '${systemd_unitdir}/system' => '${systemd_system_unitdir}' ref-manual: render options in monospace to show quotes properly ref-manual: remove mention of obsolete devtool "--any-recipe" option ref-manual: correct typo in "classes" section, "${BPN}/{PV}" ref-manual: add potential of parallelism to defn of "Task" ref-manual: couple minor tweaks to Chapter 1 dev-manual: emphasize that new layers live outside of poky dev-manual: update output of "wic list images" Robert Yang (1): assimp: Remove it Ross Burton (40): lz4: remove redundant BSD license python3-numpy: remove redundant BSD license quota: remove BSD license nfs-utils: set precise BSD license dtc: set precise BSD license acpica: set precise BSD license libevent: set precise BSD license openssh: remove redundant BSD license python3-packaging: fix license statement iputils: set precise BSD license libx11-compose-data: set precise BSD license webkitgtk: set precise BSD license libwpe: set precise BSD license wpebackend-fdo: set precise BSD license common-licenses: add missing SPDX licences dev-manual/common-tasks: sync libxpm fragment with the recipe lsof: correct LICENSE selftest/python-async-test: set precise BSD license lsof: add upstream check xinetd: correct LICENSE oeqa/recipeutils: update for license change to python-async-test libxfont: set precise BSD license valgrind: set precise BSD license shadow-sysroot: sync license with shadow ovmf: set precise BSD license ppp: set precise BSD license ffmpeg: update LICENSE hdparm: set correct license recipetool/create_buildsys_python: treat BSD as BSD-3-Clause oeqa/selftest/recipetool: update for license changes create-spdx: transform license list into a dict for faster lookups create-spdx: remove redundant test create-spdx: embed unknown license texts create-spdx: don't duplicate license texts in each package create-spdx: handle CLOSED license ffmpeg: fix LICENSE avahi: remove obsolete intltool-native dependency shared-mime-info: use a more concise description libsoup-2.4: remove obsolete intltool dependency oeqa/target/ssh: don't assume target_dumper is set Sakib Sajal (1): go: upgrade 1.16.5 -> 1.16.7 Saul Wold (2): classes/create-spdx: extend DocumentRef to include name create-spdx: remove trailing comma Scott Weaver (3): bitbake: bitbake: fetch2: fix premirror URI when downloadfilename defined bitbake: bitbake: tests/fetch: add downloadfilename tests bitbake: bitbake: tests/fetch: add and fix npm tests Steve Sakoman (1): connman: add CVE_PRODUCT Tom Rini (1): common-tasks: Add an example of using bbappends to add a file Trevor Woerner (1): hello-mod/hello.c: convert to module_init/module_exit Valentin Danaila (1): bitbake: fetch2/s3: allow to switch profile from environment variable Vyacheslav Yurkov (1): ref-manual: add overlayfs class Signed-off-by: Andrew Geissler <geissonator@yahoo.com> Change-Id: I194b13991cbaac7ae9e20cc2b552b508ab879905
Diffstat (limited to 'poky/meta/classes/create-spdx.bbclass')
-rw-r--r--poky/meta/classes/create-spdx.bbclass931
1 files changed, 931 insertions, 0 deletions
diff --git a/poky/meta/classes/create-spdx.bbclass b/poky/meta/classes/create-spdx.bbclass
new file mode 100644
index 000000000..3c73c21c0
--- /dev/null
+++ b/poky/meta/classes/create-spdx.bbclass
@@ -0,0 +1,931 @@
+#
+# SPDX-License-Identifier: GPL-2.0-only
+#
+
+DEPLOY_DIR_SPDX ??= "${DEPLOY_DIR}/spdx/${MACHINE}"
+
+# The product name that the CVE database uses. Defaults to BPN, but may need to
+# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff).
+CVE_PRODUCT ??= "${BPN}"
+CVE_VERSION ??= "${PV}"
+
+SPDXDIR ??= "${WORKDIR}/spdx"
+SPDXDEPLOY = "${SPDXDIR}/deploy"
+SPDXWORK = "${SPDXDIR}/work"
+
+SPDXRUNTIMEDEPLOY = "${SPDXDIR}/runtime-deploy"
+
+SPDX_INCLUDE_SOURCES ??= "0"
+SPDX_INCLUDE_PACKAGED ??= "0"
+SPDX_ARCHIVE_SOURCES ??= "0"
+SPDX_ARCHIVE_PACKAGED ??= "0"
+
+SPDX_UUID_NAMESPACE ??= "sbom.openembedded.org"
+SPDX_NAMESPACE_PREFIX ??= "http://spdx.org/spdxdoc"
+
+SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json"
+
+do_image_complete[depends] = "virtual/kernel:do_create_spdx"
+
+def get_doc_namespace(d, doc):
+ import uuid
+ namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE"))
+ return "%s/%s-%s" % (d.getVar("SPDX_NAMESPACE_PREFIX"), doc.name, str(uuid.uuid5(namespace_uuid, doc.name)))
+
+
+def is_work_shared(d):
+ pn = d.getVar('PN')
+ return bb.data.inherits_class('kernel', d) or pn.startswith('gcc-source')
+
+
+python() {
+ import json
+ if d.getVar("SPDX_LICENSE_DATA"):
+ return
+
+ with open(d.getVar("SPDX_LICENSES"), "r") as f:
+ data = json.load(f)
+ # Transform the license array to a dictionary
+ data["licenses"] = {l["licenseId"]: l for l in data["licenses"]}
+ d.setVar("SPDX_LICENSE_DATA", data)
+}
+
+def convert_license_to_spdx(lic, document, d, existing={}):
+ from pathlib import Path
+ import oe.spdx
+
+ available_licenses = d.getVar("AVAILABLE_LICENSES").split()
+ license_data = d.getVar("SPDX_LICENSE_DATA")
+ extracted = {}
+
+ def add_extracted_license(ident, name):
+ nonlocal document
+
+ if name in extracted:
+ return
+
+ extracted_info = oe.spdx.SPDXExtractedLicensingInfo()
+ extracted_info.name = name
+ extracted_info.licenseId = ident
+ extracted_info.extractedText = None
+
+ if name == "PD":
+ # Special-case this.
+ extracted_info.extractedText = "Software released to the public domain"
+ elif name in available_licenses:
+ # This license can be found in COMMON_LICENSE_DIR or LICENSE_PATH
+ for directory in [d.getVar('COMMON_LICENSE_DIR')] + d.getVar('LICENSE_PATH').split():
+ try:
+ with (Path(directory) / name).open(errors="replace") as f:
+ extracted_info.extractedText = f.read()
+ break
+ except FileNotFoundError:
+ pass
+ if extracted_info.extractedText is None:
+ # Error out, as the license was in available_licenses so should
+ # be on disk somewhere.
+ bb.error("Cannot find text for license %s" % name)
+ else:
+ # If it's not SPDX, or PD, or in available licenses, then NO_GENERIC_LICENSE must be set
+ filename = d.getVarFlag('NO_GENERIC_LICENSE', name)
+ if filename:
+ filename = d.expand("${S}/" + filename)
+ with open(filename, errors="replace") as f:
+ extracted_info.extractedText = f.read()
+ else:
+ bb.error("Cannot find any text for license %s" % name)
+
+ extracted[name] = extracted_info
+ document.hasExtractedLicensingInfos.append(extracted_info)
+
+ def convert(l):
+ if l == "(" or l == ")":
+ return l
+
+ if l == "&":
+ return "AND"
+
+ if l == "|":
+ return "OR"
+
+ if l == "CLOSED":
+ return "NONE"
+
+ spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l
+ if spdx_license in license_data["licenses"]:
+ return spdx_license
+
+ try:
+ spdx_license = existing[l]
+ except KeyError:
+ spdx_license = "LicenseRef-" + l
+ add_extracted_license(spdx_license, l)
+
+ return spdx_license
+
+ lic_split = lic.replace("(", " ( ").replace(")", " ) ").split()
+
+ return ' '.join(convert(l) for l in lic_split)
+
+
+def process_sources(d):
+ pn = d.getVar('PN')
+ assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split()
+ if pn in assume_provided:
+ for p in d.getVar("PROVIDES").split():
+ if p != pn:
+ pn = p
+ break
+
+ # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted,
+ # so avoid archiving source here.
+ if pn.startswith('glibc-locale'):
+ return False
+ if d.getVar('PN') == "libtool-cross":
+ return False
+ if d.getVar('PN') == "libgcc-initial":
+ return False
+ if d.getVar('PN') == "shadow-sysroot":
+ return False
+
+ # We just archive gcc-source for all the gcc related recipes
+ if d.getVar('BPN') in ['gcc', 'libgcc']:
+ bb.debug(1, 'spdx: There is bug in scan of %s is, do nothing' % pn)
+ return False
+
+ return True
+
+
+def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]):
+ from pathlib import Path
+ import oe.spdx
+ import hashlib
+
+ source_date_epoch = d.getVar("SOURCE_DATE_EPOCH")
+ if source_date_epoch:
+ source_date_epoch = int(source_date_epoch)
+
+ sha1s = []
+ spdx_files = []
+
+ file_counter = 1
+ for subdir, dirs, files in os.walk(topdir):
+ dirs[:] = [d for d in dirs if d not in ignore_dirs]
+ if subdir == str(topdir):
+ dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs]
+
+ for file in files:
+ filepath = Path(subdir) / file
+ filename = str(filepath.relative_to(topdir))
+
+ if filepath.is_file() and not filepath.is_symlink():
+ spdx_file = oe.spdx.SPDXFile()
+ spdx_file.SPDXID = get_spdxid(file_counter)
+ for t in get_types(filepath):
+ spdx_file.fileTypes.append(t)
+ spdx_file.fileName = filename
+
+ if archive is not None:
+ with filepath.open("rb") as f:
+ info = archive.gettarinfo(fileobj=f)
+ info.name = filename
+ info.uid = 0
+ info.gid = 0
+ info.uname = "root"
+ info.gname = "root"
+
+ if source_date_epoch is not None and info.mtime > source_date_epoch:
+ info.mtime = source_date_epoch
+
+ archive.addfile(info, f)
+
+ sha1 = bb.utils.sha1_file(filepath)
+ sha1s.append(sha1)
+ spdx_file.checksums.append(oe.spdx.SPDXChecksum(
+ algorithm="SHA1",
+ checksumValue=sha1,
+ ))
+ spdx_file.checksums.append(oe.spdx.SPDXChecksum(
+ algorithm="SHA256",
+ checksumValue=bb.utils.sha256_file(filepath),
+ ))
+
+ doc.files.append(spdx_file)
+ doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file)
+ spdx_pkg.hasFiles.append(spdx_file.SPDXID)
+
+ spdx_files.append(spdx_file)
+
+ file_counter += 1
+
+ sha1s.sort()
+ verifier = hashlib.sha1()
+ for v in sha1s:
+ verifier.update(v.encode("utf-8"))
+ spdx_pkg.packageVerificationCode.packageVerificationCodeValue = verifier.hexdigest()
+
+ return spdx_files
+
+
+def add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources):
+ from pathlib import Path
+ import hashlib
+ import oe.packagedata
+ import oe.spdx
+
+ debug_search_paths = [
+ Path(d.getVar('PKGD')),
+ Path(d.getVar('STAGING_DIR_TARGET')),
+ Path(d.getVar('STAGING_DIR_NATIVE')),
+ ]
+
+ pkg_data = oe.packagedata.read_subpkgdata_extended(package, d)
+
+ if pkg_data is None:
+ return
+
+ for file_path, file_data in pkg_data["files_info"].items():
+ if not "debugsrc" in file_data:
+ continue
+
+ for pkg_file in package_files:
+ if file_path.lstrip("/") == pkg_file.fileName.lstrip("/"):
+ break
+ else:
+ bb.fatal("No package file found for %s" % str(file_path))
+ continue
+
+ for debugsrc in file_data["debugsrc"]:
+ ref_id = "NOASSERTION"
+ for search in debug_search_paths:
+ debugsrc_path = search / debugsrc.lstrip("/")
+ if not debugsrc_path.exists():
+ continue
+
+ file_sha256 = bb.utils.sha256_file(debugsrc_path)
+
+ if file_sha256 in sources:
+ source_file = sources[file_sha256]
+
+ doc_ref = package_doc.find_external_document_ref(source_file.doc.documentNamespace)
+ if doc_ref is None:
+ doc_ref = oe.spdx.SPDXExternalDocumentRef()
+ doc_ref.externalDocumentId = "DocumentRef-dependency-" + source_file.doc.name
+ doc_ref.spdxDocument = source_file.doc.documentNamespace
+ doc_ref.checksum.algorithm = "SHA1"
+ doc_ref.checksum.checksumValue = source_file.doc_sha1
+ package_doc.externalDocumentRefs.append(doc_ref)
+
+ ref_id = "%s:%s" % (doc_ref.externalDocumentId, source_file.file.SPDXID)
+ else:
+ bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256))
+ break
+ else:
+ bb.debug(1, "Debug source %s not found" % debugsrc)
+
+ package_doc.add_relationship(pkg_file, "GENERATED_FROM", ref_id, comment=debugsrc)
+
+def collect_dep_recipes(d, doc, spdx_recipe):
+ from pathlib import Path
+ import oe.sbom
+ import oe.spdx
+
+ deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
+
+ dep_recipes = []
+ taskdepdata = d.getVar("BB_TASKDEPDATA", False)
+ deps = sorted(set(
+ dep[0] for dep in taskdepdata.values() if
+ dep[1] == "do_create_spdx" and dep[0] != d.getVar("PN")
+ ))
+ for dep_pn in deps:
+ dep_recipe_path = deploy_dir_spdx / "recipes" / ("recipe-%s.spdx.json" % dep_pn)
+
+ spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_recipe_path)
+
+ for pkg in spdx_dep_doc.packages:
+ if pkg.name == dep_pn:
+ spdx_dep_recipe = pkg
+ break
+ else:
+ continue
+
+ dep_recipes.append(oe.sbom.DepRecipe(spdx_dep_doc, spdx_dep_sha1, spdx_dep_recipe))
+
+ dep_recipe_ref = oe.spdx.SPDXExternalDocumentRef()
+ dep_recipe_ref.externalDocumentId = "DocumentRef-dependency-" + spdx_dep_doc.name
+ dep_recipe_ref.spdxDocument = spdx_dep_doc.documentNamespace
+ dep_recipe_ref.checksum.algorithm = "SHA1"
+ dep_recipe_ref.checksum.checksumValue = spdx_dep_sha1
+
+ doc.externalDocumentRefs.append(dep_recipe_ref)
+
+ doc.add_relationship(
+ "%s:%s" % (dep_recipe_ref.externalDocumentId, spdx_dep_recipe.SPDXID),
+ "BUILD_DEPENDENCY_OF",
+ spdx_recipe
+ )
+
+ return dep_recipes
+
+collect_dep_recipes[vardepsexclude] += "BB_TASKDEPDATA"
+
+
+def collect_dep_sources(d, dep_recipes):
+ import oe.sbom
+
+ sources = {}
+ for dep in dep_recipes:
+ recipe_files = set(dep.recipe.hasFiles)
+
+ for spdx_file in dep.doc.files:
+ if spdx_file.SPDXID not in recipe_files:
+ continue
+
+ if "SOURCE" in spdx_file.fileTypes:
+ for checksum in spdx_file.checksums:
+ if checksum.algorithm == "SHA256":
+ sources[checksum.checksumValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file)
+ break
+
+ return sources
+
+
+python do_create_spdx() {
+ from datetime import datetime, timezone
+ import oe.sbom
+ import oe.spdx
+ import uuid
+ from pathlib import Path
+ from contextlib import contextmanager
+ import oe.cve_check
+
+ @contextmanager
+ def optional_tarfile(name, guard, mode="w"):
+ import tarfile
+ import bb.compress.zstd
+
+ num_threads = int(d.getVar("BB_NUMBER_THREADS"))
+
+ if guard:
+ name.parent.mkdir(parents=True, exist_ok=True)
+ with bb.compress.zstd.open(name, mode=mode + "b", num_threads=num_threads) as f:
+ with tarfile.open(fileobj=f, mode=mode + "|") as tf:
+ yield tf
+ else:
+ yield None
+
+
+ deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
+ spdx_workdir = Path(d.getVar("SPDXWORK"))
+ include_packaged = d.getVar("SPDX_INCLUDE_PACKAGED") == "1"
+ include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1"
+ archive_sources = d.getVar("SPDX_ARCHIVE_SOURCES") == "1"
+ archive_packaged = d.getVar("SPDX_ARCHIVE_PACKAGED") == "1"
+ is_native = bb.data.inherits_class("native", d)
+
+ creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
+
+ doc = oe.spdx.SPDXDocument()
+
+ doc.name = "recipe-" + d.getVar("PN")
+ doc.documentNamespace = get_doc_namespace(d, doc)
+ doc.creationInfo.created = creation_time
+ doc.creationInfo.comment = "This document was created by analyzing recipe files during the build."
+ doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"]
+ doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass")
+ doc.creationInfo.creators.append("Organization: OpenEmbedded ()")
+ doc.creationInfo.creators.append("Person: N/A ()")
+
+ recipe = oe.spdx.SPDXPackage()
+ recipe.name = d.getVar("PN")
+ recipe.versionInfo = d.getVar("PV")
+ recipe.SPDXID = oe.sbom.get_recipe_spdxid(d)
+
+ for s in d.getVar('SRC_URI').split():
+ if not s.startswith("file://"):
+ recipe.downloadLocation = s
+ break
+ else:
+ recipe.downloadLocation = "NOASSERTION"
+
+ homepage = d.getVar("HOMEPAGE")
+ if homepage:
+ recipe.homepage = homepage
+
+ license = d.getVar("LICENSE")
+ if license:
+ recipe.licenseDeclared = convert_license_to_spdx(license, doc, d)
+
+ summary = d.getVar("SUMMARY")
+ if summary:
+ recipe.summary = summary
+
+ description = d.getVar("DESCRIPTION")
+ if description:
+ recipe.description = description
+
+ # Some CVEs may be patched during the build process without incrementing the version number,
+ # so querying for CVEs based on the CPE id can lead to false positives. To account for this,
+ # save the CVEs fixed by patches to source information field in the SPDX.
+ patched_cves = oe.cve_check.get_patched_cves(d)
+ patched_cves = list(patched_cves)
+ patched_cves = ' '.join(patched_cves)
+ if patched_cves:
+ recipe.sourceInfo = "CVEs fixed: " + patched_cves
+
+ cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION"))
+ if cpe_ids:
+ for cpe_id in cpe_ids:
+ cpe = oe.spdx.SPDXExternalReference()
+ cpe.referenceCategory = "SECURITY"
+ cpe.referenceType = "http://spdx.org/rdf/references/cpe23Type"
+ cpe.referenceLocator = cpe_id
+ recipe.externalRefs.append(cpe)
+
+ doc.packages.append(recipe)
+ doc.add_relationship(doc, "DESCRIBES", recipe)
+
+ if process_sources(d) and include_sources:
+ recipe_archive = deploy_dir_spdx / "recipes" / (doc.name + ".tar.zst")
+ with optional_tarfile(recipe_archive, archive_sources) as archive:
+ spdx_get_src(d)
+
+ add_package_files(
+ d,
+ doc,
+ recipe,
+ spdx_workdir,
+ lambda file_counter: "SPDXRef-SourceFile-%s-%d" % (d.getVar("PN"), file_counter),
+ lambda filepath: ["SOURCE"],
+ ignore_dirs=[".git"],
+ ignore_top_level_dirs=["temp"],
+ archive=archive,
+ )
+
+ if archive is not None:
+ recipe.packageFileName = str(recipe_archive.name)
+
+ dep_recipes = collect_dep_recipes(d, doc, recipe)
+
+ doc_sha1 = oe.sbom.write_doc(d, doc, "recipes")
+ dep_recipes.append(oe.sbom.DepRecipe(doc, doc_sha1, recipe))
+
+ recipe_ref = oe.spdx.SPDXExternalDocumentRef()
+ recipe_ref.externalDocumentId = "DocumentRef-recipe-" + recipe.name
+ recipe_ref.spdxDocument = doc.documentNamespace
+ recipe_ref.checksum.algorithm = "SHA1"
+ recipe_ref.checksum.checksumValue = doc_sha1
+
+ sources = collect_dep_sources(d, dep_recipes)
+ found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos}
+
+ if not is_native:
+ bb.build.exec_func("read_subpackage_metadata", d)
+
+ pkgdest = Path(d.getVar("PKGDEST"))
+ for package in d.getVar("PACKAGES").split():
+ if not oe.packagedata.packaged(package, d):
+ continue
+
+ package_doc = oe.spdx.SPDXDocument()
+ pkg_name = d.getVar("PKG:%s" % package) or package
+ package_doc.name = pkg_name
+ package_doc.documentNamespace = get_doc_namespace(d, package_doc)
+ package_doc.creationInfo.created = creation_time
+ package_doc.creationInfo.comment = "This document was created by analyzing packages created during the build."
+ package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"]
+ package_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass")
+ package_doc.creationInfo.creators.append("Organization: OpenEmbedded ()")
+ package_doc.creationInfo.creators.append("Person: N/A ()")
+ package_doc.externalDocumentRefs.append(recipe_ref)
+
+ package_license = d.getVar("LICENSE:%s" % package) or d.getVar("LICENSE")
+
+ spdx_package = oe.spdx.SPDXPackage()
+
+ spdx_package.SPDXID = oe.sbom.get_package_spdxid(pkg_name)
+ spdx_package.name = pkg_name
+ spdx_package.versionInfo = d.getVar("PV")
+ spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses)
+
+ package_doc.packages.append(spdx_package)
+
+ package_doc.add_relationship(spdx_package, "GENERATED_FROM", "%s:%s" % (recipe_ref.externalDocumentId, recipe.SPDXID))
+ package_doc.add_relationship(package_doc, "DESCRIBES", spdx_package)
+
+ package_archive = deploy_dir_spdx / "packages" / (package_doc.name + ".tar.zst")
+ with optional_tarfile(package_archive, archive_packaged) as archive:
+ package_files = add_package_files(
+ d,
+ package_doc,
+ spdx_package,
+ pkgdest / package,
+ lambda file_counter: oe.sbom.get_packaged_file_spdxid(pkg_name, file_counter),
+ lambda filepath: ["BINARY"],
+ archive=archive,
+ )
+
+ if archive is not None:
+ spdx_package.packageFileName = str(package_archive.name)
+
+ add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources)
+
+ oe.sbom.write_doc(d, package_doc, "packages")
+}
+# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source
+addtask do_create_spdx after do_package do_packagedata do_unpack before do_build do_rm_work
+
+SSTATETASKS += "do_create_spdx"
+do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}"
+do_create_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}"
+
+python do_create_spdx_setscene () {
+ sstate_setscene(d)
+}
+addtask do_create_spdx_setscene
+
+do_create_spdx[dirs] = "${SPDXDEPLOY} ${SPDXWORK}"
+do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}"
+do_create_spdx[depends] += "${PATCHDEPENDENCY}"
+do_create_spdx[deptask] = "do_create_spdx"
+
+def collect_package_providers(d):
+ from pathlib import Path
+ import oe.sbom
+ import oe.spdx
+ import json
+
+ deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
+
+ providers = {}
+
+ taskdepdata = d.getVar("BB_TASKDEPDATA", False)
+ deps = sorted(set(
+ dep[0] for dep in taskdepdata.values() if dep[0] != d.getVar("PN")
+ ))
+ deps.append(d.getVar("PN"))
+
+ for dep_pn in deps:
+ recipe_data = oe.packagedata.read_pkgdata(dep_pn, d)
+
+ for pkg in recipe_data.get("PACKAGES", "").split():
+
+ pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, d)
+ rprovides = set(n for n, _ in bb.utils.explode_dep_versions2(pkg_data.get("RPROVIDES", "")).items())
+ rprovides.add(pkg)
+
+ for r in rprovides:
+ providers[r] = pkg
+
+ return providers
+
+collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA"
+
+python do_create_runtime_spdx() {
+ from datetime import datetime, timezone
+ import oe.sbom
+ import oe.spdx
+ import oe.packagedata
+ from pathlib import Path
+
+ deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
+ spdx_deploy = Path(d.getVar("SPDXRUNTIMEDEPLOY"))
+ is_native = bb.data.inherits_class("native", d)
+
+ creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
+
+ providers = collect_package_providers(d)
+
+ if not is_native:
+ bb.build.exec_func("read_subpackage_metadata", d)
+
+ dep_package_cache = {}
+
+ pkgdest = Path(d.getVar("PKGDEST"))
+ for package in d.getVar("PACKAGES").split():
+ localdata = bb.data.createCopy(d)
+ pkg_name = d.getVar("PKG:%s" % package) or package
+ localdata.setVar("PKG", pkg_name)
+ localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package)
+
+ if not oe.packagedata.packaged(package, localdata):
+ continue
+
+ pkg_spdx_path = deploy_dir_spdx / "packages" / (pkg_name + ".spdx.json")
+
+ package_doc, package_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path)
+
+ for p in package_doc.packages:
+ if p.name == pkg_name:
+ spdx_package = p
+ break
+ else:
+ bb.fatal("Package '%s' not found in %s" % (pkg_name, pkg_spdx_path))
+
+ runtime_doc = oe.spdx.SPDXDocument()
+ runtime_doc.name = "runtime-" + pkg_name
+ runtime_doc.documentNamespace = get_doc_namespace(localdata, runtime_doc)
+ runtime_doc.creationInfo.created = creation_time
+ runtime_doc.creationInfo.comment = "This document was created by analyzing package runtime dependencies."
+ runtime_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"]
+ runtime_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass")
+ runtime_doc.creationInfo.creators.append("Organization: OpenEmbedded ()")
+ runtime_doc.creationInfo.creators.append("Person: N/A ()")
+
+ package_ref = oe.spdx.SPDXExternalDocumentRef()
+ package_ref.externalDocumentId = "DocumentRef-package-" + package
+ package_ref.spdxDocument = package_doc.documentNamespace
+ package_ref.checksum.algorithm = "SHA1"
+ package_ref.checksum.checksumValue = package_doc_sha1
+
+ runtime_doc.externalDocumentRefs.append(package_ref)
+
+ runtime_doc.add_relationship(
+ runtime_doc.SPDXID,
+ "AMENDS",
+ "%s:%s" % (package_ref.externalDocumentId, package_doc.SPDXID)
+ )
+
+ deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "")
+ seen_deps = set()
+ for dep, _ in deps.items():
+ if dep in seen_deps:
+ continue
+
+ dep = providers[dep]
+
+ if not oe.packagedata.packaged(dep, localdata):
+ continue
+
+ dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d)
+ dep_pkg = dep_pkg_data["PKG"]
+
+ if dep in dep_package_cache:
+ (dep_spdx_package, dep_package_ref) = dep_package_cache[dep]
+ else:
+ dep_path = deploy_dir_spdx / "packages" / ("%s.spdx.json" % dep_pkg)
+
+ spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_path)
+
+ for pkg in spdx_dep_doc.packages:
+ if pkg.name == dep_pkg:
+ dep_spdx_package = pkg
+ break
+ else:
+ bb.fatal("Package '%s' not found in %s" % (dep_pkg, dep_path))
+
+ dep_package_ref = oe.spdx.SPDXExternalDocumentRef()
+ dep_package_ref.externalDocumentId = "DocumentRef-runtime-dependency-" + spdx_dep_doc.name
+ dep_package_ref.spdxDocument = spdx_dep_doc.documentNamespace
+ dep_package_ref.checksum.algorithm = "SHA1"
+ dep_package_ref.checksum.checksumValue = spdx_dep_sha1
+
+ dep_package_cache[dep] = (dep_spdx_package, dep_package_ref)
+
+ runtime_doc.externalDocumentRefs.append(dep_package_ref)
+
+ runtime_doc.add_relationship(
+ "%s:%s" % (dep_package_ref.externalDocumentId, dep_spdx_package.SPDXID),
+ "RUNTIME_DEPENDENCY_OF",
+ "%s:%s" % (package_ref.externalDocumentId, spdx_package.SPDXID)
+ )
+ seen_deps.add(dep)
+
+ oe.sbom.write_doc(d, runtime_doc, "runtime", spdx_deploy)
+}
+
+addtask do_create_runtime_spdx after do_create_spdx before do_build do_rm_work
+SSTATETASKS += "do_create_runtime_spdx"
+do_create_runtime_spdx[sstate-inputdirs] = "${SPDXRUNTIMEDEPLOY}"
+do_create_runtime_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}"
+
+python do_create_runtime_spdx_setscene () {
+ sstate_setscene(d)
+}
+addtask do_create_runtime_spdx_setscene
+
+do_create_runtime_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}"
+do_create_runtime_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}"
+do_create_runtime_spdx[rdeptask] = "do_create_spdx"
+
+def spdx_get_src(d):
+ """
+ save patched source of the recipe in SPDX_WORKDIR.
+ """
+ import shutil
+ spdx_workdir = d.getVar('SPDXWORK')
+ spdx_sysroot_native = d.getVar('STAGING_DIR_NATIVE')
+ pn = d.getVar('PN')
+
+ workdir = d.getVar("WORKDIR")
+
+ try:
+ # The kernel class functions require it to be on work-shared, so we dont change WORKDIR
+ if not is_work_shared(d):
+ # Change the WORKDIR to make do_unpack do_patch run in another dir.
+ d.setVar('WORKDIR', spdx_workdir)
+ # Restore the original path to recipe's native sysroot (it's relative to WORKDIR).
+ d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native)
+
+ # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the
+ # possibly requiring of the following tasks (such as some recipes's
+ # do_patch required 'B' existed).
+ bb.utils.mkdirhier(d.getVar('B'))
+
+ bb.build.exec_func('do_unpack', d)
+ # Copy source of kernel to spdx_workdir
+ if is_work_shared(d):
+ d.setVar('WORKDIR', spdx_workdir)
+ d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native)
+ src_dir = spdx_workdir + "/" + d.getVar('PN')+ "-" + d.getVar('PV') + "-" + d.getVar('PR')
+ bb.utils.mkdirhier(src_dir)
+ if bb.data.inherits_class('kernel',d):
+ share_src = d.getVar('STAGING_KERNEL_DIR')
+ cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/"
+ cmd_copy_kernel_result = os.popen(cmd_copy_share).read()
+ bb.note("cmd_copy_kernel_result = " + cmd_copy_kernel_result)
+
+ git_path = src_dir + "/.git"
+ if os.path.exists(git_path):
+ shutils.rmtree(git_path)
+
+ # Make sure gcc and kernel sources are patched only once
+ if not (d.getVar('SRC_URI') == "" or is_work_shared(d)):
+ bb.build.exec_func('do_patch', d)
+
+ # Some userland has no source.
+ if not os.path.exists( spdx_workdir ):
+ bb.utils.mkdirhier(spdx_workdir)
+ finally:
+ d.setVar("WORKDIR", workdir)
+
+do_rootfs[recrdeptask] += "do_create_spdx do_create_runtime_spdx"
+
+ROOTFS_POSTUNINSTALL_COMMAND =+ "image_combine_spdx ; "
+python image_combine_spdx() {
+ import os
+ import oe.spdx
+ import oe.sbom
+ import io
+ import json
+ from oe.rootfs import image_list_installed_packages
+ from datetime import timezone, datetime
+ from pathlib import Path
+ import tarfile
+ import bb.compress.zstd
+
+ creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
+ image_name = d.getVar("IMAGE_NAME")
+ image_link_name = d.getVar("IMAGE_LINK_NAME")
+
+ deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
+ imgdeploydir = Path(d.getVar("IMGDEPLOYDIR"))
+ source_date_epoch = d.getVar("SOURCE_DATE_EPOCH")
+
+ doc = oe.spdx.SPDXDocument()
+ doc.name = image_name
+ doc.documentNamespace = get_doc_namespace(d, doc)
+ doc.creationInfo.created = creation_time
+ doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build."
+ doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"]
+ doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass")
+ doc.creationInfo.creators.append("Organization: OpenEmbedded ()")
+ doc.creationInfo.creators.append("Person: N/A ()")
+
+ image = oe.spdx.SPDXPackage()
+ image.name = d.getVar("PN")
+ image.versionInfo = d.getVar("PV")
+ image.SPDXID = oe.sbom.get_image_spdxid(image_name)
+
+ doc.packages.append(image)
+
+ spdx_package = oe.spdx.SPDXPackage()
+
+ packages = image_list_installed_packages(d)
+
+ for name in sorted(packages.keys()):
+ pkg_spdx_path = deploy_dir_spdx / "packages" / (name + ".spdx.json")
+ pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path)
+
+ for p in pkg_doc.packages:
+ if p.name == name:
+ pkg_ref = oe.spdx.SPDXExternalDocumentRef()
+ pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name
+ pkg_ref.spdxDocument = pkg_doc.documentNamespace
+ pkg_ref.checksum.algorithm = "SHA1"
+ pkg_ref.checksum.checksumValue = pkg_doc_sha1
+
+ doc.externalDocumentRefs.append(pkg_ref)
+ doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID))
+ break
+ else:
+ bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path))
+
+ runtime_spdx_path = deploy_dir_spdx / "runtime" / ("runtime-" + name + ".spdx.json")
+ runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path)
+
+ runtime_ref = oe.spdx.SPDXExternalDocumentRef()
+ runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name
+ runtime_ref.spdxDocument = runtime_doc.documentNamespace
+ runtime_ref.checksum.algorithm = "SHA1"
+ runtime_ref.checksum.checksumValue = runtime_doc_sha1
+
+ # "OTHER" isn't ideal here, but I can't find a relationship that makes sense
+ doc.externalDocumentRefs.append(runtime_ref)
+ doc.add_relationship(
+ image,
+ "OTHER",
+ "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID),
+ comment="Runtime dependencies for %s" % name
+ )
+
+ image_spdx_path = imgdeploydir / (image_name + ".spdx.json")
+
+ with image_spdx_path.open("wb") as f:
+ doc.to_json(f, sort_keys=True)
+
+ image_spdx_link = imgdeploydir / (image_link_name + ".spdx.json")
+ image_spdx_link.symlink_to(os.path.relpath(image_spdx_path, image_spdx_link.parent))
+
+ num_threads = int(d.getVar("BB_NUMBER_THREADS"))
+
+ visited_docs = set()
+
+ index = {"documents": []}
+
+ spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst")
+ with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f:
+ with tarfile.open(fileobj=f, mode="w|") as tar:
+ def collect_spdx_document(path):
+ nonlocal tar
+ nonlocal deploy_dir_spdx
+ nonlocal source_date_epoch
+ nonlocal index
+
+ if path in visited_docs:
+ return
+
+ visited_docs.add(path)
+
+ with path.open("rb") as f:
+ doc, sha1 = oe.sbom.read_doc(f)
+ f.seek(0)
+
+ if doc.documentNamespace in visited_docs:
+ return
+
+ bb.note("Adding SPDX document %s" % path)
+ visited_docs.add(doc.documentNamespace)
+ info = tar.gettarinfo(fileobj=f)
+
+ info.name = doc.name + ".spdx.json"
+ info.uid = 0
+ info.gid = 0
+ info.uname = "root"
+ info.gname = "root"
+
+ if source_date_epoch is not None and info.mtime > int(source_date_epoch):
+ info.mtime = int(source_date_epoch)
+
+ tar.addfile(info, f)
+
+ index["documents"].append({
+ "filename": info.name,
+ "documentNamespace": doc.documentNamespace,
+ "sha1": sha1,
+ })
+
+ for ref in doc.externalDocumentRefs:
+ ref_path = deploy_dir_spdx / "by-namespace" / ref.spdxDocument.replace("/", "_")
+ collect_spdx_document(ref_path)
+
+ collect_spdx_document(image_spdx_path)
+
+ index["documents"].sort(key=lambda x: x["filename"])
+
+ index_str = io.BytesIO(json.dumps(index, sort_keys=True).encode("utf-8"))
+
+ info = tarfile.TarInfo()
+ info.name = "index.json"
+ info.size = len(index_str.getvalue())
+ info.uid = 0
+ info.gid = 0
+ info.uname = "root"
+ info.gname = "root"
+
+ tar.addfile(info, fileobj=index_str)
+
+ def make_image_link(target_path, suffix):
+ link = imgdeploydir / (image_link_name + suffix)
+ link.symlink_to(os.path.relpath(target_path, link.parent))
+
+ make_image_link(spdx_tar_path, ".spdx.tar.zst")
+
+ spdx_index_path = imgdeploydir / (image_name + ".spdx.index.json")
+ with spdx_index_path.open("w") as f:
+ json.dump(index, f, sort_keys=True)
+
+ make_image_link(spdx_index_path, ".spdx.index.json")
+}
+