diff options
author | Patrick Williams <patrick@stwcx.xyz> | 2022-11-17 16:29:11 +0300 |
---|---|---|
committer | Patrick Williams <patrick@stwcx.xyz> | 2022-11-19 05:11:25 +0300 |
commit | 7784c4292cd9e369da612736deb0691153d1b786 (patch) | |
tree | 31368891eb9ce67a004712b5269b7b9b0f0a3c89 /poky/meta/classes | |
parent | 4a4f35a2d76c72c60405581c889e2835ca49b641 (diff) | |
download | openbmc-7784c4292cd9e369da612736deb0691153d1b786.tar.xz |
subtree updates
- Remove systemd patches for object-manager due to upstream fix.
meta-arm: 3b7347cd67..d5f132b199:
Abdellatif El Khlifi (2):
kas: corstone1000: set branches to langdale
arm-bsp/documentation: corstone1000: 2022.11.10 RC: update the user guide
Anton Antonov (1):
arm-bsp/fvp-base: Enable virtio-rng support and unset preferred 5.15 kernel
Emekcan (2):
arm-bsp/trusted-services: add checks for null attributes in smm gateway
arm-bsp/trusted-services: Fix GetNextVariable max_name_len in smm gateway
Jon Mason (3):
arm/sbsa-acs: update to the latest version
arm/hafnium: cleanup the patches
arm/gn: update to the latest SHA
Luca Fancellu (1):
arm,arm-bsp/recipes-kernel: don't use PN in arm-ffa-transport.inc
Peter Hoyes (5):
arm/fvp: Join cli arguments in verbose logging
arm/lib: Factor out asyncio in FVPRunner
arm/lib: Decouple console parsing from the FVPRunner
arm/oeqa: Log the FVP output in OEFVPSSHTarget
runfvp: Fix verbose output when using --console
Ross Burton (1):
arm/linux-arm64-ack: fix buildpaths in the perf Python module
Rui Miguel Silva (3):
arm/trusted-services: check before applying patches
arm-bsp/trusted-services: psa test setup corstone1000
arm-bsp/trusted-firmware-m: adjust ps assets for corstone1000
Vishnu Banavath (2):
arm-bsp/documentation: corstone1000: 2022.11.10 RC: update the release notes
arm-bsp/documentation: corstone1000: 2022.11.10 RC: update the change log
meta-raspberrypi: a305f4804b..93dadf336c:
Andrei Gherzan (2):
ci: Bump actions/checkout to v3
ci: Fix dco-check job with newer git versions
Martin Jansa (1):
raspberrypi4-64: drop DEFAULTTUNE assignment
poky: 482c493cf6..44bb88cc86:
Alex Kiernan (1):
rust: update 1.64.0 -> 1.65.0
Alexander Kanavin (74):
man-pages: upgrade 5.13 -> 6.01
piglit: upgrade to latest revision
lsof: upgrade 4.96.3 -> 4.96.4
ffmpeg: upgrade 5.1.1 -> 5.1.2
ccache: upgrade 4.6.3 -> 4.7.2
python3-pip: upgrade 22.2.2 -> 22.3
ltp: upgrade 20220527 -> 20220930
alsa-utils: upgrade 1.2.7 -> 1.2.8
alsa-ucm-conf: upgrade 1.2.7.2 -> 1.2.8
libbsd: upgrade 0.11.6 -> 0.11.7
libunistring: upgrade 1.0 -> 1.1
puzzles: upgrade to latest revision
libsoup: upgrade 3.2.0 -> 3.2.1
linux-firmware: upgrade 20220913 -> 20221012
python3-git: upgrade 3.1.28 -> 3.1.29
xwayland: upgrade 22.1.3 -> 22.1.4
strace: upgrade 5.19 -> 6.0
python3-dtschema: upgrade 2022.8.3 -> 2022.9
fontconfig: upgrade 2.14.0 -> 2.14.1
python3-setuptools: upgrade 65.0.2 -> 65.5.0
taglib: upgrade 1.12 -> 1.13
nghttp2: upgrade 1.49.0 -> 1.50.0
python3-wheel: upgrade 0.37.1 -> 0.38.0
libffi: upgrade 3.4.2 -> 3.4.4
libical: upgrade 3.0.15 -> 3.0.16
mtd-utils: upgrade 2.1.4 -> 2.1.5
repo: upgrade 2.29.3 -> 2.29.5
libidn2: upgrade 2.3.3 -> 2.3.4
makedepend: upgrade 1.0.6 -> 1.0.7
diffoscope: upgrade 221 -> 224
mmc-utils: upgrade to latest revision
libsoup-2.4: upgrade 2.74.2 -> 2.74.3
gdk-pixbuf: upgrade 2.42.9 -> 2.42.10
harfbuzz: upgrade 5.3.0 -> 5.3.1
netbase: upgrade 6.3 -> 6.4
mpg123: upgrade 1.30.2 -> 1.31.1
sudo: upgrade 1.9.11p3 -> 1.9.12
alsa-lib: upgrade 1.2.7.2 -> 1.2.8
pango: upgrade 1.50.10 -> 1.50.11
pixman: upgrade 0.40.0 -> 0.42.2
vulkan: upgrade 1.3.224.1 -> 1.3.231.1
gstreamer1.0: upgrade 1.20.3 -> 1.20.4
shaderc: upgrade 2022.2 -> 2022.3
selftest: add a copy of previous mtd-utils version to meta-selftest
python3: correctly adjust include paths in sysconfigdata
vala: install vapigen-wrapper into /usr/bin/crosscripts and stage only that
sanity.bbclass: do not check for presence of distutils
pango: replace a recipe fix with an upstream submitted patch
libpciaccess: update 0.16 -> 0.17
libxinerama: update 1.1.4 -> 1.1.5
libxkbfile: update 1.1.0 -> 1.1.1
libxmu: update 1.1.3 -> 1.1.4
libxrender: update 0.9.10 -> 0.9.11
libxshmfence: update 1.3 -> 1.3.1
libxtst: update 1.2.3 -> 1.2.4
libxxf86vm: update 1.1.4 -> 1.1.5
xcb-util: update to latest revisions
xf86-input-vmmouse: update 13.1.0 -> 13.2.0
gnomebase.bbclass: return the whole version for tarball directory if it is a number
adwaita-icon-theme: update 42.0 -> 43
libepoxy: convert to git
libepoxy: update 1.5.9 -> 1.5.10
rgb: update 1.0.6 -> 1.1.0
meson: update 0.63.3 -> 0.64.0
systemd: update 251.4 -> 251.8
libxext: update 1.3.4 -> 1.3.5
gettext: update 0.21 -> 0.21.1
glib-2.0: update 2.72.3 -> 2.74.1
glib-networking: update 2.72.2 -> 2.74.0
readline: update 8.1.2 -> 8.2
llvm: update 15.0.1 -> 15.0.4
make: update 4.3 -> 4.4
bash: update 5.1.16 -> 5.2.9
mesa: do not rely on native llvm-config in target sysroot
Atanas Bunchev (1):
qemu.rst: audio: reference to Command-Line options
Benjamin Szőke (1):
image_types: Add 7-Zip support in conversion types and commands
Changhyeok Bae (1):
repo: upgrade 2.29.5 -> 2.29.9
Chase Qi (1):
libc-test: add libc testsuite for musl
Christoph Lauer (1):
populate_sdk_base: add zip options
David Bagonyi (1):
gpgme: Allow setuptools3-base to be excluded from the inherit list
Diego Sueiro (1):
kernel.bbclass: Include randstruct seed assets in STAGING_KERNEL_BUILDDIR
Etienne Cordonnier (1):
mirrors.bbclass: use shallow tarball for nativesdk-binutils
Jordan Crouse (2):
spirv-tools: Correctly set the prefix in exported cmake packages
vulkan-loader: Allow headless targets to build the loader
Jose Quaresma (3):
sstatesig: skip the rm_work task signature
rm_work: exclude the SSTATETASKS from the rm_work tasks sinature
sstate: Allow optimisation of do_deploy_archives task dependencies
Joshua Watt (2):
classes: create-spdx: Move to version specific class
scripts: convert-overrides: Allow command-line customizations
Kai Kang (1):
libuv: fixup SRC_URI
Konrad Weihmann (1):
create-spdx: default share_src for shared sources
Lee Chee Yang (1):
migration guides: add release notes for 4.0.5
Leon Anavi (2):
get_module_deps3.py: Check attribute '__file__'
python3-manifest.json: Fix re in core
Mark Asselstine (2):
bitbake: data: drop unused __expand_var_regexp__ and __expand_python_regexp__
bitbake: data_smart: allow python snippets to include a dictionary
Markus Volk (4):
webkitgtk: use libsoup-3.0 by default
epiphany: use libsoup-3.0 by default
gstreamer1.0-plugins-good: use libsoup-3.0 by default
libinput: upgrade 1.19.4 -> 1.21.0
Martin Jansa (1):
cargo.bbclass: avoid calling which ${RUSTC} with undefined ${RUSTC}
Michael Opdenacker (10):
ref-manual: terms.rst: add SBOM and SPDX terms
ref-manual: variables.rst: document spdx-create class variables
dev-manual: common-tasks.rst: add section about SPDX / SBOM generation
ref-manual: classes.rst: expand documentation of create-spdx class
ref-manual: terms.rst: add reference to new SBOM/SPDX section in dev manual
manuals: document "mime-xdg" class and MIME_XDG_PACKAGES
manuals: add shortcut for Wikipedia links
ref-manual/variables.rst: expand BB_NUMBER_THREADS description
ref-manual/variables.rst: expand PARALLEL_MAKE description
release-notes: use oe_git and yocto_git macros
Nathan Rossi (4):
oeqa/selftest/lic_checksum: Cleanup changes to emptytest include
oeqa/selftest/minidebuginfo: Create selftest for minidebuginfo
glibc-locale: Do not INHIBIT_DEFAULT_DEPS
package: Fix handling of minidebuginfo with newer binutils
Niko Mauno (1):
systemd: Consider PACKAGECONFIG in RRECOMMENDS
Paulo Neves (1):
manuals: remove xterm requirements
Pavel Zhukov (1):
bitbake: gitsm: Fix regression in gitsm submodule path parsing
Peter Kjellerstedt (1):
pango: Make it build with ptest disabled
Peter Marko (2):
systemd: add group render to udev package
meta-selftest/staticids: add render group for systemd
Quentin Schulz (3):
docs: ref-manual: classes: fix section name for github-releases
docs: ref-manual: classes: add missing closing parenthesis
docs: poky.yaml.in: remove pylint3 from Ubuntu/Debian host dependencies
Richard Purdie (7):
bitbake.conf: Drop export of SOURCE_DATE_EPOCH_FALLBACK
gcc-shared-source: Fix source date epoch handling
gcc-source: Fix gengtypes race
gcc-source: Drop gengtype manipulation
gcc-source: Ensure deploy_source_date_epoch sstate hash doesn't change
sanity: Drop data finalize call
bitbake: data/data_smart/build: Clean up datastore finalize/update_data references
Robert Yang (1):
bitbake: gitsm.py: process_submodules(): Set nobranch=1 for url
Ross Burton (19):
insane: add codeload.github.com to src-uri-bad check
populate_sdk_ext: use ConfigParser instead of SafeConfigParser
stress-ng: improve makefile use
linux-firmware: don't put the firmware into the sysroot
oeqa/qemurunner: update exception class for QMP API changes
oeqa/core/decorator: add decorators to skip based on HOST_ARCH
oeqa/selftest/buildoptions: skip test_read_only_image on qemuarm64
oeqa/selftest/efibootpartition: improve test
oeqa/selftest/imagefeatures: remove hardcoded MACHINE in test_image_gen_debugfs
oeqa/selftest/imagefeatures: don't use wic images in test_hypervisor_fmts
oeqa/selftest/imagefeatures: set a .wks in test_fs_types
oeqa/selftest/overlayfs: overlayfs: skip x86-specific tests
oeqa/selftest/package: generalise test_gdb_hardlink_debug()
oeqa/selftest/package: improve test_preserve_ownership
oeqa/selftest/runqemu: don't hardcode qemux86-64
oeqa/selftest/runtime_test: only run the virgl tests on qemux86-64
oeqa/selftest/wic: skip more tests on aarch64
oeqa/selftest/wic: use skipIfNotArch instead of custom decorator
classes/testexport: move to classes-recipe
Sergei Zhmylev (1):
wic: make ext2/3/4 images reproducible
Tim Orling (4):
python3-typing-extensions: upgrade 4.3.0 -> 4.4.0
bitbake: toaster: fixtures/README: django 1.8 -> 3.2
bitbake: toaster: fixtures/gen_fixtures.py: update branches
bitbake: toaster: Add refreshed oe-core and poky fixtures
Ulrich Ölmann (1):
dev-manual: common-tasks.rst: fix typos
Wang Mingyu (33):
bind: upgrade 9.18.7 -> 9.18.8
libedit: upgrade 20210910-3.1 -> 20221030-3.1
mtools: upgrade 4.0.41 -> 4.0.42
diffstat: upgrade 1.64 -> 1.65
inetutils: upgrade 2.3 -> 2.4
orc: upgrade 0.4.32 -> 0.4.33
socat: upgrade 1.7.4.3 -> 1.7.4.4
libxcrypt: upgrade 4.4.28 -> 4.4.30
python3-babel: upgrade 2.10.3 -> 2.11.0
python3-hatch-fancy-pypi-readme: upgrade 22.7.0 -> 22.8.0
python3-hatchling upgrade: 1.11.0 -> 1.11.1
gi-docgen: upgrade 2022.1 -> 2022.2
libdrm: upgrade 2.4.113 -> 2.4.114
mmc-utils: upgrade to latest revision
mobile-broadband-provider-info: upgrade 20220725 -> 20221107
libsdl2: upgrade 2.24.1 -> 2.24.2
mesa: upgrade 22.2.2 -> 22.2.3
python3-dtschema: upgrade 2022.9 -> 2022.11
python3-flit-core: upgrade 3.7.1 -> 3.8.0
python3-pip: update 22.3 -> 22.3.1
python3-psutil: upgrade 5.9.3 -> 5.9.4
python3-setuptools: upgrade 65.5.0 -> 65.5.1
python3-sphinx-rtd-theme: upgrade 1.1.0 -> 1.1.1
python3-subunit: upgrade 1.4.0 -> 1.4.1
python3-wheel: upgrade 0.38.0 -> 0.38.4
sed: update 4.8 -> 4.9
sudo: upgrade 1.9.12 -> 1.9.12p1
sysstat: upgrade 12.6.0 -> 12.6.1
babeltrace: upgrade 1.5.8 -> 1.5.11
iso-codes: upgrade 4.11.0 -> 4.12.0
libsoup: upgrade 3.2.1 -> 3.2.2
wayland-protocols: upgrade 1.27 -> 1.28
xwayland: upgrade 22.1.4 -> 22.1.5
zhengruoqin (5):
python3-jsonschema: upgrade 4.16.0 -> 4.17.0
python3-pyrsistent: upgrade 0.18.1 -> 0.19.2
python3-numpy: upgrade 1.23.3 -> 1.23.4
python3-sphinx-rtd-theme: upgrade 1.0.0 -> 1.1.0
python3-pbr: upgrade 5.10.0 -> 5.11.0
meta-openembedded: 6ebff843cc..d04444509a:
Armin Kuster (1):
meta-oe][PATCH] gst-editing-services: fix typo in LICENSE field.
Chen Pei (1):
python3-brotli: Add new recipe for 1.0.9
Kory Maincent (1):
openocd: fix build error
Leon Anavi (6):
python3-automat: Upgrade 20.2.0 -> 22.10.0
python3-asttokens: Upgrade 2.0.8 -> 2.1.0
python3-zeroconf: Upgrade 0.39.2 -> 0.39.4
python3-imageio: Upgrade 2.22.2 -> 2.22.3
python3-httplib: Upgrade 0.20.4 -> 0.21.0
python3-twisted: Upgrade 22.8.0 -> 22.10.0
Markus Volk (6):
pugixml: upgrade 1.12 -> 1.13
geary: update 40.0 -> 43.0
rest: upgrade 0.8.1 -> 0.9.0
gnome-online-accounts: update 3.44.0 -> 3.46.0
yelp: use libsoup-3.0 by default
surf: use libsoup-3.0 by default
Martin Jansa (1):
monkey: use git fetcher
Randy MacLeod (1):
nftables: use automake ptest output format
Sakib Sajal (1):
minio: add recipe for minio client
Tim Orling (5):
libcompress-raw-bzip2-perl: upgrade 2.096 -> 2.201
libcompress-raw-lzma-perl: upgrade 2.096 -> 2.201
libcompress-raw-zlib-perl: upgrade 2.096 -> 2.202
libio-compress-lzma-perl: upgrade 2.096 -> 2.201
libio-compress-perl: upgrade 2.096 -> 2.201
Wang Mingyu (43):
python3-lazy-object-proxy: upgrade 1.7.1 -> 1.8.0
python3-luma-oled: upgrade 3.8.1 -> 3.9.0
python3-nmap: upgrade 1.5.4 -> 1.6.0
python3-pint: upgrade 0.20 -> 0.20.1
python3-protobuf: upgrade 4.21.8 -> 4.21.9
python3-pytest-benchmark: upgrade 3.4.1 -> 4.0.0
python3-pytest-html: upgrade 3.1.1 -> 3.2.0
python3-pytest-xdist: upgrade 2.5.0 -> 3.0.2
python3-requests-toolbelt: upgrade 0.10.0 -> 0.10.1
python3-websockets: upgrade 10.3 -> 10.4
fetchmail: Fix buildpaths warning.
libxpresent: upgrade 1.0.0 -> 1.0.1
xkbprint: upgrade 1.0.5 -> 1.0.6
xmlsec1: upgrade 1.2.34 -> 1.2.36
openwsman: Change download branch from master to main.
hwdata: upgrade 0.363 -> 0.364
lcms: upgrade 2.13.1 -> 2.14
libdbd-sqlite-perl: upgrade 1.70 -> 1.72
mosh: upgrade 1.3.2 -> 1.4.0
xfstests: upgrade 2022.10.09 -> 2022.10.30
ulogd2: upgrade 2.0.7 -> 2.0.8
cli11: upgrade 2.3.0 -> 2.3.1
ctags: upgrade 5.9.20221023.0 -> 5.9.20221106.0
valijson: upgrade 0.7 -> 1.0
openvpn: upgrade 2.5.7 -> 2.5.8
poco: upgrade 1.12.3 -> 1.12.4
poppler: upgrade 22.10.0 -> 22.11.0
satyr: upgrade 0.39 -> 0.40
ser1net: upgrade 4.3.8 -> 4.3.9
stunnel: upgrade 5.66 -> 5.67
wolfssl: upgrade 5.5.2 -> 5.5.3
tio: upgrade 2.2 -> 2.3
uhubctl: upgrade 2.4.0 -> 2.5.0
zabbix: upgrade 6.2.3 -> 6.2.4
python3-spidev: upgrade 3.5 -> 3.6
python3-gevent: upgrade 22.10.1 -> 22.10.2
python3-google-auth: upgrade 2.13.0 -> 2.14.0
python3-greenlet: upgrade 1.1.3.post0 -> 2.0.0
python3-robotframework: upgrade 6.0 -> 6.0.1
python3-regex: upgrade 2022.9.13 -> 2022.10.31
python3-pillow: upgrade 9.2.0 -> 9.3.0
python3-paramiko: upgrade 2.11.0 -> 2.12.0
python3-jsonref: upgrade 0.3.0 -> 1.0.1
leimaohui (1):
samba: Fix install conflict with multilib enabled.
zhengrq.fnst@fujitsu.com (5):
python3-sqlalchemy: upgrade 1.4.42 -> 1.4.43
python3-websocket-client: upgrade 1.4.1 -> 1.4.2
python3-termcolor: upgrade 2.0.1 -> 2.1.0
python3-zopeinterface: upgrade 5.5.0 -> 5.5.1
python3-tqdm: upgrade 4.64.0 -> 4.64.1
Signed-off-by: Patrick Williams <patrick@stwcx.xyz>
Change-Id: I0a8f95b57a7b9433fe59a9055a4dae58694c1759
Diffstat (limited to 'poky/meta/classes')
-rw-r--r-- | poky/meta/classes/create-spdx-2.2.bbclass | 1026 | ||||
-rw-r--r-- | poky/meta/classes/create-spdx.bbclass | 1023 | ||||
-rw-r--r-- | poky/meta/classes/rm_work.bbclass | 2 | ||||
-rw-r--r-- | poky/meta/classes/testexport.bbclass | 180 |
4 files changed, 1031 insertions, 1200 deletions
diff --git a/poky/meta/classes/create-spdx-2.2.bbclass b/poky/meta/classes/create-spdx-2.2.bbclass new file mode 100644 index 0000000000..f0513af083 --- /dev/null +++ b/poky/meta/classes/create-spdx-2.2.bbclass @@ -0,0 +1,1026 @@ +# +# Copyright OpenEmbedded Contributors +# +# SPDX-License-Identifier: GPL-2.0-only +# + +DEPLOY_DIR_SPDX ??= "${DEPLOY_DIR}/spdx/${MACHINE}" + +# The product name that the CVE database uses. Defaults to BPN, but may need to +# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff). +CVE_PRODUCT ??= "${BPN}" +CVE_VERSION ??= "${PV}" + +SPDXDIR ??= "${WORKDIR}/spdx" +SPDXDEPLOY = "${SPDXDIR}/deploy" +SPDXWORK = "${SPDXDIR}/work" + +SPDX_TOOL_NAME ??= "oe-spdx-creator" +SPDX_TOOL_VERSION ??= "1.0" + +SPDXRUNTIMEDEPLOY = "${SPDXDIR}/runtime-deploy" + +SPDX_INCLUDE_SOURCES ??= "0" +SPDX_ARCHIVE_SOURCES ??= "0" +SPDX_ARCHIVE_PACKAGED ??= "0" + +SPDX_UUID_NAMESPACE ??= "sbom.openembedded.org" +SPDX_NAMESPACE_PREFIX ??= "http://spdx.org/spdxdoc" +SPDX_PRETTY ??= "0" + +SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json" + +SPDX_ORG ??= "OpenEmbedded ()" +SPDX_SUPPLIER ??= "Organization: ${SPDX_ORG}" +SPDX_SUPPLIER[doc] = "The SPDX PackageSupplier field for SPDX packages created from \ + this recipe. For SPDX documents create using this class during the build, this \ + is the contact information for the person or organization who is doing the \ + build." + +def extract_licenses(filename): + import re + + lic_regex = re.compile(rb'^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$', re.MULTILINE) + + try: + with open(filename, 'rb') as f: + size = min(15000, os.stat(filename).st_size) + txt = f.read(size) + licenses = re.findall(lic_regex, txt) + if licenses: + ascii_licenses = [lic.decode('ascii') for lic in licenses] + return ascii_licenses + except Exception as e: + bb.warn(f"Exception reading {filename}: {e}") + return None + +def get_doc_namespace(d, doc): + import uuid + namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE")) + return "%s/%s-%s" % (d.getVar("SPDX_NAMESPACE_PREFIX"), doc.name, str(uuid.uuid5(namespace_uuid, doc.name))) + +def create_annotation(d, comment): + from datetime import datetime, timezone + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + annotation = oe.spdx.SPDXAnnotation() + annotation.annotationDate = creation_time + annotation.annotationType = "OTHER" + annotation.annotator = "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) + annotation.comment = comment + return annotation + +def recipe_spdx_is_native(d, recipe): + return any(a.annotationType == "OTHER" and + a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and + a.comment == "isNative" for a in recipe.annotations) + +def is_work_shared_spdx(d): + return bb.data.inherits_class('kernel', d) or ('work-shared' in d.getVar('WORKDIR')) + +def get_json_indent(d): + if d.getVar("SPDX_PRETTY") == "1": + return 2 + return None + +python() { + import json + if d.getVar("SPDX_LICENSE_DATA"): + return + + with open(d.getVar("SPDX_LICENSES"), "r") as f: + data = json.load(f) + # Transform the license array to a dictionary + data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} + d.setVar("SPDX_LICENSE_DATA", data) +} + +def convert_license_to_spdx(lic, document, d, existing={}): + from pathlib import Path + import oe.spdx + + license_data = d.getVar("SPDX_LICENSE_DATA") + extracted = {} + + def add_extracted_license(ident, name): + nonlocal document + + if name in extracted: + return + + extracted_info = oe.spdx.SPDXExtractedLicensingInfo() + extracted_info.name = name + extracted_info.licenseId = ident + extracted_info.extractedText = None + + if name == "PD": + # Special-case this. + extracted_info.extractedText = "Software released to the public domain" + else: + # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH + for directory in [d.getVar('COMMON_LICENSE_DIR')] + (d.getVar('LICENSE_PATH') or '').split(): + try: + with (Path(directory) / name).open(errors="replace") as f: + extracted_info.extractedText = f.read() + break + except FileNotFoundError: + pass + if extracted_info.extractedText is None: + # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set + filename = d.getVarFlag('NO_GENERIC_LICENSE', name) + if filename: + filename = d.expand("${S}/" + filename) + with open(filename, errors="replace") as f: + extracted_info.extractedText = f.read() + else: + bb.error("Cannot find any text for license %s" % name) + + extracted[name] = extracted_info + document.hasExtractedLicensingInfos.append(extracted_info) + + def convert(l): + if l == "(" or l == ")": + return l + + if l == "&": + return "AND" + + if l == "|": + return "OR" + + if l == "CLOSED": + return "NONE" + + spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l + if spdx_license in license_data["licenses"]: + return spdx_license + + try: + spdx_license = existing[l] + except KeyError: + spdx_license = "LicenseRef-" + l + add_extracted_license(spdx_license, l) + + return spdx_license + + lic_split = lic.replace("(", " ( ").replace(")", " ) ").split() + + return ' '.join(convert(l) for l in lic_split) + +def process_sources(d): + pn = d.getVar('PN') + assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split() + if pn in assume_provided: + for p in d.getVar("PROVIDES").split(): + if p != pn: + pn = p + break + + # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted, + # so avoid archiving source here. + if pn.startswith('glibc-locale'): + return False + if d.getVar('PN') == "libtool-cross": + return False + if d.getVar('PN') == "libgcc-initial": + return False + if d.getVar('PN') == "shadow-sysroot": + return False + + # We just archive gcc-source for all the gcc related recipes + if d.getVar('BPN') in ['gcc', 'libgcc']: + bb.debug(1, 'spdx: There is bug in scan of %s is, do nothing' % pn) + return False + + return True + + +def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): + from pathlib import Path + import oe.spdx + import hashlib + + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + if source_date_epoch: + source_date_epoch = int(source_date_epoch) + + sha1s = [] + spdx_files = [] + + file_counter = 1 + for subdir, dirs, files in os.walk(topdir): + dirs[:] = [d for d in dirs if d not in ignore_dirs] + if subdir == str(topdir): + dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] + + for file in files: + filepath = Path(subdir) / file + filename = str(filepath.relative_to(topdir)) + + if not filepath.is_symlink() and filepath.is_file(): + spdx_file = oe.spdx.SPDXFile() + spdx_file.SPDXID = get_spdxid(file_counter) + for t in get_types(filepath): + spdx_file.fileTypes.append(t) + spdx_file.fileName = filename + + if archive is not None: + with filepath.open("rb") as f: + info = archive.gettarinfo(fileobj=f) + info.name = filename + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > source_date_epoch: + info.mtime = source_date_epoch + + archive.addfile(info, f) + + sha1 = bb.utils.sha1_file(filepath) + sha1s.append(sha1) + spdx_file.checksums.append(oe.spdx.SPDXChecksum( + algorithm="SHA1", + checksumValue=sha1, + )) + spdx_file.checksums.append(oe.spdx.SPDXChecksum( + algorithm="SHA256", + checksumValue=bb.utils.sha256_file(filepath), + )) + + if "SOURCE" in spdx_file.fileTypes: + extracted_lics = extract_licenses(filepath) + if extracted_lics: + spdx_file.licenseInfoInFiles = extracted_lics + + doc.files.append(spdx_file) + doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file) + spdx_pkg.hasFiles.append(spdx_file.SPDXID) + + spdx_files.append(spdx_file) + + file_counter += 1 + + sha1s.sort() + verifier = hashlib.sha1() + for v in sha1s: + verifier.update(v.encode("utf-8")) + spdx_pkg.packageVerificationCode.packageVerificationCodeValue = verifier.hexdigest() + + return spdx_files + + +def add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources): + from pathlib import Path + import hashlib + import oe.packagedata + import oe.spdx + + debug_search_paths = [ + Path(d.getVar('PKGD')), + Path(d.getVar('STAGING_DIR_TARGET')), + Path(d.getVar('STAGING_DIR_NATIVE')), + Path(d.getVar('STAGING_KERNEL_DIR')), + ] + + pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) + + if pkg_data is None: + return + + for file_path, file_data in pkg_data["files_info"].items(): + if not "debugsrc" in file_data: + continue + + for pkg_file in package_files: + if file_path.lstrip("/") == pkg_file.fileName.lstrip("/"): + break + else: + bb.fatal("No package file found for %s" % str(file_path)) + continue + + for debugsrc in file_data["debugsrc"]: + ref_id = "NOASSERTION" + for search in debug_search_paths: + if debugsrc.startswith("/usr/src/kernel"): + debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') + else: + debugsrc_path = search / debugsrc.lstrip("/") + if not debugsrc_path.exists(): + continue + + file_sha256 = bb.utils.sha256_file(debugsrc_path) + + if file_sha256 in sources: + source_file = sources[file_sha256] + + doc_ref = package_doc.find_external_document_ref(source_file.doc.documentNamespace) + if doc_ref is None: + doc_ref = oe.spdx.SPDXExternalDocumentRef() + doc_ref.externalDocumentId = "DocumentRef-dependency-" + source_file.doc.name + doc_ref.spdxDocument = source_file.doc.documentNamespace + doc_ref.checksum.algorithm = "SHA1" + doc_ref.checksum.checksumValue = source_file.doc_sha1 + package_doc.externalDocumentRefs.append(doc_ref) + + ref_id = "%s:%s" % (doc_ref.externalDocumentId, source_file.file.SPDXID) + else: + bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) + break + else: + bb.debug(1, "Debug source %s not found" % debugsrc) + + package_doc.add_relationship(pkg_file, "GENERATED_FROM", ref_id, comment=debugsrc) + +def collect_dep_recipes(d, doc, spdx_recipe): + from pathlib import Path + import oe.sbom + import oe.spdx + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + + dep_recipes = [] + taskdepdata = d.getVar("BB_TASKDEPDATA", False) + deps = sorted(set( + dep[0] for dep in taskdepdata.values() if + dep[1] == "do_create_spdx" and dep[0] != d.getVar("PN") + )) + for dep_pn in deps: + dep_recipe_path = deploy_dir_spdx / "recipes" / ("recipe-%s.spdx.json" % dep_pn) + + spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_recipe_path) + + for pkg in spdx_dep_doc.packages: + if pkg.name == dep_pn: + spdx_dep_recipe = pkg + break + else: + continue + + dep_recipes.append(oe.sbom.DepRecipe(spdx_dep_doc, spdx_dep_sha1, spdx_dep_recipe)) + + dep_recipe_ref = oe.spdx.SPDXExternalDocumentRef() + dep_recipe_ref.externalDocumentId = "DocumentRef-dependency-" + spdx_dep_doc.name + dep_recipe_ref.spdxDocument = spdx_dep_doc.documentNamespace + dep_recipe_ref.checksum.algorithm = "SHA1" + dep_recipe_ref.checksum.checksumValue = spdx_dep_sha1 + + doc.externalDocumentRefs.append(dep_recipe_ref) + + doc.add_relationship( + "%s:%s" % (dep_recipe_ref.externalDocumentId, spdx_dep_recipe.SPDXID), + "BUILD_DEPENDENCY_OF", + spdx_recipe + ) + + return dep_recipes + +collect_dep_recipes[vardepsexclude] += "BB_TASKDEPDATA" + + +def collect_dep_sources(d, dep_recipes): + import oe.sbom + + sources = {} + for dep in dep_recipes: + # Don't collect sources from native recipes as they + # match non-native sources also. + if recipe_spdx_is_native(d, dep.recipe): + continue + recipe_files = set(dep.recipe.hasFiles) + + for spdx_file in dep.doc.files: + if spdx_file.SPDXID not in recipe_files: + continue + + if "SOURCE" in spdx_file.fileTypes: + for checksum in spdx_file.checksums: + if checksum.algorithm == "SHA256": + sources[checksum.checksumValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file) + break + + return sources + + +python do_create_spdx() { + from datetime import datetime, timezone + import oe.sbom + import oe.spdx + import uuid + from pathlib import Path + from contextlib import contextmanager + import oe.cve_check + + @contextmanager + def optional_tarfile(name, guard, mode="w"): + import tarfile + import bb.compress.zstd + + num_threads = int(d.getVar("BB_NUMBER_THREADS")) + + if guard: + name.parent.mkdir(parents=True, exist_ok=True) + with bb.compress.zstd.open(name, mode=mode + "b", num_threads=num_threads) as f: + with tarfile.open(fileobj=f, mode=mode + "|") as tf: + yield tf + else: + yield None + + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_workdir = Path(d.getVar("SPDXWORK")) + include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" + archive_sources = d.getVar("SPDX_ARCHIVE_SOURCES") == "1" + archive_packaged = d.getVar("SPDX_ARCHIVE_PACKAGED") == "1" + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + + doc = oe.spdx.SPDXDocument() + + doc.name = "recipe-" + d.getVar("PN") + doc.documentNamespace = get_doc_namespace(d, doc) + doc.creationInfo.created = creation_time + doc.creationInfo.comment = "This document was created by analyzing recipe files during the build." + doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + doc.creationInfo.creators.append("Person: N/A ()") + + recipe = oe.spdx.SPDXPackage() + recipe.name = d.getVar("PN") + recipe.versionInfo = d.getVar("PV") + recipe.SPDXID = oe.sbom.get_recipe_spdxid(d) + recipe.supplier = d.getVar("SPDX_SUPPLIER") + if bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d): + recipe.annotations.append(create_annotation(d, "isNative")) + + for s in d.getVar('SRC_URI').split(): + if not s.startswith("file://"): + s = s.split(';')[0] + recipe.downloadLocation = s + break + else: + recipe.downloadLocation = "NOASSERTION" + + homepage = d.getVar("HOMEPAGE") + if homepage: + recipe.homepage = homepage + + license = d.getVar("LICENSE") + if license: + recipe.licenseDeclared = convert_license_to_spdx(license, doc, d) + + summary = d.getVar("SUMMARY") + if summary: + recipe.summary = summary + + description = d.getVar("DESCRIPTION") + if description: + recipe.description = description + + # Some CVEs may be patched during the build process without incrementing the version number, + # so querying for CVEs based on the CPE id can lead to false positives. To account for this, + # save the CVEs fixed by patches to source information field in the SPDX. + patched_cves = oe.cve_check.get_patched_cves(d) + patched_cves = list(patched_cves) + patched_cves = ' '.join(patched_cves) + if patched_cves: + recipe.sourceInfo = "CVEs fixed: " + patched_cves + + cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) + if cpe_ids: + for cpe_id in cpe_ids: + cpe = oe.spdx.SPDXExternalReference() + cpe.referenceCategory = "SECURITY" + cpe.referenceType = "http://spdx.org/rdf/references/cpe23Type" + cpe.referenceLocator = cpe_id + recipe.externalRefs.append(cpe) + + doc.packages.append(recipe) + doc.add_relationship(doc, "DESCRIBES", recipe) + + if process_sources(d) and include_sources: + recipe_archive = deploy_dir_spdx / "recipes" / (doc.name + ".tar.zst") + with optional_tarfile(recipe_archive, archive_sources) as archive: + spdx_get_src(d) + + add_package_files( + d, + doc, + recipe, + spdx_workdir, + lambda file_counter: "SPDXRef-SourceFile-%s-%d" % (d.getVar("PN"), file_counter), + lambda filepath: ["SOURCE"], + ignore_dirs=[".git"], + ignore_top_level_dirs=["temp"], + archive=archive, + ) + + if archive is not None: + recipe.packageFileName = str(recipe_archive.name) + + dep_recipes = collect_dep_recipes(d, doc, recipe) + + doc_sha1 = oe.sbom.write_doc(d, doc, "recipes", indent=get_json_indent(d)) + dep_recipes.append(oe.sbom.DepRecipe(doc, doc_sha1, recipe)) + + recipe_ref = oe.spdx.SPDXExternalDocumentRef() + recipe_ref.externalDocumentId = "DocumentRef-recipe-" + recipe.name + recipe_ref.spdxDocument = doc.documentNamespace + recipe_ref.checksum.algorithm = "SHA1" + recipe_ref.checksum.checksumValue = doc_sha1 + + sources = collect_dep_sources(d, dep_recipes) + found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos} + + if not recipe_spdx_is_native(d, recipe): + bb.build.exec_func("read_subpackage_metadata", d) + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + if not oe.packagedata.packaged(package, d): + continue + + package_doc = oe.spdx.SPDXDocument() + pkg_name = d.getVar("PKG:%s" % package) or package + package_doc.name = pkg_name + package_doc.documentNamespace = get_doc_namespace(d, package_doc) + package_doc.creationInfo.created = creation_time + package_doc.creationInfo.comment = "This document was created by analyzing packages created during the build." + package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + package_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + package_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + package_doc.creationInfo.creators.append("Person: N/A ()") + package_doc.externalDocumentRefs.append(recipe_ref) + + package_license = d.getVar("LICENSE:%s" % package) or d.getVar("LICENSE") + + spdx_package = oe.spdx.SPDXPackage() + + spdx_package.SPDXID = oe.sbom.get_package_spdxid(pkg_name) + spdx_package.name = pkg_name + spdx_package.versionInfo = d.getVar("PV") + spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses) + spdx_package.supplier = d.getVar("SPDX_SUPPLIER") + + package_doc.packages.append(spdx_package) + + package_doc.add_relationship(spdx_package, "GENERATED_FROM", "%s:%s" % (recipe_ref.externalDocumentId, recipe.SPDXID)) + package_doc.add_relationship(package_doc, "DESCRIBES", spdx_package) + + package_archive = deploy_dir_spdx / "packages" / (package_doc.name + ".tar.zst") + with optional_tarfile(package_archive, archive_packaged) as archive: + package_files = add_package_files( + d, + package_doc, + spdx_package, + pkgdest / package, + lambda file_counter: oe.sbom.get_packaged_file_spdxid(pkg_name, file_counter), + lambda filepath: ["BINARY"], + ignore_top_level_dirs=['CONTROL', 'DEBIAN'], + archive=archive, + ) + + if archive is not None: + spdx_package.packageFileName = str(package_archive.name) + + add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources) + + oe.sbom.write_doc(d, package_doc, "packages", indent=get_json_indent(d)) +} +# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source +addtask do_create_spdx after do_package do_packagedata do_unpack before do_populate_sdk do_build do_rm_work + +SSTATETASKS += "do_create_spdx" +do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}" +do_create_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_spdx_setscene + +do_create_spdx[dirs] = "${SPDXWORK}" +do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}" +do_create_spdx[depends] += "${PATCHDEPENDENCY}" +do_create_spdx[deptask] = "do_create_spdx" + +def collect_package_providers(d): + from pathlib import Path + import oe.sbom + import oe.spdx + import json + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + + providers = {} + + taskdepdata = d.getVar("BB_TASKDEPDATA", False) + deps = sorted(set( + dep[0] for dep in taskdepdata.values() if dep[0] != d.getVar("PN") + )) + deps.append(d.getVar("PN")) + + for dep_pn in deps: + recipe_data = oe.packagedata.read_pkgdata(dep_pn, d) + + for pkg in recipe_data.get("PACKAGES", "").split(): + + pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, d) + rprovides = set(n for n, _ in bb.utils.explode_dep_versions2(pkg_data.get("RPROVIDES", "")).items()) + rprovides.add(pkg) + + for r in rprovides: + providers[r] = pkg + + return providers + +collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" + +python do_create_runtime_spdx() { + from datetime import datetime, timezone + import oe.sbom + import oe.spdx + import oe.packagedata + from pathlib import Path + + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + spdx_deploy = Path(d.getVar("SPDXRUNTIMEDEPLOY")) + is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + + providers = collect_package_providers(d) + + if not is_native: + bb.build.exec_func("read_subpackage_metadata", d) + + dep_package_cache = {} + + pkgdest = Path(d.getVar("PKGDEST")) + for package in d.getVar("PACKAGES").split(): + localdata = bb.data.createCopy(d) + pkg_name = d.getVar("PKG:%s" % package) or package + localdata.setVar("PKG", pkg_name) + localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) + + if not oe.packagedata.packaged(package, localdata): + continue + + pkg_spdx_path = deploy_dir_spdx / "packages" / (pkg_name + ".spdx.json") + + package_doc, package_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) + + for p in package_doc.packages: + if p.name == pkg_name: + spdx_package = p + break + else: + bb.fatal("Package '%s' not found in %s" % (pkg_name, pkg_spdx_path)) + + runtime_doc = oe.spdx.SPDXDocument() + runtime_doc.name = "runtime-" + pkg_name + runtime_doc.documentNamespace = get_doc_namespace(localdata, runtime_doc) + runtime_doc.creationInfo.created = creation_time + runtime_doc.creationInfo.comment = "This document was created by analyzing package runtime dependencies." + runtime_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + runtime_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + runtime_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + runtime_doc.creationInfo.creators.append("Person: N/A ()") + + package_ref = oe.spdx.SPDXExternalDocumentRef() + package_ref.externalDocumentId = "DocumentRef-package-" + package + package_ref.spdxDocument = package_doc.documentNamespace + package_ref.checksum.algorithm = "SHA1" + package_ref.checksum.checksumValue = package_doc_sha1 + + runtime_doc.externalDocumentRefs.append(package_ref) + + runtime_doc.add_relationship( + runtime_doc.SPDXID, + "AMENDS", + "%s:%s" % (package_ref.externalDocumentId, package_doc.SPDXID) + ) + + deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") + seen_deps = set() + for dep, _ in deps.items(): + if dep in seen_deps: + continue + + if dep not in providers: + continue + + dep = providers[dep] + + if not oe.packagedata.packaged(dep, localdata): + continue + + dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) + dep_pkg = dep_pkg_data["PKG"] + + if dep in dep_package_cache: + (dep_spdx_package, dep_package_ref) = dep_package_cache[dep] + else: + dep_path = deploy_dir_spdx / "packages" / ("%s.spdx.json" % dep_pkg) + + spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_path) + + for pkg in spdx_dep_doc.packages: + if pkg.name == dep_pkg: + dep_spdx_package = pkg + break + else: + bb.fatal("Package '%s' not found in %s" % (dep_pkg, dep_path)) + + dep_package_ref = oe.spdx.SPDXExternalDocumentRef() + dep_package_ref.externalDocumentId = "DocumentRef-runtime-dependency-" + spdx_dep_doc.name + dep_package_ref.spdxDocument = spdx_dep_doc.documentNamespace + dep_package_ref.checksum.algorithm = "SHA1" + dep_package_ref.checksum.checksumValue = spdx_dep_sha1 + + dep_package_cache[dep] = (dep_spdx_package, dep_package_ref) + + runtime_doc.externalDocumentRefs.append(dep_package_ref) + + runtime_doc.add_relationship( + "%s:%s" % (dep_package_ref.externalDocumentId, dep_spdx_package.SPDXID), + "RUNTIME_DEPENDENCY_OF", + "%s:%s" % (package_ref.externalDocumentId, spdx_package.SPDXID) + ) + seen_deps.add(dep) + + oe.sbom.write_doc(d, runtime_doc, "runtime", spdx_deploy, indent=get_json_indent(d)) +} + +addtask do_create_runtime_spdx after do_create_spdx before do_build do_rm_work +SSTATETASKS += "do_create_runtime_spdx" +do_create_runtime_spdx[sstate-inputdirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" + +python do_create_runtime_spdx_setscene () { + sstate_setscene(d) +} +addtask do_create_runtime_spdx_setscene + +do_create_runtime_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}" +do_create_runtime_spdx[rdeptask] = "do_create_spdx" + +def spdx_get_src(d): + """ + save patched source of the recipe in SPDX_WORKDIR. + """ + import shutil + spdx_workdir = d.getVar('SPDXWORK') + spdx_sysroot_native = d.getVar('STAGING_DIR_NATIVE') + pn = d.getVar('PN') + + workdir = d.getVar("WORKDIR") + + try: + # The kernel class functions require it to be on work-shared, so we dont change WORKDIR + if not is_work_shared_spdx(d): + # Change the WORKDIR to make do_unpack do_patch run in another dir. + d.setVar('WORKDIR', spdx_workdir) + # Restore the original path to recipe's native sysroot (it's relative to WORKDIR). + d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) + + # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the + # possibly requiring of the following tasks (such as some recipes's + # do_patch required 'B' existed). + bb.utils.mkdirhier(d.getVar('B')) + + bb.build.exec_func('do_unpack', d) + # Copy source of kernel to spdx_workdir + if is_work_shared_spdx(d): + share_src = d.getVar('WORKDIR') + d.setVar('WORKDIR', spdx_workdir) + d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) + src_dir = spdx_workdir + "/" + d.getVar('PN')+ "-" + d.getVar('PV') + "-" + d.getVar('PR') + bb.utils.mkdirhier(src_dir) + if bb.data.inherits_class('kernel',d): + share_src = d.getVar('STAGING_KERNEL_DIR') + cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/" + cmd_copy_shared_res = os.popen(cmd_copy_share).read() + bb.note("cmd_copy_shared_result = " + cmd_copy_shared_res) + + git_path = src_dir + "/.git" + if os.path.exists(git_path): + shutils.rmtree(git_path) + + # Make sure gcc and kernel sources are patched only once + if not (d.getVar('SRC_URI') == "" or is_work_shared_spdx(d)): + bb.build.exec_func('do_patch', d) + + # Some userland has no source. + if not os.path.exists( spdx_workdir ): + bb.utils.mkdirhier(spdx_workdir) + finally: + d.setVar("WORKDIR", workdir) + +do_rootfs[recrdeptask] += "do_create_spdx do_create_runtime_spdx" + +ROOTFS_POSTUNINSTALL_COMMAND =+ "image_combine_spdx ; " + +do_populate_sdk[recrdeptask] += "do_create_spdx do_create_runtime_spdx" +POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_combine_spdx; " +POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_combine_spdx; " + +python image_combine_spdx() { + import os + import oe.sbom + from pathlib import Path + from oe.rootfs import image_list_installed_packages + + image_name = d.getVar("IMAGE_NAME") + image_link_name = d.getVar("IMAGE_LINK_NAME") + imgdeploydir = Path(d.getVar("IMGDEPLOYDIR")) + img_spdxid = oe.sbom.get_image_spdxid(image_name) + packages = image_list_installed_packages(d) + + combine_spdx(d, image_name, imgdeploydir, img_spdxid, packages) + + def make_image_link(target_path, suffix): + if image_link_name: + link = imgdeploydir / (image_link_name + suffix) + if link != target_path: + link.symlink_to(os.path.relpath(target_path, link.parent)) + + image_spdx_path = imgdeploydir / (image_name + ".spdx.json") + make_image_link(image_spdx_path, ".spdx.json") + spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst") + make_image_link(spdx_tar_path, ".spdx.tar.zst") + spdx_index_path = imgdeploydir / (image_name + ".spdx.index.json") + make_image_link(spdx_index_path, ".spdx.index.json") +} + +python sdk_host_combine_spdx() { + sdk_combine_spdx(d, "host") +} + +python sdk_target_combine_spdx() { + sdk_combine_spdx(d, "target") +} + +def sdk_combine_spdx(d, sdk_type): + import oe.sbom + from pathlib import Path + from oe.sdk import sdk_list_installed_packages + + sdk_name = d.getVar("SDK_NAME") + "-" + sdk_type + sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) + sdk_spdxid = oe.sbom.get_sdk_spdxid(sdk_name) + sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") + combine_spdx(d, sdk_name, sdk_deploydir, sdk_spdxid, sdk_packages) + +def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages): + import os + import oe.spdx + import oe.sbom + import io + import json + from datetime import timezone, datetime + from pathlib import Path + import tarfile + import bb.compress.zstd + + creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) + source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") + + doc = oe.spdx.SPDXDocument() + doc.name = rootfs_name + doc.documentNamespace = get_doc_namespace(d, doc) + doc.creationInfo.created = creation_time + doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build." + doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] + doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") + doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) + doc.creationInfo.creators.append("Person: N/A ()") + + image = oe.spdx.SPDXPackage() + image.name = d.getVar("PN") + image.versionInfo = d.getVar("PV") + image.SPDXID = rootfs_spdxid + image.supplier = d.getVar("SPDX_SUPPLIER") + + doc.packages.append(image) + + for name in sorted(packages.keys()): + pkg_spdx_path = deploy_dir_spdx / "packages" / (name + ".spdx.json") + pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) + + for p in pkg_doc.packages: + if p.name == name: + pkg_ref = oe.spdx.SPDXExternalDocumentRef() + pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name + pkg_ref.spdxDocument = pkg_doc.documentNamespace + pkg_ref.checksum.algorithm = "SHA1" + pkg_ref.checksum.checksumValue = pkg_doc_sha1 + + doc.externalDocumentRefs.append(pkg_ref) + doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) + break + else: + bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) + + runtime_spdx_path = deploy_dir_spdx / "runtime" / ("runtime-" + name + ".spdx.json") + runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) + + runtime_ref = oe.spdx.SPDXExternalDocumentRef() + runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name + runtime_ref.spdxDocument = runtime_doc.documentNamespace + runtime_ref.checksum.algorithm = "SHA1" + runtime_ref.checksum.checksumValue = runtime_doc_sha1 + + # "OTHER" isn't ideal here, but I can't find a relationship that makes sense + doc.externalDocumentRefs.append(runtime_ref) + doc.add_relationship( + image, + "OTHER", + "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), + comment="Runtime dependencies for %s" % name + ) + + image_spdx_path = rootfs_deploydir / (rootfs_name + ".spdx.json") + + with image_spdx_path.open("wb") as f: + doc.to_json(f, sort_keys=True, indent=get_json_indent(d)) + + num_threads = int(d.getVar("BB_NUMBER_THREADS")) + + visited_docs = set() + + index = {"documents": []} + + spdx_tar_path = rootfs_deploydir / (rootfs_name + ".spdx.tar.zst") + with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f: + with tarfile.open(fileobj=f, mode="w|") as tar: + def collect_spdx_document(path): + nonlocal tar + nonlocal deploy_dir_spdx + nonlocal source_date_epoch + nonlocal index + + if path in visited_docs: + return + + visited_docs.add(path) + + with path.open("rb") as f: + doc, sha1 = oe.sbom.read_doc(f) + f.seek(0) + + if doc.documentNamespace in visited_docs: + return + + bb.note("Adding SPDX document %s" % path) + visited_docs.add(doc.documentNamespace) + info = tar.gettarinfo(fileobj=f) + + info.name = doc.name + ".spdx.json" + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + if source_date_epoch is not None and info.mtime > int(source_date_epoch): + info.mtime = int(source_date_epoch) + + tar.addfile(info, f) + + index["documents"].append({ + "filename": info.name, + "documentNamespace": doc.documentNamespace, + "sha1": sha1, + }) + + for ref in doc.externalDocumentRefs: + ref_path = deploy_dir_spdx / "by-namespace" / ref.spdxDocument.replace("/", "_") + collect_spdx_document(ref_path) + + collect_spdx_document(image_spdx_path) + + index["documents"].sort(key=lambda x: x["filename"]) + + index_str = io.BytesIO(json.dumps( + index, + sort_keys=True, + indent=get_json_indent(d), + ).encode("utf-8")) + + info = tarfile.TarInfo() + info.name = "index.json" + info.size = len(index_str.getvalue()) + info.uid = 0 + info.gid = 0 + info.uname = "root" + info.gname = "root" + + tar.addfile(info, fileobj=index_str) + + spdx_index_path = rootfs_deploydir / (rootfs_name + ".spdx.index.json") + with spdx_index_path.open("w") as f: + json.dump(index, f, sort_keys=True, indent=get_json_indent(d)) diff --git a/poky/meta/classes/create-spdx.bbclass b/poky/meta/classes/create-spdx.bbclass index af6afcc653..19c6c0ff0b 100644 --- a/poky/meta/classes/create-spdx.bbclass +++ b/poky/meta/classes/create-spdx.bbclass @@ -3,1023 +3,6 @@ # # SPDX-License-Identifier: GPL-2.0-only # - -DEPLOY_DIR_SPDX ??= "${DEPLOY_DIR}/spdx/${MACHINE}" - -# The product name that the CVE database uses. Defaults to BPN, but may need to -# be overriden per recipe (for example tiff.bb sets CVE_PRODUCT=libtiff). -CVE_PRODUCT ??= "${BPN}" -CVE_VERSION ??= "${PV}" - -SPDXDIR ??= "${WORKDIR}/spdx" -SPDXDEPLOY = "${SPDXDIR}/deploy" -SPDXWORK = "${SPDXDIR}/work" - -SPDX_TOOL_NAME ??= "oe-spdx-creator" -SPDX_TOOL_VERSION ??= "1.0" - -SPDXRUNTIMEDEPLOY = "${SPDXDIR}/runtime-deploy" - -SPDX_INCLUDE_SOURCES ??= "0" -SPDX_ARCHIVE_SOURCES ??= "0" -SPDX_ARCHIVE_PACKAGED ??= "0" - -SPDX_UUID_NAMESPACE ??= "sbom.openembedded.org" -SPDX_NAMESPACE_PREFIX ??= "http://spdx.org/spdxdoc" -SPDX_PRETTY ??= "0" - -SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json" - -SPDX_ORG ??= "OpenEmbedded ()" -SPDX_SUPPLIER ??= "Organization: ${SPDX_ORG}" -SPDX_SUPPLIER[doc] = "The SPDX PackageSupplier field for SPDX packages created from \ - this recipe. For SPDX documents create using this class during the build, this \ - is the contact information for the person or organization who is doing the \ - build." - -def extract_licenses(filename): - import re - - lic_regex = re.compile(rb'^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$', re.MULTILINE) - - try: - with open(filename, 'rb') as f: - size = min(15000, os.stat(filename).st_size) - txt = f.read(size) - licenses = re.findall(lic_regex, txt) - if licenses: - ascii_licenses = [lic.decode('ascii') for lic in licenses] - return ascii_licenses - except Exception as e: - bb.warn(f"Exception reading {filename}: {e}") - return None - -def get_doc_namespace(d, doc): - import uuid - namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE")) - return "%s/%s-%s" % (d.getVar("SPDX_NAMESPACE_PREFIX"), doc.name, str(uuid.uuid5(namespace_uuid, doc.name))) - -def create_annotation(d, comment): - from datetime import datetime, timezone - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - annotation = oe.spdx.SPDXAnnotation() - annotation.annotationDate = creation_time - annotation.annotationType = "OTHER" - annotation.annotator = "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) - annotation.comment = comment - return annotation - -def recipe_spdx_is_native(d, recipe): - return any(a.annotationType == "OTHER" and - a.annotator == "Tool: %s - %s" % (d.getVar("SPDX_TOOL_NAME"), d.getVar("SPDX_TOOL_VERSION")) and - a.comment == "isNative" for a in recipe.annotations) - -def is_work_shared_spdx(d): - return bb.data.inherits_class('kernel', d) or ('work-shared' in d.getVar('WORKDIR')) - -def get_json_indent(d): - if d.getVar("SPDX_PRETTY") == "1": - return 2 - return None - -python() { - import json - if d.getVar("SPDX_LICENSE_DATA"): - return - - with open(d.getVar("SPDX_LICENSES"), "r") as f: - data = json.load(f) - # Transform the license array to a dictionary - data["licenses"] = {l["licenseId"]: l for l in data["licenses"]} - d.setVar("SPDX_LICENSE_DATA", data) -} - -def convert_license_to_spdx(lic, document, d, existing={}): - from pathlib import Path - import oe.spdx - - license_data = d.getVar("SPDX_LICENSE_DATA") - extracted = {} - - def add_extracted_license(ident, name): - nonlocal document - - if name in extracted: - return - - extracted_info = oe.spdx.SPDXExtractedLicensingInfo() - extracted_info.name = name - extracted_info.licenseId = ident - extracted_info.extractedText = None - - if name == "PD": - # Special-case this. - extracted_info.extractedText = "Software released to the public domain" - else: - # Seach for the license in COMMON_LICENSE_DIR and LICENSE_PATH - for directory in [d.getVar('COMMON_LICENSE_DIR')] + (d.getVar('LICENSE_PATH') or '').split(): - try: - with (Path(directory) / name).open(errors="replace") as f: - extracted_info.extractedText = f.read() - break - except FileNotFoundError: - pass - if extracted_info.extractedText is None: - # If it's not SPDX or PD, then NO_GENERIC_LICENSE must be set - filename = d.getVarFlag('NO_GENERIC_LICENSE', name) - if filename: - filename = d.expand("${S}/" + filename) - with open(filename, errors="replace") as f: - extracted_info.extractedText = f.read() - else: - bb.error("Cannot find any text for license %s" % name) - - extracted[name] = extracted_info - document.hasExtractedLicensingInfos.append(extracted_info) - - def convert(l): - if l == "(" or l == ")": - return l - - if l == "&": - return "AND" - - if l == "|": - return "OR" - - if l == "CLOSED": - return "NONE" - - spdx_license = d.getVarFlag("SPDXLICENSEMAP", l) or l - if spdx_license in license_data["licenses"]: - return spdx_license - - try: - spdx_license = existing[l] - except KeyError: - spdx_license = "LicenseRef-" + l - add_extracted_license(spdx_license, l) - - return spdx_license - - lic_split = lic.replace("(", " ( ").replace(")", " ) ").split() - - return ' '.join(convert(l) for l in lic_split) - -def process_sources(d): - pn = d.getVar('PN') - assume_provided = (d.getVar("ASSUME_PROVIDED") or "").split() - if pn in assume_provided: - for p in d.getVar("PROVIDES").split(): - if p != pn: - pn = p - break - - # glibc-locale: do_fetch, do_unpack and do_patch tasks have been deleted, - # so avoid archiving source here. - if pn.startswith('glibc-locale'): - return False - if d.getVar('PN') == "libtool-cross": - return False - if d.getVar('PN') == "libgcc-initial": - return False - if d.getVar('PN') == "shadow-sysroot": - return False - - # We just archive gcc-source for all the gcc related recipes - if d.getVar('BPN') in ['gcc', 'libgcc']: - bb.debug(1, 'spdx: There is bug in scan of %s is, do nothing' % pn) - return False - - return True - - -def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archive=None, ignore_dirs=[], ignore_top_level_dirs=[]): - from pathlib import Path - import oe.spdx - import hashlib - - source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") - if source_date_epoch: - source_date_epoch = int(source_date_epoch) - - sha1s = [] - spdx_files = [] - - file_counter = 1 - for subdir, dirs, files in os.walk(topdir): - dirs[:] = [d for d in dirs if d not in ignore_dirs] - if subdir == str(topdir): - dirs[:] = [d for d in dirs if d not in ignore_top_level_dirs] - - for file in files: - filepath = Path(subdir) / file - filename = str(filepath.relative_to(topdir)) - - if not filepath.is_symlink() and filepath.is_file(): - spdx_file = oe.spdx.SPDXFile() - spdx_file.SPDXID = get_spdxid(file_counter) - for t in get_types(filepath): - spdx_file.fileTypes.append(t) - spdx_file.fileName = filename - - if archive is not None: - with filepath.open("rb") as f: - info = archive.gettarinfo(fileobj=f) - info.name = filename - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - if source_date_epoch is not None and info.mtime > source_date_epoch: - info.mtime = source_date_epoch - - archive.addfile(info, f) - - sha1 = bb.utils.sha1_file(filepath) - sha1s.append(sha1) - spdx_file.checksums.append(oe.spdx.SPDXChecksum( - algorithm="SHA1", - checksumValue=sha1, - )) - spdx_file.checksums.append(oe.spdx.SPDXChecksum( - algorithm="SHA256", - checksumValue=bb.utils.sha256_file(filepath), - )) - - if "SOURCE" in spdx_file.fileTypes: - extracted_lics = extract_licenses(filepath) - if extracted_lics: - spdx_file.licenseInfoInFiles = extracted_lics - - doc.files.append(spdx_file) - doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file) - spdx_pkg.hasFiles.append(spdx_file.SPDXID) - - spdx_files.append(spdx_file) - - file_counter += 1 - - sha1s.sort() - verifier = hashlib.sha1() - for v in sha1s: - verifier.update(v.encode("utf-8")) - spdx_pkg.packageVerificationCode.packageVerificationCodeValue = verifier.hexdigest() - - return spdx_files - - -def add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources): - from pathlib import Path - import hashlib - import oe.packagedata - import oe.spdx - - debug_search_paths = [ - Path(d.getVar('PKGD')), - Path(d.getVar('STAGING_DIR_TARGET')), - Path(d.getVar('STAGING_DIR_NATIVE')), - Path(d.getVar('STAGING_KERNEL_DIR')), - ] - - pkg_data = oe.packagedata.read_subpkgdata_extended(package, d) - - if pkg_data is None: - return - - for file_path, file_data in pkg_data["files_info"].items(): - if not "debugsrc" in file_data: - continue - - for pkg_file in package_files: - if file_path.lstrip("/") == pkg_file.fileName.lstrip("/"): - break - else: - bb.fatal("No package file found for %s" % str(file_path)) - continue - - for debugsrc in file_data["debugsrc"]: - ref_id = "NOASSERTION" - for search in debug_search_paths: - if debugsrc.startswith("/usr/src/kernel"): - debugsrc_path = search / debugsrc.replace('/usr/src/kernel/', '') - else: - debugsrc_path = search / debugsrc.lstrip("/") - if not debugsrc_path.exists(): - continue - - file_sha256 = bb.utils.sha256_file(debugsrc_path) - - if file_sha256 in sources: - source_file = sources[file_sha256] - - doc_ref = package_doc.find_external_document_ref(source_file.doc.documentNamespace) - if doc_ref is None: - doc_ref = oe.spdx.SPDXExternalDocumentRef() - doc_ref.externalDocumentId = "DocumentRef-dependency-" + source_file.doc.name - doc_ref.spdxDocument = source_file.doc.documentNamespace - doc_ref.checksum.algorithm = "SHA1" - doc_ref.checksum.checksumValue = source_file.doc_sha1 - package_doc.externalDocumentRefs.append(doc_ref) - - ref_id = "%s:%s" % (doc_ref.externalDocumentId, source_file.file.SPDXID) - else: - bb.debug(1, "Debug source %s with SHA256 %s not found in any dependency" % (str(debugsrc_path), file_sha256)) - break - else: - bb.debug(1, "Debug source %s not found" % debugsrc) - - package_doc.add_relationship(pkg_file, "GENERATED_FROM", ref_id, comment=debugsrc) - -def collect_dep_recipes(d, doc, spdx_recipe): - from pathlib import Path - import oe.sbom - import oe.spdx - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - - dep_recipes = [] - taskdepdata = d.getVar("BB_TASKDEPDATA", False) - deps = sorted(set( - dep[0] for dep in taskdepdata.values() if - dep[1] == "do_create_spdx" and dep[0] != d.getVar("PN") - )) - for dep_pn in deps: - dep_recipe_path = deploy_dir_spdx / "recipes" / ("recipe-%s.spdx.json" % dep_pn) - - spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_recipe_path) - - for pkg in spdx_dep_doc.packages: - if pkg.name == dep_pn: - spdx_dep_recipe = pkg - break - else: - continue - - dep_recipes.append(oe.sbom.DepRecipe(spdx_dep_doc, spdx_dep_sha1, spdx_dep_recipe)) - - dep_recipe_ref = oe.spdx.SPDXExternalDocumentRef() - dep_recipe_ref.externalDocumentId = "DocumentRef-dependency-" + spdx_dep_doc.name - dep_recipe_ref.spdxDocument = spdx_dep_doc.documentNamespace - dep_recipe_ref.checksum.algorithm = "SHA1" - dep_recipe_ref.checksum.checksumValue = spdx_dep_sha1 - - doc.externalDocumentRefs.append(dep_recipe_ref) - - doc.add_relationship( - "%s:%s" % (dep_recipe_ref.externalDocumentId, spdx_dep_recipe.SPDXID), - "BUILD_DEPENDENCY_OF", - spdx_recipe - ) - - return dep_recipes - -collect_dep_recipes[vardepsexclude] += "BB_TASKDEPDATA" - - -def collect_dep_sources(d, dep_recipes): - import oe.sbom - - sources = {} - for dep in dep_recipes: - # Don't collect sources from native recipes as they - # match non-native sources also. - if recipe_spdx_is_native(d, dep.recipe): - continue - recipe_files = set(dep.recipe.hasFiles) - - for spdx_file in dep.doc.files: - if spdx_file.SPDXID not in recipe_files: - continue - - if "SOURCE" in spdx_file.fileTypes: - for checksum in spdx_file.checksums: - if checksum.algorithm == "SHA256": - sources[checksum.checksumValue] = oe.sbom.DepSource(dep.doc, dep.doc_sha1, dep.recipe, spdx_file) - break - - return sources - - -python do_create_spdx() { - from datetime import datetime, timezone - import oe.sbom - import oe.spdx - import uuid - from pathlib import Path - from contextlib import contextmanager - import oe.cve_check - - @contextmanager - def optional_tarfile(name, guard, mode="w"): - import tarfile - import bb.compress.zstd - - num_threads = int(d.getVar("BB_NUMBER_THREADS")) - - if guard: - name.parent.mkdir(parents=True, exist_ok=True) - with bb.compress.zstd.open(name, mode=mode + "b", num_threads=num_threads) as f: - with tarfile.open(fileobj=f, mode=mode + "|") as tf: - yield tf - else: - yield None - - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - spdx_workdir = Path(d.getVar("SPDXWORK")) - include_sources = d.getVar("SPDX_INCLUDE_SOURCES") == "1" - archive_sources = d.getVar("SPDX_ARCHIVE_SOURCES") == "1" - archive_packaged = d.getVar("SPDX_ARCHIVE_PACKAGED") == "1" - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - - doc = oe.spdx.SPDXDocument() - - doc.name = "recipe-" + d.getVar("PN") - doc.documentNamespace = get_doc_namespace(d, doc) - doc.creationInfo.created = creation_time - doc.creationInfo.comment = "This document was created by analyzing recipe files during the build." - doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - doc.creationInfo.creators.append("Person: N/A ()") - - recipe = oe.spdx.SPDXPackage() - recipe.name = d.getVar("PN") - recipe.versionInfo = d.getVar("PV") - recipe.SPDXID = oe.sbom.get_recipe_spdxid(d) - recipe.supplier = d.getVar("SPDX_SUPPLIER") - if bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d): - recipe.annotations.append(create_annotation(d, "isNative")) - - for s in d.getVar('SRC_URI').split(): - if not s.startswith("file://"): - s = s.split(';')[0] - recipe.downloadLocation = s - break - else: - recipe.downloadLocation = "NOASSERTION" - - homepage = d.getVar("HOMEPAGE") - if homepage: - recipe.homepage = homepage - - license = d.getVar("LICENSE") - if license: - recipe.licenseDeclared = convert_license_to_spdx(license, doc, d) - - summary = d.getVar("SUMMARY") - if summary: - recipe.summary = summary - - description = d.getVar("DESCRIPTION") - if description: - recipe.description = description - - # Some CVEs may be patched during the build process without incrementing the version number, - # so querying for CVEs based on the CPE id can lead to false positives. To account for this, - # save the CVEs fixed by patches to source information field in the SPDX. - patched_cves = oe.cve_check.get_patched_cves(d) - patched_cves = list(patched_cves) - patched_cves = ' '.join(patched_cves) - if patched_cves: - recipe.sourceInfo = "CVEs fixed: " + patched_cves - - cpe_ids = oe.cve_check.get_cpe_ids(d.getVar("CVE_PRODUCT"), d.getVar("CVE_VERSION")) - if cpe_ids: - for cpe_id in cpe_ids: - cpe = oe.spdx.SPDXExternalReference() - cpe.referenceCategory = "SECURITY" - cpe.referenceType = "http://spdx.org/rdf/references/cpe23Type" - cpe.referenceLocator = cpe_id - recipe.externalRefs.append(cpe) - - doc.packages.append(recipe) - doc.add_relationship(doc, "DESCRIBES", recipe) - - if process_sources(d) and include_sources: - recipe_archive = deploy_dir_spdx / "recipes" / (doc.name + ".tar.zst") - with optional_tarfile(recipe_archive, archive_sources) as archive: - spdx_get_src(d) - - add_package_files( - d, - doc, - recipe, - spdx_workdir, - lambda file_counter: "SPDXRef-SourceFile-%s-%d" % (d.getVar("PN"), file_counter), - lambda filepath: ["SOURCE"], - ignore_dirs=[".git"], - ignore_top_level_dirs=["temp"], - archive=archive, - ) - - if archive is not None: - recipe.packageFileName = str(recipe_archive.name) - - dep_recipes = collect_dep_recipes(d, doc, recipe) - - doc_sha1 = oe.sbom.write_doc(d, doc, "recipes", indent=get_json_indent(d)) - dep_recipes.append(oe.sbom.DepRecipe(doc, doc_sha1, recipe)) - - recipe_ref = oe.spdx.SPDXExternalDocumentRef() - recipe_ref.externalDocumentId = "DocumentRef-recipe-" + recipe.name - recipe_ref.spdxDocument = doc.documentNamespace - recipe_ref.checksum.algorithm = "SHA1" - recipe_ref.checksum.checksumValue = doc_sha1 - - sources = collect_dep_sources(d, dep_recipes) - found_licenses = {license.name:recipe_ref.externalDocumentId + ":" + license.licenseId for license in doc.hasExtractedLicensingInfos} - - if not recipe_spdx_is_native(d, recipe): - bb.build.exec_func("read_subpackage_metadata", d) - - pkgdest = Path(d.getVar("PKGDEST")) - for package in d.getVar("PACKAGES").split(): - if not oe.packagedata.packaged(package, d): - continue - - package_doc = oe.spdx.SPDXDocument() - pkg_name = d.getVar("PKG:%s" % package) or package - package_doc.name = pkg_name - package_doc.documentNamespace = get_doc_namespace(d, package_doc) - package_doc.creationInfo.created = creation_time - package_doc.creationInfo.comment = "This document was created by analyzing packages created during the build." - package_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - package_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - package_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - package_doc.creationInfo.creators.append("Person: N/A ()") - package_doc.externalDocumentRefs.append(recipe_ref) - - package_license = d.getVar("LICENSE:%s" % package) or d.getVar("LICENSE") - - spdx_package = oe.spdx.SPDXPackage() - - spdx_package.SPDXID = oe.sbom.get_package_spdxid(pkg_name) - spdx_package.name = pkg_name - spdx_package.versionInfo = d.getVar("PV") - spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses) - spdx_package.supplier = d.getVar("SPDX_SUPPLIER") - - package_doc.packages.append(spdx_package) - - package_doc.add_relationship(spdx_package, "GENERATED_FROM", "%s:%s" % (recipe_ref.externalDocumentId, recipe.SPDXID)) - package_doc.add_relationship(package_doc, "DESCRIBES", spdx_package) - - package_archive = deploy_dir_spdx / "packages" / (package_doc.name + ".tar.zst") - with optional_tarfile(package_archive, archive_packaged) as archive: - package_files = add_package_files( - d, - package_doc, - spdx_package, - pkgdest / package, - lambda file_counter: oe.sbom.get_packaged_file_spdxid(pkg_name, file_counter), - lambda filepath: ["BINARY"], - ignore_top_level_dirs=['CONTROL', 'DEBIAN'], - archive=archive, - ) - - if archive is not None: - spdx_package.packageFileName = str(package_archive.name) - - add_package_sources_from_debug(d, package_doc, spdx_package, package, package_files, sources) - - oe.sbom.write_doc(d, package_doc, "packages", indent=get_json_indent(d)) -} -# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source -addtask do_create_spdx after do_package do_packagedata do_unpack before do_populate_sdk do_build do_rm_work - -SSTATETASKS += "do_create_spdx" -do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}" -do_create_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" - -python do_create_spdx_setscene () { - sstate_setscene(d) -} -addtask do_create_spdx_setscene - -do_create_spdx[dirs] = "${SPDXWORK}" -do_create_spdx[cleandirs] = "${SPDXDEPLOY} ${SPDXWORK}" -do_create_spdx[depends] += "${PATCHDEPENDENCY}" -do_create_spdx[deptask] = "do_create_spdx" - -def collect_package_providers(d): - from pathlib import Path - import oe.sbom - import oe.spdx - import json - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - - providers = {} - - taskdepdata = d.getVar("BB_TASKDEPDATA", False) - deps = sorted(set( - dep[0] for dep in taskdepdata.values() if dep[0] != d.getVar("PN") - )) - deps.append(d.getVar("PN")) - - for dep_pn in deps: - recipe_data = oe.packagedata.read_pkgdata(dep_pn, d) - - for pkg in recipe_data.get("PACKAGES", "").split(): - - pkg_data = oe.packagedata.read_subpkgdata_dict(pkg, d) - rprovides = set(n for n, _ in bb.utils.explode_dep_versions2(pkg_data.get("RPROVIDES", "")).items()) - rprovides.add(pkg) - - for r in rprovides: - providers[r] = pkg - - return providers - -collect_package_providers[vardepsexclude] += "BB_TASKDEPDATA" - -python do_create_runtime_spdx() { - from datetime import datetime, timezone - import oe.sbom - import oe.spdx - import oe.packagedata - from pathlib import Path - - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - spdx_deploy = Path(d.getVar("SPDXRUNTIMEDEPLOY")) - is_native = bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d) - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - - providers = collect_package_providers(d) - - if not is_native: - bb.build.exec_func("read_subpackage_metadata", d) - - dep_package_cache = {} - - pkgdest = Path(d.getVar("PKGDEST")) - for package in d.getVar("PACKAGES").split(): - localdata = bb.data.createCopy(d) - pkg_name = d.getVar("PKG:%s" % package) or package - localdata.setVar("PKG", pkg_name) - localdata.setVar('OVERRIDES', d.getVar("OVERRIDES", False) + ":" + package) - - if not oe.packagedata.packaged(package, localdata): - continue - - pkg_spdx_path = deploy_dir_spdx / "packages" / (pkg_name + ".spdx.json") - - package_doc, package_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) - - for p in package_doc.packages: - if p.name == pkg_name: - spdx_package = p - break - else: - bb.fatal("Package '%s' not found in %s" % (pkg_name, pkg_spdx_path)) - - runtime_doc = oe.spdx.SPDXDocument() - runtime_doc.name = "runtime-" + pkg_name - runtime_doc.documentNamespace = get_doc_namespace(localdata, runtime_doc) - runtime_doc.creationInfo.created = creation_time - runtime_doc.creationInfo.comment = "This document was created by analyzing package runtime dependencies." - runtime_doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - runtime_doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - runtime_doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - runtime_doc.creationInfo.creators.append("Person: N/A ()") - - package_ref = oe.spdx.SPDXExternalDocumentRef() - package_ref.externalDocumentId = "DocumentRef-package-" + package - package_ref.spdxDocument = package_doc.documentNamespace - package_ref.checksum.algorithm = "SHA1" - package_ref.checksum.checksumValue = package_doc_sha1 - - runtime_doc.externalDocumentRefs.append(package_ref) - - runtime_doc.add_relationship( - runtime_doc.SPDXID, - "AMENDS", - "%s:%s" % (package_ref.externalDocumentId, package_doc.SPDXID) - ) - - deps = bb.utils.explode_dep_versions2(localdata.getVar("RDEPENDS") or "") - seen_deps = set() - for dep, _ in deps.items(): - if dep in seen_deps: - continue - - if dep not in providers: - continue - - dep = providers[dep] - - if not oe.packagedata.packaged(dep, localdata): - continue - - dep_pkg_data = oe.packagedata.read_subpkgdata_dict(dep, d) - dep_pkg = dep_pkg_data["PKG"] - - if dep in dep_package_cache: - (dep_spdx_package, dep_package_ref) = dep_package_cache[dep] - else: - dep_path = deploy_dir_spdx / "packages" / ("%s.spdx.json" % dep_pkg) - - spdx_dep_doc, spdx_dep_sha1 = oe.sbom.read_doc(dep_path) - - for pkg in spdx_dep_doc.packages: - if pkg.name == dep_pkg: - dep_spdx_package = pkg - break - else: - bb.fatal("Package '%s' not found in %s" % (dep_pkg, dep_path)) - - dep_package_ref = oe.spdx.SPDXExternalDocumentRef() - dep_package_ref.externalDocumentId = "DocumentRef-runtime-dependency-" + spdx_dep_doc.name - dep_package_ref.spdxDocument = spdx_dep_doc.documentNamespace - dep_package_ref.checksum.algorithm = "SHA1" - dep_package_ref.checksum.checksumValue = spdx_dep_sha1 - - dep_package_cache[dep] = (dep_spdx_package, dep_package_ref) - - runtime_doc.externalDocumentRefs.append(dep_package_ref) - - runtime_doc.add_relationship( - "%s:%s" % (dep_package_ref.externalDocumentId, dep_spdx_package.SPDXID), - "RUNTIME_DEPENDENCY_OF", - "%s:%s" % (package_ref.externalDocumentId, spdx_package.SPDXID) - ) - seen_deps.add(dep) - - oe.sbom.write_doc(d, runtime_doc, "runtime", spdx_deploy, indent=get_json_indent(d)) -} - -addtask do_create_runtime_spdx after do_create_spdx before do_build do_rm_work -SSTATETASKS += "do_create_runtime_spdx" -do_create_runtime_spdx[sstate-inputdirs] = "${SPDXRUNTIMEDEPLOY}" -do_create_runtime_spdx[sstate-outputdirs] = "${DEPLOY_DIR_SPDX}" - -python do_create_runtime_spdx_setscene () { - sstate_setscene(d) -} -addtask do_create_runtime_spdx_setscene - -do_create_runtime_spdx[dirs] = "${SPDXRUNTIMEDEPLOY}" -do_create_runtime_spdx[cleandirs] = "${SPDXRUNTIMEDEPLOY}" -do_create_runtime_spdx[rdeptask] = "do_create_spdx" - -def spdx_get_src(d): - """ - save patched source of the recipe in SPDX_WORKDIR. - """ - import shutil - spdx_workdir = d.getVar('SPDXWORK') - spdx_sysroot_native = d.getVar('STAGING_DIR_NATIVE') - pn = d.getVar('PN') - - workdir = d.getVar("WORKDIR") - - try: - # The kernel class functions require it to be on work-shared, so we dont change WORKDIR - if not is_work_shared_spdx(d): - # Change the WORKDIR to make do_unpack do_patch run in another dir. - d.setVar('WORKDIR', spdx_workdir) - # Restore the original path to recipe's native sysroot (it's relative to WORKDIR). - d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) - - # The changed 'WORKDIR' also caused 'B' changed, create dir 'B' for the - # possibly requiring of the following tasks (such as some recipes's - # do_patch required 'B' existed). - bb.utils.mkdirhier(d.getVar('B')) - - bb.build.exec_func('do_unpack', d) - # Copy source of kernel to spdx_workdir - if is_work_shared_spdx(d): - d.setVar('WORKDIR', spdx_workdir) - d.setVar('STAGING_DIR_NATIVE', spdx_sysroot_native) - src_dir = spdx_workdir + "/" + d.getVar('PN')+ "-" + d.getVar('PV') + "-" + d.getVar('PR') - bb.utils.mkdirhier(src_dir) - if bb.data.inherits_class('kernel',d): - share_src = d.getVar('STAGING_KERNEL_DIR') - cmd_copy_share = "cp -rf " + share_src + "/* " + src_dir + "/" - cmd_copy_kernel_result = os.popen(cmd_copy_share).read() - bb.note("cmd_copy_kernel_result = " + cmd_copy_kernel_result) - - git_path = src_dir + "/.git" - if os.path.exists(git_path): - shutils.rmtree(git_path) - - # Make sure gcc and kernel sources are patched only once - if not (d.getVar('SRC_URI') == "" or is_work_shared_spdx(d)): - bb.build.exec_func('do_patch', d) - - # Some userland has no source. - if not os.path.exists( spdx_workdir ): - bb.utils.mkdirhier(spdx_workdir) - finally: - d.setVar("WORKDIR", workdir) - -do_rootfs[recrdeptask] += "do_create_spdx do_create_runtime_spdx" - -ROOTFS_POSTUNINSTALL_COMMAND =+ "image_combine_spdx ; " - -do_populate_sdk[recrdeptask] += "do_create_spdx do_create_runtime_spdx" -POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_combine_spdx; " -POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_combine_spdx; " - -python image_combine_spdx() { - import os - import oe.sbom - from pathlib import Path - from oe.rootfs import image_list_installed_packages - - image_name = d.getVar("IMAGE_NAME") - image_link_name = d.getVar("IMAGE_LINK_NAME") - imgdeploydir = Path(d.getVar("IMGDEPLOYDIR")) - img_spdxid = oe.sbom.get_image_spdxid(image_name) - packages = image_list_installed_packages(d) - - combine_spdx(d, image_name, imgdeploydir, img_spdxid, packages) - - def make_image_link(target_path, suffix): - if image_link_name: - link = imgdeploydir / (image_link_name + suffix) - if link != target_path: - link.symlink_to(os.path.relpath(target_path, link.parent)) - - image_spdx_path = imgdeploydir / (image_name + ".spdx.json") - make_image_link(image_spdx_path, ".spdx.json") - spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst") - make_image_link(spdx_tar_path, ".spdx.tar.zst") - spdx_index_path = imgdeploydir / (image_name + ".spdx.index.json") - make_image_link(spdx_index_path, ".spdx.index.json") -} - -python sdk_host_combine_spdx() { - sdk_combine_spdx(d, "host") -} - -python sdk_target_combine_spdx() { - sdk_combine_spdx(d, "target") -} - -def sdk_combine_spdx(d, sdk_type): - import oe.sbom - from pathlib import Path - from oe.sdk import sdk_list_installed_packages - - sdk_name = d.getVar("SDK_NAME") + "-" + sdk_type - sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR")) - sdk_spdxid = oe.sbom.get_sdk_spdxid(sdk_name) - sdk_packages = sdk_list_installed_packages(d, sdk_type == "target") - combine_spdx(d, sdk_name, sdk_deploydir, sdk_spdxid, sdk_packages) - -def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages): - import os - import oe.spdx - import oe.sbom - import io - import json - from datetime import timezone, datetime - from pathlib import Path - import tarfile - import bb.compress.zstd - - creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") - deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX")) - source_date_epoch = d.getVar("SOURCE_DATE_EPOCH") - - doc = oe.spdx.SPDXDocument() - doc.name = rootfs_name - doc.documentNamespace = get_doc_namespace(d, doc) - doc.creationInfo.created = creation_time - doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build." - doc.creationInfo.licenseListVersion = d.getVar("SPDX_LICENSE_DATA")["licenseListVersion"] - doc.creationInfo.creators.append("Tool: OpenEmbedded Core create-spdx.bbclass") - doc.creationInfo.creators.append("Organization: %s" % d.getVar("SPDX_ORG")) - doc.creationInfo.creators.append("Person: N/A ()") - - image = oe.spdx.SPDXPackage() - image.name = d.getVar("PN") - image.versionInfo = d.getVar("PV") - image.SPDXID = rootfs_spdxid - image.supplier = d.getVar("SPDX_SUPPLIER") - - doc.packages.append(image) - - for name in sorted(packages.keys()): - pkg_spdx_path = deploy_dir_spdx / "packages" / (name + ".spdx.json") - pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path) - - for p in pkg_doc.packages: - if p.name == name: - pkg_ref = oe.spdx.SPDXExternalDocumentRef() - pkg_ref.externalDocumentId = "DocumentRef-%s" % pkg_doc.name - pkg_ref.spdxDocument = pkg_doc.documentNamespace - pkg_ref.checksum.algorithm = "SHA1" - pkg_ref.checksum.checksumValue = pkg_doc_sha1 - - doc.externalDocumentRefs.append(pkg_ref) - doc.add_relationship(image, "CONTAINS", "%s:%s" % (pkg_ref.externalDocumentId, p.SPDXID)) - break - else: - bb.fatal("Unable to find package with name '%s' in SPDX file %s" % (name, pkg_spdx_path)) - - runtime_spdx_path = deploy_dir_spdx / "runtime" / ("runtime-" + name + ".spdx.json") - runtime_doc, runtime_doc_sha1 = oe.sbom.read_doc(runtime_spdx_path) - - runtime_ref = oe.spdx.SPDXExternalDocumentRef() - runtime_ref.externalDocumentId = "DocumentRef-%s" % runtime_doc.name - runtime_ref.spdxDocument = runtime_doc.documentNamespace - runtime_ref.checksum.algorithm = "SHA1" - runtime_ref.checksum.checksumValue = runtime_doc_sha1 - - # "OTHER" isn't ideal here, but I can't find a relationship that makes sense - doc.externalDocumentRefs.append(runtime_ref) - doc.add_relationship( - image, - "OTHER", - "%s:%s" % (runtime_ref.externalDocumentId, runtime_doc.SPDXID), - comment="Runtime dependencies for %s" % name - ) - - image_spdx_path = rootfs_deploydir / (rootfs_name + ".spdx.json") - - with image_spdx_path.open("wb") as f: - doc.to_json(f, sort_keys=True, indent=get_json_indent(d)) - - num_threads = int(d.getVar("BB_NUMBER_THREADS")) - - visited_docs = set() - - index = {"documents": []} - - spdx_tar_path = rootfs_deploydir / (rootfs_name + ".spdx.tar.zst") - with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f: - with tarfile.open(fileobj=f, mode="w|") as tar: - def collect_spdx_document(path): - nonlocal tar - nonlocal deploy_dir_spdx - nonlocal source_date_epoch - nonlocal index - - if path in visited_docs: - return - - visited_docs.add(path) - - with path.open("rb") as f: - doc, sha1 = oe.sbom.read_doc(f) - f.seek(0) - - if doc.documentNamespace in visited_docs: - return - - bb.note("Adding SPDX document %s" % path) - visited_docs.add(doc.documentNamespace) - info = tar.gettarinfo(fileobj=f) - - info.name = doc.name + ".spdx.json" - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - if source_date_epoch is not None and info.mtime > int(source_date_epoch): - info.mtime = int(source_date_epoch) - - tar.addfile(info, f) - - index["documents"].append({ - "filename": info.name, - "documentNamespace": doc.documentNamespace, - "sha1": sha1, - }) - - for ref in doc.externalDocumentRefs: - ref_path = deploy_dir_spdx / "by-namespace" / ref.spdxDocument.replace("/", "_") - collect_spdx_document(ref_path) - - collect_spdx_document(image_spdx_path) - - index["documents"].sort(key=lambda x: x["filename"]) - - index_str = io.BytesIO(json.dumps( - index, - sort_keys=True, - indent=get_json_indent(d), - ).encode("utf-8")) - - info = tarfile.TarInfo() - info.name = "index.json" - info.size = len(index_str.getvalue()) - info.uid = 0 - info.gid = 0 - info.uname = "root" - info.gname = "root" - - tar.addfile(info, fileobj=index_str) - - spdx_index_path = rootfs_deploydir / (rootfs_name + ".spdx.index.json") - with spdx_index_path.open("w") as f: - json.dump(index, f, sort_keys=True, indent=get_json_indent(d)) +# Include this class when you don't care what version of SPDX you get; it will +# be updated to the latest stable version that is supported +inherit create-spdx-2.2 diff --git a/poky/meta/classes/rm_work.bbclass b/poky/meta/classes/rm_work.bbclass index c493efff2f..4121a13279 100644 --- a/poky/meta/classes/rm_work.bbclass +++ b/poky/meta/classes/rm_work.bbclass @@ -112,6 +112,8 @@ do_rm_work () { fi done } +do_rm_work[vardepsexclude] += "SSTATETASKS" + do_rm_work_all () { : } diff --git a/poky/meta/classes/testexport.bbclass b/poky/meta/classes/testexport.bbclass deleted file mode 100644 index f7c5242dc5..0000000000 --- a/poky/meta/classes/testexport.bbclass +++ /dev/null @@ -1,180 +0,0 @@ -# Copyright (C) 2016 Intel Corporation -# -# SPDX-License-Identifier: MIT -# -# testexport.bbclass allows to execute runtime test outside OE environment. -# Most of the tests are commands run on target image over ssh. -# To use it add testexport to global inherit and call your target image with -c testexport -# You can try it out like this: -# - First build an image. i.e. core-image-sato -# - Add INHERIT += "testexport" in local.conf -# - Then bitbake core-image-sato -c testexport. That will generate the directory structure -# to execute the runtime tests using runexported.py. -# -# For more information on TEST_SUITES check testimage class. - -TEST_LOG_DIR ?= "${WORKDIR}/testexport" -TEST_EXPORT_DIR ?= "${TMPDIR}/testexport/${PN}" -TEST_EXPORT_PACKAGED_DIR ?= "packages/packaged" -TEST_EXPORT_EXTRACTED_DIR ?= "packages/extracted" - -TEST_TARGET ?= "simpleremote" -TEST_TARGET_IP ?= "" -TEST_SERVER_IP ?= "" - -require conf/testexport.conf - -TEST_EXPORT_SDK_ENABLED ?= "0" - -TEST_EXPORT_DEPENDS = "" -TEST_EXPORT_DEPENDS += "${@bb.utils.contains('IMAGE_PKGTYPE', 'rpm', 'cpio-native:do_populate_sysroot', '', d)}" -TEST_EXPORT_DEPENDS += "${@bb.utils.contains('TEST_EXPORT_SDK_ENABLED', '1', 'testexport-tarball:do_populate_sdk', '', d)}" -TEST_EXPORT_LOCK = "${TMPDIR}/testimage.lock" - -addtask testexport -do_testexport[nostamp] = "1" -do_testexport[depends] += "${TEST_EXPORT_DEPENDS} ${TESTIMAGEDEPENDS}" -do_testexport[lockfiles] += "${TEST_EXPORT_LOCK}" - -python do_testexport() { - testexport_main(d) -} - -def testexport_main(d): - import json - import logging - - from oeqa.runtime.context import OERuntimeTestContext - from oeqa.runtime.context import OERuntimeTestContextExecutor - - image_name = ("%s/%s" % (d.getVar('DEPLOY_DIR_IMAGE'), - d.getVar('IMAGE_LINK_NAME'))) - - tdname = "%s.testdata.json" % image_name - td = json.load(open(tdname, "r")) - - logger = logging.getLogger("BitBake") - - target = OERuntimeTestContextExecutor.getTarget( - d.getVar("TEST_TARGET"), None, d.getVar("TEST_TARGET_IP"), - d.getVar("TEST_SERVER_IP")) - - host_dumper = OERuntimeTestContextExecutor.getHostDumper( - d.getVar("testimage_dump_host"), d.getVar("TESTIMAGE_DUMP_DIR")) - - image_manifest = "%s.manifest" % image_name - image_packages = OERuntimeTestContextExecutor.readPackagesManifest(image_manifest) - - extract_dir = d.getVar("TEST_EXTRACTED_DIR") - - tc = OERuntimeTestContext(td, logger, target, host_dumper, - image_packages, extract_dir) - - copy_needed_files(d, tc) - -def copy_needed_files(d, tc): - import shutil - import oe.path - - from oeqa.utils.package_manager import _get_json_file - from oeqa.core.utils.test import getSuiteCasesFiles - - export_path = d.getVar('TEST_EXPORT_DIR') - corebase_path = d.getVar('COREBASE') - - # Clean everything before starting - oe.path.remove(export_path) - bb.utils.mkdirhier(os.path.join(export_path, 'lib', 'oeqa')) - - # The source of files to copy are relative to 'COREBASE' directory - # The destination is relative to 'TEST_EXPORT_DIR' - # Because we are squashing the libraries, we need to remove - # the layer/script directory - files_to_copy = [ os.path.join('meta', 'lib', 'oeqa', 'core'), - os.path.join('meta', 'lib', 'oeqa', 'runtime'), - os.path.join('meta', 'lib', 'oeqa', 'files'), - os.path.join('meta', 'lib', 'oeqa', 'utils'), - os.path.join('scripts', 'oe-test'), - os.path.join('scripts', 'lib', 'argparse_oe.py'), - os.path.join('scripts', 'lib', 'scriptutils.py'), ] - - for f in files_to_copy: - src = os.path.join(corebase_path, f) - dst = os.path.join(export_path, f.split('/', 1)[-1]) - if os.path.isdir(src): - oe.path.copytree(src, dst) - else: - shutil.copy2(src, dst) - - # Remove cases and just copy the ones specified - cases_path = os.path.join(export_path, 'lib', 'oeqa', 'runtime', 'cases') - oe.path.remove(cases_path) - bb.utils.mkdirhier(cases_path) - test_paths = get_runtime_paths(d) - test_modules = d.getVar('TEST_SUITES').split() - tc.loadTests(test_paths, modules=test_modules) - for f in getSuiteCasesFiles(tc.suites): - shutil.copy2(f, cases_path) - json_file = _get_json_file(f) - if json_file: - shutil.copy2(json_file, cases_path) - - # Copy test data - image_name = ("%s/%s" % (d.getVar('DEPLOY_DIR_IMAGE'), - d.getVar('IMAGE_LINK_NAME'))) - image_manifest = "%s.manifest" % image_name - tdname = "%s.testdata.json" % image_name - test_data_path = os.path.join(export_path, 'data') - bb.utils.mkdirhier(test_data_path) - shutil.copy2(image_manifest, os.path.join(test_data_path, 'manifest')) - shutil.copy2(tdname, os.path.join(test_data_path, 'testdata.json')) - - for subdir, dirs, files in os.walk(export_path): - for dir in dirs: - if dir == '__pycache__': - shutil.rmtree(os.path.join(subdir, dir)) - - # Create tar file for common parts of testexport - testexport_create_tarball(d, "testexport.tar.gz", d.getVar("TEST_EXPORT_DIR")) - - # Copy packages needed for runtime testing - package_extraction(d, tc.suites) - test_pkg_dir = d.getVar("TEST_NEEDED_PACKAGES_DIR") - if os.path.isdir(test_pkg_dir) and os.listdir(test_pkg_dir): - export_pkg_dir = os.path.join(d.getVar("TEST_EXPORT_DIR"), "packages") - oe.path.copytree(test_pkg_dir, export_pkg_dir) - # Create tar file for packages needed by the DUT - testexport_create_tarball(d, "testexport_packages_%s.tar.gz" % d.getVar("MACHINE"), export_pkg_dir) - - # Copy SDK - if d.getVar("TEST_EXPORT_SDK_ENABLED") == "1": - sdk_deploy = d.getVar("SDK_DEPLOY") - tarball_name = "%s.sh" % d.getVar("TEST_EXPORT_SDK_NAME") - tarball_path = os.path.join(sdk_deploy, tarball_name) - export_sdk_dir = os.path.join(d.getVar("TEST_EXPORT_DIR"), - d.getVar("TEST_EXPORT_SDK_DIR")) - bb.utils.mkdirhier(export_sdk_dir) - shutil.copy2(tarball_path, export_sdk_dir) - - # Create tar file for the sdk - testexport_create_tarball(d, "testexport_sdk_%s.tar.gz" % d.getVar("SDK_ARCH"), export_sdk_dir) - - bb.plain("Exported tests to: %s" % export_path) - -def testexport_create_tarball(d, tar_name, src_dir): - - import tarfile - - tar_path = os.path.join(d.getVar("TEST_EXPORT_DIR"), tar_name) - current_dir = os.getcwd() - src_dir = src_dir.rstrip('/') - dir_name = os.path.dirname(src_dir) - base_name = os.path.basename(src_dir) - - os.chdir(dir_name) - tar = tarfile.open(tar_path, "w:gz") - tar.add(base_name) - tar.close() - os.chdir(current_dir) - -IMAGE_CLASSES += "testimage" |