summaryrefslogtreecommitdiff
path: root/poky/meta/classes
diff options
context:
space:
mode:
authorAndrew Geissler <geissonator@yahoo.com>2022-02-25 23:34:39 +0300
committerAndrew Geissler <andrew@geissonator.com>2022-04-01 17:11:17 +0300
commit7e0e3c0c6a2cd4e76ebca17ed16a37155992025e (patch)
treea95a4a4e69705650aae4f048c1fdf90749f551f5 /poky/meta/classes
parent0b74d07dc0e30403ff5928c63dabfbbd6eb40c49 (diff)
downloadopenbmc-7e0e3c0c6a2cd4e76ebca17ed16a37155992025e.tar.xz
subtree updates feb 25 2022
poky: 27ff420543..49168f5d55: Ahsan Hussain (1): staging: use relative path in sysroot_stage_dir() Alejandro Hernandez Samaniego (5): core-image-tiny-initramfs: Mark recipe as 32 bit ARM compatible kernel.bbclass: Allow initramfs to be built from a separate multiconfig busybox: Add shell arithmetic to work with poky-tiny newlib: Upgrade 4.1.0 -> 4.2.0 documentation: Add multiconfig initramfs configuration: Alex Stewart (1): sudo: add /etc/sudoers to sudo-lib conffiles Alexander Kanavin (84): ruby: do not parallel install bind: upgrade 9.16.24 -> 9.16.25 ifupdown: upgrade 0.8.36 -> 0.8.37 ethtool: upgrade 5.15 -> 5.16 webkitgtk: upgrade 2.34.3 -> 2.34.4 debianutils: upgrade 5.5 -> 5.7 diffoscope: upgrade 200 -> 201 libbsd: upgrade 0.11.3 -> 0.11.5 libical: upgrade 3.0.12 -> 3.0.13 zstd: update 1.5.0 -> 1.5.2 rust: update 1.58.0 -> 1.58.1 wpa-supplicant: update 2.9 -> 2.10 ltp: update 20210927 -> 20220121 gnutls: update 3.7.2 -> 3.7.3 libusb1: correct SRC_URI gobject-introspection: replace prelink-rtld with objdump -p util-linux: update 2.37.2 -> 2.37.3 cmake: update 3.22.1 -> 3.22.2 git: merge .inc into .bb git: build manpages from source subject to manpages PACKAGECONFIG git: update 2.34.1 -> 2.35.1 python3-pycryptodome: update 3.12.0 -> 3.14.0 at: update 3.2.2 -> 3.2.4 sudo: update 1.9.8p2 -> 1.9.9 seatd: add recipe weston: upgrade 9.0.0 -> 10.0.0 xf86-input-libinput: update 1.2.0 -> 1.2.1 glib-2.0: upgrade 2.70.2 -> 2.70.3 lua: upgrade 5.4.3 -> 5.4.4 mmc-utils: upgrade to latest revision python3-cython: upgrade 0.29.26 -> 0.29.27 python3-hypothesis: upgrade 6.36.0 -> 6.36.1 python3-pip: upgrade 21.3.1 -> 22.0.2 cups: upgrade 2.4.0 -> 2.4.1 stress-ng: upgrade 0.13.10 -> 0.13.11 mesa: upgrade 21.3.4 -> 21.3.5 piglit: upgrade to latest revision puzzles: upgrade to latest revision diffoscope: upgrade 201 -> 202 libcap: upgrade 2.62 -> 2.63 libusb1: upgrade 1.0.24 -> 1.0.25 re2c: upgrade 2.2 -> 3.0 libgpg-error: update 1.43 -> 1.44 harfbuzz: upgrade 3.2.0 -> 3.3.1 qemu: replace a gtk wrapper with directly setting environment from runqemu runqemu: preload uninative libraries when host gl drivers are in use git: restore reproducibility on centos 7 insane.bbclass: use multiprocessing for collecting 'objdump -p' output llvm: update 12.0.1 -> 13.0.1 python3-numpy: update 1.22.1 -> 1.22.2 sstate: additional debugging when fetch fails occur sstate: fix up additional debugging when fetch fails occur ruby: correctly set native/target dependencies core-image-weston-sdk: synchronize with core-image-sato-sdk gstreamer1.0: disable flaky gstbin:test_watch_for_state_change test weston-init: disable systemd watchdog option webkitgtk: drop patch merged upstream man-db: update 2.10.0 -> 2.10.1 webkitgtk: remove rejected patch vulkan: update 1.2.198 -> 1.3.204 vulkan-samples: update to latest revision xkeyboard-config: update 2.34 -> 2.35.1 libgit2: update 1.3.0 -> 1.4.0 util-linux: upgrade 2.37.3 -> 2.37.4 python3-tomli: upgrade 2.0.0 -> 2.0.1 repo: upgrade 2.20 -> 2.21 help2man: upgrade 1.48.5 -> 1.49.1 meson: upgrade 0.61.1 -> 0.61.2 mmc-utils: upgrade to latest revision python3-dtschema: upgrade 2021.12 -> 2022.1 python3-pytest: upgrade 7.0.0 -> 7.0.1 vala: upgrade 0.54.6 -> 0.54.7 gi-docgen: upgrade 2021.8 -> 2022.1 pango: upgrade 1.48.10 -> 1.50.4 piglit: upgrade to latest revision shaderc: upgrade 2022.0 -> 2022.1 gst-examples: upgrade 1.18.5 -> 1.18.6 libical: upgrade 3.0.13 -> 3.0.14 diffoscope: upgrade 202 -> 204 gdb: update 11.1 -> 11.2 weston-init: replace deprecated/disabled fbdev with drm backend devtool: explicitly set main or master branches in upgrades when available base/staging: use HOST_PREFIX, not TARGET_PREFIX insane: use HOST_ variables, not TARGET_ to determine the cross system Alexandru Ardelean (1): libsndfile1: bump to version 1.0.31 Andrej Valek (3): busybox: refresh defconfig oeqa: qemu: create missing directory for _write_dump dhcpcd: add option to set DBDIR location Andres Beltran (1): create-spdx: add support for SDKs Andrey Zhizhikin (1): waffle: add wayland-protocols when building with wayland Bruce Ashfield (18): linux-yocto/5.15: update to v5.15.15 linux-yocto/5.10: update to v5.10.92 x86: fix defconfig configuration warnings linux-yocto/5.15: update to v5.15.16 linux-yocto/5.10: update to v5.10.93 linux-libc-headers: update to v5.16 qemuarm64: Add tiny ktype to qemuarm64 bsp lttng-modules: fix build against v5.17+ linux-yocto-dev: update to v5.17+ linux-yocto/5.15: update to v5.15.19 linux-yocto/5.10: update to v5.10.96 lttng-modules: update devupstream to latest 2.13 linux-yocto/5.15: update to v5.15.22 linux-yocto/5.10: update to v5.10.99 linux-yocto/5.15: ppc/riscv: fix build with binutils 2.3.8 linux-yocto/5.10: ppc/riscv: fix build with binutils 2.3.8 linux-yocto/5.10: fix dssall build error with binutils 2.3.8 linux-yocto/5.15: fix dssall build error with binutils 2.3.8 Carlos Rafael Giani (1): libxml2: Backport python3-lxml workaround patch Changhyeok Bae (1): connman: update 1.40 -> 1.41 Changqing Li (2): mdadm: fix testcase 00multipath failure nghttp2: fix for multilib support Chen Qi (1): mdadm: install mdcheck Christian Eggers (5): sdk: fix search for dynamic loader mc: fix build if ncurses have been configured without wide characters curl: configure with '--without-ssl' if ssl is disabled gcsections: add nativesdk-cairo to exclude list dev-manual: update example from kernel.bbclass Daiane Angolini (1): classes/lib/useradd: The option -P is deprecated Daniel Gomez (2): bitbake: contrib: Fix hash server Dockerfile dependencies bitbake: asyncrpc: Fix attribute errors Daniel Müller (1): scripts/runqemu-ifdown: Don't treat the last iptables command as special Denys Dmytriyenko (2): wayland-protocols: upgrade 1.24 -> 1.25 yocto-check-layer: add ability to perform tests from a global bbclass Florian Amstutz (1): devtool: deploy-target: Remove stripped binaries in pseudo context Hongxu Jia (1): glibc: fix create thread failed in unprivileged process Joe Slater (1): virglrenderer: fix CVE-2022-0135 and -0175 Jon Mason (1): linux-yocto-dev: add qemuriscv32 Jose Quaresma (18): icecc.bbclass: replace deprecated bash command substitution spirv-headers: bump to b42ba6 spirv-tools: upgrade 2021.4 -> 2022.1 glslang: upgrade 11.7.1 -> 11.8.0 shaderc: upgrade 2021.3 -> 2021.4 shaderc: upgrade 2021.4 -> 2022.0 gstreamer1.0: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-plugins-base: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-plugins-good: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-plugins-bad: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-plugins-ugly: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-rtsp-server: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-libav: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-vaapi: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-omx: upgrade 1.18.5 -> 1.20.0 gstreamer1.0-python: upgrade 1.18.5 -> 1.20.0 gst-devtools: upgrade 1.18.5 -> 1.20.0 gstreamer1.0: update licenses of all modules Joshua Watt (4): classes/create-spdx: Add packageSupplier field classes/create-spdx: Remove unnecessary package spdx: Add set helper for list properties bitbake: msg: Ensure manually created loggers have the once filter Justin Bronder (1): initramfs-framework: unmount automounts before switch_root Kai Kang (2): toolchain-scripts.bbclass: use double quotes for exported PS1 webkitgtk: 2.34.4 -> 2.34.5 Khem Raj (17): ffmpeg: Remove --disable-msa2 mips option systemd: Forward port musl patches ruby: Fix build on riscv/musl musl: Update to latest master libstd-rs: Apply patches to right version of libc image-prelink: Remove bbclass qemuppc64.conf: Remove commented prelink use meta: Remove libsegfault and catchsegv man-db: Fix build with clang diffutils: Link with libbcrypt on mingw binutils: Upgrade to 2.38 release opensbi: Upgrade to 1.0 u-boot: Fix RISCV build with binutils 2.38 libgit2: Upgrade to 1.4.1 grub: Fix build with bintutils 2.38 on riscv boost: Fix build on 32bit arches with 64bit time_t defaults scripts/documentation-audit: Use renamed LICENSE_FLAGS_ACCEPTED variable Konrad Weihmann (2): ruby: fix DEPENDS append gmp: fix EXTRA_OECONF for mipsarchr6 Kory Maincent (1): icu: fix make_icudata dependencies Lee Chee Yang (1): libarchive : update to 3.5.3 LiweiSong (1): qemu: add tpm string section to qemu acpi table Luna Gräfje (1): tune-cortexa72: Fix a misspelt override in PACKAGE_EXTRA_ARCHS Marek Vasut (1): kernel-fitimage: Add missing dependency for UBOOT_ENV Markus Niebel (1): kmscube: depend on virtual/libgbm Markus Volk (2): libical: build gobject and vala introspection seatd: build systemd backend if DISTRO_FEATURE systemd is set Marta Rybczynska (1): bitbake: lib/bb: fix exit when found renamed variables Martin Beeger (1): cmake: remove bogus CMAKE_LDFLAGS_FLAGS definition from toolchain file Martin Jansa (2): systemd: fix DeprecationWarning about regexps icecc.bbclass: fix syntax error Matthias Klein (1): boost: add json lib Michael Halstead (3): uninative: Upgrade to 3.5 releases: update to include 3.1.14 releases: update to include 3.4.2 Michael Opdenacker (8): migration-3.5: mention task specific network access dev-manual: stop mentioning the Angstrom distribution dev-manual: new "working with pre-compiled libraries" section manuals: improve references to classes manuals: propose https for SSTATE_MIRRORS ref-manual: add usage details about ccache.bbclass ref-manual: update TCLIBC description manuals: add 3.4 and 3.4.1 release notes after migration information Oleksandr Kravchuk (1): ell: update to 0.48 Oleksandr Suvorov (1): depmodwrapper-cross: add config directory option Pavel Zhukov (3): systemd: allow to create directory whose path contains symlink systemd: enable KeepConfiguration= when running on network filesystem patch.py: Prevent git repo reinitialization Peter Kjellerstedt (21): sstate: A third fix for for touching files inside pseudo devtool: sdk-update: Remove an unnecessary \n from SSTATE_MIRRORS sstatetests: Correct a typo in a comment glibc-tests: Correct PACKAGE_DEBUG_SPLIT_STYLE test-manual: Correct two references to BB_SKIP_NETTESTS package: Split out package_debug_vars from split_and_strip_files package: Make package_debug_vars() return a dict package: Pass dv (debug_vars) around instead of individual vars bitbake: bitbake-user-manual: Remove unnecessary \n from a PREMIRRORS example bitbake: tests/fetch: Skip the crate tests if network tests are disabled bitbake: tests/fetch: Remove unnecessary \n from mirror variables bitbake: tests/fetch: Improve the verbose messages for skipped tests bitbake: tests/fetch: Unify how git commands are run bitbake: tests/fetch: Only set the Git user name/email if they are not already set bitbake: tests/fetch: Make test_npm_premirrors work with the current fetcher bitbake: fetch2: Correct handling of replacing the basename in URIs bitbake: fetch2: npm: Put all downloaded files in the npm2 directory poky.conf: Remove unnecessary \n from PREMIRRORS local.conf.sample: Remove unnecessary \n from the SSTATE_MIRRORS example manuals: Remove unnecessary \n from SSTATE_MIRRORS examples oeqa/selftest/bblogging: Add logging tests for bb.build.exec_func with shell/python code Pgowda (2): glibc : Fix CVE-2021-3998 glibc : Fix CVE-2021-3999 Quentin Schulz (2): docs: point to renamed BB_BASEHASH_IGNORE_VARS variable docs: fix hardcoded link warning messages Richard Purdie (65): lttng-tools: upgrade 2.13.2 -> 2.13.4 gcc: Drop stdlib++ option patch staging: Add extra hash handling code abi_version/sstate: Bump for hash equivalence fix prelink: Drop support for it glibc: Drop prelink patch oqea/runtime/oe_syslog: Improve test vim: Upgrade 4269 -> 4134 binutils: Add fix for CVE-2021-45078 glibc: Upgrade to 2.35 patchelf: Fix corrupted file mode patch buildtools: Allow testsdk to access the network scriptutils: Fix handling of srcuri urls default-distrovars.inc: Switch connectivity check to a yoctoproject.org page bitbake: tests/fetch: Add missing branch param for git urls oeqa/buildtools: Switch to our webserver instead of example.com openssl: Add perl functionality test to do_configure ltp: Disable proc01 test bitbake: fetch2/cooker: Fix source revision handling with floating upstreams bitbake: data_smart: Fix overrides file/line message additions bitbake: cooker: Improve parsing failure from handled exception usability bitbake: msg: Add bb.warnonce() and bb.erroronce() log methods bitbake: data_smart: Add hasOverrides method to public datastore API selftest/bbtests: Add tests for git floating tag resolution oeqa/selftest/bbtests: Update to match changed bitbake output features_check/insane: Use hasOverrides datastore method recipeutils: Add missing get_srcrev() call archiver: Fix typo bitbake: utils: Fix environment decorator logic error bitbake: fetch2: Abstract fetcher environment to a function core-image-testmaster: Rename to core-image-testcontroller scripts: Add a conversion script to use SPDX license names meta/meta-selftest/meta-skeleton: Update LICENSE variable to use SPDX license identifiers oeqa/selftest/bbtests: Update after license changes sstate: Setup fetcher environment in advance expat: Upgrade 2.4.4 -> 2.4.5 expat: Upgrade 2.4.5 -> 2.4.6 perl: Improve and update module RPDEPENDS libxml-parser-perl: Add missing RDEPENDS vim: Upgrade 8.2.4314 -> 8.2.4424 tiff: Add backports for two CVEs from upstream bitbake: utils: Ensure shell function failure in python logging is correct oeqa/selftest/bblogging: Split the test cases up for ease of testing bitbake: data_smart/cookerdata: Add variable remapping support bitbake: data_smart: Allow rename mechanism to show full expressions bitbake: data_smart: Add support to BB_RENAMED_VARIABLES for custom strings bitbake: bitbake: Bump version to 1.53.1 sanity.conf: Require bitbake version 1.53.1 layer.conf: Update to use kirkstone meta/scripts: Change BB_ENV_EXTRA_WHITE -> BB_ENV_PASSTHROUGH_ADDITIONS meta/scripts: Handle bitbake variable renaming bitbake.conf: Add entries for renamed variables meta/scripts: Automated conversion of OE renamed variables icecc: Improve variables/terminology bitbake.conf: Add entries to warn on usage of removed variables layer.conf: Update to kirkstone namespace bitbake: data_smart: Avoid exceptions for non string data bitbake: tests/fetch: Update for master -> main change upstream python3targetconfig: Use for nativesdk too licenses: Fix logic error introduced in rename pip_install_wheel: Recompile modified files pip_install_wheel: Use --ignore-installed for pip python3-pip: Don't change shebang python3-pip: Improve reproducibility python3-numpy: Fix pyc determinism issue Robert Joslyn (1): dev-manual/common-tasks: Fix typo Robert Yang (1): bitbake: bitbake: bitbake-worker: Preserve network non-local uid Ross Burton (21): tiff: backport fix for CVE-2022-22844 yocto-check-layer: add debug output for the layers that were found openssl: export OPENSSL_MODULES in the wrapper expat: upgrade to 2.4.4 vim: upgrade to patch 4269 core-image-sato-sdk: allocate more memory when in qemu oeqa/runtime/stap: improve systemtap test systemtap: backport buffer size tuning patches strace: remove obsolete musl-on-MIPS patch strace: skip a number of load-sensitive tests yocto-check-layer: check for duplicate layers when finding layers common-licences: remove ambiguous "BSD" license newlib: fix license checksums oeqa/selftest: test that newlib can build cmake: stop FetchContent from fetching content kernel: make kernel-base recommend kernel-image, not depend poky-tiny: don't skip core-image-base and core-image-full-cmdline poky-tiny: set QB_DEFAULT_FSTYPE correctly coreutils: remove obsolete ignored CVE list cve-check: get_cve_info should open the database read-only oeqa/controllers: update for MasterImageHardwareTarget->Controller... Rudolf J Streif (1): linux-firmware: Add CLM blob to linux-firmware-bcm4373 package Saul Wold (13): recipetool: Fix circular reference in SRC_URI create-spdx: Get SPDX-License-Identifier from source blacklist: Replace class with SKIP_RECIPE variable dnf: Use renamed SKIP_RECIPE varFlag multilib: Use renamed SKIP_RECIPE varFlag imagefeatures: selftest: Change variable to be more descriptive scripts: Add convert-variable-renames script for inclusive language variable renaming meta: Rename LICENSE_FLAGS variable poky-tiny: Use renamed SKIP_RECIPES varFlag Rename LICENSE_FLAGS variable meta: Further LICENSE_FLAGS variable updates package: rename LICENSE_EXCLUSION license.py: rename variables Scott Murray (13): bitbake: bitbake: Rename basehas and taskhash filtering variables bitbake: bitbake: Rename environment filtering variables bitbake: bitbake: Rename configuration hash filtering variable bitbake: bitbake: Rename setscene enforce filtering variable bitbake: bitbake: Rename allowed multiple provider variable bitbake: lib: Replace remaining "blacklist"/"whitelist" usage bitbake: lib/bb: Replace "abort" usage in task handling bitbake: lib/bb: Replace "ABORT" action in BB_DISKMON_DIRS bitbake: bitbake: Replace remaining "abort" usage local.conf/oeqa: Update BB_DISKMON_DIRS use meta-poky: Update BB_DISKMON_DIRS use scripts: fix file writing in convert-variable-renames scripts: fix file writing in convert-spdx-licenses Sean Anderson (1): libpcap: Disable DPDK explicitly Stefan Herbrechtsmeier (12): cve-check: create directory of CVE_CHECK_MANIFEST before copy systemd: Add link-udev-shared PACKAGECONFIG systemd: Minimize udev package size if DISTRO_FEATURES doen't contain sysvinit gcc-target: fix glob to remove gcc-<version> binary gcc-target: move cc1plus to g++ package wic: partition: Support valueless keys in sourceparams wic: rawcopy: Add support for packed images selftest: wic: Remove requirement of syslinux from test_rawcopy_plugin selftest: wic: Add rawcopy plugin unpack test selftest: wic: Disable graphic of qemu to support WSL classes: rootfs-postcommands: avoid exception in overlayfs_qa_check files: overlayfs-create-dirs: split ExecStart into two commands Tamizharasan Kumar (2): linux-yocto/5.10: update genericx86* machines to v5.10.99 linux-yocto/5.15: update genericx86* machines to v5.15.22 Tim Orling (43): python3-hypothesis: upgrade 6.35.0 -> 6.36.0 python3-setuptools-scm: upgrade 6.3.2 -> 6.4.2 python3-pyparsing: upgrade 3.0.6 -> 3.0.7 python3-importlib-metadata: upgrade 4.10.0 -> 4.10.1 python3-wheel: move 0.37.1 from meta-python python3-flit-core: add recipe for 3.6.0 python3-flit-core: SUMMARY DESCRIPTION HOMEPAGE python3-flit-core: inherit pip_install_wheel flit_core.bbclass: add helper for newer python packaging python3-wheel: inherit flit_core pip_install_wheel.bbclass: add helper class python3-wheel-native: install ${bindir}/wheel setuptools_build_meta.bbclass: add helper class python3-pip: inherit setuptools_build_meta python3-pip-native: install scripts in ${bindir} python3-attrs: inherit setuptools_build_meta python3-git: inherit setuptools_build_meta python3-pytest: inherit setuptools_build_meta python3-setuptools-scm: inherit setuptools_build_meta python3-zipp: inherit setuptools_build_meta python3-iniconfig: inherit setuptools_build_meta python3-py: inherit setuptools_build_meta python3-pluggy: inherit setuptools_build_meta python3-setuptools: inherit setuptools_base_meta setuptools3.bbclass: refactor for wheels python3-more-itertools: set PIP_INSTALL_PACKAGE meson: inherit setuptools_build_meta python3-libarchive-c: set PIP_INSTALL_PACKAGE python3-smartypants: patch hash bang to python3 python3-scons: merge -native recipe python3-subunit: merge inc; set PIP_INSTALL_PACKAGE python3-magic: set PIP_INSTALL_PACKAGE bmap-tools: set PIP_INSTALL_PACKAGE, BASEVER asciidoc: set PIP_INSTALL_PACKAGE gi-docgen: set PIP_INSTALL_PACKAGE python3-numpy: set PIP_INSTALL_PACKAGE python3-dbusmock: set PIP_INSTALL_PACKAGE python3-mako: inherit setuptools_build_meta python3-packaging: inherit setuptools_build_meta python3-nose: drop recipe disutils*.bbclasses: move to meta-python selftest: drop distutils3 test from recipetool pip_install_wheel: improved wheel filename guess Tom Hochstein (1): xwayland: Add xkbcomp runtime dependency Yi Zhao (2): glibc: unify wordsize.h between arm and aarch64 glibc: fix multilib headers conflict for arm Zoltán Böszörményi (2): qemuboot: Fix build error if UNINATIVE_LOADER is unset gtk-icon-cache: Allow using gtk4 Zygmunt Krynicki (13): bitbake: fetch2/wget: move loop-invariant load of BB_ORIGENV bitbake: cooker: Fix typo "isn't" and "tuples" bitbake: cookerdata: Fix typo "normally" bitbake: daemonize: Fix typo "separate" bitbake: event: Fix typo "asynchronous" and "occasionally" bitbake: fetch2: Fix typo "conform" and "processed" bitbake: fetch2/git: Remove duplicate "the" bitbake: persist_data: Fix typo "committed" bitbake: process: Fix typo: "process" bitbake: progress: Fix typo "wherever" bitbake: tinfoil: Fix typo "receive" and "something" bitbake: utils: Fix typo "dependency" and "spawning" bitbake: wget: Fix grammar "can happen" pgowda (1): gcc : Fix CVE-2021-46195 wangmy (44): libwebp: 1.2.1 -> 1.2.2 python3-libarchive-c: upgrade 3.2 -> 4.0 lighttpd: upgrade 1.4.63 -> 1.4.64 nfs-utils: upgrade 2.5.4 -> 2.6.1 libmodulemd: upgrade 2.13.0 -> 2.14.0 libxcrypt: upgrade 4.4.27 -> 4.4.28 lzip: upgrade 1.22 -> 1.23 libxkbcommon: upgrade 1.3.1 -> 1.4.0 man-db: upgrade 2.9.4 -> 2.10.0 gdbm: upgrade 1.22 -> 1.23 harfbuzz: upgrade 3.3.1 -> 3.3.2 findutils: upgrade 4.8.0 -> 4.9.0 python3-magic: upgrade 0.4.24 -> 0.4.25 python3-pycryptodome: upgrade 3.14.0 -> 3.14.1 python3-pytest: upgrade 6.2.5 -> 7.0.0 python3-pip: upgrade 22.0.2 -> 22.0.3 python3-pyelftools: upgrade 0.27 -> 0.28 screen: upgrade 4.8.0 -> 4.9.0 ed: upgrade 1.17 -> 1.18 autoconf-archive: upgrade 2021.02.19 -> 2022.02.11 gpgme: upgrade 1.16.0 -> 1.17.0 glib-2.0: upgrade 2.70.3 -> 2.70.4 harfbuzz: upgrade 3.3.2 -> 3.4.0 python3-hypothesis: upgrade 6.36.1 -> 6.36.2 python3-pathlib2: upgrade 2.3.6 -> 2.3.7 python3-pbr: upgrade 5.8.0 -> 5.8.1 python3-ruamel-yaml: upgrade 0.17.20 -> 0.17.21 linux-firmware: upgrade 20211216 -> 20220209 rng-tools: upgrade 6.14 -> 6.15 mesa: upgrade 21.3.5 -> 21.3.6 go: upgrade 1.17.6 -> 1.17.7 libhandy: Use upstream regex to check version of upgrade. libva-utils: upgrade 2.13.0 -> 2.14.0 patchelf: upgrade 0.14.3 -> 0.14.5 quilt: upgrade 0.66 -> 0.67 ruby: upgrade 3.1.0 -> 3.1.1 wireless-regdb: upgrade 2021.08.28 -> 2022.02.18 bind: upgrade 9.16.25 -> 9.16.26 flac: upgrade 1.3.3 -> 1.3.4 init-system-helpers: upgrade 1.60 -> 1.62 libdrm: upgrade 2.4.109 -> 2.4.110 python3-hypothesis: upgrade 6.36.2 -> 6.37.2 python3-markupsafe: upgrade 2.0.1 -> 2.1.0 asciidoc: upgrade 10.1.1 -> 10.1.3 meta-raspberrypi: 836755370f..e39a0a570c: Andrei Gherzan (13): README.md: Add contributing section Move the python3-adafruit recipes depending on meta-oe to dynamic layers README.md: Don't advertise meta-oe dependency docs: Detail the merging process of patches sent through the mailing list ci: Define an action for building a local docker image ci: Define an action for cleaning dangling image ci: Define an action for cleaning/removing an image ci: Introduce workflow for compliance ci: Introduce workflow Yocto operations/builds ci: Add workflow to cancel redundant workflows ci: Add git mirror workflow ci: No need for checkout step in mirror workflow ci: Use the current stable version for the mirror action Aníbal Limón (1): gstreamer1.0-plugins-good: Update bbappend to 1.20 Bernhard Guillon (1): rpi-base.inc: enable i2c-gpio overlay Devendra Tewari (1): linux-raspberrypi: Upgrade to 5.10.83 Khem Raj (7): raspberrypi4-64: Switch to using cortexa72-crc default tune picamera-libs,python3-picamera: Limit visibility to 32 bit rpi machines rpi-gpio: Replace setuptool3 instead of distutils3 python3-adafruit-blinka: Disable on musl linux-raspberrypi: Add recipe for 5.15 LTS kernel rpi-default-versions: Use 5.15 as default kernel layers: Bump to use kirkstone Martin Jansa (4): sdcard_image-rpi: fix DeprecationWarning gstreamer1.0-plugins-bad: remove libmms PACKAGECONFIG and add gpl meta: update variable names meta: update license names Mauro Anjo (1): machine: add Pi Zero 2 W 32bits Michal Toman (1): rpi-base.inc: Add vc4-fkms-v3d-pi4 overlay Mingli Yu (1): xserver-xorg: remove xshmfence configure option Otto Esko (2): recipes-bsp: Add support for gpio-shutdown Add documentation for gpio-shutdown Paul Barker (1): raspberrypi4-64: Switch to cortexa72 tune bhargavthriler (1): python3-picamera: Add picamera library meta-openembedded: 6b63095946..cf0ed42391: Alejandro Hernandez Samaniego (1): remmina: Upgrade to 1.4.23 Alexander Kanavin (4): libvncserver: disable ffmpeg support due to incompatiblility with ffmpeg 5.0 opencv: update 4.5.2 -> 4.5.5 minidlna: update 1.2.1 -> 1.3.0 mpd: update 0.22.9 -> 0.23.5 An?bal Lim?n (2): python3-apt: add new recipe version 2.3.0 unattended-upgrades: add new recipe version 2.6 Andreas Müller (1): All layers: Follow oe-core's variable name changes Andrej Valek (1): nodejs: add option to use openssl legacy providers Carlos Rafael Giani (3): pipewire: Upgrade to 0.3.45 wireplumber: Add recipe wireplumber: Improve configuration Changqing Li (1): python3-psutil: fix test failure Christian Eggers (5): ebtables: remove perl from RDEPENDS graphviz: native: create /usr/lib/graphviz/config6 in populate_sysroot boost-sml: add new recipe python3-dt-schema: remove recipe graphviz: added PACKAGECONFIG for librsvg Clément Péron (1): networking: add new netsniff-ng recipe version 0.6.8 Daniel Gomez (1): opencv: Update contrib 4.5.2 -> 4.5.5 Devendra Tewari (1): libcamera: add pkg-config files Fabio Estevam (2): rtc-tools: Add a recipe rtc-tools: Update to 2022.02 Gianfranco Costamagna (4): vboxguestdrivers: upgrade 6.1.30 -> 6.1.32 boinc-client: Update to 7.18.1 mosquitto: upgrade 2.0.12 -> 2.0.14 websocketpp: Apply upstream proposed patch to fix a Scons 4.2.0+ build failure Jan Luebbe (1): snappy: use main branch to fix fetch failure Jan Vermaete (2): netdata: upgrade 1.32.1 -> 1.33.0 netdata: version bump 1.33.0 -> 1.33.1 Justin Bronder (4): yaml-cpp: bump 0.7.0 googlebenchmark: add 1.6.1 python3-pytest-forked: add 1.4.0 python3-pytest-xdist: add 2.5.0 Kai Kang (1): openjpeg: fix CVE-2021-29338 Kartikey Rameshbhai Parmar (1): imagemagick: update SRC_URI branch to main Khem Raj (46): xfce4-datetime-setter: Fix build with meson 0.61+ gerbera: Upgrade to 1.9.2 iotop: Disable LTO with clang/rv64 spdlog: Update the external fmt patch dlt-daemon: Bump to latest revision on master evolution-data-server: Disable g-i data generation gerbera: Fix build with fmt 8.1+ php: Update to 8.1.2 postgresql: Fix build on riscv libcec: Fix type mismatch for return value of LibCecBootloader() gparted: Do not use NULL where boolean is expected python3-pyruvate: Fix build with mips python3-pyruvate: Fix build with riscv64/musl pcp: Disable parallel compile gst-shark: Upgrade to 0.7.3.1 crda: Fix buffer overflow in sscanf open-vm-tools: Fix build with musl openldap: Fix build with musl gimp: Disable vector icons on x86 with clang libjs-jquery-icheck: Use hardcoded SHA for srcrev smarty: Upgrade to 4.1.0 dhcp-relay: Package needed shared libs from bind gimp: Disable vector icons with clang on arm capnproto: Fix build on mips packagegroup-meta-oe: Add googlebenchmark packagegroup-meta-python: Add python3-pytest-forked and python3-pytest-xdist ntopng: Avoid linking libm statically libsigc++-3: Upgrade to 3.2.0 geany-plugins: Fix build with libgit2 1.4+ recipes: Use renamed SKIP_RECIPE varFlag recipes: Use new CVE_CHECK_IGNORE variable meta-oe: Use new variable SYSROOT_DIRS_IGNORE layers: Bump to use kirkstone capnproto: Link with libatomic on rv32 iotop: Disable lto with clang for rv32 glibmm: Add recipe for 2.70.0 cairomm: Add recipe for cairomm 1.16 pangomm: Add recipe for pangomm-2.48 atkmm: Add new recipe for 2.36+ libxml++: Upgrade to 2.42.1 release libxml++-5.0: Add recipe for libxml++ 5.0 Revert "libcamera: add pkg-config files" python3-blinker: Migrate to use pytest instead of nose for testing python3-oauthlib: Drop redundant nose dependency netplan: Add knob to enable tests openldap: Use renamed variable CVE_CHECK_IGNORE Leon Anavi (23): python3-imageio: Upgrade 2.14.0 -> 2.14.1 python3-pandas: Upgrade 1.3.5 -> 1.4.0 python3-aenum: Upgrade 3.1.6 -> 3.1.8 python3-redis: Upgrade 4.0.2 -> 4.1.1 python3-jdatetime: Upgrade 3.8.1 -> 3.8.2 python3-bandit: Upgrade 1.7.1 -> 1.7.2 python3-fasteners: Upgrade 0.17.2 -> 0.17.3 python3-ansi2html: Upgrade 1.6.0 -> 1.7.0 python3-coverage: Upgrade 6.2 -> 6.3 python3-imageio: Upgrade 2.14.1 -> 2.15.0 python3-humanize: Upgrade 3.13.1 -> 3.14.0 python3-bitarray: Upgrade 2.3.5 -> 2.3.6 python3-itsdangerous: Upgrade 2.0.1 -> 2.1.0 python3-croniter: Upgrade 1.2.0 -> 1.3.4 python3-distro: Upgrade 1.6.0 -> 1.7.0 python3-click: Upgrade 8.0.3 -> 8.0.4 python3-ordered-set: Upgrade 4.0.2 -> 4.1.0 python3-bitarray: Upgrade 2.3.6 -> 2.3.7 python3-pandas: Upgrade 1.4.0 -> 1.4.1 python3-unidiff: Upgrade 0.7.0 -> 0.7.3 python3-langtable: Upgrade to release 0.0.57 python3-cmd2: Upgrade 2.3.3 -> 2.4.0 python3-coverage: Upgrade 6.3 -> 6.3.2 Marek Vasut (1): freerdp: Update to FreeRDP 2.5.0 Mark Jonas (1): mbedtls: Upgrade to 2.28.0 Markus Volk (12): geary: initial add recipe packagegroup-gnome-apps.bb: add geary folks: add PACKAGECONFIG for import_tool and inspect_tool geary: use sha hash for SRCREV; fix identation folks: make some dependencies optional evolution-data-server: try to fix g-i data generation folks: dont build tests as they are not installed anyway gvfs: upgrade 1.48.1 -> 1.49.1 libxfce4util: inherit vala xfconf: inherit vala libxfce4ui: disable vala xfce4-panel: disable vala Martin Jansa (1): Fix DeprecationWarning about regexps Matsunaga-Shinji (1): openldap: add CVE-2015-3276 to allowlist Matthias Klein (1): gpsd-machine-conf: set precise BSD-3-Clause license Mingli Yu (5): plymouth: switch to KillMode=mixed lxdm: remove conflicts setting plymouth: add extra kernel parameter opencv: disable sse4.1 and sse4.2 on x86 plymouth: Add the retain-splash option Oleksandr Kravchuk (4): redis: add recipe for 7.0-rc1 iwd: update to 1.24 fping: update to 5.1 capnproto: update to 0.9.1 Peter Bergin (1): pipewire: fix build error when pipewire-jack is used Peter Griffin (1): libcamera: update meson options to build pipeline handlers & cam utility Randy MacLeod (1): rsyslog: update to 8.2202 Robert Joslyn (1): hwdata: Update to 0.356 Ross Burton (16): python3-jsonpath-rw: set correct license concurrencykit: use precise BSD licence version pkcs11-helper: update homepage pkcs11-helper: set precise BSD license spice: set correct LICENSE poppler-data: set precise BSD license openipmi: use precise BSD license s-nail: add a maintained mail(1) fork minidlna: use precise BSD license smartmontools: use s-nail mailx: remove spice-protocol: upgrade to 1.14.3 libjs-jquery: remove fwupd-efi: upgrade to 1.2 fping: set precise license concurrencykit: enable 32- and 64-bit Arm targets Sakib Sajal (1): nss: uprev v3.73.1 -> v3.74 Sam Van Den Berge (1): libiio: use setuptools functions instead of distutils Thomas Perrot (1): breakpad: fix branch for gtest in SRC_URI Tim Orling (2): python3-wheel: drop; moved to oe-core python3-test-generator: drop recipe Trevor Gamblin (1): phoronix-test-suite: upgrade 9.2.1 -> 10.8.1 Wang Mingyu (39): ndpi: upgrade 4.0 -> 4.2 ntopng: upgrade 5.0 -> 5.2.1 python3-werkzeug: upgrade 2.0.2 -> 2.0.3 python3-twisted: upgrade 21.7.0 -> 22.1.0 python3-natsort: upgrade 8.0.2 -> 8.1.0 xfsdump: upgrade 3.1.9 -> 3.1.10 mm-common: upgrade 1.0.3 -> 1.0.4 fsverity-utils: upgrade 1.4 -> 1.5 libgee: upgrade 0.20.4 -> 0.20.5 libqmi: upgrade 1.30.2 -> 1.30.4 libcrypt-openssl-guess-perl: upgrade 0.14 -> 0.15 gjs: upgrade 1.70.0 -> 1.70.1 dnf-plugin-tui: Fix a bug of multilib libwacom: upgrade 1.12 -> 2.1.0 gedit: upgrade 40.1 -> 41.0 gnome-autoar: upgrade 0.4.2 -> 0.4.3 libwnck3: upgrade 40.0 -> 40.1 iscsi-initiator-utils: upgrade 2.1.5 -> 2.1.6 iotop: upgrade 1.20 -> 1.21 inotify-tools: upgrade 3.21.9.6 -> 3.22.1.0 gnuplot: upgrade 5.4.2 -> 5.4.3 libxmlb: upgrade 0.3.6 -> 0.3.7 libgusb: upgrade 0.3.9 -> 0.3.10 monit: upgrade 5.30.0 -> 5.31.0 libjcat: upgrade 0.1.9 -> 0.1.10 libio-socket-ssl-perl: upgrade 2.073 -> 2.074 mpv: upgrade 0.34.0 -> 0.34.1 php: upgrade 8.1.2 -> 8.1.3 nano: upgrade 6.0 -> 6.2 rdma-core: upgrade 38.0 -> 39.0 netplan: upgrade 0.103 -> 0.104 nautilus: upgrade 41.1 -> 41.2 zchunk: upgrade 1.1.16 -> 1.2.0 tree: upgrade 2.0.1 -> 2.0.2 soci: upgrade 4.0.2 -> 4.0.3 remmina: upgrade 1.4.23 -> 1.4.24 wolfssl: upgrade 5.1.0- > 5.2.0 tcpreplay: upgrade 4.4.0 -> 4.4.1 spice-protocol: upgrade 0.14.3 -> 0.14.4 Xu Huan (15): python3-multidict: upgrade 5.2.0 -> 6.0.2 python3-pulsectl upgrade 21.10.5 -> 22.1.3 python3-pyephem: upgrade 4.1.1 -> 4.1.3 python3-pytest-timeout: upgrade 2.0.2 -> 2.1.0 python3-pywbemtools: upgrade 0.9.0 -> 0.9.1 python3-requests-oauthlib: upgrade 1.3.0 -> 1.3.1 python3-sqlalchemy: upgrade 1.4.29 -> 1.4.31 python3-oauthlib: upgrade 3.1.1 -> 3.2.0 python3-pyudev: upgrade 0.22.0 -> 0.23.2 python3-pyopenssl: upgrade 21.0.0 -> 22.0.0 python3-alembic upgrade 1.7.5 -> 1.7.6 python3-autobahn: upgrade 21.11.1 -> 22.1.1 python3-flask: upgrade 2.0.2 -> 2.0.3 python3-imageio: upgrade 2.15.0 -> 2.16.0 python3-jdatetime: upgrade 3.8.2 -> 4.0.0 Yi Zhao (5): phpmyadmin: upgrade 5.1.1 -> 5.1.2 tcpdump: upgrade 4.99.0 -> 4.99.1 tcpslice: upgrade 1.2a3 -> 1.5 tcpreplay: update HOMEPAGE samba: upgrade 4.14.11 -> 4.14.12 Zheng Ruoqin (6): libjs-jquery: Upgrade to 3.3.1. protobuf: upgrade 3.19.3 -> 3.19.4 phpmyadmin: upgrade 5.1.2 -> 5.1.3 postgresql: upgrade 14.1 -> 14.2 pugixml: upgrade 1.11.4 -> 1.12 poppler: upgrade 22.01.0 -> 22.02.0 wangmy (14): fatcat: upgrade 1.1.0 -> 1.1.1 libnma: upgrade 1.8.32 -> 1.8.34 botan: upgrade 2.18.2 -> 2.19.1 cgdb: upgrade 0.7.1 -> 0.8.0 ddrescue: upgrade 1.25 -> 1.26 hostapd: upgrade 2.9 -> 2.10 libcereal: upgrade 1.3.0 -> 1.3.1 ser2net: upgrade 4.3.4 -> 4.3.5 dlt-daemon: upgrade 2.18.7 -> 2.18.8 devilspie2: upgrade 0.43 -> 0.44 opensaf: upgrade 5.21.09 -> 5.22.01 tcpreplay: upgrade 4.3.4 -> 4.4.0 lcms: upgrade 2.12 -> 2.13.1 libcgi-perl: upgrade 4.53 -> 4.54 meta-security: c20b35b527..6cc8dde794: Akshay Bhat (2): meta-hardening: Fix override syntax scap-security-guide: Fix openembedded platform tests Anton Antonov (1): Upgrade parsec-tool to 0.5.1 Armin Kuster (11): google-authenticator-libpam: update to 1.09 packagegroup-security-tpm2.bb: remove dynamic pkgs tpm2-pkcs11_1.7.0: Drop dstat from DPENDS lkrg-module: update to 0.9.2 suricata: update to 6.0.4 tpm2-tss: update to 3.1.0 parsec-service: fix compile issue. layer.conf: Update to use kirkstone recipes: Use renamed SKIP_RECIPE varFlag chipsec: fix WARNING smack: Use new CVE_CHECK_IGNORE variable Patrick Williams (1): tpm2-pkcs11: fix RDEPENDS variable Yi Zhao (1): samhain: upgrade 4.4.3 -> 4.4.6 Signed-off-by: Andrew Geissler <geissonator@yahoo.com> Change-Id: I270425c8a022f2e281a28ea19fdfae47aa375551
Diffstat (limited to 'poky/meta/classes')
-rw-r--r--poky/meta/classes/archiver.bbclass4
-rw-r--r--poky/meta/classes/base.bbclass18
-rw-r--r--poky/meta/classes/blacklist.bbclass20
-rw-r--r--poky/meta/classes/buildhistory.bbclass6
-rw-r--r--poky/meta/classes/cmake.bbclass1
-rw-r--r--poky/meta/classes/create-spdx.bbclass120
-rw-r--r--poky/meta/classes/cve-check.bbclass14
-rw-r--r--poky/meta/classes/devicetree.bbclass2
-rw-r--r--poky/meta/classes/distutils-common-base.bbclass28
-rw-r--r--poky/meta/classes/distutils3-base.bbclass9
-rw-r--r--poky/meta/classes/distutils3.bbclass71
-rw-r--r--poky/meta/classes/features_check.bbclass9
-rw-r--r--poky/meta/classes/flit_core.bbclass16
-rw-r--r--poky/meta/classes/gobject-introspection.bbclass2
-rw-r--r--poky/meta/classes/gtk-icon-cache.bbclass27
-rw-r--r--poky/meta/classes/icecc.bbclass61
-rw-r--r--poky/meta/classes/image-prelink.bbclass76
-rw-r--r--poky/meta/classes/insane.bbclass36
-rw-r--r--poky/meta/classes/kernel-fitimage.bbclass4
-rw-r--r--poky/meta/classes/kernel.bbclass15
-rw-r--r--poky/meta/classes/license.bbclass35
-rw-r--r--poky/meta/classes/multilib.bbclass8
-rw-r--r--poky/meta/classes/multilib_global.bbclass4
-rw-r--r--poky/meta/classes/package.bbclass133
-rw-r--r--poky/meta/classes/pip_install_wheel.bbclass48
-rw-r--r--poky/meta/classes/populate_sdk_ext.bbclass20
-rw-r--r--poky/meta/classes/python3targetconfig.bbclass12
-rw-r--r--poky/meta/classes/qemuboot.bbclass4
-rw-r--r--poky/meta/classes/rootfs-postcommands.bbclass2
-rw-r--r--poky/meta/classes/sanity.bbclass2
-rw-r--r--poky/meta/classes/setuptools3.bbclass20
-rw-r--r--poky/meta/classes/setuptools_build_meta.bbclass18
-rw-r--r--poky/meta/classes/sstate.bbclass71
-rw-r--r--poky/meta/classes/staging.bbclass48
-rw-r--r--poky/meta/classes/toolchain-scripts.bbclass2
-rw-r--r--poky/meta/classes/uninative.bbclass4
-rw-r--r--poky/meta/classes/useradd-staticids.bbclass4
-rw-r--r--poky/meta/classes/waf.bbclass2
-rw-r--r--poky/meta/classes/yocto-check-layer.bbclass16
39 files changed, 522 insertions, 470 deletions
diff --git a/poky/meta/classes/archiver.bbclass b/poky/meta/classes/archiver.bbclass
index 549f3311e4..c19c770d11 100644
--- a/poky/meta/classes/archiver.bbclass
+++ b/poky/meta/classes/archiver.bbclass
@@ -5,7 +5,7 @@
# 1) original (or unpacked) source: ARCHIVER_MODE[src] = "original"
# 2) patched source: ARCHIVER_MODE[src] = "patched" (default)
# 3) configured source: ARCHIVER_MODE[src] = "configured"
-# 4) source mirror: ARCHIVE_MODE[src] = "mirror"
+# 4) source mirror: ARCHIVER_MODE[src] = "mirror"
# 5) The patches between do_unpack and do_patch:
# ARCHIVER_MODE[diff] = "1"
# And you can set the one that you'd like to exclude from the diff:
@@ -63,7 +63,7 @@ ARCHIVER_WORKDIR = "${WORKDIR}/archiver-work/"
# When producing a combined mirror directory, allow duplicates for the case
# where multiple recipes use the same SRC_URI.
ARCHIVER_COMBINED_MIRRORDIR = "${ARCHIVER_TOPDIR}/mirror"
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_SRC}/mirror"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_SRC}/mirror"
do_dumpdata[dirs] = "${ARCHIVER_OUTDIR}"
do_ar_recipe[dirs] = "${ARCHIVER_OUTDIR}"
diff --git a/poky/meta/classes/base.bbclass b/poky/meta/classes/base.bbclass
index 9f1bfe8466..27c1d6168d 100644
--- a/poky/meta/classes/base.bbclass
+++ b/poky/meta/classes/base.bbclass
@@ -71,7 +71,7 @@ def get_base_dep(d):
return ""
return "${BASE_DEFAULT_DEPS}"
-BASE_DEFAULT_DEPS = "virtual/${TARGET_PREFIX}gcc virtual/${TARGET_PREFIX}compilerlibs virtual/libc"
+BASE_DEFAULT_DEPS = "virtual/${HOST_PREFIX}gcc virtual/${HOST_PREFIX}compilerlibs virtual/libc"
BASEDEPENDS = ""
BASEDEPENDS:class-target = "${@get_base_dep(d)}"
@@ -329,7 +329,7 @@ python base_eventhandler() {
source_mirror_fetch = d.getVar('SOURCE_MIRROR_FETCH', False)
if not source_mirror_fetch:
provs = (d.getVar("PROVIDES") or "").split()
- multiwhitelist = (d.getVar("MULTI_PROVIDER_WHITELIST") or "").split()
+ multiwhitelist = (d.getVar("BB_MULTI_PROVIDER_ALLOWED") or "").split()
for p in provs:
if p.startswith("virtual/") and p not in multiwhitelist:
profprov = d.getVar("PREFERRED_PROVIDER_" + p)
@@ -438,6 +438,14 @@ python () {
if os.path.normpath(d.getVar("WORKDIR")) != os.path.normpath(d.getVar("B")):
d.appendVar("PSEUDO_IGNORE_PATHS", ",${B}")
+ # To add a recipe to the skip list , set:
+ # SKIP_RECIPE[pn] = "message"
+ pn = d.getVar('PN')
+ skip_msg = d.getVarFlag('SKIP_RECIPE', pn)
+ if skip_msg:
+ bb.debug(1, "Skipping %s %s" % (pn, skip_msg))
+ raise bb.parse.SkipRecipe("Recipe will be skipped because: %s" % (skip_msg))
+
# Handle PACKAGECONFIG
#
# These take the form:
@@ -534,9 +542,9 @@ python () {
unmatched_license_flags = check_license_flags(d)
if unmatched_license_flags:
if len(unmatched_license_flags) == 1:
- message = "because it has a restricted license '{0}'. Which is not whitelisted in LICENSE_FLAGS_WHITELIST".format(unmatched_license_flags[0])
+ message = "because it has a restricted license '{0}'. Which is not listed in LICENSE_FLAGS_ACCEPTED".format(unmatched_license_flags[0])
else:
- message = "because it has restricted licenses {0}. Which are not whitelisted in LICENSE_FLAGS_WHITELIST".format(
+ message = "because it has restricted licenses {0}. Which are not listed in LICENSE_FLAGS_ACCEPTED".format(
", ".join("'{0}'".format(f) for f in unmatched_license_flags))
bb.debug(1, "Skipping %s %s" % (pn, message))
raise bb.parse.SkipRecipe(message)
@@ -615,7 +623,7 @@ python () {
if unskipped_pkgs:
for pkg in skipped_pkgs:
bb.debug(1, "Skipping the package %s at do_rootfs because of incompatible license(s): %s" % (pkg, ' '.join(skipped_pkgs[pkg])))
- d.setVar('LICENSE_EXCLUSION-' + pkg, ' '.join(skipped_pkgs[pkg]))
+ d.setVar('_exclude_incompatible-' + pkg, ' '.join(skipped_pkgs[pkg]))
for pkg in unskipped_pkgs:
bb.debug(1, "Including the package %s" % pkg)
else:
diff --git a/poky/meta/classes/blacklist.bbclass b/poky/meta/classes/blacklist.bbclass
deleted file mode 100644
index dc794228ff..0000000000
--- a/poky/meta/classes/blacklist.bbclass
+++ /dev/null
@@ -1,20 +0,0 @@
-# anonymous support class from originally from angstrom
-#
-# To use the blacklist, a distribution should include this
-# class in the INHERIT_DISTRO
-#
-# No longer use ANGSTROM_BLACKLIST, instead use a table of
-# recipes in PNBLACKLIST
-#
-# Features:
-#
-# * To add a package to the blacklist, set:
-# PNBLACKLIST[pn] = "message"
-#
-
-python () {
- blacklist = d.getVarFlag('PNBLACKLIST', d.getVar('PN'))
-
- if blacklist:
- raise bb.parse.SkipRecipe("Recipe is blacklisted: %s" % (blacklist))
-}
diff --git a/poky/meta/classes/buildhistory.bbclass b/poky/meta/classes/buildhistory.bbclass
index daa96f3b63..49797a6701 100644
--- a/poky/meta/classes/buildhistory.bbclass
+++ b/poky/meta/classes/buildhistory.bbclass
@@ -31,7 +31,7 @@ BUILDHISTORY_DIR_PACKAGE = "${BUILDHISTORY_DIR}/packages/${MULTIMACH_TARGET_SYS}
# of failed builds.
#
# The expected usage is via auto.conf, but passing via the command line also works
-# with: BB_ENV_EXTRAWHITE=BUILDHISTORY_RESET BUILDHISTORY_RESET=1
+# with: BB_ENV_PASSTHROUGH_ADDITIONS=BUILDHISTORY_RESET BUILDHISTORY_RESET=1
BUILDHISTORY_RESET ?= ""
BUILDHISTORY_OLD_DIR = "${BUILDHISTORY_DIR}/${@ "old" if "${BUILDHISTORY_RESET}" else ""}"
@@ -792,8 +792,8 @@ def buildhistory_get_sdkvars(d):
sdkvars = "DISTRO DISTRO_VERSION SDK_NAME SDK_VERSION SDKMACHINE SDKIMAGE_FEATURES TOOLCHAIN_HOST_TASK TOOLCHAIN_TARGET_TASK BAD_RECOMMENDATIONS NO_RECOMMENDATIONS PACKAGE_EXCLUDE"
if d.getVar('BB_CURRENTTASK') == 'populate_sdk_ext':
# Extensible SDK uses some additional variables
- sdkvars += " SDK_LOCAL_CONF_WHITELIST SDK_LOCAL_CONF_BLACKLIST SDK_INHERIT_BLACKLIST SDK_UPDATE_URL SDK_EXT_TYPE SDK_RECRDEP_TASKS SDK_INCLUDE_PKGDATA SDK_INCLUDE_TOOLCHAIN"
- listvars = "SDKIMAGE_FEATURES BAD_RECOMMENDATIONS PACKAGE_EXCLUDE SDK_LOCAL_CONF_WHITELIST SDK_LOCAL_CONF_BLACKLIST SDK_INHERIT_BLACKLIST"
+ sdkvars += " ESDK_LOCALCONF_ALLOW ESDK_LOCALCONF_REMOVE ESDK_CLASS_INHERIT_DISABLE SDK_UPDATE_URL SDK_EXT_TYPE SDK_RECRDEP_TASKS SDK_INCLUDE_PKGDATA SDK_INCLUDE_TOOLCHAIN"
+ listvars = "SDKIMAGE_FEATURES BAD_RECOMMENDATIONS PACKAGE_EXCLUDE ESDK_LOCALCONF_ALLOW ESDK_LOCALCONF_REMOVE ESDK_CLASS_INHERIT_DISABLE"
return outputvars(sdkvars, listvars, d)
diff --git a/poky/meta/classes/cmake.bbclass b/poky/meta/classes/cmake.bbclass
index 92b9197c48..fac7bbca7a 100644
--- a/poky/meta/classes/cmake.bbclass
+++ b/poky/meta/classes/cmake.bbclass
@@ -189,6 +189,7 @@ cmake_do_configure() {
-DCMAKE_TOOLCHAIN_FILE=${WORKDIR}/toolchain.cmake \
-DCMAKE_NO_SYSTEM_FROM_IMPORTED=1 \
-DCMAKE_EXPORT_NO_PACKAGE_REGISTRY=ON \
+ -DFETCHCONTENT_FULLY_DISCONNECTED=ON \
${EXTRA_OECMAKE} \
-Wno-dev
}
diff --git a/poky/meta/classes/create-spdx.bbclass b/poky/meta/classes/create-spdx.bbclass
index eb9535069a..5375ef3e34 100644
--- a/poky/meta/classes/create-spdx.bbclass
+++ b/poky/meta/classes/create-spdx.bbclass
@@ -29,9 +29,31 @@ SPDX_NAMESPACE_PREFIX ??= "http://spdx.org/spdxdoc"
SPDX_LICENSES ??= "${COREBASE}/meta/files/spdx-licenses.json"
SPDX_ORG ??= "OpenEmbedded ()"
+SPDX_SUPPLIER ??= "Organization: ${SPDX_ORG}"
+SPDX_SUPPLIER[doc] = "The SPDX PackageSupplier field for SPDX packages created from \
+ this recipe. For SPDX documents create using this class during the build, this \
+ is the contact information for the person or organization who is doing the \
+ build."
do_image_complete[depends] = "virtual/kernel:do_create_spdx"
+def extract_licenses(filename):
+ import re
+
+ lic_regex = re.compile(b'^\W*SPDX-License-Identifier:\s*([ \w\d.()+-]+?)(?:\s+\W*)?$', re.MULTILINE)
+
+ try:
+ with open(filename, 'rb') as f:
+ size = min(15000, os.stat(filename).st_size)
+ txt = f.read(size)
+ licenses = re.findall(lic_regex, txt)
+ if licenses:
+ ascii_licenses = [lic.decode('ascii') for lic in licenses]
+ return ascii_licenses
+ except Exception as e:
+ bb.warn(f"Exception reading {filename}: {e}")
+ return None
+
def get_doc_namespace(d, doc):
import uuid
namespace_uuid = uuid.uuid5(uuid.NAMESPACE_DNS, d.getVar("SPDX_UUID_NAMESPACE"))
@@ -227,6 +249,11 @@ def add_package_files(d, doc, spdx_pkg, topdir, get_spdxid, get_types, *, archiv
checksumValue=bb.utils.sha256_file(filepath),
))
+ if "SOURCE" in spdx_file.fileTypes:
+ extracted_lics = extract_licenses(filepath)
+ if extracted_lics:
+ spdx_file.licenseInfoInFiles = extracted_lics
+
doc.files.append(spdx_file)
doc.add_relationship(spdx_pkg, "CONTAINS", spdx_file)
spdx_pkg.hasFiles.append(spdx_file.SPDXID)
@@ -425,6 +452,7 @@ python do_create_spdx() {
recipe.name = d.getVar("PN")
recipe.versionInfo = d.getVar("PV")
recipe.SPDXID = oe.sbom.get_recipe_spdxid(d)
+ recipe.packageSupplier = d.getVar("SPDX_SUPPLIER")
if bb.data.inherits_class("native", d) or bb.data.inherits_class("cross", d):
recipe.annotations.append(create_annotation(d, "isNative"))
@@ -534,6 +562,7 @@ python do_create_spdx() {
spdx_package.name = pkg_name
spdx_package.versionInfo = d.getVar("PV")
spdx_package.licenseDeclared = convert_license_to_spdx(package_license, package_doc, d, found_licenses)
+ spdx_package.packageSupplier = d.getVar("SPDX_SUPPLIER")
package_doc.packages.append(spdx_package)
@@ -560,7 +589,7 @@ python do_create_spdx() {
oe.sbom.write_doc(d, package_doc, "packages")
}
# NOTE: depending on do_unpack is a hack that is necessary to get it's dependencies for archive the source
-addtask do_create_spdx after do_package do_packagedata do_unpack before do_build do_rm_work
+addtask do_create_spdx after do_package do_packagedata do_unpack before do_populate_sdk do_build do_rm_work
SSTATETASKS += "do_create_spdx"
do_create_spdx[sstate-inputdirs] = "${SPDXDEPLOY}"
@@ -792,28 +821,77 @@ def spdx_get_src(d):
do_rootfs[recrdeptask] += "do_create_spdx do_create_runtime_spdx"
ROOTFS_POSTUNINSTALL_COMMAND =+ "image_combine_spdx ; "
+
+do_populate_sdk[recrdeptask] += "do_create_spdx do_create_runtime_spdx"
+POPULATE_SDK_POST_HOST_COMMAND:append:task-populate-sdk = " sdk_host_combine_spdx; "
+POPULATE_SDK_POST_TARGET_COMMAND:append:task-populate-sdk = " sdk_target_combine_spdx; "
+
python image_combine_spdx() {
import os
+ import oe.sbom
+ from pathlib import Path
+ from oe.rootfs import image_list_installed_packages
+
+ image_name = d.getVar("IMAGE_NAME")
+ image_link_name = d.getVar("IMAGE_LINK_NAME")
+ imgdeploydir = Path(d.getVar("IMGDEPLOYDIR"))
+ img_spdxid = oe.sbom.get_image_spdxid(image_name)
+ packages = image_list_installed_packages(d)
+
+ combine_spdx(d, image_name, imgdeploydir, img_spdxid, packages)
+
+ if image_link_name:
+ image_spdx_path = imgdeploydir / (image_name + ".spdx.json")
+ image_spdx_link = imgdeploydir / (image_link_name + ".spdx.json")
+ image_spdx_link.symlink_to(os.path.relpath(image_spdx_path, image_spdx_link.parent))
+
+ def make_image_link(target_path, suffix):
+ if image_link_name:
+ link = imgdeploydir / (image_link_name + suffix)
+ link.symlink_to(os.path.relpath(target_path, link.parent))
+
+ spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst")
+ make_image_link(spdx_tar_path, ".spdx.tar.zst")
+ spdx_index_path = imgdeploydir / (image_name + ".spdx.index.json")
+ make_image_link(spdx_index_path, ".spdx.index.json")
+}
+
+python sdk_host_combine_spdx() {
+ sdk_combine_spdx(d, "host")
+}
+
+python sdk_target_combine_spdx() {
+ sdk_combine_spdx(d, "target")
+}
+
+def sdk_combine_spdx(d, sdk_type):
+ import oe.sbom
+ from pathlib import Path
+ from oe.sdk import sdk_list_installed_packages
+
+ sdk_name = d.getVar("SDK_NAME") + "-" + sdk_type
+ sdk_deploydir = Path(d.getVar("SDKDEPLOYDIR"))
+ sdk_spdxid = oe.sbom.get_sdk_spdxid(sdk_name)
+ sdk_packages = sdk_list_installed_packages(d, sdk_type == "target")
+ combine_spdx(d, sdk_name, sdk_deploydir, sdk_spdxid, sdk_packages)
+
+def combine_spdx(d, rootfs_name, rootfs_deploydir, rootfs_spdxid, packages):
+ import os
import oe.spdx
import oe.sbom
import io
import json
- from oe.rootfs import image_list_installed_packages
from datetime import timezone, datetime
from pathlib import Path
import tarfile
import bb.compress.zstd
creation_time = datetime.now(tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
- image_name = d.getVar("IMAGE_NAME")
- image_link_name = d.getVar("IMAGE_LINK_NAME")
-
deploy_dir_spdx = Path(d.getVar("DEPLOY_DIR_SPDX"))
- imgdeploydir = Path(d.getVar("IMGDEPLOYDIR"))
source_date_epoch = d.getVar("SOURCE_DATE_EPOCH")
doc = oe.spdx.SPDXDocument()
- doc.name = image_name
+ doc.name = rootfs_name
doc.documentNamespace = get_doc_namespace(d, doc)
doc.creationInfo.created = creation_time
doc.creationInfo.comment = "This document was created by analyzing the source of the Yocto recipe during the build."
@@ -825,14 +903,11 @@ python image_combine_spdx() {
image = oe.spdx.SPDXPackage()
image.name = d.getVar("PN")
image.versionInfo = d.getVar("PV")
- image.SPDXID = oe.sbom.get_image_spdxid(image_name)
+ image.SPDXID = rootfs_spdxid
+ image.packageSupplier = d.getVar("SPDX_SUPPLIER")
doc.packages.append(image)
- spdx_package = oe.spdx.SPDXPackage()
-
- packages = image_list_installed_packages(d)
-
for name in sorted(packages.keys()):
pkg_spdx_path = deploy_dir_spdx / "packages" / (name + ".spdx.json")
pkg_doc, pkg_doc_sha1 = oe.sbom.read_doc(pkg_spdx_path)
@@ -869,22 +944,18 @@ python image_combine_spdx() {
comment="Runtime dependencies for %s" % name
)
- image_spdx_path = imgdeploydir / (image_name + ".spdx.json")
+ image_spdx_path = rootfs_deploydir / (rootfs_name + ".spdx.json")
with image_spdx_path.open("wb") as f:
doc.to_json(f, sort_keys=True)
- if image_link_name:
- image_spdx_link = imgdeploydir / (image_link_name + ".spdx.json")
- image_spdx_link.symlink_to(os.path.relpath(image_spdx_path, image_spdx_link.parent))
-
num_threads = int(d.getVar("BB_NUMBER_THREADS"))
visited_docs = set()
index = {"documents": []}
- spdx_tar_path = imgdeploydir / (image_name + ".spdx.tar.zst")
+ spdx_tar_path = rootfs_deploydir / (rootfs_name + ".spdx.tar.zst")
with bb.compress.zstd.open(spdx_tar_path, "w", num_threads=num_threads) as f:
with tarfile.open(fileobj=f, mode="w|") as tar:
def collect_spdx_document(path):
@@ -946,17 +1017,6 @@ python image_combine_spdx() {
tar.addfile(info, fileobj=index_str)
- def make_image_link(target_path, suffix):
- if image_link_name:
- link = imgdeploydir / (image_link_name + suffix)
- link.symlink_to(os.path.relpath(target_path, link.parent))
-
- make_image_link(spdx_tar_path, ".spdx.tar.zst")
-
- spdx_index_path = imgdeploydir / (image_name + ".spdx.index.json")
+ spdx_index_path = rootfs_deploydir / (rootfs_name + ".spdx.index.json")
with spdx_index_path.open("w") as f:
json.dump(index, f, sort_keys=True)
-
- make_image_link(spdx_index_path, ".spdx.index.json")
-}
-
diff --git a/poky/meta/classes/cve-check.bbclass b/poky/meta/classes/cve-check.bbclass
index 6c04ff9f09..d715fbf4d8 100644
--- a/poky/meta/classes/cve-check.bbclass
+++ b/poky/meta/classes/cve-check.bbclass
@@ -44,14 +44,14 @@ CVE_CHECK_CREATE_MANIFEST ??= "1"
CVE_CHECK_REPORT_PATCHED ??= "1"
# Whitelist for packages (PN)
-CVE_CHECK_PN_WHITELIST ?= ""
+CVE_CHECK_SKIP_RECIPE ?= ""
# Whitelist for CVE. If a CVE is found, then it is considered patched.
# The value is a string containing space separated CVE values:
#
-# CVE_CHECK_WHITELIST = 'CVE-2014-2524 CVE-2018-1234'
+# CVE_CHECK_IGNORE = 'CVE-2014-2524 CVE-2018-1234'
#
-CVE_CHECK_WHITELIST ?= ""
+CVE_CHECK_IGNORE ?= ""
# Layers to be excluded
CVE_CHECK_LAYER_EXCLUDELIST ??= ""
@@ -144,6 +144,7 @@ python cve_check_write_rootfs_manifest () {
manifest_name = d.getVar("CVE_CHECK_MANIFEST")
cve_tmp_file = d.getVar("CVE_CHECK_TMP_FILE")
+ bb.utils.mkdirhier(os.path.dirname(manifest_name))
shutil.copyfile(cve_tmp_file, manifest_name)
if manifest_name and os.path.exists(manifest_name):
@@ -177,11 +178,11 @@ def check_cves(d, patched_cves):
pv = d.getVar("CVE_VERSION").split("+git")[0]
# If the recipe has been whitelisted we return empty lists
- if pn in d.getVar("CVE_CHECK_PN_WHITELIST").split():
+ if pn in d.getVar("CVE_CHECK_SKIP_RECIPE").split():
bb.note("Recipe has been whitelisted, skipping check")
return ([], [], [])
- cve_whitelist = d.getVar("CVE_CHECK_WHITELIST").split()
+ cve_whitelist = d.getVar("CVE_CHECK_IGNORE").split()
import sqlite3
db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
@@ -264,7 +265,8 @@ def get_cve_info(d, cves):
import sqlite3
cve_data = {}
- conn = sqlite3.connect(d.getVar("CVE_CHECK_DB_FILE"))
+ db_file = d.expand("file:${CVE_CHECK_DB_FILE}?mode=ro")
+ conn = sqlite3.connect(db_file, uri=True)
for cve in cves:
for row in conn.execute("SELECT * FROM NVD WHERE ID IS ?", (cve,)):
diff --git a/poky/meta/classes/devicetree.bbclass b/poky/meta/classes/devicetree.bbclass
index 8546c1cf80..7f3b808572 100644
--- a/poky/meta/classes/devicetree.bbclass
+++ b/poky/meta/classes/devicetree.bbclass
@@ -17,7 +17,7 @@ SECTION ?= "bsp"
# The default inclusion of kernel device tree includes and headers means that
# device trees built with them are at least GPLv2 (and in some cases dual
# licensed). Default to GPLv2 if the recipe does not specify a license.
-LICENSE ?= "GPLv2"
+LICENSE ?= "GPL-2.0-only"
LIC_FILES_CHKSUM ?= "file://${COMMON_LICENSE_DIR}/GPL-2.0-only;md5=801f80980d171dd6425610833a22dbe6"
INHIBIT_DEFAULT_DEPS = "1"
diff --git a/poky/meta/classes/distutils-common-base.bbclass b/poky/meta/classes/distutils-common-base.bbclass
deleted file mode 100644
index 59c750a3cf..0000000000
--- a/poky/meta/classes/distutils-common-base.bbclass
+++ /dev/null
@@ -1,28 +0,0 @@
-export STAGING_INCDIR
-export STAGING_LIBDIR
-
-# LDSHARED is the ld *command* used to create shared library
-export LDSHARED = "${CCLD} -shared"
-# LDXXSHARED is the ld *command* used to create shared library of C++
-# objects
-export LDCXXSHARED = "${CXX} -shared"
-# CCSHARED are the C *flags* used to create objects to go into a shared
-# library (module)
-export CCSHARED = "-fPIC -DPIC"
-# LINKFORSHARED are the flags passed to the $(CC) command that links
-# the python executable
-export LINKFORSHARED = "${SECURITY_CFLAGS} -Xlinker -export-dynamic"
-
-FILES:${PN} += "${libdir}/* ${libdir}/${PYTHON_DIR}/*"
-
-FILES:${PN}-staticdev += "\
- ${PYTHON_SITEPACKAGES_DIR}/*.a \
-"
-FILES:${PN}-dev += "\
- ${datadir}/pkgconfig \
- ${libdir}/pkgconfig \
- ${PYTHON_SITEPACKAGES_DIR}/*.la \
-"
-python __anonymous() {
- bb.warn("distutils-common-base.bbclass is deprecated, please use setuptools3-base.bbclass instead")
-}
diff --git a/poky/meta/classes/distutils3-base.bbclass b/poky/meta/classes/distutils3-base.bbclass
deleted file mode 100644
index 850c535bb1..0000000000
--- a/poky/meta/classes/distutils3-base.bbclass
+++ /dev/null
@@ -1,9 +0,0 @@
-DEPENDS:append:class-target = " ${PYTHON_PN}-native ${PYTHON_PN}"
-DEPENDS:append:class-nativesdk = " ${PYTHON_PN}-native ${PYTHON_PN}"
-RDEPENDS:${PN} += "${@['', '${PYTHON_PN}-core']['${CLASSOVERRIDE}' == 'class-target']}"
-
-inherit distutils-common-base python3native python3targetconfig
-
-python __anonymous() {
- bb.warn("distutils3-base.bbclass is deprecated, please use setuptools3-base.bbclass instead")
-
diff --git a/poky/meta/classes/distutils3.bbclass b/poky/meta/classes/distutils3.bbclass
deleted file mode 100644
index a6d8e8763f..0000000000
--- a/poky/meta/classes/distutils3.bbclass
+++ /dev/null
@@ -1,71 +0,0 @@
-inherit distutils3-base
-
-B = "${WORKDIR}/build"
-distutils_do_configure[cleandirs] = "${B}"
-
-DISTUTILS_BUILD_ARGS ?= ""
-DISTUTILS_INSTALL_ARGS ?= "--root=${D} \
- --prefix=${prefix} \
- --install-lib=${PYTHON_SITEPACKAGES_DIR} \
- --install-data=${datadir}"
-
-DISTUTILS_PYTHON = "python3"
-DISTUTILS_PYTHON:class-native = "nativepython3"
-
-DISTUTILS_SETUP_PATH ?= "${S}"
-
-python __anonymous() {
- bb.warn("distutils3.bbclass is deprecated, please use setuptools3.bbclass instead")
-}
-
-distutils3_do_configure() {
- :
-}
-
-distutils3_do_compile() {
- cd ${DISTUTILS_SETUP_PATH}
- NO_FETCH_BUILD=1 \
- STAGING_INCDIR=${STAGING_INCDIR} \
- STAGING_LIBDIR=${STAGING_LIBDIR} \
- ${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py \
- build --build-base=${B} ${DISTUTILS_BUILD_ARGS} || \
- bbfatal_log "'${PYTHON_PN} setup.py build ${DISTUTILS_BUILD_ARGS}' execution failed."
-}
-distutils3_do_compile[vardepsexclude] = "MACHINE"
-
-distutils3_do_install() {
- cd ${DISTUTILS_SETUP_PATH}
- install -d ${D}${PYTHON_SITEPACKAGES_DIR}
- STAGING_INCDIR=${STAGING_INCDIR} \
- STAGING_LIBDIR=${STAGING_LIBDIR} \
- PYTHONPATH=${D}${PYTHON_SITEPACKAGES_DIR} \
- ${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py \
- build --build-base=${B} install --skip-build ${DISTUTILS_INSTALL_ARGS} || \
- bbfatal_log "'${PYTHON_PN} setup.py install ${DISTUTILS_INSTALL_ARGS}' execution failed."
-
- # support filenames with *spaces*
- find ${D} -name "*.py" -exec grep -q ${D} {} \; \
- -exec sed -i -e s:${D}::g {} \;
-
- for i in ${D}${bindir}/* ${D}${sbindir}/*; do
- if [ -f "$i" ]; then
- sed -i -e s:${PYTHON}:${USRBINPATH}/env\ ${DISTUTILS_PYTHON}:g $i
- sed -i -e s:${STAGING_BINDIR_NATIVE}:${bindir}:g $i
- fi
- done
-
- rm -f ${D}${PYTHON_SITEPACKAGES_DIR}/easy-install.pth
-
- #
- # FIXME: Bandaid against wrong datadir computation
- #
- if [ -e ${D}${datadir}/share ]; then
- mv -f ${D}${datadir}/share/* ${D}${datadir}/
- rmdir ${D}${datadir}/share
- fi
-}
-distutils3_do_install[vardepsexclude] = "MACHINE"
-
-EXPORT_FUNCTIONS do_configure do_compile do_install
-
-export LDSHARED="${CCLD} -shared"
diff --git a/poky/meta/classes/features_check.bbclass b/poky/meta/classes/features_check.bbclass
index 205e1b9cd3..3ef6b35baa 100644
--- a/poky/meta/classes/features_check.bbclass
+++ b/poky/meta/classes/features_check.bbclass
@@ -19,12 +19,9 @@ python () {
unused = True
for kind in ['DISTRO', 'MACHINE', 'COMBINED', 'IMAGE']:
- if d.getVar('ANY_OF_' + kind + '_FEATURES') is None and \
- d.overridedata.get('ANY_OF_' + kind + '_FEATURES') is None and \
- d.getVar('REQUIRED_' + kind + '_FEATURES') is None and \
- d.overridedata.get('REQUIRED_' + kind + '_FEATURES') is None and \
- d.getVar('CONFLICT_' + kind + '_FEATURES') is None and \
- d.overridedata.get('CONFLICT_' + kind + '_FEATURES') is None:
+ if d.getVar('ANY_OF_' + kind + '_FEATURES') is None and not d.hasOverrides('ANY_OF_' + kind + '_FEATURES') and \
+ d.getVar('REQUIRED_' + kind + '_FEATURES') is None and not d.hasOverrides('REQUIRED_' + kind + '_FEATURES') and \
+ d.getVar('CONFLICT_' + kind + '_FEATURES') is None and not d.hasOverrides('CONFLICT_' + kind + '_FEATURES'):
continue
unused = False
diff --git a/poky/meta/classes/flit_core.bbclass b/poky/meta/classes/flit_core.bbclass
new file mode 100644
index 0000000000..0f2eec85d0
--- /dev/null
+++ b/poky/meta/classes/flit_core.bbclass
@@ -0,0 +1,16 @@
+inherit pip_install_wheel python3native python3-dir
+
+DEPENDS += "python3 python3-flit-core-native python3-pip-native"
+
+do_configure () {
+ mkdir -p ${S}/dist
+ cat > ${S}/build-it.py << EOF
+from flit_core import buildapi
+buildapi.build_wheel('./dist')
+EOF
+}
+
+do_compile () {
+ ${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} ${S}/build-it.py
+}
+
diff --git a/poky/meta/classes/gobject-introspection.bbclass b/poky/meta/classes/gobject-introspection.bbclass
index 4db1b362d9..7bf9feb0d6 100644
--- a/poky/meta/classes/gobject-introspection.bbclass
+++ b/poky/meta/classes/gobject-introspection.bbclass
@@ -29,7 +29,7 @@ EXTRA_OEMESON:prepend:class-nativesdk = "${@['', '${GIRMESONBUILD}'][d.getVar('G
# Generating introspection data depends on a combination of native and target
# introspection tools, and qemu to run the target tools.
-DEPENDS:append:class-target = " gobject-introspection gobject-introspection-native qemu-native prelink-native"
+DEPENDS:append:class-target = " gobject-introspection gobject-introspection-native qemu-native"
# Even though introspection is disabled on -native, gobject-introspection package is still
# needed for m4 macros.
diff --git a/poky/meta/classes/gtk-icon-cache.bbclass b/poky/meta/classes/gtk-icon-cache.bbclass
index 0248ba285e..6808339b90 100644
--- a/poky/meta/classes/gtk-icon-cache.bbclass
+++ b/poky/meta/classes/gtk-icon-cache.bbclass
@@ -1,17 +1,22 @@
FILES:${PN} += "${datadir}/icons/hicolor"
-#gtk+3 reqiure GTK3DISTROFEATURES, DEPENDS on it make all the
+GTKIC_VERSION ??= '3'
+
+GTKPN = "${@ 'gtk4' if d.getVar('GTKIC_VERSION') == '4' else 'gtk+3' }"
+GTKIC_CMD = "${@ 'gtk-update-icon-cache-3.0.0' if d.getVar('GTKIC_VERSION') == '4' else 'gtk4-update-icon-cache' }"
+
+#gtk+3/gtk4 require GTK3DISTROFEATURES, DEPENDS on it make all the
#recipes inherit this class require GTK3DISTROFEATURES
inherit features_check
ANY_OF_DISTRO_FEATURES = "${GTK3DISTROFEATURES}"
-DEPENDS +=" ${@['hicolor-icon-theme', '']['${BPN}' == 'hicolor-icon-theme']} \
- ${@['gdk-pixbuf', '']['${BPN}' == 'gdk-pixbuf']} \
- ${@['gtk+3', '']['${BPN}' == 'gtk+3']} \
- gtk+3-native \
+DEPENDS +=" ${@ '' if d.getVar('BPN') == 'hicolor-icon-theme' else 'hicolor-icon-theme' } \
+ ${@ '' if d.getVar('BPN') == 'gdk-pixbuf' else 'gdk-pixbuf' } \
+ ${@ '' if d.getVar('BPN') == d.getVar('GTKPN') else d.getVar('GTKPN') } \
+ ${GTKPN}-native \
"
-PACKAGE_WRITE_DEPS += "gtk+3-native gdk-pixbuf-native"
+PACKAGE_WRITE_DEPS += "${GTKPN}-native gdk-pixbuf-native"
gtk_icon_cache_postinst() {
if [ "x$D" != "x" ]; then
@@ -25,7 +30,7 @@ else
for icondir in /usr/share/icons/* ; do
if [ -d $icondir ] ; then
- gtk-update-icon-cache -fqt $icondir
+ ${GTKIC_CMD} -fqt $icondir
fi
done
fi
@@ -39,7 +44,7 @@ if [ "x$D" != "x" ]; then
else
for icondir in /usr/share/icons/* ; do
if [ -d $icondir ] ; then
- gtk-update-icon-cache -qt $icondir
+ ${GTKIC_CMD} -qt $icondir
fi
done
fi
@@ -58,13 +63,13 @@ python populate_packages:append () {
rdepends = ' ' + d.getVar('MLPREFIX', False) + "hicolor-icon-theme"
d.appendVar('RDEPENDS:%s' % pkg, rdepends)
- #gtk_icon_cache_postinst depend on gdk-pixbuf and gtk+3
+ #gtk_icon_cache_postinst depend on gdk-pixbuf and gtk+3/gtk4
bb.note("adding gdk-pixbuf dependency to %s" % pkg)
rdepends = ' ' + d.getVar('MLPREFIX', False) + "gdk-pixbuf"
d.appendVar('RDEPENDS:%s' % pkg, rdepends)
- bb.note("adding gtk+3 dependency to %s" % pkg)
- rdepends = ' ' + d.getVar('MLPREFIX', False) + "gtk+3"
+ bb.note("adding %s dependency to %s" % (d.getVar('GTKPN'), pkg))
+ rdepends = ' ' + d.getVar('MLPREFIX', False) + d.getVar('GTKPN')
d.appendVar('RDEPENDS:%s' % pkg, rdepends)
bb.note("adding gtk-icon-cache postinst and postrm scripts to %s" % pkg)
diff --git a/poky/meta/classes/icecc.bbclass b/poky/meta/classes/icecc.bbclass
index 794e9930ad..a550b6af24 100644
--- a/poky/meta/classes/icecc.bbclass
+++ b/poky/meta/classes/icecc.bbclass
@@ -19,22 +19,21 @@
# or the default one provided by icecc-create-env.bb will be used
# (NOTE that this is a modified version of the script need it and *not the one that comes with icecc*
#
-# User can specify if specific packages or packages belonging to class should not use icecc to distribute
-# compile jobs to remote machines, but handled locally, by defining ICECC_USER_CLASS_BL and ICECC_USER_PACKAGE_BL
-# with the appropriate values in local.conf. In addition the user can force to enable icecc for packages
-# which set an empty PARALLEL_MAKE variable by defining ICECC_USER_PACKAGE_WL.
+# User can specify if specific recipes or recipes belonging to class should not use icecc to distribute
+# compile jobs to remote machines, but handled locally, by defining ICECC_CLASS_DISABLE and ICECC_RECIPE_DISABLE
+# with the appropriate values in local.conf. In addition the user can force to enable icecc for recipes
+# which set an empty PARALLEL_MAKE variable by defining ICECC_RECIPE_ENABLE.
#
#########################################################################################
#Error checking is kept to minimum so double check any parameters you pass to the class
###########################################################################################
-BB_HASHBASE_WHITELIST += "ICECC_PARALLEL_MAKE ICECC_DISABLED ICECC_USER_PACKAGE_BL \
- ICECC_USER_CLASS_BL ICECC_USER_PACKAGE_WL ICECC_PATH ICECC_ENV_EXEC \
+BB_BASEHASH_IGNORE_VARS += "ICECC_PARALLEL_MAKE ICECC_DISABLED ICECC_RECIPE_DISABLE \
+ ICECC_CLASS_DISABLE ICECC_RECIPE_ENABLE ICECC_PATH ICECC_ENV_EXEC \
ICECC_CARET_WORKAROUND ICECC_CFLAGS ICECC_ENV_VERSION \
ICECC_DEBUG ICECC_LOGFILE ICECC_REPEAT_RATE ICECC_PREFERRED_HOST \
ICECC_CLANG_REMOTE_CPP ICECC_IGNORE_UNVERIFIED ICECC_TEST_SOCKET \
- ICECC_ENV_DEBUG ICECC_SYSTEM_PACKAGE_BL ICECC_SYSTEM_CLASS_BL \
- ICECC_REMOTE_CPP \
+ ICECC_ENV_DEBUG ICECC_REMOTE_CPP \
"
ICECC_ENV_EXEC ?= "${STAGING_BINDIR_NATIVE}/icecc-create-env"
@@ -66,7 +65,7 @@ CXXFLAGS += "${ICECC_CFLAGS}"
# Debug flags when generating environments
ICECC_ENV_DEBUG ??= ""
-# "system" recipe blacklist contains a list of packages that can not distribute
+# Disable recipe list contains a list of recipes that can not distribute
# compile tasks for one reason or the other. When adding new entry, please
# document why (how it failed) so that we can re-evaluate it later e.g. when
# there is new version
@@ -79,21 +78,21 @@ ICECC_ENV_DEBUG ??= ""
# inline assembly
# target-sdk-provides-dummy - ${HOST_PREFIX} is empty which triggers the "NULL
# prefix" error.
-ICECC_SYSTEM_PACKAGE_BL += "\
+ICECC_RECIPE_DISABLE += "\
libgcc-initial \
pixman \
systemtap \
target-sdk-provides-dummy \
"
-# "system" classes that should be blacklisted. When adding new entry, please
+# Classes that should not use icecc. When adding new entry, please
# document why (how it failed) so that we can re-evaluate it later
#
# image - Image aren't compiling, but the testing framework for images captures
# PARALLEL_MAKE as part of the test environment. Many tests won't use
# icecream, but leaving the high level of parallelism can cause them to
# consume an unnecessary amount of resources.
-ICECC_SYSTEM_CLASS_BL += "\
+ICECC_CLASS_DISABLE += "\
image \
"
@@ -141,32 +140,28 @@ def use_icecc(bb,d):
pn = d.getVar('PN')
bpn = d.getVar('BPN')
- # Blacklist/whitelist checks are made against BPN, because there is a good
+ # Enable/disable checks are made against BPN, because there is a good
# chance that if icecc should be skipped for a recipe, it should be skipped
# for all the variants of that recipe. PN is still checked in case a user
# specified a more specific recipe.
check_pn = set([pn, bpn])
- system_class_blacklist = (d.getVar('ICECC_SYSTEM_CLASS_BL') or "").split()
- user_class_blacklist = (d.getVar('ICECC_USER_CLASS_BL') or "none").split()
- package_class_blacklist = system_class_blacklist + user_class_blacklist
+ class_disable = (d.getVar('ICECC_CLASS_DISABLE') or "").split()
- for black in package_class_blacklist:
- if bb.data.inherits_class(black, d):
- bb.debug(1, "%s: class %s found in blacklist, disable icecc" % (pn, black))
+ for bbclass in class_disable:
+ if bb.data.inherits_class(bbclass, d):
+ bb.debug(1, "%s: bbclass %s found in disable, disable icecc" % (pn, bbclass))
return "no"
- system_package_blacklist = (d.getVar('ICECC_SYSTEM_PACKAGE_BL') or "").split()
- user_package_blacklist = (d.getVar('ICECC_USER_PACKAGE_BL') or "").split()
- user_package_whitelist = (d.getVar('ICECC_USER_PACKAGE_WL') or "").split()
- package_blacklist = system_package_blacklist + user_package_blacklist
+ disabled_recipes = (d.getVar('ICECC_RECIPE_DISABLE') or "").split()
+ enabled_recipes = (d.getVar('ICECC_RECIPE_ENABLE') or "").split()
- if check_pn & set(package_blacklist):
- bb.debug(1, "%s: found in blacklist, disable icecc" % pn)
+ if check_pn & set(disabled_recipes):
+ bb.debug(1, "%s: found in disable list, disable icecc" % pn)
return "no"
- if check_pn & set(user_package_whitelist):
- bb.debug(1, "%s: found in whitelist, enable icecc" % pn)
+ if check_pn & set(enabled_recipes):
+ bb.debug(1, "%s: found in enabled recipes list, enable icecc" % pn)
return "yes"
if d.getVar('PARALLEL_MAKE') == "":
@@ -309,7 +304,7 @@ wait_for_file() {
local TIMEOUT=$2
until [ -f "$FILE_TO_TEST" ]
do
- TIME_ELAPSED=`expr $TIME_ELAPSED + 1`
+ TIME_ELAPSED=$(expr $TIME_ELAPSED + 1)
if [ $TIME_ELAPSED -gt $TIMEOUT ]
then
return 1
@@ -362,8 +357,8 @@ set_icecc_env() {
return
fi
- ICE_VERSION=`$ICECC_CC -dumpversion`
- ICECC_VERSION=`echo ${ICECC_VERSION} | sed -e "s/@VERSION@/$ICE_VERSION/g"`
+ ICE_VERSION="$($ICECC_CC -dumpversion)"
+ ICECC_VERSION=$(echo ${ICECC_VERSION} | sed -e "s/@VERSION@/$ICE_VERSION/g")
if [ ! -x "${ICECC_ENV_EXEC}" ]
then
bbwarn "Cannot use icecc: invalid ICECC_ENV_EXEC"
@@ -390,18 +385,18 @@ set_icecc_env() {
chmod 775 $ICE_PATH/$compiler
done
- ICECC_AS="`${ICECC_CC} -print-prog-name=as`"
+ ICECC_AS="$(${ICECC_CC} -print-prog-name=as)"
# for target recipes should return something like:
# /OE/tmp-eglibc/sysroots/x86_64-linux/usr/libexec/arm920tt-oe-linux-gnueabi/gcc/arm-oe-linux-gnueabi/4.8.2/as
# and just "as" for native, if it returns "as" in current directory (for whatever reason) use "as" from PATH
- if [ "`dirname "${ICECC_AS}"`" = "." ]
+ if [ "$(dirname "${ICECC_AS}")" = "." ]
then
ICECC_AS="${ICECC_WHICH_AS}"
fi
if [ ! -f "${ICECC_VERSION}.done" ]
then
- mkdir -p "`dirname "${ICECC_VERSION}"`"
+ mkdir -p "$(dirname "${ICECC_VERSION}")"
# the ICECC_VERSION generation step must be locked by a mutex
# in order to prevent race conditions
diff --git a/poky/meta/classes/image-prelink.bbclass b/poky/meta/classes/image-prelink.bbclass
deleted file mode 100644
index 8158eeaf4c..0000000000
--- a/poky/meta/classes/image-prelink.bbclass
+++ /dev/null
@@ -1,76 +0,0 @@
-do_rootfs[depends] += "prelink-native:do_populate_sysroot"
-
-IMAGE_PREPROCESS_COMMAND:append:libc-glibc = " prelink_setup; prelink_image; "
-
-python prelink_setup () {
- oe.utils.write_ld_so_conf(d)
-}
-
-inherit linuxloader
-
-prelink_image () {
-# export PSEUDO_DEBUG=4
-# /bin/env | /bin/grep PSEUDO
-# echo "LD_LIBRARY_PATH=$LD_LIBRARY_PATH"
-# echo "LD_PRELOAD=$LD_PRELOAD"
-
- pre_prelink_size=`du -ks ${IMAGE_ROOTFS} | awk '{size = $1 ; print size }'`
- echo "Size before prelinking $pre_prelink_size."
-
- # The filesystem may not contain sysconfdir so establish what is present
- # to enable cleanup after temporary creation of sysconfdir if needed
- presentdir="${IMAGE_ROOTFS}${sysconfdir}"
- while [ "${IMAGE_ROOTFS}" != "${presentdir}" ] ; do
- [ ! -d "${presentdir}" ] || break
- presentdir=`dirname "${presentdir}"`
- done
-
- mkdir -p "${IMAGE_ROOTFS}${sysconfdir}"
-
- # We need a prelink conf on the filesystem, add one if it's missing
- if [ ! -e ${IMAGE_ROOTFS}${sysconfdir}/prelink.conf ]; then
- cp ${STAGING_ETCDIR_NATIVE}/prelink.conf \
- ${IMAGE_ROOTFS}${sysconfdir}/prelink.conf
- dummy_prelink_conf=true;
- else
- dummy_prelink_conf=false;
- fi
-
- # We need a ld.so.conf with pathnames in,prelink conf on the filesystem, add one if it's missing
- ldsoconf=${IMAGE_ROOTFS}${sysconfdir}/ld.so.conf
- if [ -e $ldsoconf ]; then
- cp $ldsoconf $ldsoconf.prelink
- fi
- cat ${STAGING_DIR_TARGET}${sysconfdir}/ld.so.conf >> $ldsoconf
-
- dynamic_loader=${@get_linuxloader(d)}
-
- # prelink!
- if [ "$REPRODUCIBLE_TIMESTAMP_ROOTFS" = "" ]; then
- export PRELINK_TIMESTAMP=`git log -1 --pretty=%ct `
- else
- export PRELINK_TIMESTAMP=$REPRODUCIBLE_TIMESTAMP_ROOTFS
- fi
- ${STAGING_SBINDIR_NATIVE}/prelink --root ${IMAGE_ROOTFS} -am -N -c ${sysconfdir}/prelink.conf --dynamic-linker $dynamic_loader
-
- # Remove the prelink.conf if we had to add it.
- if [ "$dummy_prelink_conf" = "true" ]; then
- rm -f ${IMAGE_ROOTFS}${sysconfdir}/prelink.conf
- fi
-
- if [ -e $ldsoconf.prelink ]; then
- mv $ldsoconf.prelink $ldsoconf
- else
- rm $ldsoconf
- fi
-
- # Remove any directories temporarily created for sysconfdir
- cleanupdir="${IMAGE_ROOTFS}${sysconfdir}"
- while [ "${presentdir}" != "${cleanupdir}" ] ; do
- rmdir "${cleanupdir}"
- cleanupdir=`dirname ${cleanupdir}`
- done
-
- pre_prelink_size=`du -ks ${IMAGE_ROOTFS} | awk '{size = $1 ; print size }'`
- echo "Size after prelinking $pre_prelink_size."
-}
diff --git a/poky/meta/classes/insane.bbclass b/poky/meta/classes/insane.bbclass
index 11532ecd08..890e865a8f 100644
--- a/poky/meta/classes/insane.bbclass
+++ b/poky/meta/classes/insane.bbclass
@@ -48,7 +48,7 @@ enabled tests are listed here, the do_package_qa task will run under fakeroot."
ALL_QA = "${WARN_QA} ${ERROR_QA}"
-UNKNOWN_CONFIGURE_WHITELIST ?= "--enable-nls --disable-nls --disable-silent-rules --disable-dependency-tracking --with-libtool-sysroot --disable-static"
+UNKNOWN_CONFIGURE_OPT_IGNORE ?= "--enable-nls --disable-nls --disable-silent-rules --disable-dependency-tracking --with-libtool-sysroot --disable-static"
# This is a list of directories that are expected to be empty.
QA_EMPTY_DIRS ?= " \
@@ -325,8 +325,8 @@ def package_qa_check_arch(path,name,d, elf, messages):
if not elf:
return
- target_os = d.getVar('TARGET_OS')
- target_arch = d.getVar('TARGET_ARCH')
+ target_os = d.getVar('HOST_OS')
+ target_arch = d.getVar('HOST_ARCH')
provides = d.getVar('PROVIDES')
bpn = d.getVar('BPN')
@@ -684,26 +684,44 @@ def package_qa_recipe(warnfuncs, errorfuncs, pn, d):
return len(errors) == 0
+def prepopulate_objdump_p(elf, d):
+ output = elf.run_objdump("-p", d)
+ return (elf.name, output)
+
# Walk over all files in a directory and call func
def package_qa_walk(warnfuncs, errorfuncs, package, d):
#if this will throw an exception, then fix the dict above
- target_os = d.getVar('TARGET_OS')
- target_arch = d.getVar('TARGET_ARCH')
+ target_os = d.getVar('HOST_OS')
+ target_arch = d.getVar('HOST_ARCH')
warnings = {}
errors = {}
+ elves = {}
for path in pkgfiles[package]:
elf = None
if os.path.isfile(path):
elf = oe.qa.ELFFile(path)
try:
elf.open()
+ elf.close()
except oe.qa.NotELFFileError:
elf = None
+ if elf:
+ elves[path] = elf
+
+ results = oe.utils.multiprocess_launch(prepopulate_objdump_p, elves.values(), d, extraargs=(d,))
+ for item in results:
+ elves[item[0]].set_objdump("-p", item[1])
+
+ for path in pkgfiles[package]:
+ if path in elves:
+ elves[path].open()
for func in warnfuncs:
- func(path, package, d, elf, warnings)
+ func(path, package, d, elves.get(path), warnings)
for func in errorfuncs:
- func(path, package, d, elf, errors)
+ func(path, package, d, elves.get(path), errors)
+ if path in elves:
+ elves[path].close()
for w in warnings:
oe.qa.handle_error(w, warnings[w], d)
@@ -974,7 +992,7 @@ def package_qa_check_unhandled_features_check(pn, d, messages):
var_set = False
for kind in ['DISTRO', 'MACHINE', 'COMBINED']:
for var in ['ANY_OF_' + kind + '_FEATURES', 'REQUIRED_' + kind + '_FEATURES', 'CONFLICT_' + kind + '_FEATURES']:
- if d.getVar(var) is not None or d.overridedata.get(var) is not None:
+ if d.getVar(var) is not None or d.hasOverrides(var):
var_set = True
if var_set:
oe.qa.handle_error("unhandled-features-check", "%s: recipe doesn't inherit features_check" % pn, d)
@@ -1252,7 +1270,7 @@ Rerun configure task after fixing this."""
options = set()
for line in output.splitlines():
options |= set(line.partition(flag)[2].split())
- whitelist = set(d.getVar("UNKNOWN_CONFIGURE_WHITELIST").split())
+ whitelist = set(d.getVar("UNKNOWN_CONFIGURE_OPT_IGNORE").split())
options -= whitelist
if options:
pn = d.getVar('PN')
diff --git a/poky/meta/classes/kernel-fitimage.bbclass b/poky/meta/classes/kernel-fitimage.bbclass
index b0c971b0eb..c16977c477 100644
--- a/poky/meta/classes/kernel-fitimage.bbclass
+++ b/poky/meta/classes/kernel-fitimage.bbclass
@@ -36,6 +36,10 @@ python __anonymous () {
if image:
d.appendVarFlag('do_assemble_fitimage_initramfs', 'depends', ' ${INITRAMFS_IMAGE}:do_image_complete')
+ ubootenv = d.getVar('UBOOT_ENV')
+ if ubootenv:
+ d.appendVarFlag('do_assemble_fitimage', 'depends', ' virtual/bootloader:do_populate_sysroot')
+
#check if there are any dtb providers
providerdtb = d.getVar("PREFERRED_PROVIDER_virtual/dtb")
if providerdtb:
diff --git a/poky/meta/classes/kernel.bbclass b/poky/meta/classes/kernel.bbclass
index 473e28be47..4f304eb9c7 100644
--- a/poky/meta/classes/kernel.bbclass
+++ b/poky/meta/classes/kernel.bbclass
@@ -30,6 +30,8 @@ INITRAMFS_IMAGE ?= ""
INITRAMFS_IMAGE_NAME ?= "${@['${INITRAMFS_IMAGE}-${MACHINE}', ''][d.getVar('INITRAMFS_IMAGE') == '']}"
INITRAMFS_TASK ?= ""
INITRAMFS_IMAGE_BUNDLE ?= ""
+INITRAMFS_DEPLOY_DIR_IMAGE ?= "${DEPLOY_DIR_IMAGE}"
+INITRAMFS_MULTICONFIG ?= ""
# KERNEL_VERSION is extracted from source code. It is evaluated as
# None for the first parsing, since the code has not been fetched.
@@ -133,7 +135,10 @@ set -e
# the do_bundle_initramfs does nothing, but the INITRAMFS_IMAGE is built
# standalone for use by wic and other tools.
if image:
- d.appendVarFlag('do_bundle_initramfs', 'depends', ' ${INITRAMFS_IMAGE}:do_image_complete')
+ if d.getVar('INITRAMFS_MULTICONFIG'):
+ d.appendVarFlag('do_bundle_initramfs', 'mcdepends', ' mc::${INITRAMFS_MULTICONFIG}:${INITRAMFS_IMAGE}:do_image_complete')
+ else:
+ d.appendVarFlag('do_bundle_initramfs', 'depends', ' ${INITRAMFS_IMAGE}:do_image_complete')
if image and bb.utils.to_boolean(d.getVar('INITRAMFS_IMAGE_BUNDLE')):
bb.build.addtask('do_transform_bundled_initramfs', 'do_deploy', 'do_bundle_initramfs', d)
@@ -240,8 +245,8 @@ copy_initramfs() {
# Find and use the first initramfs image archive type we find
rm -f ${B}/usr/${INITRAMFS_IMAGE_NAME}.cpio
for img in cpio cpio.gz cpio.lz4 cpio.lzo cpio.lzma cpio.xz cpio.zst; do
- if [ -e "${DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.$img" ]; then
- cp ${DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.$img ${B}/usr/.
+ if [ -e "${INITRAMFS_DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.$img" ]; then
+ cp ${INITRAMFS_DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.$img ${B}/usr/.
case $img in
*gz)
echo "gzip decompressing image"
@@ -278,7 +283,7 @@ copy_initramfs() {
fi
done
# Verify that the above loop found a initramfs, fail otherwise
- [ -f ${B}/usr/${INITRAMFS_IMAGE_NAME}.cpio ] && echo "Finished copy of initramfs into ./usr" || die "Could not find any ${DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.cpio{.gz|.lz4|.lzo|.lzma|.xz|.zst) for bundling; INITRAMFS_IMAGE_NAME might be wrong."
+ [ -f ${B}/usr/${INITRAMFS_IMAGE_NAME}.cpio ] && echo "Finished copy of initramfs into ./usr" || die "Could not find any ${INITRAMFS_DEPLOY_DIR_IMAGE}/${INITRAMFS_IMAGE_NAME}.cpio{.gz|.lz4|.lzo|.lzma|.xz|.zst) for bundling; INITRAMFS_IMAGE_NAME might be wrong."
}
do_bundle_initramfs () {
@@ -650,7 +655,7 @@ FILES:${KERNEL_PACKAGE_NAME}-modules = ""
RDEPENDS:${KERNEL_PACKAGE_NAME} = "${KERNEL_PACKAGE_NAME}-base (= ${EXTENDPKGV})"
# Allow machines to override this dependency if kernel image files are
# not wanted in images as standard
-RDEPENDS:${KERNEL_PACKAGE_NAME}-base ?= "${KERNEL_PACKAGE_NAME}-image (= ${EXTENDPKGV})"
+RRECOMMENDS:${KERNEL_PACKAGE_NAME}-base ?= "${KERNEL_PACKAGE_NAME}-image (= ${EXTENDPKGV})"
PKG:${KERNEL_PACKAGE_NAME}-image = "${KERNEL_PACKAGE_NAME}-image-${@legitimize_package_name(d.getVar('KERNEL_VERSION'))}"
RDEPENDS:${KERNEL_PACKAGE_NAME}-image += "${@oe.utils.conditional('KERNEL_IMAGETYPE', 'vmlinux', '${KERNEL_PACKAGE_NAME}-vmlinux (= ${EXTENDPKGV})', '', d)}"
PKG:${KERNEL_PACKAGE_NAME}-base = "${KERNEL_PACKAGE_NAME}-${@legitimize_package_name(d.getVar('KERNEL_VERSION'))}"
diff --git a/poky/meta/classes/license.bbclass b/poky/meta/classes/license.bbclass
index d5480d87e2..dec9867209 100644
--- a/poky/meta/classes/license.bbclass
+++ b/poky/meta/classes/license.bbclass
@@ -341,30 +341,31 @@ def incompatible_license(d, dont_want_licenses, package=None):
def check_license_flags(d):
"""
This function checks if a recipe has any LICENSE_FLAGS that
- aren't whitelisted.
+ aren't acceptable.
- If it does, it returns the all LICENSE_FLAGS missing from the whitelist, or
- all of the LICENSE_FLAGS if there is no whitelist.
+ If it does, it returns the all LICENSE_FLAGS missing from the list
+ of acceptable license flags, or all of the LICENSE_FLAGS if there
+ is no list of acceptable flags.
- If everything is is properly whitelisted, it returns None.
+ If everything is is acceptable, it returns None.
"""
- def license_flag_matches(flag, whitelist, pn):
+ def license_flag_matches(flag, acceptlist, pn):
"""
- Return True if flag matches something in whitelist, None if not.
+ Return True if flag matches something in acceptlist, None if not.
- Before we test a flag against the whitelist, we append _${PN}
+ Before we test a flag against the acceptlist, we append _${PN}
to it. We then try to match that string against the
- whitelist. This covers the normal case, where we expect
+ acceptlist. This covers the normal case, where we expect
LICENSE_FLAGS to be a simple string like 'commercial', which
- the user typically matches exactly in the whitelist by
+ the user typically matches exactly in the acceptlist by
explicitly appending the package name e.g 'commercial_foo'.
If we fail the match however, we then split the flag across
'_' and append each fragment and test until we either match or
run out of fragments.
"""
flag_pn = ("%s_%s" % (flag, pn))
- for candidate in whitelist:
+ for candidate in acceptlist:
if flag_pn == candidate:
return True
@@ -375,27 +376,27 @@ def check_license_flags(d):
if flag_cur:
flag_cur += "_"
flag_cur += flagment
- for candidate in whitelist:
+ for candidate in acceptlist:
if flag_cur == candidate:
return True
return False
- def all_license_flags_match(license_flags, whitelist):
+ def all_license_flags_match(license_flags, acceptlist):
""" Return all unmatched flags, None if all flags match """
pn = d.getVar('PN')
- split_whitelist = whitelist.split()
+ split_acceptlist = acceptlist.split()
flags = []
for flag in license_flags.split():
- if not license_flag_matches(flag, split_whitelist, pn):
+ if not license_flag_matches(flag, split_acceptlist, pn):
flags.append(flag)
return flags if flags else None
license_flags = d.getVar('LICENSE_FLAGS')
if license_flags:
- whitelist = d.getVar('LICENSE_FLAGS_WHITELIST')
- if not whitelist:
+ acceptlist = d.getVar('LICENSE_FLAGS_ACCEPTED')
+ if not acceptlist:
return license_flags.split()
- unmatched_flags = all_license_flags_match(license_flags, whitelist)
+ unmatched_flags = all_license_flags_match(license_flags, acceptlist)
if unmatched_flags:
return unmatched_flags
return None
diff --git a/poky/meta/classes/multilib.bbclass b/poky/meta/classes/multilib.bbclass
index 4a3e582816..ec2013198c 100644
--- a/poky/meta/classes/multilib.bbclass
+++ b/poky/meta/classes/multilib.bbclass
@@ -65,11 +65,11 @@ python multilib_virtclass_handler () {
override = ":virtclass-multilib-" + variant
- blacklist = e.data.getVarFlag('PNBLACKLIST', e.data.getVar('PN'))
- if blacklist:
+ skip_msg = e.data.getVarFlag('SKIP_RECIPE', e.data.getVar('PN'))
+ if skip_msg:
pn_new = variant + "-" + e.data.getVar('PN')
- if not e.data.getVarFlag('PNBLACKLIST', pn_new):
- e.data.setVarFlag('PNBLACKLIST', pn_new, blacklist)
+ if not e.data.getVarFlag('SKIP_RECIPE', pn_new):
+ e.data.setVarFlag('SKIP_RECIPE', pn_new, skip_msg)
e.data.setVar("MLPREFIX", variant + "-")
e.data.setVar("PN", variant + "-" + e.data.getVar("PN", False))
diff --git a/poky/meta/classes/multilib_global.bbclass b/poky/meta/classes/multilib_global.bbclass
index dae015cdaf..ab8ca0e41d 100644
--- a/poky/meta/classes/multilib_global.bbclass
+++ b/poky/meta/classes/multilib_global.bbclass
@@ -137,14 +137,14 @@ def preferred_ml_updates(d):
prov = prov.replace("virtual/", "")
return "virtual/" + prefix + "-" + prov
- mp = (d.getVar("MULTI_PROVIDER_WHITELIST") or "").split()
+ mp = (d.getVar("BB_MULTI_PROVIDER_ALLOWED") or "").split()
extramp = []
for p in mp:
if p.endswith("-native") or "-crosssdk-" in p or p.startswith(("nativesdk-", "virtual/nativesdk-")) or 'cross-canadian' in p:
continue
for pref in prefixes:
extramp.append(translate_provide(pref, p))
- d.setVar("MULTI_PROVIDER_WHITELIST", " ".join(mp + extramp))
+ d.setVar("BB_MULTI_PROVIDER_ALLOWED", " ".join(mp + extramp))
abisafe = (d.getVar("SIGGEN_EXCLUDERECIPES_ABISAFE") or "").split()
extras = []
diff --git a/poky/meta/classes/package.bbclass b/poky/meta/classes/package.bbclass
index 4927fb99ff..f822258150 100644
--- a/poky/meta/classes/package.bbclass
+++ b/poky/meta/classes/package.bbclass
@@ -367,7 +367,7 @@ def source_info(file, d, fatal=True):
return list(debugsources)
-def splitdebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsrcdir, d):
+def splitdebuginfo(file, dvar, dv, d):
# Function to split a single file into two components, one is the stripped
# target system binary, the other contains any debugging information. The
# two files are linked to reference each other.
@@ -378,7 +378,7 @@ def splitdebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsrcdir,
import subprocess
src = file[len(dvar):]
- dest = debuglibdir + os.path.dirname(src) + debugdir + "/" + os.path.basename(src) + debugappend
+ dest = dv["libdir"] + os.path.dirname(src) + dv["dir"] + "/" + os.path.basename(src) + dv["append"]
debugfile = dvar + dest
sources = []
@@ -397,7 +397,7 @@ def splitdebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsrcdir,
os.chmod(file, newmode)
# We need to extract the debug src information here...
- if debugsrcdir:
+ if dv["srcdir"]:
sources = source_info(file, d)
bb.utils.mkdirhier(os.path.dirname(debugfile))
@@ -412,7 +412,7 @@ def splitdebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsrcdir,
return (file, sources)
-def splitstaticdebuginfo(file, dvar, debugstaticdir, debugstaticlibdir, debugstaticappend, debugsrcdir, d):
+def splitstaticdebuginfo(file, dvar, dv, d):
# Unlike the function above, there is no way to split a static library
# two components. So to get similar results we will copy the unmodified
# static library (containing the debug symbols) into a new directory.
@@ -425,7 +425,7 @@ def splitstaticdebuginfo(file, dvar, debugstaticdir, debugstaticlibdir, debugsta
import shutil
src = file[len(dvar):]
- dest = debugstaticlibdir + os.path.dirname(src) + debugstaticdir + "/" + os.path.basename(src) + debugstaticappend
+ dest = dv["staticlibdir"] + os.path.dirname(src) + dv["staticdir"] + "/" + os.path.basename(src) + dv["staticappend"]
debugfile = dvar + dest
sources = []
@@ -442,7 +442,7 @@ def splitstaticdebuginfo(file, dvar, debugstaticdir, debugstaticlibdir, debugsta
os.chmod(file, newmode)
# We need to extract the debug src information here...
- if debugsrcdir:
+ if dv["srcdir"]:
sources = source_info(file, d)
bb.utils.mkdirhier(os.path.dirname(debugfile))
@@ -455,7 +455,7 @@ def splitstaticdebuginfo(file, dvar, debugstaticdir, debugstaticlibdir, debugsta
return (file, sources)
-def inject_minidebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsrcdir, d):
+def inject_minidebuginfo(file, dvar, dv, d):
# Extract just the symbols from debuginfo into minidebuginfo,
# compress it with xz and inject it back into the binary in a .gnu_debugdata section.
# https://sourceware.org/gdb/onlinedocs/gdb/MiniDebugInfo.html
@@ -469,7 +469,7 @@ def inject_minidebuginfo(file, dvar, debugdir, debuglibdir, debugappend, debugsr
minidebuginfodir = d.expand('${WORKDIR}/minidebuginfo')
src = file[len(dvar):]
- dest = debuglibdir + os.path.dirname(src) + debugdir + "/" + os.path.basename(src) + debugappend
+ dest = dv["libdir"] + os.path.dirname(src) + dv["dir"] + "/" + os.path.basename(src) + dv["append"]
debugfile = dvar + dest
minidebugfile = minidebuginfodir + src + '.minidebug'
bb.utils.mkdirhier(os.path.dirname(minidebugfile))
@@ -1065,6 +1065,54 @@ python fixup_perms () {
fix_perms(each_file, fs_perms_table[dir].fmode, fs_perms_table[dir].fuid, fs_perms_table[dir].fgid, dir)
}
+def package_debug_vars(d):
+ # We default to '.debug' style
+ if d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-file-directory':
+ # Single debug-file-directory style debug info
+ debug_vars = {
+ "append": ".debug",
+ "staticappend": "",
+ "dir": "",
+ "staticdir": "",
+ "libdir": "/usr/lib/debug",
+ "staticlibdir": "/usr/lib/debug-static",
+ "srcdir": "/usr/src/debug",
+ }
+ elif d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-without-src':
+ # Original OE-core, a.k.a. ".debug", style debug info, but without sources in /usr/src/debug
+ debug_vars = {
+ "append": "",
+ "staticappend": "",
+ "dir": "/.debug",
+ "staticdir": "/.debug-static",
+ "libdir": "",
+ "staticlibdir": "",
+ "srcdir": "",
+ }
+ elif d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-with-srcpkg':
+ debug_vars = {
+ "append": "",
+ "staticappend": "",
+ "dir": "/.debug",
+ "staticdir": "/.debug-static",
+ "libdir": "",
+ "staticlibdir": "",
+ "srcdir": "/usr/src/debug",
+ }
+ else:
+ # Original OE-core, a.k.a. ".debug", style debug info
+ debug_vars = {
+ "append": "",
+ "staticappend": "",
+ "dir": "/.debug",
+ "staticdir": "/.debug-static",
+ "libdir": "",
+ "staticlibdir": "",
+ "srcdir": "/usr/src/debug",
+ }
+
+ return debug_vars
+
python split_and_strip_files () {
import stat, errno
import subprocess
@@ -1076,42 +1124,7 @@ python split_and_strip_files () {
oldcwd = os.getcwd()
os.chdir(dvar)
- # We default to '.debug' style
- if d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-file-directory':
- # Single debug-file-directory style debug info
- debugappend = ".debug"
- debugstaticappend = ""
- debugdir = ""
- debugstaticdir = ""
- debuglibdir = "/usr/lib/debug"
- debugstaticlibdir = "/usr/lib/debug-static"
- debugsrcdir = "/usr/src/debug"
- elif d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-without-src':
- # Original OE-core, a.k.a. ".debug", style debug info, but without sources in /usr/src/debug
- debugappend = ""
- debugstaticappend = ""
- debugdir = "/.debug"
- debugstaticdir = "/.debug-static"
- debuglibdir = ""
- debugstaticlibdir = ""
- debugsrcdir = ""
- elif d.getVar('PACKAGE_DEBUG_SPLIT_STYLE') == 'debug-with-srcpkg':
- debugappend = ""
- debugstaticappend = ""
- debugdir = "/.debug"
- debugstaticdir = "/.debug-static"
- debuglibdir = ""
- debugstaticlibdir = ""
- debugsrcdir = "/usr/src/debug"
- else:
- # Original OE-core, a.k.a. ".debug", style debug info
- debugappend = ""
- debugstaticappend = ""
- debugdir = "/.debug"
- debugstaticdir = "/.debug-static"
- debuglibdir = ""
- debugstaticlibdir = ""
- debugsrcdir = "/usr/src/debug"
+ dv = package_debug_vars(d)
#
# First lets figure out all of the files we may have to process ... do this only once!
@@ -1132,9 +1145,9 @@ python split_and_strip_files () {
file = os.path.join(root, f)
# Skip debug files
- if debugappend and file.endswith(debugappend):
+ if dv["append"] and file.endswith(dv["append"]):
continue
- if debugdir and debugdir in os.path.dirname(file[len(dvar):]):
+ if dv["dir"] and dv["dir"] in os.path.dirname(file[len(dvar):]):
continue
if file in skipfiles:
@@ -1231,11 +1244,11 @@ python split_and_strip_files () {
# First lets process debug splitting
#
if (d.getVar('INHIBIT_PACKAGE_DEBUG_SPLIT') != '1'):
- results = oe.utils.multiprocess_launch(splitdebuginfo, list(elffiles), d, extraargs=(dvar, debugdir, debuglibdir, debugappend, debugsrcdir, d))
+ results = oe.utils.multiprocess_launch(splitdebuginfo, list(elffiles), d, extraargs=(dvar, dv, d))
- if debugsrcdir and not hostos.startswith("mingw"):
+ if dv["srcdir"] and not hostos.startswith("mingw"):
if (d.getVar('PACKAGE_DEBUG_STATIC_SPLIT') == '1'):
- results = oe.utils.multiprocess_launch(splitstaticdebuginfo, staticlibs, d, extraargs=(dvar, debugstaticdir, debugstaticlibdir, debugstaticappend, debugsrcdir, d))
+ results = oe.utils.multiprocess_launch(splitstaticdebuginfo, staticlibs, d, extraargs=(dvar, dv, d))
else:
for file in staticlibs:
results.append( (file,source_info(file, d)) )
@@ -1254,9 +1267,9 @@ python split_and_strip_files () {
target = inodes[ref][0][len(dvar):]
for file in inodes[ref][1:]:
src = file[len(dvar):]
- dest = debuglibdir + os.path.dirname(src) + debugdir + "/" + os.path.basename(target) + debugappend
+ dest = dv["libdir"] + os.path.dirname(src) + dv["dir"] + "/" + os.path.basename(target) + dv["append"]
fpath = dvar + dest
- ftarget = dvar + debuglibdir + os.path.dirname(target) + debugdir + "/" + os.path.basename(target) + debugappend
+ ftarget = dvar + dv["libdir"] + os.path.dirname(target) + dv["dir"] + "/" + os.path.basename(target) + dv["append"]
bb.utils.mkdirhier(os.path.dirname(fpath))
# Only one hardlink of separated debug info file in each directory
if not os.access(fpath, os.R_OK):
@@ -1266,7 +1279,7 @@ python split_and_strip_files () {
# Create symlinks for all cases we were able to split symbols
for file in symlinks:
src = file[len(dvar):]
- dest = debuglibdir + os.path.dirname(src) + debugdir + "/" + os.path.basename(src) + debugappend
+ dest = dv["libdir"] + os.path.dirname(src) + dv["dir"] + "/" + os.path.basename(src) + dv["append"]
fpath = dvar + dest
# Skip it if the target doesn't exist
try:
@@ -1282,17 +1295,17 @@ python split_and_strip_files () {
lbase = os.path.basename(ltarget)
ftarget = ""
if lpath and lpath != ".":
- ftarget += lpath + debugdir + "/"
- ftarget += lbase + debugappend
+ ftarget += lpath + dv["dir"] + "/"
+ ftarget += lbase + dv["append"]
if lpath.startswith(".."):
ftarget = os.path.join("..", ftarget)
bb.utils.mkdirhier(os.path.dirname(fpath))
#bb.note("Symlink %s -> %s" % (fpath, ftarget))
os.symlink(ftarget, fpath)
- # Process the debugsrcdir if requested...
+ # Process the dv["srcdir"] if requested...
# This copies and places the referenced sources for later debugging...
- copydebugsources(debugsrcdir, sources, d)
+ copydebugsources(dv["srcdir"], sources, d)
#
# End of debug splitting
#
@@ -1316,7 +1329,7 @@ python split_and_strip_files () {
# Build "minidebuginfo" and reinject it back into the stripped binaries
if d.getVar('PACKAGE_MINIDEBUGINFO') == '1':
oe.utils.multiprocess_launch(inject_minidebuginfo, list(elffiles), d,
- extraargs=(dvar, debugdir, debuglibdir, debugappend, debugsrcdir, d))
+ extraargs=(dvar, dv, d))
#
# End of strip
@@ -1455,10 +1468,10 @@ python populate_packages () {
os.umask(oldumask)
os.chdir(workdir)
- # Handle LICENSE_EXCLUSION
+ # Handle excluding packages with incompatible licenses
package_list = []
for pkg in packages:
- licenses = d.getVar('LICENSE_EXCLUSION-' + pkg)
+ licenses = d.getVar('_exclude_incompatible-' + pkg)
if licenses:
msg = "Excluding %s from packaging as it has incompatible license(s): %s" % (pkg, licenses)
oe.qa.handle_error("incompatible-license", msg, d)
@@ -2340,7 +2353,7 @@ def gen_packagevar(d, pkgvars="PACKAGEVARS"):
# Ensure that changes to INCOMPATIBLE_LICENSE re-run do_package for
# affected recipes.
- ret.append('LICENSE_EXCLUSION-%s' % p)
+ ret.append('_exclude_incompatible-%s' % p)
return " ".join(ret)
PACKAGE_PREPROCESS_FUNCS ?= ""
diff --git a/poky/meta/classes/pip_install_wheel.bbclass b/poky/meta/classes/pip_install_wheel.bbclass
new file mode 100644
index 0000000000..9f9feda6ee
--- /dev/null
+++ b/poky/meta/classes/pip_install_wheel.bbclass
@@ -0,0 +1,48 @@
+DEPENDS:append = " python3-pip-native"
+
+def guess_pip_install_package_name(d):
+ '''https://www.python.org/dev/peps/pep-0491/#escaping-and-unicode'''
+ return (d.getVar('PYPI_PACKAGE') or d.getVar('PN')).replace('-', '_')
+
+PIP_INSTALL_PACKAGE ?= "${@guess_pip_install_package_name(d)}"
+PIP_INSTALL_DIST_PATH ?= "${B}/dist"
+PYPA_WHEEL ??= "${PIP_INSTALL_DIST_PATH}/${PIP_INSTALL_PACKAGE}-${PV}-*.whl"
+
+PIP_INSTALL_ARGS ?= "\
+ -vvvv \
+ --ignore-installed \
+ --no-cache \
+ --no-deps \
+ --no-index \
+ --root=${D} \
+ --prefix=${prefix} \
+"
+
+pip_install_wheel_do_install:prepend () {
+ install -d ${D}${PYTHON_SITEPACKAGES_DIR}
+}
+
+export PYPA_WHEEL
+
+PIP_INSTALL_PYTHON = "python3"
+PIP_INSTALL_PYTHON:class-native = "nativepython3"
+
+pip_install_wheel_do_install () {
+ nativepython3 -m pip install ${PIP_INSTALL_ARGS} ${PYPA_WHEEL} ||
+ bbfatal_log "Failed to pip install wheel. Check the logs."
+
+ for i in ${D}${bindir}/* ${D}${sbindir}/*; do
+ if [ -f "$i" ]; then
+ sed -i -e "1s,#!.*nativepython3,#!${USRBINPATH}/env ${PIP_INSTALL_PYTHON}," $i
+ sed -i -e "s:${PYTHON}:${USRBINPATH}/env\ ${PIP_INSTALL_PYTHON}:g" $i
+ sed -i -e "s:${STAGING_BINDIR_NATIVE}:${bindir}:g" $i
+ # Recompile after modifying it
+ cd ${D}
+ file=`echo $i | sed 's:^${D}/::'`
+ ${STAGING_BINDIR_NATIVE}/python3-native/python3 -c "from py_compile import compile; compile('$file')"
+ cd -
+ fi
+ done
+}
+
+EXPORT_FUNCTIONS do_install
diff --git a/poky/meta/classes/populate_sdk_ext.bbclass b/poky/meta/classes/populate_sdk_ext.bbclass
index ef93b6a826..9c9561c5c6 100644
--- a/poky/meta/classes/populate_sdk_ext.bbclass
+++ b/poky/meta/classes/populate_sdk_ext.bbclass
@@ -22,8 +22,8 @@ SDK_INCLUDE_BUILDTOOLS ?= '1'
SDK_RECRDEP_TASKS ?= ""
SDK_CUSTOM_TEMPLATECONF ?= "0"
-SDK_LOCAL_CONF_WHITELIST ?= ""
-SDK_LOCAL_CONF_BLACKLIST ?= "CONF_VERSION \
+ESDK_LOCALCONF_ALLOW ?= ""
+ESDK_LOCALCONF_REMOVE ?= "CONF_VERSION \
BB_NUMBER_THREADS \
BB_NUMBER_PARSE_THREADS \
PARALLEL_MAKE \
@@ -34,7 +34,7 @@ SDK_LOCAL_CONF_BLACKLIST ?= "CONF_VERSION \
TMPDIR \
BB_SERVER_TIMEOUT \
"
-SDK_INHERIT_BLACKLIST ?= "buildhistory icecc"
+ESDK_CLASS_INHERIT_DISABLE ?= "buildhistory icecc"
SDK_UPDATE_URL ?= ""
SDK_TARGETS ?= "${PN}"
@@ -282,7 +282,7 @@ python copy_buildsystem () {
bb.utils.mkdirhier(uninative_outdir)
shutil.copy(uninative_file, uninative_outdir)
- env_whitelist = (d.getVar('BB_ENV_EXTRAWHITE') or '').split()
+ env_whitelist = (d.getVar('BB_ENV_PASSTHROUGH_ADDITIONS') or '').split()
env_whitelist_values = {}
# Create local.conf
@@ -294,8 +294,8 @@ python copy_buildsystem () {
if derivative:
shutil.copyfile(builddir + '/conf/local.conf', baseoutpath + '/conf/local.conf')
else:
- local_conf_whitelist = (d.getVar('SDK_LOCAL_CONF_WHITELIST') or '').split()
- local_conf_blacklist = (d.getVar('SDK_LOCAL_CONF_BLACKLIST') or '').split()
+ local_conf_whitelist = (d.getVar('ESDK_LOCALCONF_ALLOW') or '').split()
+ local_conf_blacklist = (d.getVar('ESDK_LOCALCONF_REMOVE') or '').split()
def handle_var(varname, origvalue, op, newlines):
if varname in local_conf_blacklist or (origvalue.strip().startswith('/') and not varname in local_conf_whitelist):
newlines.append('# Removed original setting of %s\n' % varname)
@@ -338,7 +338,7 @@ python copy_buildsystem () {
f.write('CONF_VERSION = "%s"\n\n' % d.getVar('CONF_VERSION', False))
# Some classes are not suitable for SDK, remove them from INHERIT
- f.write('INHERIT:remove = "%s"\n' % d.getVar('SDK_INHERIT_BLACKLIST', False))
+ f.write('INHERIT:remove = "%s"\n' % d.getVar('ESDK_CLASS_INHERIT_DISABLE', False))
# Bypass the default connectivity check if any
f.write('CONNECTIVITY_CHECK_URIS = ""\n\n')
@@ -354,10 +354,10 @@ python copy_buildsystem () {
f.write('SIGGEN_LOCKEDSIGS_TASKSIG_CHECK = "warn"\n\n')
# We want to be able to set this without a full reparse
- f.write('BB_HASHCONFIG_WHITELIST:append = " SIGGEN_UNLOCKED_RECIPES"\n\n')
+ f.write('BB_HASHCONFIG_IGNORE_VARS:append = " SIGGEN_UNLOCKED_RECIPES"\n\n')
# Set up whitelist for run on install
- f.write('BB_SETSCENE_ENFORCE_WHITELIST = "%:* *:do_shared_workdir *:do_rm_work wic-tools:* *:do_addto_recipe_sysroot"\n\n')
+ f.write('BB_SETSCENE_ENFORCE_IGNORE_TASKS = "%:* *:do_shared_workdir *:do_rm_work wic-tools:* *:do_addto_recipe_sysroot"\n\n')
# Hide the config information from bitbake output (since it's fixed within the SDK)
f.write('BUILDCFG_HEADER = ""\n\n')
@@ -436,7 +436,7 @@ python copy_buildsystem () {
f.write('meta/conf\n')
# Ensure any variables set from the external environment (by way of
- # BB_ENV_EXTRAWHITE) are set in the SDK's configuration
+ # BB_ENV_PASSTHROUGH_ADDITIONS) are set in the SDK's configuration
extralines = []
for name, value in env_whitelist_values.items():
actualvalue = d.getVar(name) or ''
diff --git a/poky/meta/classes/python3targetconfig.bbclass b/poky/meta/classes/python3targetconfig.bbclass
index 5c8457acaa..2476858cae 100644
--- a/poky/meta/classes/python3targetconfig.bbclass
+++ b/poky/meta/classes/python3targetconfig.bbclass
@@ -15,3 +15,15 @@ do_compile:prepend:class-target() {
do_install:prepend:class-target() {
export _PYTHON_SYSCONFIGDATA_NAME="_sysconfigdata"
}
+
+do_configure:prepend:class-nativesdk() {
+ export _PYTHON_SYSCONFIGDATA_NAME="_sysconfigdata"
+}
+
+do_compile:prepend:class-nativesdk() {
+ export _PYTHON_SYSCONFIGDATA_NAME="_sysconfigdata"
+}
+
+do_install:prepend:class-nativesdk() {
+ export _PYTHON_SYSCONFIGDATA_NAME="_sysconfigdata"
+}
diff --git a/poky/meta/classes/qemuboot.bbclass b/poky/meta/classes/qemuboot.bbclass
index cc1cbce69d..755d49acd6 100644
--- a/poky/meta/classes/qemuboot.bbclass
+++ b/poky/meta/classes/qemuboot.bbclass
@@ -109,7 +109,7 @@ def qemuboot_vars(d):
build_vars = ['MACHINE', 'TUNE_ARCH', 'DEPLOY_DIR_IMAGE',
'KERNEL_IMAGETYPE', 'IMAGE_NAME', 'IMAGE_LINK_NAME',
'STAGING_DIR_NATIVE', 'STAGING_BINDIR_NATIVE',
- 'STAGING_DIR_HOST', 'SERIAL_CONSOLES']
+ 'STAGING_DIR_HOST', 'SERIAL_CONSOLES', 'UNINATIVE_LOADER']
return build_vars + [k for k in d.keys() if k.startswith('QB_')]
do_write_qemuboot_conf[vardeps] += "${@' '.join(qemuboot_vars(d))}"
@@ -136,6 +136,8 @@ python do_write_qemuboot_conf() {
'qemu-helper-native/1.0-r1/recipe-sysroot-native/usr/bin/')
else:
val = d.getVar(k)
+ if val is None:
+ continue
# we only want to write out relative paths so that we can relocate images
# and still run them
if val.startswith(topdir):
diff --git a/poky/meta/classes/rootfs-postcommands.bbclass b/poky/meta/classes/rootfs-postcommands.bbclass
index 74035c30b7..cd8986d8a9 100644
--- a/poky/meta/classes/rootfs-postcommands.bbclass
+++ b/poky/meta/classes/rootfs-postcommands.bbclass
@@ -380,7 +380,7 @@ python overlayfs_qa_check() {
from oe.overlayfs import mountUnitName
# this is a dumb check for unit existence, not its validity
- overlayMountPoints = d.getVarFlags("OVERLAYFS_MOUNT_POINT")
+ overlayMountPoints = d.getVarFlags("OVERLAYFS_MOUNT_POINT") or {}
imagepath = d.getVar("IMAGE_ROOTFS")
searchpaths = [oe.path.join(imagepath, d.getVar("sysconfdir"), "systemd", "system"),
oe.path.join(imagepath, d.getVar("systemd_system_unitdir"))]
diff --git a/poky/meta/classes/sanity.bbclass b/poky/meta/classes/sanity.bbclass
index f288b4c84c..773902e619 100644
--- a/poky/meta/classes/sanity.bbclass
+++ b/poky/meta/classes/sanity.bbclass
@@ -353,7 +353,7 @@ def check_connectivity(d):
msg += " Please ensure your host's network is configured correctly.\n"
msg += " If your ISP or network is blocking the above URL,\n"
msg += " try with another domain name, for example by setting:\n"
- msg += " CONNECTIVITY_CHECK_URIS = \"https://www.yoctoproject.org/\""
+ msg += " CONNECTIVITY_CHECK_URIS = \"https://www.example.com/\""
msg += " You could also set BB_NO_NETWORK = \"1\" to disable network\n"
msg += " access if all required sources are on local disk.\n"
retval = msg
diff --git a/poky/meta/classes/setuptools3.bbclass b/poky/meta/classes/setuptools3.bbclass
index fd8499d26c..12561340b0 100644
--- a/poky/meta/classes/setuptools3.bbclass
+++ b/poky/meta/classes/setuptools3.bbclass
@@ -1,6 +1,7 @@
-inherit setuptools3-base
+inherit setuptools3-base pip_install_wheel
-B = "${WORKDIR}/build"
+# bdist_wheel builds in ./dist
+#B = "${WORKDIR}/build"
SETUPTOOLS_BUILD_ARGS ?= ""
SETUPTOOLS_INSTALL_ARGS ?= "--root=${D} \
@@ -23,20 +24,15 @@ setuptools3_do_compile() {
STAGING_INCDIR=${STAGING_INCDIR} \
STAGING_LIBDIR=${STAGING_LIBDIR} \
${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py \
- build --build-base=${B} ${SETUPTOOLS_BUILD_ARGS} || \
- bbfatal_log "'${PYTHON_PN} setup.py build ${SETUPTOOLS_BUILD_ARGS}' execution failed."
+ bdist_wheel ${SETUPTOOLS_BUILD_ARGS} || \
+ bbfatal_log "'${PYTHON_PN} setup.py bdist_wheel ${SETUPTOOLS_BUILD_ARGS}' execution failed."
}
setuptools3_do_compile[vardepsexclude] = "MACHINE"
setuptools3_do_install() {
cd ${SETUPTOOLS_SETUP_PATH}
- install -d ${D}${PYTHON_SITEPACKAGES_DIR}
- STAGING_INCDIR=${STAGING_INCDIR} \
- STAGING_LIBDIR=${STAGING_LIBDIR} \
- PYTHONPATH=${D}${PYTHON_SITEPACKAGES_DIR} \
- ${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py \
- build --build-base=${B} install --skip-build ${SETUPTOOLS_INSTALL_ARGS} || \
- bbfatal_log "'${PYTHON_PN} setup.py install ${SETUPTOOLS_INSTALL_ARGS}' execution failed."
+
+ pip_install_wheel_do_install
# support filenames with *spaces*
find ${D} -name "*.py" -exec grep -q ${D} {} \; \
@@ -64,5 +60,5 @@ setuptools3_do_install[vardepsexclude] = "MACHINE"
EXPORT_FUNCTIONS do_configure do_compile do_install
export LDSHARED="${CCLD} -shared"
-DEPENDS += "python3-setuptools-native"
+DEPENDS += "python3-setuptools-native python3-wheel-native"
diff --git a/poky/meta/classes/setuptools_build_meta.bbclass b/poky/meta/classes/setuptools_build_meta.bbclass
new file mode 100644
index 0000000000..b1441e65dd
--- /dev/null
+++ b/poky/meta/classes/setuptools_build_meta.bbclass
@@ -0,0 +1,18 @@
+inherit pip_install_wheel setuptools3-base
+
+DEPENDS += "python3 python3-setuptools-native python3-wheel-native"
+
+setuptools_build_meta_do_configure () {
+ mkdir -p ${S}/dist
+ cat > ${S}/build-it.py << EOF
+from setuptools import build_meta
+wheel = build_meta.build_wheel('./dist')
+print(wheel)
+EOF
+}
+
+setuptools_build_meta_do_compile () {
+ nativepython3 ${S}/build-it.py
+}
+
+EXPORT_FUNCTIONS do_configure do_compile
diff --git a/poky/meta/classes/sstate.bbclass b/poky/meta/classes/sstate.bbclass
index b45da4fb23..787172b408 100644
--- a/poky/meta/classes/sstate.bbclass
+++ b/poky/meta/classes/sstate.bbclass
@@ -1,4 +1,4 @@
-SSTATE_VERSION = "7"
+SSTATE_VERSION = "8"
SSTATE_ZSTD_CLEVEL ??= "8"
@@ -50,21 +50,21 @@ SSTATE_EXTRAPATH[vardepvalue] = ""
SSTATE_EXTRAPATHWILDCARD[vardepvalue] = ""
# For multilib rpm the allarch packagegroup files can overwrite (in theory they're identical)
-SSTATE_DUPWHITELIST = "${DEPLOY_DIR}/licenses/"
+SSTATE_ALLOW_OVERLAP_FILES = "${DEPLOY_DIR}/licenses/"
# Avoid docbook/sgml catalog warnings for now
-SSTATE_DUPWHITELIST += "${STAGING_ETCDIR_NATIVE}/sgml ${STAGING_DATADIR_NATIVE}/sgml"
+SSTATE_ALLOW_OVERLAP_FILES += "${STAGING_ETCDIR_NATIVE}/sgml ${STAGING_DATADIR_NATIVE}/sgml"
# sdk-provides-dummy-nativesdk and nativesdk-buildtools-perl-dummy overlap for different SDKMACHINE
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_RPM}/sdk_provides_dummy_nativesdk/ ${DEPLOY_DIR_IPK}/sdk-provides-dummy-nativesdk/"
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_RPM}/buildtools_dummy_nativesdk/ ${DEPLOY_DIR_IPK}/buildtools-dummy-nativesdk/"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_RPM}/sdk_provides_dummy_nativesdk/ ${DEPLOY_DIR_IPK}/sdk-provides-dummy-nativesdk/"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_RPM}/buildtools_dummy_nativesdk/ ${DEPLOY_DIR_IPK}/buildtools-dummy-nativesdk/"
# target-sdk-provides-dummy overlaps that allarch is disabled when multilib is used
-SSTATE_DUPWHITELIST += "${COMPONENTS_DIR}/sdk-provides-dummy-target/ ${DEPLOY_DIR_RPM}/sdk_provides_dummy_target/ ${DEPLOY_DIR_IPK}/sdk-provides-dummy-target/"
+SSTATE_ALLOW_OVERLAP_FILES += "${COMPONENTS_DIR}/sdk-provides-dummy-target/ ${DEPLOY_DIR_RPM}/sdk_provides_dummy_target/ ${DEPLOY_DIR_IPK}/sdk-provides-dummy-target/"
# Archive the sources for many architectures in one deploy folder
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_SRC}"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_SRC}"
# ovmf/grub-efi/systemd-boot/intel-microcode multilib recipes can generate identical overlapping files
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_IMAGE}/ovmf"
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_IMAGE}/grub-efi"
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_IMAGE}/systemd-boot"
-SSTATE_DUPWHITELIST += "${DEPLOY_DIR_IMAGE}/microcode"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_IMAGE}/ovmf"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_IMAGE}/grub-efi"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_IMAGE}/systemd-boot"
+SSTATE_ALLOW_OVERLAP_FILES += "${DEPLOY_DIR_IMAGE}/microcode"
SSTATE_SCAN_FILES ?= "*.la *-config *_config postinst-*"
SSTATE_SCAN_CMD ??= 'find ${SSTATE_BUILDDIR} \( -name "${@"\" -o -name \"".join(d.getVar("SSTATE_SCAN_FILES").split())}" \) -type f'
@@ -94,7 +94,7 @@ SSTATE_ARCHS[vardepsexclude] = "ORIGNATIVELSBSTRING"
SSTATE_MANMACH ?= "${SSTATE_PKGARCH}"
-SSTATECREATEFUNCS = "sstate_hardcode_path"
+SSTATECREATEFUNCS += "sstate_hardcode_path"
SSTATECREATEFUNCS[vardeps] = "SSTATE_SCAN_FILES"
SSTATEPOSTCREATEFUNCS = ""
SSTATEPREINSTFUNCS = ""
@@ -260,7 +260,7 @@ def sstate_install(ss, d):
shareddirs.append(dstdir)
# Check the file list for conflicts against files which already exist
- whitelist = (d.getVar("SSTATE_DUPWHITELIST") or "").split()
+ whitelist = (d.getVar("SSTATE_ALLOW_OVERLAP_FILES") or "").split()
match = []
for f in sharedfiles:
if os.path.exists(f) and not os.path.islink(f):
@@ -296,7 +296,7 @@ def sstate_install(ss, d):
"DISTRO_FEATURES on an existing build directory is not supported - you " \
"should really clean out tmp and rebuild (reusing sstate should be safe). " \
"It could be the overlapping files detected are harmless in which case " \
- "adding them to SSTATE_DUPWHITELIST may be the correct solution. It could " \
+ "adding them to SSTATE_ALLOW_OVERLAP_FILES may be the correct solution. It could " \
"also be your build is including two different conflicting versions of " \
"things (e.g. bluez 4 and bluez 5 and the correct solution for that would " \
"be to resolve the conflict. If in doubt, please ask on the mailing list, " \
@@ -350,7 +350,7 @@ def sstate_install(ss, d):
for lock in locks:
bb.utils.unlockfile(lock)
-sstate_install[vardepsexclude] += "SSTATE_DUPWHITELIST STATE_MANMACH SSTATE_MANFILEPREFIX"
+sstate_install[vardepsexclude] += "SSTATE_ALLOW_OVERLAP_FILES STATE_MANMACH SSTATE_MANFILEPREFIX"
sstate_install[vardeps] += "${SSTATEPOSTINSTFUNCS}"
def sstate_installpkg(ss, d):
@@ -862,14 +862,18 @@ sstate_create_package () {
fi
chmod 0664 $TFILE
# Skip if it was already created by some other process
- if [ ! -e ${SSTATE_PKG} ]; then
+ if [ -h ${SSTATE_PKG} ] && [ ! -e ${SSTATE_PKG} ]; then
+ # There is a symbolic link, but it links to nothing.
+ # Forcefully replace it with the new file.
+ ln -f $TFILE ${SSTATE_PKG} || true
+ elif [ ! -e ${SSTATE_PKG} ]; then
# Move into place using ln to attempt an atomic op.
# Abort if it already exists
- ln $TFILE ${SSTATE_PKG} && rm $TFILE
+ ln $TFILE ${SSTATE_PKG} || true
else
- rm $TFILE
+ touch ${SSTATE_PKG} 2>/dev/null || true
fi
- touch ${SSTATE_PKG} 2>/dev/null || true
+ rm $TFILE
}
python sstate_sign_package () {
@@ -905,7 +909,7 @@ sstate_unpack_package () {
tar -I "$ZSTD" -xvpf ${SSTATE_PKG}
# update .siginfo atime on local/NFS mirror if it is a symbolic link
- [ ! -h ${SSTATE_PKG}.siginfo ] || touch -a ${SSTATE_PKG}.siginfo 2>/dev/null || true
+ [ ! -h ${SSTATE_PKG}.siginfo ] || [ ! -e ${SSTATE_PKG}.siginfo ] || touch -a ${SSTATE_PKG}.siginfo 2>/dev/null || true
# update each symbolic link instead of any referenced file
touch --no-dereference ${SSTATE_PKG} 2>/dev/null || true
[ ! -e ${SSTATE_PKG}.sig ] || touch --no-dereference ${SSTATE_PKG}.sig 2>/dev/null || true
@@ -988,6 +992,8 @@ def sstate_checkhashes(sq_data, d, siginfo=False, currentcount=0, summary=True,
localdata.setVar('SRC_URI', srcuri)
bb.debug(2, "SState: Attempting to fetch %s" % srcuri)
+ import traceback
+
try:
fetcher = bb.fetch2.Fetch(srcuri.split(), localdata2,
connection_cache=thread_worker.connection_cache)
@@ -996,9 +1002,9 @@ def sstate_checkhashes(sq_data, d, siginfo=False, currentcount=0, summary=True,
found.add(tid)
missed.remove(tid)
except bb.fetch2.FetchError as e:
- bb.debug(2, "SState: Unsuccessful fetch test for %s (%s)" % (srcuri, repr(e)))
+ bb.debug(2, "SState: Unsuccessful fetch test for %s (%s)\n%s" % (srcuri, repr(e), traceback.format_exc()))
except Exception as e:
- bb.error("SState: cannot test %s: %s" % (srcuri, repr(e)))
+ bb.error("SState: cannot test %s: %s\n%s" % (srcuri, repr(e), traceback.format_exc()))
if progress:
bb.event.fire(bb.event.ProcessProgress(msg, len(tasklist) - thread_worker.tasks.qsize()), d)
@@ -1016,15 +1022,18 @@ def sstate_checkhashes(sq_data, d, siginfo=False, currentcount=0, summary=True,
msg = "Checking sstate mirror object availability"
bb.event.fire(bb.event.ProcessStarted(msg, len(tasklist)), d)
- bb.event.enable_threadlock()
- pool = oe.utils.ThreadedPool(nproc, len(tasklist),
- worker_init=checkstatus_init, worker_end=checkstatus_end,
- name="sstate_checkhashes-")
- for t in tasklist:
- pool.add_task(checkstatus, t)
- pool.start()
- pool.wait_completion()
- bb.event.disable_threadlock()
+ # Have to setup the fetcher environment here rather than in each thread as it would race
+ fetcherenv = bb.fetch2.get_fetcher_environment(d)
+ with bb.utils.environment(**fetcherenv):
+ bb.event.enable_threadlock()
+ pool = oe.utils.ThreadedPool(nproc, len(tasklist),
+ worker_init=checkstatus_init, worker_end=checkstatus_end,
+ name="sstate_checkhashes-")
+ for t in tasklist:
+ pool.add_task(checkstatus, t)
+ pool.start()
+ pool.wait_completion()
+ bb.event.disable_threadlock()
if progress:
bb.event.fire(bb.event.ProcessFinished(msg), d)
diff --git a/poky/meta/classes/staging.bbclass b/poky/meta/classes/staging.bbclass
index 25f77c7735..ab827766be 100644
--- a/poky/meta/classes/staging.bbclass
+++ b/poky/meta/classes/staging.bbclass
@@ -24,7 +24,7 @@ SYSROOT_DIRS:append:class-cross = " ${SYSROOT_DIRS_NATIVE}"
SYSROOT_DIRS:append:class-crosssdk = " ${SYSROOT_DIRS_NATIVE}"
# These directories will not be staged in the sysroot
-SYSROOT_DIRS_BLACKLIST = " \
+SYSROOT_DIRS_IGNORE = " \
${mandir} \
${docdir} \
${infodir} \
@@ -49,9 +49,10 @@ sysroot_stage_dir() {
fi
mkdir -p "$dest"
+ rdest=$(realpath --relative-to="$src" "$dest")
(
cd $src
- find . -print0 | cpio --null -pdlu $dest
+ find . -print0 | cpio --null -pdlu $rdest
)
}
@@ -64,7 +65,7 @@ sysroot_stage_dirs() {
done
# Remove directories we do not care about
- for dir in ${SYSROOT_DIRS_BLACKLIST}; do
+ for dir in ${SYSROOT_DIRS_IGNORE}; do
rm -rf "$to$dir"
done
}
@@ -103,7 +104,7 @@ python do_populate_sysroot () {
for f in (d.getVar('SYSROOT_PREPROCESS_FUNCS') or '').split():
bb.build.exec_func(f, d)
pn = d.getVar("PN")
- multiprov = d.getVar("MULTI_PROVIDER_WHITELIST").split()
+ multiprov = d.getVar("BB_MULTI_PROVIDER_ALLOWED").split()
provdir = d.expand("${SYSROOT_DESTDIR}${base_prefix}/sysroot-providers/")
bb.utils.mkdirhier(provdir)
for p in d.getVar("PROVIDES").split():
@@ -115,11 +116,11 @@ python do_populate_sysroot () {
}
do_populate_sysroot[vardeps] += "${SYSROOT_PREPROCESS_FUNCS}"
-do_populate_sysroot[vardepsexclude] += "MULTI_PROVIDER_WHITELIST"
+do_populate_sysroot[vardepsexclude] += "BB_MULTI_PROVIDER_ALLOWED"
POPULATESYSROOTDEPS = ""
-POPULATESYSROOTDEPS:class-target = "virtual/${MLPREFIX}${TARGET_PREFIX}binutils:do_populate_sysroot"
-POPULATESYSROOTDEPS:class-nativesdk = "virtual/${TARGET_PREFIX}binutils-crosssdk:do_populate_sysroot"
+POPULATESYSROOTDEPS:class-target = "virtual/${MLPREFIX}${HOST_PREFIX}binutils:do_populate_sysroot"
+POPULATESYSROOTDEPS:class-nativesdk = "virtual/${HOST_PREFIX}binutils-crosssdk:do_populate_sysroot"
do_populate_sysroot[depends] += "${POPULATESYSROOTDEPS}"
SSTATETASKS += "do_populate_sysroot"
@@ -624,3 +625,36 @@ python staging_taskhandler() {
}
staging_taskhandler[eventmask] = "bb.event.RecipeTaskPreProcess"
addhandler staging_taskhandler
+
+
+#
+# Target build output, stored in do_populate_sysroot or do_package can depend
+# not only upon direct dependencies but also indirect ones. A good example is
+# linux-libc-headers. The toolchain depends on this but most target recipes do
+# not. There are some headers which are not used by the toolchain build and do
+# not change the toolchain task output, hence the task hashes can change without
+# changing the sysroot output of that recipe yet they can influence others.
+#
+# A specific example is rtc.h which can change rtcwake.c in util-linux but is not
+# used in the glibc or gcc build. To account for this, we need to account for the
+# populate_sysroot hashes in the task output hashes.
+#
+python target_add_sysroot_deps () {
+ current_task = "do_" + d.getVar("BB_CURRENTTASK")
+ if current_task not in ["do_populate_sysroot", "do_package"]:
+ return
+
+ pn = d.getVar("PN")
+ if pn.endswith("-native"):
+ return
+
+ taskdepdata = d.getVar("BB_TASKDEPDATA", False)
+ deps = {}
+ for dep in taskdepdata.values():
+ if dep[1] == "do_populate_sysroot" and not dep[0].endswith(("-native", "-initial")) and "-cross-" not in dep[0]:
+ deps[dep[0]] = dep[6]
+
+ d.setVar("HASHEQUIV_EXTRA_SIGDATA", "\n".join("%s: %s" % (k, deps[k]) for k in sorted(deps.keys())))
+}
+SSTATECREATEFUNCS += "target_add_sysroot_deps"
+
diff --git a/poky/meta/classes/toolchain-scripts.bbclass b/poky/meta/classes/toolchain-scripts.bbclass
index fb6261c91d..8f914cce27 100644
--- a/poky/meta/classes/toolchain-scripts.bbclass
+++ b/poky/meta/classes/toolchain-scripts.bbclass
@@ -8,7 +8,7 @@ TARGET_CC_ARCH:append:libc-musl = " -mmusl"
# default debug prefix map isn't valid in the SDK
DEBUG_PREFIX_MAP = ""
-EXPORT_SDK_PS1 = "${@ 'export PS1=\'%s\'' % d.getVar('SDK_PS1') if d.getVar('SDK_PS1') else ''}"
+EXPORT_SDK_PS1 = "${@ 'export PS1=\\"%s\\"' % d.getVar('SDK_PS1') if d.getVar('SDK_PS1') else ''}"
# This function creates an environment-setup-script for use in a deployable SDK
toolchain_create_sdk_env_script () {
diff --git a/poky/meta/classes/uninative.bbclass b/poky/meta/classes/uninative.bbclass
index 4412d7c567..6a9e862bcd 100644
--- a/poky/meta/classes/uninative.bbclass
+++ b/poky/meta/classes/uninative.bbclass
@@ -9,8 +9,8 @@ UNINATIVE_TARBALL ?= "${BUILD_ARCH}-nativesdk-libc-${UNINATIVE_VERSION}.tar.xz"
#UNINATIVE_CHECKSUM[x86_64] = "dead"
UNINATIVE_DLDIR ?= "${DL_DIR}/uninative/"
-# Enabling uninative will change the following variables so they need to go the parsing white list to prevent multiple recipe parsing
-BB_HASHCONFIG_WHITELIST += "NATIVELSBSTRING SSTATEPOSTUNPACKFUNCS BUILD_LDFLAGS"
+# Enabling uninative will change the following variables so they need to go the parsing ignored variables list to prevent multiple recipe parsing
+BB_HASHCONFIG_IGNORE_VARS += "NATIVELSBSTRING SSTATEPOSTUNPACKFUNCS BUILD_LDFLAGS"
addhandler uninative_event_fetchloader
uninative_event_fetchloader[eventmask] = "bb.event.BuildStarted"
diff --git a/poky/meta/classes/useradd-staticids.bbclass b/poky/meta/classes/useradd-staticids.bbclass
index 8e2a7fb635..3acf59cd46 100644
--- a/poky/meta/classes/useradd-staticids.bbclass
+++ b/poky/meta/classes/useradd-staticids.bbclass
@@ -174,8 +174,6 @@ def update_useradd_static_config(d):
newparam += ['', ' --non-unique'][uaargs.non_unique]
if uaargs.password != None:
newparam += ['', ' --password %s' % uaargs.password][uaargs.password != None]
- elif uaargs.clear_password:
- newparam += ['', ' --clear-password %s' % uaargs.clear_password][uaargs.clear_password != None]
newparam += ['', ' --root %s' % uaargs.root][uaargs.root != None]
newparam += ['', ' --system'][uaargs.system]
newparam += ['', ' --shell %s' % uaargs.shell][uaargs.shell != None]
@@ -236,8 +234,6 @@ def update_useradd_static_config(d):
newparam += ['', ' --non-unique'][gaargs.non_unique]
if gaargs.password != None:
newparam += ['', ' --password %s' % gaargs.password][gaargs.password != None]
- elif gaargs.clear_password:
- newparam += ['', ' --clear-password %s' % gaargs.clear_password][gaargs.clear_password != None]
newparam += ['', ' --root %s' % gaargs.root][gaargs.root != None]
newparam += ['', ' --system'][gaargs.system]
newparam += ' %s' % gaargs.GROUP
diff --git a/poky/meta/classes/waf.bbclass b/poky/meta/classes/waf.bbclass
index bc594d3c6b..464564afa1 100644
--- a/poky/meta/classes/waf.bbclass
+++ b/poky/meta/classes/waf.bbclass
@@ -39,7 +39,7 @@ def waflock_hash(d):
# directory (e.g. if the source is coming from externalsrc and was previously
# configured elsewhere).
export WAFLOCK = ".lock-waf_oe_${@waflock_hash(d)}_build"
-BB_HASHBASE_WHITELIST += "WAFLOCK"
+BB_BASEHASH_IGNORE_VARS += "WAFLOCK"
python waf_preconfigure() {
import subprocess
diff --git a/poky/meta/classes/yocto-check-layer.bbclass b/poky/meta/classes/yocto-check-layer.bbclass
new file mode 100644
index 0000000000..329d3f8edb
--- /dev/null
+++ b/poky/meta/classes/yocto-check-layer.bbclass
@@ -0,0 +1,16 @@
+#
+# This class is used by yocto-check-layer script for additional per-recipe tests
+# The first test ensures that the layer has no recipes skipping 'installed-vs-shipped' QA checks
+#
+
+WARN_QA:remove = "installed-vs-shipped"
+ERROR_QA:append = " installed-vs-shipped"
+
+python () {
+ packages = set((d.getVar('PACKAGES') or '').split())
+ for package in packages:
+ skip = set((d.getVar('INSANE_SKIP') or "").split() +
+ (d.getVar('INSANE_SKIP:' + package) or "").split())
+ if 'installed-vs-shipped' in skip:
+ oe.qa.handle_error("installed-vs-shipped", 'Package %s is skipping "installed-vs-shipped" QA test.' % package, d)
+}