diff options
author | Andrew Geissler <geissonator@yahoo.com> | 2023-01-13 17:55:19 +0300 |
---|---|---|
committer | Andrew Geissler <geissonator@yahoo.com> | 2023-01-13 21:05:42 +0300 |
commit | 517393d903089f921915530ffa689a7a1d113420 (patch) | |
tree | 8bac829cbf7fe13f7f1a4d937e1997d923755ed6 /poky/bitbake/lib | |
parent | 98ccb1903bfb490dc6075e694664d8f1edef74dd (diff) | |
download | openbmc-517393d903089f921915530ffa689a7a1d113420.tar.xz |
subtree updates Jan-13-2023
meta-openembedded: d04444509a..cd13881611:
Alex Kiernan (10):
mdns: Upgrade 1310.140.1 -> 1790.40.31
mdns: Set MDNS_VERSIONSTR_NODTS
mdns: Upgrade 1790.40.31 -> 1790.60.25
ostree: Upgrade 2022.5 -> 2022.7
ostree: Use systemd_system_unitdir for systemd units
ostree: Switch to fuse3 which is supported in ostree now
ostree: Fix comments for configuration/ptest
ostree: Handle musl's ERANGE mapping
usbguard: Remove pegtl from DEPENDS
usbguard: Upgrade 1.1.1 -> 1.1.2
Alex Stewart (2):
gvfs: stylize DEPENDS
gvfs: obviate the ssh-client requirement for gvfs
Alexander Kanavin (5):
frr: add a patch to correctly check presence of python from pkg-config
lirc: correctly use PYTHONPATH
libportal: move to oe-core
packagegroup-meta-python: drop python3-strict-rfc3339
nftables: fix builds with latest setuptools
Alexander Stein (1):
dool: Add patch to fix rebuild
Archana Polampalli (1):
Nodejs - Upgrade to 16.18.1
Bartosz Golaszewski (3):
python3-kmod: new package
python3-watchdogdev: new package
packagegroup-meta-python: add missing packages
Bruce Ashfield (1):
zfs: update to 2.1.7
Changqing Li (5):
linuxptp: fix do_compile error
keyutils: fix ptest failed since "+++ Can't Determine Endianness"
graphviz: Do not build tcl support for native
redis: 6.2.7 -> 6.2.8
redis: 7.0.5 -> 7.0.7
Chen Pei (2):
suitesparse:fix git branch in SRC_URI
botan: upgrade 2.19.2 -> 2.19.3
Chen Qi (4):
xfce4-verve-plugin: fix do_configure faiure about missing libpcre
networkmanager: fix dhcpcd PACKAGECONFIG
networkmanager: install config files into correct place
networkmanager: fix /etc/resolv.conf handling
Christian Eggers (1):
boost-url: remove recipe
Clément Péron (3):
navigation: bump proj to 9.1.0 library
proj: add a packageconfig to build as a static library
proj: avoid leaking host path in libproj
Devendra Tewari (1):
android-tools: Use echo instead of bbnote
Dmitry Baryshkov (1):
nss: fix cross-compilation error
Erwann Roussy (3):
python3-schedutils: add recipe
python3-linux-procfs: add recipe
tuna: add recipe
Fabio Estevam (2):
remmina: Update to 1.4.28
crucible: Upgrade to 2022.12.06
Geoff Parker (1):
python3-yappi: upgrade 1.3.6 -> 1.4.0, python 3.11 compatible
Gerbrand De Laender (1):
python3-aioserial: new package
Gianfranco Costamagna (2):
vbxguestdrivers: upgrade 7.0.2 -> 7.0.4
boinc-client: Update boinc from 7.18.1 to 7.20.4
Gianluigi Spagnuolo (1):
libbpf: add native and nativesdk BBCLASSEXTEND
Hains van den Bosch (2):
python3-twisted: Add python3-asyncio to RDEPENDS
python3-twisted: Add python3-typing-extensions to RDEPENDS
He Zhe (1):
protobuf: upgrade 3.21.5 -> 3.21.10
Jose Quaresma (1):
lshw: bump to 42fef565
Kai Kang (31):
freeradius: fix multilib systemd service start failure
wxwidgets: 3.1.5 -> 3.2.1
python3-attrdict3: add recipe with version 2.0.2
python3-wxgtk4: 4.1.1 -> 4.2.0
xfce4-settings: 4.16.3 -> 4.16.5
python3-m2crypto: fix CVE-2020-25657 and buildpaths qa issue
fixup! wxwidgets: 3.1.5 -> 3.2.1
postfix: fix multilib conflict of sample-main.cf
python3-wxgtk4: replace deprecated inspect.getargspec
libxfce4ui: 4.16.1 -> 4.18.0
thunar-volman: 4.16.0 -> 4.18.0
xfce4-cpufreq-plugin: 1.2.7 -> 1.2.8
xfce4-wavelan-plugin: 0.6.2 -> 0.6.3
xfce4-cpugraph-plugin: 1.2.6 -> 1.2.7
xfce4-sensors-plugin: 1.4.3 -> 1.4.4
thunar-shares-plugin: Bump GLib minimum required to 2.26
xfce4-dev-tools: 4.16.0 -> 4.18.0
libxfce4util: 4.16.0 -> 4.18.0
exo: 4.16.4 -> 4.18.0
garcon: 4.16.1 -> 4.18.0
xfce4-panel: 4.16.3 -> 4.18.0
thunar: 4.16.9 -> 4.18.0
tumbler: 4.16.0 -> 4.18.0
xfconf: 4.16.0 -> 4.18.0
xfce4-appfinder: 4.16.1 -> 4.18.0
xfce4-settings: 4.16.5 -> 4.18.0
xfce4-power-manager: 4.16.0 -> 4.18.0
xfce4-session: 4.16.0 -> 4.18.0
xfwm4: 4.16.1 -> 4.18.0
xfdesktop: 4.16.0 -> 4.18.0
xorg-lib: set XORG_EXT for recipes
Khem Raj (91):
gnome-text-editor: Add missing libpcre build time depenedency
ettercap: Add missing dependency on libpcre
xcb-util-cursor: Update to 0.1.4
lldpd: Use github release assets for SRC_URI
aufs-util: Fix build with large file support enabled systems
volume-key: Inherit python3targetconfig
proj: Enable apps when building native variant
python3-pyproj: Export PROJ_DIR
satyr: Inherit python3targetconfig
rest: Re-add 0.8.1
gfbgraph: Use rest 0.8.1
audit: Inherit python3targetconfig
opensaf: Check for _FILE_OFFSET_BITS instead of __TIMESIZE
flite: Add missing deps on alsa-lib and chrpath
python3-pystemd: Regenerate .c sources using newer cython
libreport: Inherit python3targetconfig
uw-imap: Disable parallelism
gnome-calendar: Upgrade to 43.1
gnome-photos: Upgrade to 43.0
libgweather: Remove 40.0
waf-samba.bbclass: point PYTHON_CONFIG to target python3-config
amtk: Add missing dep on python3-pygments-native
fontforge: Inherit python3targetconfig
tepl: Add missing dep on python3-pygments-native
alsa-oss: Remove recipe
opencv: Check for commercial_ffmpeg as well to enable ffmpeg
opencv: Fix build with ffmpeg 5.1+
fwts: Upgrade to 22.11.00
minio: Disable on mips
sip: Add recipe for 6.7.5
imapfilter: Upgrade to 2.7.6
perfetto: Do not pass TUNE_CCARGS to native/host compiler
stressapptest: Upgrade to latest tip
mariadb: Upgrade to 10.11.1
surf: Depend on gcr3
fatcat: Enable 64bit off_t
stressapptest: Fix build with largefile support and musl
nspr: Upgrade to 4.35
cryptsetup: Upgrade to 2.6.0
libyui,libyui-ncurses: Upgrade to 4.2.3
inotify-tools: Fix build on musl and lfs64
sdbus-c++-libsystemd: Upgrade to 250.9 systemd release
xfsprogs: Upgrade to 6.0.0
drbd,drbd-utils: Upgrade to 9.2.1 and drbd-utils to 9.22.0
libtraceevent: Add recipe
libtracefs: Add recipe
trace-cmd: Remove use of off64_t and lseek64
xfsdump: Add -D_LARGEFILE64_SOURCE on musl
xfstests: Add -D_LARGEFILE64_SOURCE on musl
mariadb: Alias lseek64/open64/ftruncate64 on musl systems
gperftools: Define off64_t on musl
android-tools: Define lseek64 = lseek on musl
php: Add -D_LARGEFILE64_SOURCE to cflags
spice-gtk: Use libucontext for coroutines on musl
wxwidgets: Fix build with musl
wxwidgets: Fix locale on musl
wxwidgets: Set HAVE_LARGEFILE_SUPPORT
python3-wxgtk4: Do not use GetAssertStackTrace with USE_STACKWALKER disabled
f2fs-tools: Upgrade to 1.15.0
trace-cmd: Pass ldflags to compiler
parole: Define DATADIRNAME
abseil-cpp: Replace off64_t with off_t
vsftpd_3.0.5.bb: Define _LARGEFILE64_SOURCE on musl
mozjs-102: Disable mozilla stackwalk on musl
fatresize: Fix build when 64bit time_t is enabled
boinc-client: Fix build when using 64bit time_t
python3-grpcio: Define -D_LARGEFILE64_SOURCE only for musl
gnome-online-accounts: Fix build race seen on musl systems
imagemagick: Do not set ac_cv_sys_file_offset_bits
spdlog: Do not use LFS64 functions with musl
mongodb: Do not use off64_t on musl
dracut: Do not undefine _FILE_OFFSET_BITS
libcamera: Diable 64bit time_t on glibc targets
v4l-utils: Diable 64bit time_t on glibc targets
opensaf: Fix the check for __fsblkcnt64_t size
libcereal,poco: Link with -latomic on ppc32 as well
sshpass: Use SPDX identified string for GPLv2
nftables: Upgrade to 1.0.6
mycroft: Check for pulseaudio in distro features
trace-cmd: Build libs before building rest
open-vm-tools: Fix build with 64-bit time_t
libtraceevent: Move plugins into package of its own
trace-cmd: Upgrade to 3.1.5
luajit: Update to latest on v2.1 branch
concurrencykit: Update to 0.7.0
concurrencykit: Set correct PLAT value for riscv32
concurrencykit: Fix build on riscv32 and riscv64
sysbench: Enable only on architectures supporting LuaJIT
packagegroup-meta-oe: Ensure sysbench is included in limited arches
hwloc: Update to 2.9.0
fluentbit: Link with libatomic on ppc32
Lei Maohui (1):
polkit: Fix multilib builds
Leon Anavi (9):
python3-watchdog: Upgrade 2.2.0 -> 2.2.1
python3-zeroconf: Upgrade 0.39.4 -> 0.47.1
python3-croniter: Upgrade 1.3.7 -> 1.3.8
python3-coverage: Upgrade 7.0.1 -> 7.0.3
python3-prompt-toolkit: Upgrade 3.0.31 -> 3.0.36
python3-simplejson: Upgrade 3.18.0 -> 3.18.1
python3-termcolor: Upgrade 2.1.1 -> 2.2.0
python3-cantools: Upgrade 37.2.0 -> 38.0.0
python3-marshmallow: Upgrade 3.18.0 -> 3.19.0
Livin Sunny (1):
libwebsockets: add ipv6 in PACKAGECONFIG
Markus Volk (88):
blueman: add RDEPEND on python3-fcntl
hwdata: add patch to use sysroot prefix for pkgdatadir
pipewire: upgrade 0.3.59 -> 0.3.60
spirv-cross: upgrade; fix build
blueman: upgrade 2.34 -> 2.35
pipewire: upgrade 0.3.60 -> 0.3.61
iwd: upgrade 1.30 -> 2.0
libgdata: use gcr3
libgweather: update 4.0.0 -> 4.2.0
gnome-online-accounts: use gcr3
geary: build with gcr3
gnome-keyring: use gcr3
evolution-data-server: update 3.44.2 -> 3.46.1
gnome-settings-daemon: update 42.1 -> 43.0
libnma: update 1.8.38 -> 1.10.4
geocode-glib: build with libsoup-3.0
gjs: update 1.72.2 -> 1.75.1
gnome-shell: update 42.0 -> 43.1
mutter: update 42.0 -> 43.1
polkit: add recipe for v122
mozjs: update 98 -> 102
appstream-glib: update 0.7.18 -> 0.8.2
gthumb: build with libsoup-3
amtk: update 5.3.1 -> 5.6.1
gedit: update 42.2 -> 43.2
evolution-data-server: remove libgdata dependency
tepl: update 6.0.0 -> 6.2.0
perfetto: pass TUNE_CCARGS to use machine tune
gnome-photos: update dependencies
thunar-archive-plugin: update 0.4.0 -> 0.5.0
libadwaita: remove deprecated sassc-native dependency
gnome-shell: remove deprecated sassc-native dependency
spice-gtk: add missing license information
pipewire: update 0.3.61 -> 0.3.62
gdm: update 42.0 -> 43.0
gnome-session: update 42.0 -> 43-0
geoclue: update to latest commit to allow to build with libsoup-3.0
gvfs: fix polkit homedir
editorconfig: add recipe
tracker: update 3.4.1 -> 3.4.2
gvfs: fix dependencies
gnome-calculator: update 42.2 -> 43.0.1
tracker-miners: update 3.4.1 -> 3.4.2
gnome-photos: add missing runtime dependency on tracker-miners
gtksourceview5: update 5.4.2 -> 5.6.1
remmina: build with libsoup-3.0
ostree: replace libsoup-2.4 by curl
gnome-text-editor: update 42.2 -> 43.1
gtk4: remove recipe
libxmlb: allow to build native
pipewire: update 0.3.62 -> 0.3.63
gnome-shell-extensions: update SRC_URI and remove sassc-native dep
grilo: update 0.3.14 -> 0.3.15
libstemmer: move recipe to meta-oe
xdg-desktop-portal: add recipe
bubblewrap: import recipe from meta-security
gnome-software: add recipe
basu: import recipe from meta-wayland
xdg-desktop-portal-wlr: add recipe
appstream: add recipe
flatpak: add recipe
flatpak-xdg-utils: add recipe
flatpak: add runtime dependency on flatpak-xdg-utils
wireplumber: update 0.4.12 -> 0.4.13
wireplumber: build with dbus support by default
xdg-desktop-portal-gnome: add recipe
libcloudproviders: add recipe
evince: update 42.3 -> 43.1
libportal: build libportal-gtk4 and vala support
nautilus: update 42.2 -> 43.1
gnome-desktop: update 42.0 -> 43
file-roller: update 3.42.0 -> 43.0
wireplumber: dont start systemd system service by default
gnome-bluetooth: update 42.4 -> 42.5
gnome-flashback: update 3.44.0 -> 3.46.0
libwnck3: update 40.1 -> 43.0
gnome-panel: update 3.44.0 -> 3.47.1
gnome-terminal: update 3.42.2 -> 3.46.7
dconf-editor: update 3.38.3 -> 43.0
gnome-shell: add missing RDEPENDS
gnome-control-center: update 42.0 -> 43.2
gnome-shell: add runtime dependency on adwaita-icon-theme
xdg-desktop-portal-gtk: add recipe
thunar: add tumbler to RRECOMMENDS
gnome:terminal add missing inherit meson
gnome-disk-utility: update 42.0 -> 43.0
eog: add recipe
libdecor: import recipe
Martin Jansa (3):
nss: fix SRC_URI
geoclue: fix polkit files only with modem-gps PACKAGECONFIG
layer.conf: update LAYERSERIES_COMPAT for mickledore
Mathieu Dubois-Briand (2):
nss: Add missing CVE product
nss: Whitelist CVEs related to libnssdbm
Matthias Klein (1):
paho-mqtt-c: upgrade 1.3.11 -> 1.3.12
Max Krummenacher (1):
opencv: follow changed name license_flags_accepted
Mingli Yu (25):
gnome-calculator: add opengl to REQUIRED_DISTRO_FEATURES
waylandpp: add opengl to REQUIRED_DISTRO_FEATURES
libnma: add opengl to REQUIRED_DISTRO_FEATURES
network-manager-applet: add opengl to REQUIRED_DISTRO_FEATURES
gssdp: check opengl is enabled or not
gtksourceview5: add opengl to REQUIRED_DISTRO_FEATURES
gnome-font-viewer: add opengl to REQUIRED_DISTRO_FEATURES
libxfce4ui: check opengl DISTRO_FEATURES
gnome-desktop: add opengl to REQUIRED_DISTRO_FEATURES
ibus: add opengl related check
nautilus: add opengl to REQUIRED_DISTRO_FEATURES
gnome-bluetooth: add opengl to REQUIRED_DISTRO_FEATURES
evince: add opengl to REQUIRED_DISTRO_FEATURES
gnome-calendar: add opengl to REQUIRED_DISTRO_FEATURES
xf86-video-amdgpu: add opengl to REQUIRED_DISTRO_FEATURES
spice-gtk: add opengl to REQUIRED_DISTRO_FEATURES
grail: add opengl to REQUIRED_DISTRO_FEATURES
frame: add opengl to REQUIRED_DISTRO_FEATURES
geis: add opengl to REQUIRED_DISTRO_FEATURES
evolution-data-server: add opengl to REQUIRED_DISTRO_FEATURES
libgweather4: add opengl to REQUIRED_DISTRO_FEATURES
geary: add opengl to REQUIRED_DISTRO_FEATURES
file-roller: add opengl to REQUIRED_DISTRO_FEATURES
gnome-photos: add opengl to REQUIRED_DISTRO_FEATURES
xdg-desktop-portal-wlr: add opengl to REQUIRED_DISTRO_FEATURES
Naveen Saini (3):
opencl-headers: add native and nativesdk
tcsh: add native nativesdk BBCLASSEXTEND
tbb: upgrade 2021.5.0 -> 2021.7.0
Omkar Patil (1):
ntfs-3g-ntfsprogs: Upgrade 2022.5.17 to 2022.10.3
Ovidiu Panait (1):
multipath-tools: upgrade 0.8.4 -> 0.9.3
Peter Bergin (1):
sysbench: Upgrade 0.4.12 -> 1.0.20
Peter Kjellerstedt (4):
chrony: Make it possible to enable editline support again
chrony: Remove the libcap and nss PACKAGECONFIGs
Revert "lldpd: Use github release assets for SRC_URI"
lldpd: Correct the checksum for the tar ball to match 1.0.16
Preeti Sachan (1):
fluidsynth: update SRC_URI to remove non-existing 2.2.x branch
Roger Knecht (1):
python3-rapidjson: add recipe
Sakib Sajal (1):
minio: fix license information
Samuli Piippo (1):
protobuf: stage protoc binary to sysroot
Tim Orling (4):
libio-pty-perl: upgrade 1.16 -> 1.17; enable ptest
libmozilla-ca-perl: add recipe for 20221114
libio-socket-ssl-perl: upgrade 2.075 -> 2.076
libtest-warnings-perl: move to oe-core
Tomasz Żyjewski (2):
python3-binwalk: add recipe for version 2.3.3
python3-uefi-firmware: add recipe for version 1.9
Wang Mingyu (190):
byacc: upgrade 20220128 -> 20221106
libforms: upgrade 1.2.4 -> 1.2.5pre1
libnftnl: upgrade 1.2.3 -> 1.2.4
mpich: upgrade 4.0.2 -> 4.0.3
python3-u-msgpack-python: upgrade 2.7.1 -> 2.7.2
python3-aiosignal: upgrade 1.2.0 -> 1.3.1
python3-eth-hash: upgrade 0.5.0 -> 0.5.1
python3-frozenlist: upgrade 1.3.1 -> 1.3.3
python3-google-auth: upgrade 2.14.0 -> 2.14.1
python3-greenlet: upgrade 2.0.0 -> 2.0.1
python3-imageio: upgrade 2.22.3 -> 2.22.4
python3-pycocotools: upgrade 2.0.5 -> 2.0.6
babl: upgrade 0.1.96 -> 0.1.98
ctags: upgrade 5.9.20221106.0 -> 5.9.20221113.0
gegl: upgrade 0.4.38 -> 0.4.40
freerdp: upgrade 2.8.1 -> 2.9.0
glibmm-2.68: upgrade 2.72.1 -> 2.74.0
googlebenchmark: upgrade 1.7.0 -> 1.7.1
gnome-backgrounds: upgrade 42.0 -> 43
nano: upgrade 6.4 -> 7.0
networkmanager-openvpn: upgrade 1.10.0 -> 1.10.2
python3-django: upgrade 4.1 -> 4.1.3
python3-flask-migrate: upgrade 3.1.0 -> 4.0.0
python3-eth-utils: upgrade 2.0.0 -> 2.1.0
python3-eventlet: upgrade 0.33.1 -> 0.33.2
python3-googleapis-common-protos: upgrade 1.56.4 -> 1.57.0
python3-google-api-python-client: upgrade 2.65.0 -> 2.66.0
python3-pymongo: upgrade 4.3.2 -> 4.3.3
lldpd: upgrade 1.0.15 -> 1.0.16
audit: upgrade 3.0.8 -> 3.0.9
ccid: upgrade 1.5.0 -> 1.5.1
colord: upgrade 1.4.5 -> 1.4.6
ctags: upgrade 5.9.20221113.0 -> 5.9.20221120.0
flatbuffers: upgrade 22.10.26 -> 22.11.23
libglvnd: upgrade 1.5.0 -> 1.6.0
gensio: upgrade 2.5.2 -> 2.6.1
mg: upgrade 20220614 -> 20221112
nbdkit: upgrade 1.33.2 -> 1.33.3
xfstests: upgrade 2022.10.30 -> 2022.11.06
pcsc-lite: upgrade 1.9.8 -> 1.9.9
python3-matplotlib-inline: upgrade 0.1.2 -> 0.1.6
python3-astroid: upgrade 2.12.12 -> 2.12.13
python3-asyncinotify: upgrade 2.0.5 -> 2.0.8
python3-charset-normalizer: upgrade 3.0.0 -> 3.0.1
python3-dateparser: upgrade 1.1.0 -> 1.1.4
python3-can: upgrade 4.0.0 -> 4.1.0
python3-flask-socketio: upgrade 5.3.1 -> 5.3.2
python3-ipython: upgrade 8.2.0 -> 8.6.0
python3-langtable: upgrade 0.0.60 -> 0.0.61
python3-jedi: upgrade 0.18.1 -> 0.18.2
python3-grpcio-tools: upgrade 1.50.0 -> 1.51.0
python3-grpcio: upgrade 1.50.0 -> 1.51.0
python3-networkx: upgrade 2.8.7 -> 2.8.8
python3-pyatspi: upgrade 2.38.2 -> 2.46.0
python3-pandas: upgrade 1.5.1 -> 1.5.2
python3-pybind11-json: upgrade 0.2.11 -> 0.2.13
python3-pychromecast: upgrade 12.1.4 -> 13.0.1
python3-pycodestyle: upgrade 2.9.1 -> 2.10.0
xterm: upgrade 373 -> 377
smarty: upgrade 4.2.1 -> 4.3.0
spdlog: upgrade 1.10.0 -> 1.11.0
python3-pyperf: upgrade 2.4.1 -> 2.5.0
python3-pyflakes: upgrade 2.5.0 -> 3.0.1
python3-pymisp: upgrade 2.4.157 -> 2.4.165.1
capnproto: upgrade 0.10.2 -> 0.10.3
libass: upgrade 0.16.0 -> 0.17.0
ctags: upgrade 5.9.20221120.0 -> 5.9.20221127.0
libio-socket-ssl-perl: upgrade 2.076 -> 2.077
python3-grpcio-tools: upgrade 1.51.0 -> 1.51.1
python3-asyncinotify: upgrade 2.0.8 -> 3.0.1
python3-grpcio: upgrade 1.51.0 -> 1.51.1
opensc: upgrade 0.22.0 -> 0.23.0
python3-ipython: upgrade 8.6.0 -> 8.7.0
ply: upgrade 2.2.0 -> 2.3.0
python3-apt: upgrade 2.3.0 -> 2.5.0
poppler: upgrade 22.11.0 -> 22.12.0
python3-asttokens: upgrade 2.1.0 -> 2.2.0
python3-cbor2: upgrade 5.4.3 -> 5.4.5
python3-geomet: upgrade 0.3.0 -> 1.0.0
python3-google-api-core: upgrade 2.10.2 -> 2.11.0
python3-google-api-python-client: upgrade 2.66.0 -> 2.68.0
python3-path: upgrade 16.5.0 -> 16.6.0
python3-google-auth: upgrade 2.14.1 -> 2.15.0
zabbix: upgrade 6.2.4 -> 6.2.5
xmlsec1: upgrade 1.2.36 -> 1.2.37
smcroute: upgrade 2.5.5 -> 2.5.6
python3-protobuf: upgrade 4.21.9 -> 4.21.10
python3-traitlets: upgrade 5.5.0 -> 5.6.0
python3-twine: upgrade 4.0.1 -> 4.0.2
python3-web3: upgrade 5.31.1 -> 5.31.2
python3-ujson: upgrade 5.5.0 -> 5.6.0
ctags: upgrade 5.9.20221127.0 -> 5.9.20221204.0
dnsmasq: upgrade 2.87 -> 2.88
flatbuffers: upgrade 22.11.23 -> 22.12.06
nbdkit: upgrade 1.33.3 -> 1.33.4
hwdata: upgrade 0.364 -> 0.365
evolution-data-server: update 3.46.1 -> 3.46.2
xfstests: upgrade 2022.11.06 -> 2022.11.27
python3-protobuf: upgrade 4.21.10 -> 4.21.11
python3-traitlets: upgrade 5.6.0 -> 5.7.0
python3-redis: upgrade 4.3.5 -> 4.4.0
python3-web3: upgrade 5.31.2 -> 5.31.3
python3-asttokens: upgrade 2.2.0 -> 2.2.1
python3-cbor2: upgrade 5.4.5 -> 5.4.6
python3-google-api-python-client: upgrade 2.68.0 -> 2.69.0
python3-gmpy2: upgrade 2.1.2 -> 2.1.3
python3-multidict: upgrade 6.0.2 -> 6.0.3
python3-watchdog: upgrade 2.1.9 -> 2.2.0
python3-pychromecast: upgrade 13.0.1 -> 13.0.2
python3-pymisp: upgrade 2.4.165.1 -> 2.4.166
python3-pytest-xdist: upgrade 3.0.2 -> 3.1.0
python3-yarl: upgrade 1.8.1 -> 1.8.2
zabbix: upgrade 6.2.5 -> 6.2.6
python3-yamlloader: upgrade 1.1.0 -> 1.2.2
tio: upgrade 2.3 -> 2.4
ctags: upgrade 5.9.20221204.0 -> 6.0.20221218.0
dash: upgrade 0.5.11.5 -> 0.5.12
nanopb: upgrade 0.4.6.4 -> 0.4.7
libio-socket-ssl-perl: upgrade 2.077 -> 2.078
libfile-slurper-perl: upgrade 0.013 -> 0.014
protobuf: upgrade 3.21.10 -> 3.21.12
python3-alembic: upgrade 1.8.1 -> 1.9.0
nano: upgrade 7.0 -> 7.1
python3-gmpy2: upgrade 2.1.3 -> 2.1.5
python3-eth-account: upgrade 0.7.0 -> 0.8.0
python3-google-api-python-client: upgrade 2.69.0 -> 2.70.0
python3-protobuf: upgrade 4.21.11 -> 4.21.12
python3-pycares: upgrade 4.2.2 -> 4.3.0
python3-pycurl: upgrade 7.45.1 -> 7.45.2
python3-pychromecast: upgrade 13.0.2 -> 13.0.4
python3-pyproj: upgrade 3.4.0 -> 3.4.1
python3-pydicti: upgrade 1.1.6 -> 1.2.0
python3-sentry-sdk: upgrade 1.11.1 -> 1.12.0
python3-traitlets: upgrade 5.7.0 -> 5.7.1
tio: upgrade 2.4 -> 2.5
python3-sqlalchemy: upgrade 1.4.44 -> 1.4.45
xfsdump: upgrade 3.1.11 -> 3.1.12
python3-isort: upgrade 5.10.1 -> 5.11.3
xfstests: upgrade 2022.11.27 -> 2022.12.11
ctags: upgrade 6.0.20221218.0 -> 6.0.20221225.0
gst-editing-services: upgrade 1.20.4 -> 1.20.5
logcheck: upgrade 1.3.24 -> 1.4.0
memtester: upgrade 4.5.1 -> 4.6.0
libmime-types-perl: upgrade 2.22 -> 2.23
metacity: upgrade 3.46.0 -> 3.46.1
python3-alembic: upgrade 1.9.0 -> 1.9.1
xfstests: upgrade 2022.12.11 -> 2022.12.18
python3-cytoolz: upgrade 0.12.0 -> 0.12.1
python3-asgiref: upgrade 3.5.2 -> 3.6.0
python3-autobahn: upgrade 22.7.1 -> 22.12.1
python3-coverage: upgrade 6.5.0 -> 7.0.1
python3-bitarray: upgrade 2.6.0 -> 2.6.1
python3-imageio: upgrade 2.22.4 -> 2.23.0
python3-isort: upgrade 5.11.3 -> 5.11.4
python3-multidict: upgrade 6.0.3 -> 6.0.4
python3-traitlets: upgrade 5.7.1 -> 5.8.0
python3-pymisp: upgrade 2.4.166 -> 2.4.167
python3-sentry-sdk: upgrade 1.12.0 -> 1.12.1
python3-supervisor: upgrade 4.2.4 -> 4.2.5
wolfssl: upgrade 5.5.3 -> 5.5.4
remmina: upgrade 1.4.28 -> 1.4.29
ser2net: upgrade 4.3.10 -> 4.3.11
tesseract: upgrade 5.2.0 -> 5.3.0
network-manager-applet: upgrade 1.26.0 -> 1.30.0
byacc: upgrade 20221106 -> 20221229
ctags: upgrade 6.0.20221225.0 -> 6.0.20230101.0
flashrom: upgrade 1.2 -> 1.2.1
fontforge: upgrade 20220308 -> 20230101
hunspell: upgrade 1.7.1 -> 1.7.2
libmime-types-perl: upgrade 2.23 -> 2.24
libnet-dns-perl: upgrade 1.35 -> 1.36
tepl: upgrade 6.2.0 -> 6.4.0
tcpdump: upgrade 4.99.1 -> 4.99.2
traceroute: upgrade 2.1.0 -> 2.1.1
openwsman: upgrade 2.7.1 -> 2.7.2
pcsc-tools: upgrade 1.6.0 -> 1.6.1
poppler: upgrade 22.12.0 -> 23.01.0
rsnapshot: upgrade 1.4.4 -> 1.4.5
tree: upgrade 2.0.4 -> 2.1.0
python3-bidict: upgrade 0.22.0 -> 0.22.1
python3-bitarray: upgrade 2.6.1 -> 2.6.2
python3-dateparser: upgrade 1.1.4 -> 1.1.5
python3-lz4: upgrade 4.0.2 -> 4.3.2
python3-mock: upgrade 4.0.3 -> 5.0.0
python3-pillow: upgrade 9.3.0 -> 9.4.0
python3-pydantic: upgrade 1.10.2 -> 1.10.4
python3-pyephem: upgrade 4.1.3 -> 4.1.4
python3-xlsxwriter: upgrade 3.0.3 -> 3.0.5
python3-xxhash: upgrade 3.1.0 -> 3.2.0
dnf-plugins/rpm.py: Fix grammar when RPM_PREFER_ELF_ARCH doesn't exit.
Xiangyu Chen (1):
lldpd: add ptest for lldpd package
Yi Zhao (13):
libpwquality: set correct pam plugin directory
ostree: add runtime dependency bubblewrap for PACKAGECONFIG[selinux]
ostree: fix selinux policy rebuild error on first deployment
frr: upgrade 8.3.1 -> 8.4.1
open-vm-tools: upgrade 12.1.0 -> 12.1.5
libtdb: upgrade 1.4.3 -> 1.4.7
libldb: upgrade 2.3.4 -> 2.6.1
libtalloc: upgrade 2.3.3 -> 2.3.4
libtevent: upgrade 0.10.2 -> 0.13.0
samba upgrade 4.14.14 -> 4.17.4
krb5: upgrade 1.17.2 -> 1.20.1
grubby: update to latest git rev
grubby: drop version 8.40
Zheng Qiu (1):
python3-inotify: add ptest
persianpros (1):
samba: Remove samba related PYTHONHASHSEED patches and use export function
zhengrq.fnst@fujitsu.com (15):
python3-pymodbus: upgrade 3.0.0 -> 3.0.2
python3-pywbemtools: upgrade 1.0.1 -> 1.1.0
python3-stevedore: upgrade 4.1.0 -> 4.1.1
ser2net: upgrade 4.3.9 -> 4.3.10
yelp-tools: upgrade 42.0 -> 42.1
python3-python-vlc: upgrade 3.0.16120 -> 3.0.18121
python3-sqlalchemy: upgrade 1.4.43 -> 1.4.44
python3-zopeinterface: upgrade 5.5.1 -> 5.5.2
python3-simplejson: upgrade 3.17.6 -> 3.18.0
python3-pywbemtools: upgrade 1.0.1 -> 1.1.1
python3-redis: upgrade 4.3.4 -> 4.3.5
python3-texttable: upgrade 1.6.4 -> 1.6.7
python3-sentry-sdk: upgrade 1.9.10 -> 1.11.1
python3-twitter: upgrade 4.10.1 -> 4.12.1
python3-termcolor: upgrade 2.1.0 -> 2.1.1
meta-security: 2aa48e6f4e..f991b20f56:
Alex Kiernan (1):
bubblewrap: Update 0.6.2 -> 0.7.0
Armin Kuster (2):
python3-privacyidea: update to 2.7.4
chipsec: update to 1.9.1
Michael Haener (1):
tpm2-tools: update to 5.3
meta-arm: d5f132b199..5c42f084f7:
Adam Johnston (1):
arm/trusted-services: Fix 'no such file' when building libts
Adrian Herrera (2):
atp: decouple m5readfile from m5ops
atp: move m5readfile to meta-gem5
Adrián Herrera Arcila (5):
atp: fix failing test_readme
gem5: support for EXTRAS
atp: separate recipe for gem5 models
atp: fix machine overrides in recipes
ci: add meta-atp to check-layers
David Bagonyi (1):
meta-arm-toolchain: Drop calls to datastore finalize
Diego Sueiro (2):
arm/classes: Introduce apply_local_src_patches bbclass
arm/trusted-firmware-m: Fix local source patches application
Emekcan (1):
arm/fvp: Upgrade Corstone1000 FVP
Emekcan Aras (6):
arm-bsp/documentation: corstone1000: update the user guide
arm/optee: Move optee-3.18 patches
arm/optee: support optee 3.19
arm-bsp/optee-os: Adds 3.19 bbappend
arm-bsp/optee-os: N1SDP support for optee-os 3.19
arm/qemuarm-secureboot: pin optee-os version
Jon Mason (5):
arm-bsp/trusted-services: rename bbappends with git version
arm/trusted-services: limit the ts compatible machines
arm-bsp/trusted-services: add n1sdp support
arm/trusted-firmware-m: update to 1.6.1
CI: define DEFAULT_TAG and CPU_REQUEST
Khem Raj (1):
gn: Replace lfs64 functions with original counterparts
Mohamed Omar Asaker (5):
arm-bsp/trusted-services: corstone1000: Use the stateless platform service calls
arm-bsp/trusted-firmware-m: Bump TFM to v1.7
arm-bsp/trusted-firmware-m: corstone1000: TFM 1.7
arm-bsp/musca_b1: Edit the platform name
arm-bsp/trusted-firmware-m: Remove TF-M 1.6 recipe
Peter Hoyes (3):
arm/fvp: Backport shlex.join from Python 3.8
arm/fvpboot: Disable timing annotation by default
arm/classes: Ensure patch files are sorted in apply_local_src_patches
Robbie Cao (1):
arm/fvp-base-r-aem: upgrade to version 11.20.15
Ross Burton (17):
CI: revert a meta-clang change which breaks pixman (thus, xserver)
CI: add variables needed for k8s runners
CI: add tags to all jobs
CI: no need to install telnet
CI: fix builds with clang
CI: use the .setup fragment in machine-coverage
arm/fvp-base-a-aem: upgrade to 11.20.15
arm-bsp/edk2-firmware: allow clang builds on juno
ci/get-binary-toolchains: rewrite, slightly
arm-bsp/documentation: update fvp-base documentation to use runfvp
CI: use qemuarm64 for pending-updates report job
meta-atp: remove
meta-gem5: remove
arm/fvp-envelope: name the FVP tarballs for checksums
arm/fvp-envelope: update HOMEPAGE
arm/fvp-base-a-aem: add support for aarch64 binaries
CI: don't pin fvp-base jobs to x86-64
poky: 44bb88cc86..0ce159991d:
Alejandro Hernandez Samaniego (6):
baremetal-image: Avoid overriding qemu variables from IMAGE_CLASSES
rust: Enable building rust from stable, beta and nightly channels
rust: Enable baremetal targets
baremetal-helloworld: Enable x86 and x86-64 ports
baremetal-helloworld: Move from skeleton to recipes-extended matching what rust-hello-world is doing
oe-selftest: Add baremetal toolchain test
Alex Kiernan (20):
rust: Install target.json for target rustc
rust: update 1.65.0 -> 1.66.0
oeqa/runtime/rust: Add basic compile/run test
libstd-rs: Merge .inc into .bb
libstd-rs: Move source directory to library/test
rust-llvm: Merge .inc into .bb
rust-llvm: Update LLVM_VERSION to match embedded version
packagegroup-rust-sdk-target: Add Rust SDK target packagegroup
packagegroup-core-sdk: Add SDK toolchain language selection support
rust: Merge .inc into .bb
rust: Move musl-x86 fix for `__stack_chk_fail_local` to rust-source
cargo: Merge .inc into .bb
cargo: Extend DEBUG_PREFIX_MAP to cover vendor
cargo: Include crossbeam-utils patch
cargo: Drop exclude from world
packagegroup-rust-sdk-target: Add cargo
oeqa/runtime/rust: Add cargo test
classes: image: Set empty weak default IMAGE_LINGUAS
default-distrovars: Include "c" in IMAGE_LINGUAS for glibc
rust: Merge all rustc-source patches into rust-source.inc
Alex Stewart (2):
lsof: add update-alternatives logic
opkg: upgrade to version 0.6.1
Alexander Kanavin (155):
elfutils: update 0.187 -> 0.188
rsync: update 3.2.5 -> 3.2.7
swig: update 4.0.2 -> 4.1.0
tcl: update 8.6.11 -> 8.6.12
quota: update 4.06 -> 4.09
shadow: update 4.12.3 -> 4.13
texinfo: update 6.8 -> 7.0
libhandy: update 1.6.3 -> 1.8.0
xf86-input-mouse: update 1.9.3 -> 1.9.4
flac: update 1.4.0 -> 1.4.2
icu: update 71.1 -> 72-1
libgpg-error: update 1.45 -> 1.46
popt: update 1.18 -> 1.19
vte: update 0.68.0 -> 0.70.1
webkitgtk: update 2.36.7 -> 2.38.2
man-db: update 2.10.2 -> 2.11.1
gawk: update 5.1.1 -> 5.2.1
unfs: update 0.9.22 -> 0.10.0
qemu-helper: depend on unfs3 and pseudo directly
runqemu: do not hardcode the ip address of the nfs server when using tap
selftest/runqemu: reenable the nfs rootfs test
glibc-tests: correctly pull in the actual tests when installing -ptest package
python3: fix tests on x86 (32 bit)
ptest-packagelists.inc: do not run valgrind ptests on 32 bit x86
python3: use the standard shell version of python3-config
python3targetconfig.bbclass: use PYTHONPATH to point to the target config
bitbake: fetch2/wget.py: correctly match versioned directories
devtool/upgrade: correctly handle recipes where S is a subdir of upstream tree
python3-numpy: fix upstream version check
python3-poetry-core: update 1.3.2 -> 1.4.0
tcl: update 8.6.12 -> 8.6.13
libnewt: update 0.52.21 -> 0.52.23
libxdmcp: update 1.1.3 -> 1.1.4
libxpm: update 3.5.13 -> 3.5.14
libxrandr: update 1.5.2 -> 1.5.3
bluez: update 5.65 -> 5.66
libxcrypt: update PV to match SRCREV
python3-dbusmock: update 0.28.4 -> 0.28.6
ruby: merge .inc into .bb
ruby: update 3.1.2 -> 3.1.3
ghostscript: update 9.56.1 -> 10.0.0
tzdata: update 2022d -> 2022g
systemtap: upgrade 4.7 -> 4.8
gnupg: upgrade 2.3.7 -> 2.3.8
ptest-packagelists.inc: correctly assign fast and slow tests
ovmf: update edk2-stable202208 -> edk2-stable202211
llvm: update 15.0.4 -> 15.0.6
tcmode-default.inc: set LLVMVERSION to a major version wildcard
cmake: update 3.24.2 -> 3.25.1
python3-native: further tweak to sysconfig.py to find python includes correctly
libslirp: add recipe to continue slirp support in qemu
qemu: update 7.1.0 -> 7.2.0
systemd: update 251.8 -> 252.4
dpkg: update 1.21.9 -> 1.21.13
python3-installer: update 0.5.1 -> 0.6.0
python3: update 3.11.0 -> 3.11.1
weston: update 11.0.0 -> 11.0.1
xhost: update 1.0.8 -> 1.0.9
xinit: update 1.4.1 -> 1.4.2
xkbcomp: update 1.4.5 -> 1.4.6
xprop: update 1.2.5 -> 1.2.6
xset: update 1.2.4 -> 1.2.5
xvinfo: update 1.1.4 -> 1.1.5
xf86-video-vesa: update 2.5.0 -> 2.6.0
libice: update 1.0.10 -> 1.1.1
libxcomposite: update 0.4.5 -> 0.4.6
libxdamage: update 1.1.5 -> 1.1.6
libxres: update 1.2.1 -> 1.2.2
libxscrnsaver: update 1.2.3 -> 1.2.4
libxv: update 1.0.11 -> 1.0.12
jquery: upgrade 3.6.1 -> 3.6.2
libmodule-build-perl: update 0.4231 -> 0.4232
python3-chardet: upgrade 5.0.0 -> 5.1.0
libarchive: upgrade 3.6.1 -> 3.6.2
stress-ng: upgrade 0.15.00 -> 0.15.01
vulkan: upgrade 1.3.231.1 -> 1.3.236.0
Revert "python3-native: further tweak to sysconfig.py to find python includes correctly"
conf/machine/include: add x86-64-v3 tunes (AVX, AVX2, BMI1, BMI2, F16C, FMA, LZCNT, MOVBE, XSAVE)
go: update 1.19.3 -> 1.19.4
vulkan-samples: update to latest revision
boost-build-native: update 1.80.0 -> 1.81.0
qemu: disable sporadically failing test-io-channel-command
devtool: process local files only for the main branch
libportal: add from meta-openembedded/meta-gnome
libportal: convert from gtk-doc to gi-docgen
epiphany: update 42.4 -> 43.0
qemux86-64: build for x86-64-v3 (2013 Haswell and later) rather than Core 2 from 2006
valgrind: disable tests that started failing after switching to x86-64-v3 target
glib-2.0: upgrade 2.74.3 -> 2.74.4
jquery: upgrade 3.6.2 -> 3.6.3
nasm: update 2.15.05 -> 2.16.01
ffmpeg: use nasm patched-in debug-prefix-map option to restore reproducibility
gtk+3: update 3.24.35 -> 3.24.36
libva-utils: update 2.16.0 -> 2.17.0
xcb-util: update 0.4.0 -> 0.4.1
gnupg: update 2.3.8 -> 2.4.0
libksba: update 1.6.2 -> 1.6.3
python3-pycryptodomex: upgrade 3.15.0 -> 3.16.0
piglit: upgrade to latest revision
python3-setuptools-scm: upgrade 7.0.5 -> 7.1.0
python3-attrs: upgrade 22.1.0 -> 22.2.0
webkitgtk: upgrade 2.38.2 -> 2.38.3
linux-firmware: upgrade 20221109 -> 20221214
harfbuzz: upgrade 5.3.1 -> 6.0.0
python3-pytz: upgrade 2022.6 -> 2022.7
strace: upgrade 6.0 -> 6.1
python3-pycryptodome: upgrade 3.15.0 -> 3.16.0
meson: upgrade 0.64.0 -> 1.0.0
xwayland: upgrade 22.1.5 -> 22.1.7
python3-pyrsistent: upgrade 0.19.2 -> 0.19.3
file: upgrade 5.43 -> 5.44
python3-subunit: upgrade 1.4.1 -> 1.4.2
python3-zipp: upgrade 3.10.0 -> 3.11.0
python3-cryptography: upgrade 38.0.3 -> 38.0.4
logrotate: upgrade 3.20.1 -> 3.21.0
python3-importlib-metadata: upgrade 5.0.0 -> 5.2.0
python3-numpy: upgrade 1.23.4 -> 1.24.1
xserver-xorg: upgrade 21.1.4 -> 21.1.6
puzzles: upgrade to latest revision
vte: upgrade 0.70.1 -> 0.70.2
libpsl: upgrade 0.21.1 -> 0.21.2
libtest-fatal-perl: upgrade 0.016 -> 0.017
python3-urllib3: upgrade 1.26.12 -> 1.26.13
python3-cryptography-vectors: upgrade 38.0.3 -> 38.0.4
python3-setuptools: upgrade 65.5.1 -> 65.6.3
libsdl2: upgrade 2.26.0 -> 2.26.1
python3-gitdb: upgrade 4.0.9 -> 4.0.10
diffoscope: upgrade 224 -> 230
python3-mako: upgrade 1.2.3 -> 1.2.4
python3-sphinx: upgrade 5.3.0 -> 6.0.0
libsolv: upgrade 0.7.22 -> 0.7.23
ruby: upgrade 3.1.3 -> 3.2.0
python3-lxml: upgrade 4.9.1 -> 4.9.2
python3-git: upgrade 3.1.29 -> 3.1.30
curl: upgrade 7.86.0 -> 7.87.0
kmscube: upgrade to latest revision
gobject-introspection: upgrade 1.72.0 -> 1.74.0
python3-dtschema: upgrade 2022.11 -> 2022.12
bash: upgrade 5.2.9 -> 5.2.15
kexec-tools: upgrade 2.0.25 -> 2.0.26
python3-jsonschema: upgrade 4.17.0 -> 4.17.3
python3-pycairo: upgrade 1.21.0 -> 1.23.0
nghttp2: upgrade 1.50.0 -> 1.51.0
python3-certifi: upgrade 2022.9.24 -> 2022.12.7
python3-hypothesis: upgrade 6.57.1 -> 6.61.0
libsndfile1: upgrade 1.1.0 -> 1.2.0
repo: upgrade 2.29.9 -> 2.31
libpcap: upgrade 1.10.1 -> 1.10.2
python3-jsonschema: depend on rfc3339-validator in all cases
python3-strict-rfc3339: remove the recipe
elfutils: do not error out on deprecated declarations
gcr3: limit version check to 3.x versions without odd-even rule
ncurses: restore version check as it's now again working due to release of 6.4
tiff: update 4.4.0 -> 4.5.0
qemu: fix recent reproducibility issues
Alexey Smirnov (1):
classes: make TOOLCHAIN more permissive for kernel
Anton Antonov (1):
rust: Do not use default compiler flags defined in CC crate
Antonin Godard (2):
busybox: always start do_compile with orig config files
busybox: rm temporary files if do_compile was interrupted
Atanas Bunchev (1):
qemu.rst: slirp port forwarding details
Bruce Ashfield (30):
linux-yocto-dev: bump to v6.0+
linux-yocto/5.19: update to v5.19.16
linux-yocto/5.15: update to v5.15.74
linux-yocto/5.19: update to v5.19.17
linux-yocto/5.15: update to v5.15.76
linux-yocto/5.19: cfg: intel and vesa updates
kern-tools: integrate ZFS speedup patch
linux-yocto-dev: bump to v6.1
kernel-devsrc: fix for v6.1+
lttng-modules: fix build for v6.1+
linux-yocto/5.19: security.cfg: remove configs which have been dropped
linux-yocto/5.15: update to v5.15.78
linux-yocto/5.19: fix CONFIG_CRYPTO_CCM mismatch warnings
linux-yocto/5.15: fix CONFIG_CRYPTO_CCM mismatch warnings
linux-yocto/5.19: fix elfutils run-backtrace-native-core ptest failure
linux-libc-headers: add 6.x fetch location
linux-libc-headers: bump to 6.1
linux-yocto/5.19: fix perf build with clang
linux-yocto/5.15: ltp and squashfs fixes
linux-yocto: introduce v6.1 reference kernel recipes
linux-yocto/5.15: fix perf build with clang
linux-yocto/5.15: libbpf: Fix build warning on ref_ctr_off
linux-yocto/5.15: update to v5.15.84
linux-yocto/6.1: update to v6.1.1
linux-yocto/5.15: powerpc: Fix reschedule bug in KUAP-unlocked user copy
linux-yocto/5.19: powerpc: Fix reschedule bug in KUAP-unlocked user copy
linux-yocto/6.1: update to v6.1.3
linux-yocto/6.1: cfg: remove CONFIG_ARM_CRYPTO
yocto-bsps/5.15: update to v5.15.78
linux-yocto/5.15: update to v5.15.80
Carlos Alberto Lopez Perez (3):
xwayland: libxshmfence is needed when dri3 is enabled
recipes: Enable nativesdk for gperf, unifdef, gi-docgen and its dependencies
mesa-gl: gallium is required when enabling x11
Changqing Li (2):
base.bbclass: Fix way to check ccache path
sqlite3: upgrade 3.40.0 -> 3.40.1
Charlie Johnston (1):
opkg: ensure opkg uses private gpg.conf when applying keys.
Chee Yang Lee (1):
migration-guides: add release-notes for 4.1.1
Chen Qi (10):
kernel.bbclass: make KERNEL_DEBUG_TIMESTAMPS work at rebuild
resolvconf: make it work
dhcpcd: fix to work with systemd
bitbake: command.py: cleanup bb.cache.parse_recipe
psplash: consider the situation of psplash not exist for systemd
bc: extend to nativesdk
rm_work: adjust dependency to make do_rm_work_all depend on do_rm_work
selftest: allow '-R' and '-r' be used together
dhcpcd: backport two patches to fix runtime error
libseccomp: fix typo in DESCRIPTION
Christian Eggers (1):
boost: add url lib
David Bagonyi (1):
u-boot: Fix u-boot signing when building with multiple u-boot configs
Dmitry Baryshkov (2):
linux-firmware: upgrade 20221012 -> 20221109
linux-firmware: add new fw file to ${PN}-qcom-adreno-a530
Enguerrand de Ribaucourt (1):
bitbake-layers: fix a typo
Enrico Jörns (1):
sstatesig: emit more helpful error message when not finding sstate manifest
Enrico Scholz (1):
sstate: show progress bar again
Fabre Sébastien (1):
u-boot: Add /boot in SYSROOT_DIRS
Frank de Brabander (4):
bitbake: README: Improve explanation about running the testsuite
bitbake: bin/utils: Ensure locale en_US.UTF-8 is available on the system
bitbake: process: log odd unlink events with bitbake.sock
bitbake: README: add required python version for bitbake
Harald Seiler (1):
opkg: Set correct info_dir and status_file in opkg.conf
Jagadeesh Krishnanjanappa (1):
qemuboot.bbclass: make sure runqemu boots bundled initramfs kernel image
Jan Kircher (1):
toolchain-scripts: compatibility with unbound variable protection
Javier Tia (1):
poky.conf: Add Fedora 36 as supported distro
Joe Slater (2):
python3: Fix CVE-2022-37460
libarchive: fix CVE-2022-36227
Jose Quaresma (2):
Revert "gstreamer1.0: disable flaky gstbin:test_watch_for_state_change test"
gstreamer1.0: Fix race conditions in gstbin tests
Joshua Watt (4):
qemu-helper-native: Correctly pass program name as argv[0]
bitbake: cooker: Use event to terminate parser threads
bitbake: cooker: Start sync thread a little earlier
bitbake: bitbake: Convert to argparse
Kai Kang (4):
xorg-lib-common.inc: set default value of XORG_EXT
libx11-compose-data: 1.6.8 -> 1.8.3
libx11: 1.8.1 -> 1.8.3
libsm: 1.2.3 > 1.2.4
Kasper Revsbech (1):
bitbake: fetch2/wget: handle username/password in uri
Khem Raj (47):
rsync: Delete pedantic errors re-ordering patch
pseudo: Disable LFS on 32bit arches
libxkbcommon: Extend to build native package
iso-codes: Extend to build native packages
xkeyboard-config: Extend to build native package
bluez5: enable position independent executables flag
rpcsvc-proto: Use autoconf knob to enable largefile support
gptfdisk: Enable largefile support functions
libpcre2: Upgrade to 10.42
erofs-utils: Convert from off64_t to off_t
pseudo: Remove 64bit time_t flags
unfs3: Define off64_t in terms of off_t on musl
acpid: Fix largefile enabled build
efivar: Replace off64_t with off_t
ltp: Fix largefile support
acl: Enable largefile support by default
libpciaccess: Do not use 64bit functions for largefile support
mdadm: Use _FILE_OFFSET_BITS to use largefile support
btrfs-tools: Do not use 64bit functions for largefile support
e2fsprogs: Do not use 64bit functions for largefile support
libbsd: Fix build with largefile support
gpgme: Fix with with largefile support
virglrenderer: Replace lseek64 with lseek
nfs-utils: Replace statfs64 with statfs
alsa-utils: Replace off64_t with off_t
lttng-tools: Fix build with largefile support
strace: Add knob to enable largefile support
numactl: Enable largefile support
qemu: Fix build with largefile support
systemd: Fix 252 release build on musl
rust: Do not use open64 on musl in getrandom crate
rust,libstd-rs: Fix build with latest musl
rust-llvm: Fix build on latest musl
cargo: Do not use open64 on musl anymore
llvm: Do not use lseek64
strace: Replace off64_t with off_t in sync_file_range.c test
vulkan-samples: Do not use LFS64 APIs in spdlog
pulseaudio: Do not use 64bit time_t flags
musl: Update to latest on tip of trunk
rust: Fix build with 64bit time_t
stress-ng: Do not enforce gold linker
time64.inc: Add GLIBC_64BIT_TIME_FLAGS on ppc/x86 as well
time64: Remove leading whitespace from GLIBC_64BIT_TIME_FLAGS
mpg123: Enable largefile support
site/powerpc32-linux: Do not cache statvfs64 across glibc and musl
tiff: Add packageconfig knob for webp
site/common-musl: Set ac_cv_sys_file_offset_bits default to 64
Lee Chee Yang (1):
migration-guides: add release-notes for 4.0.6
Luca Boccassi (2):
systemd: refresh patch to remove fuzz introduced by rebase on v252
systemd: ship pcrphase/measure tools and units in systemd-extra-utils
Luis (1):
rm_work.bbclass: use HOSTTOOLS 'rm' binary exclusively
Marek Vasut (5):
bitbake: fetch2/git: Prevent git fetcher from fetching gitlab repository metadata
package_rpm: Fix Linux 6.1.0 perf 1.0 version mistranslation
systemd: Make importd depend on glib-2.0 again
bitbake: bitbake-user-manual: Document override :append, :prepend, :remove order
bitbake: fetch2/git: Clarify the meaning of namespace
Markus Volk (12):
ell: upgrade 0.53 -> 0.54
libsdl2: update 2.24.2 -> 2.26.0
graphene: import from meta-oe
gtk4: import recipe from meta-gnome
gcr: rename gcr -> gcr3
gcr: add recipe for gcr-4, needed to build with gtk4
epiphany: use gcr3
gtk4: add tracker-miners runtime dependency
python3-dbusmock: allow to build native
gtk4: update 4.8.2 -> 4.8.3
gcr3: update 3.40.0 -> 3.41.1
librsvg: enable vapi build
Marta Rybczynska (2):
efibootmgr: update compilation with musl
cve-update-db-native: avoid incomplete updates
Martin Jansa (4):
libxml2: upgrade test data from 20080827 to 20130923
nativesdk-rpm: export RPM_ETCCONFIGDIR and MAGIC in environment like RPM_CONFIGDIR
nativesdk-rpm: don't create wrappers for WRAPPER_TOOLS
tune-x86-64-v3.inc: set QEMU_EXTRAOPTIONS like other tune-* files
Mathieu Dubois-Briand (1):
dbus: Add missing CVE product name
Michael Halstead (1):
uninative: Upgrade to 3.8.1 to include libgcc
Michael Opdenacker (34):
manuals: add missing references to classes
manuals: fix paragraphs with the "inherit" word
ref-manual/classes.rst: remove reference to sip.bbclass
manuals: simplify .gitignore files
manuals: split dev-manual/common-tasks.rst
dev-manual/sbom.rst: minor corrections
bitbake: bitbake-user-manual: update references to Yocto Project manual
bitbake.conf: remove SERIAL_CONSOLE variable
bitbake: bitbake-user-manual: add reference to bitbake git repository
ref-manual: add references to variables only documented in the BitBake manual
manuals: add reference to yocto-docs git repository to page footer
manuals: add missing references to variables
manuals: add missing SPDX license header to source files
manuals: fix double colons
ref-manual/resources.rst: fix formating
ref-manual: update references to release notes
manual: improve documentation about using external toolchains
ref-manual/images.rst: fix unnumbered list
manuals: define proper numbered lists
manuals: final removal of SERIAL_CONSOLE variable
ref-manual/resources.rst: improve description of mailing lists
ref-manual/system-requirements.rst: update buildtools instructions
manuals: create references to buildtools
documentation/poky.yaml.in: update minimum python version to 3.8
manuals: prepare 4.2 migration notes
bitbake: bitbake-user-manual: double colon fix
bitbake: bitbake-user-manual: remove "OEBasic" signature generator
migration-guides: fix 4.2 migration note issues
toaster-manual: fix description of introduction video
ref-manual/classes.rst: remove .bbclass from section titles
manuals: simplify references to classes
migration-1.6.rst: fix redundant reference
ref-manual/system-requirements.rst: recommend buildtools for not supported distros
.gitignore: ignore files generated by Toaster
Mikko Rapeli (5):
qemurunner.py: support setting slirp host IP address
runqemu: limit slirp host port forwarding to localhost 127.0.0.1
qemurunner.py: use IP address from command line
dev-manual/runtime-testing.rst: fix oeqa runtime test path
runqemu: add QB_SETUP_CMD and QB_CLEANUP_CMD
Mingli Yu (8):
tcl: correct the header location in tcl.pc
python3: make tkinter available when enabled
sudo: add selinux and audit PACKAGECONFIG
iproute2: add selinux PACKAGECONFIG
util-linux: add selinux PACKAGECONFIG
cronie: add selinux PACKAGECONFIG
psmisc: add selinux PACKAGECONFIG
gcr: add opengl to REQUIRED_DISTRO_FEATURES
Narpat Mali (2):
ffmpeg: fix for CVE-2022-3964
ffmpeg: fix for CVE-2022-3965
Ola x Nilsson (4):
kbd: Don't build tests
glibc: Add ppoll fortify symbol for 64 bit time_t
insane: Add QA check for 32 bit time and file offset functions
time64.conf: Include to enable 64 bit time flags
Ovidiu Panait (1):
kernel.bbclass: remove empty module directories to prevent QA issues
Patrick Williams (1):
kernel-fitimage: reduce dependency to the cpio
Pavel Zhukov (1):
oeqa/rpm.py: Increase timeout and add debug output
Peter Kjellerstedt (1):
recipes, classes: Avoid adding extra whitespace to PACKAGESPLITFUNCS
Peter Marko (2):
externalsrc: fix lookup for .gitmodules
oeqa/selftest/externalsrc: add test for srctree_hash_files
Petr Kubizňák (1):
harfbuzz: remove bindir only if it exists
Petr Vorel (1):
iputils: update to 20221126
Polampalli, Archana (1):
libpam: fix CVE-2022-28321
Qiu, Zheng (3):
valgrind: remove most hidden tests for arm64
tiff: Security fix for CVE-2022-3970
vim: upgrade 9.0.0820 -> 9.0.0947
Quentin Schulz (4):
cairo: update patch for CVE-2019-6461 with upstream solution
cairo: fix CVE patches assigned wrong CVE number
docs: kernel-dev: faq: update tip on how to not include kernel in image
docs: migration-guides: migration-4.0: specify variable name change for kernel inclusion in image recipe
Randy MacLeod (1):
valgrind: skip the boost_thread test on arm
Ranjitsinh Rathod (1):
curl: Correct LICENSE from MIT-open-group to curl
Ravula Adhitya Siddartha (2):
linux-yocto/5.15: update genericx86* machines to v5.15.78
linux-yocto/5.19: update genericx86* machines to v5.19.17
Richard Purdie (97):
bitbake: cache/cookerdata: Move recipe parsing functions from cache to databuilder
bitbake: cache: Drop broken/unused code
bitbake: cache: Drop unused function
bitbake: server: Ensure cooker profiling works
bitbake: worker/runqueue: Reduce initial data transfer in workerdata
bitbake: cache: Drop support for not saving the cache file
bitbake: runqueue: Add further debug for sstate reuse issues
bitbake: runqueue: Fix race issues around hash equivalence and sstate reuse
bitbake: data/siggen: Switch to use frozensets and optimize
bitbake: data_smart: Add debugging for overrides stability issue
bitbake: utils: Allow to_boolean to support int values
base: Drop do_package base definition
bitbake: data: Drop obsolete pydoc/path code
bitbake: BBHandler: Remove pointless global variable declarations
bitbake: runqueue: Improve error message for missing multiconfig
bitbake: data_smart: Small cache reuse optimization
bitbake.conf: Simplify CACHE setting
oeqa/selftest/tinfoil: Add test for separate config_data with recipe_parse_file()
qemu: Ensure libpng dependency is deterministic
bitbake: data: Tweak code layout
bitbake: cache/siggen: Simplify passing basehash data into the cache
bitbake: siggen/cache: Optionally allow adding siggen hash data to the bitbake cache
bitbake: parse: Add support for addpylib conf file directive and BB_GLOBAL_PYMODULES
bitbake: cookerdata: Ensure layers use LAYERSERIES_COMPAT fairly
base: Switch to use addpylib directive and BB_GLOBAL_PYMODULES
devtool/friends: Use LAYERSERIES_CORENAMES when generating LAYERSERIES_COMPAT entries
scripts/checklayer: Update to match bitbake changes
yocto-check-layer: Allow OE-Core to be tested
bitbake: main: Add timestamp to server retry messages
bitbake: main/server: Add lockfile debugging upon server retry
poky/poky-tiny: Drop largefile mentions
lib/sstatesig: Drop OEBasic siggen
bitbake: siggen: Drop non-multiconfig aware siggen support
bitbake: build/siggen/runqueue: Drop do_setscene references
bitbake: bitbake: Bump minimum python version requirement to 3.8
sanity: Update minimum python version to 3.8
bitbake: main/process: Add extra sockname debugging
Revert "kernel-fitimage: reduce dependency to the cpio"
bitbake: siggen: Directly store datacaches reference
bitbake: bitbake: siggen/runqueue: Switch to using RECIPE_SIGGEN_INFO feature for signature dumping
bitbake: siggen: Add dummy dataCaches from task context/datastore
bitbake: build/siggen: Rework stamps functions
bitbake: siggen: Clarify which fn is meant
bitbake: ast/data/codeparser: Add dependencies from python module functions
bitbake: codeparser/data: Add vardepsexclude support to module dependency code
bitbake.conf: Add module function vardepsexclude entries
time64: Rename to a .inc file to match the others
bitbake: command: Add ping command
bitbake: cache: Allow compression of the data in SiggenRecipeInfo
bitbake: siggen: Minor code improvement
bitbake: server/process: Add bitbake.sock race handling
oeqa/concurrencytest: Add number of failures to summary output
python3-poetry-core: Fix determinism issue breaking reproducibility
bitbake: cache/siggen: Fix cache issues with signature handling
bitbake: event: builtins fix for 'd' deletion
bitbake: cooker: Ensure cache is cleared for partial resets
bitbake: tinfoil: Ensure CommandExit is handled
bitbake: cache: Drop reciever side counting for SiggenRecipeInfo
bitbake: knotty: Avoid looping with tracebacks
bitbake: event: Add enable/disable heartbeat code
bitbake: cooker/cookerdata: Rework the way the datastores are reset
bitbake: server/process: Improve exception and idle function logging
bitbake: command: Tweak finishAsyncCommand ordering to avoid races
bitbake: cooker: Ensure commands clean up any parser processes
bitbake: server/process: Improve idle loop exit code
bitbake: event: Always use threadlock
bitbake: server/process: Add locking around idle functions accesses
bitbake: server/process: Run idle commands in a separate idle thread
bitbake: knotty: Ping the server/cooker periodically
bitbake: cookerdata: Fix cache/reparsing issue
bitbake: cookerdata: Fix previous commit to use a string, not a generator
bitbake: command: Ensure that failure cases call finishAsyncComand
layer.conf: Update to use mickledore as the layer series name
layer.conf: Mark master as compatible with mickledore
bitbake: lib/bb: Update thread/process locks to use a timeout
package: Move fixup_perms function to bb function library
package: Move get_conffiles/files_from_filevars functions to lib
package: Move pkgdata handling functions to oe.packagedata
package: Move emit_pkgdata to packagedata.py
package: Move package functions to function library
package: Drop unused function and obsolete comment
package: Move mapping_rename_hook to packagedata function library
python3-cython: Use PACKAGESPLITFUNCS instead of PACKAGEBUILDPKGD
package: Drop support for PACKAGEBUILDPKGD function customisation
recipes/classes: Drop prepend/append usage with PACKAGESPLITFUNCS
bitbake: cooker: Rework the parsing results submission
bitbake: cooker: Clean up inotify idle handler
uninative-tarball: Add libgcc
patchelf: Add fix submitted upstream for uninative segfaults
bitbake: cooker/command: Drop async command handler indirection via cooker
bitbake: process/cooker/command: Fix currentAsyncCommand locking/races
uninative: Ensure uninative is enabled in all cases for BuildStarted event
qemux86-64: Reduce tuning to core2-64
bitbake: tinfoil: Don't wait for events indefinitely
bitbake: knotty: Improve shutdown handling
bitbake: cooker: Fix exit handling issues
bitbake: server/process: Move heartbeat to idle thread
Robert Andersson (1):
go-crosssdk: avoid host contamination by GOCACHE
Ross Burton (28):
build-appliance-image: Update to master head revision
lib/buildstats: fix parsing of trees with reduced_proc_pressure directories
combo-layer: remove unused import
combo-layer: dont use bb.utils.rename
combo-layer: add sync-revs command
libxml2: upgrade 2.9.14 -> 2.10.3
libxml2: add more testing
python3-packaging: upgrade to 22.0
python3-hatchling: remove python3-tomli DEPENDS
python3-cryptography: remove python3-tomli RDEPENDS
meson: drop redundant is_debianlike() patch
meson: always use meson subcommands
libepoxy: remove upstreamed patch
gtk+3: upgrade 3.24.34 -> 3.24.35
gtk+3: port to Meson
meson: no need to rebuild on install
at-spi2-core: clean up x11 enabling
at-spi2-core: disable API docs if x11 is disabled
gtk+3: fix reproducible builds
lsof: upgrade 4.96.4 -> 4.96.5
pango: upgrade 1.50.11 -> 1.50.12
python3-hatch-vcs: upgrade 0.2.0 -> 0.3.0
python3-hatchling: upgrade 1.11.1 -> 1.12.1
python3-pathspec: upgrade 0.10.1 -> 0.10.3
rm_work: handle non-existant stamps directory
oeqa/selftest/debuginfod: improve testcase
elfutils: disable deprecation errors in all builds, not just native
curl: don't enable debug builds
Ryan Eatmon (1):
go: Update reproducibility patch to fix panic errors
Sandeep Gundlupet Raju (3):
libdrm: Remove libdrm-kms package
kernel-fitimage: Adjust order of dtb/dtbo files
kernel-fitimage: Allow user to select dtb when multiple dtb exists
Saul Wold (1):
at: Change when files are copied
Sergei Zhmylev (1):
oeqa/qemurunner: implement vmdk images support
Tim Orling (7):
python3-hypothesis: upgrade 6.56.4 -> 6.57.1
at-spi2-core: upgrade 2.44.1 -> 2.46.0
mirrors.bbclass: update CPAN_MIRROR
libtry-tiny-perl: add recipe for 0.31
libtest-fatal-perl: add recipe for 0.016
libtest-warnings-perl: move from meta-perl
liburi-perl: upgrade 5.08 -> 5.17
Trevor Woerner (1):
local.conf.sample: update bbclass locations
Vincent Davis Jr (1):
mesa: enable glvnd support
Wang Mingyu (49):
btrfs-tools: upgrade 6.0 -> 6.0.1
libpipeline: upgrade 1.5.6 -> 1.5.7
btrfs-tools: upgrade 6.0.1 -> 6.0.2
bind: upgrade 9.18.8 -> 9.18.9
ccache: upgrade 4.7.2 -> 4.7.4
dropbear: upgrade 2022.82 -> 2022.83
libinput: upgrade 1.21.0 -> 1.22.0
libxft: upgrade 2.3.6 -> 2.3.7
mpfr: upgrade 4.1.0 -> 4.1.1
glib-2.0: upgrade 2.74.1 -> 2.74.3
libxcrypt-compat: upgrade 4.4.30 -> 4.4.33
patchelf: upgrade 0.16.1 -> 0.17.0
pciutils: upgrade 3.8.0 -> 3.9.0
shaderc: upgrade 2022.3 -> 2022.4
sqlite3: upgrade 3.39.4 -> 3.40.0
stress-ng: upgrade 0.14.06 -> 0.15.00
swig: upgrade 4.1.0 -> 4.1.1
texinfo: upgrade 7.0 -> 7.0.1
usbutils: upgrade 014 -> 015
xz: upgrade 5.2.7 -> 5.2.9
wayland-protocols: upgrade 1.28 -> 1.31
gnu-config: upgrade to latest revision
libfontenc: upgrade 1.1.6 -> 1.1.7
libpcre2: upgrade 10.40 -> 10.41
libpng: upgrade 1.6.38 -> 1.6.39
libxau: upgrade 1.0.10 -> 1.0.11
libxkbfile: upgrade 1.1.1 -> 1.1.2
libxshmfence: upgrade 1.3.1 -> 1.3.2
xrandr: upgrade 1.5.1 -> 1.5.2
boost: upgrade 1.80.0 -> 1.81.0
ell: upgrade 0.54 -> 0.55
git: upgrade 2.38.1 -> 2.39.0
help2man: upgrade 1.49.2 -> 1.49.3
iproute2: upgrade 6.0.0 -> 6.1.0
libmpc: upgrade 1.2.1 -> 1.3.1
makedepend: upgrade 1.0.7 -> 1.0.8
psmisc: upgrade 23.5 -> 23.6
xz: upgrade 5.2.9 -> 5.4.0
gstreamer1.0: upgrade 1.20.4 -> 1.20.5
bind: upgrade 9.18.9 -> 9.18.10
btrfs-tools: upgrade 6.0.2 -> 6.1
librepo: upgrade 1.14.5 -> 1.15.1
libsdl2: upgrade 2.26.1 -> 2.26.2
libva-utils: upgrade 2.17.0 -> 2.17.1
libxkbcommon: upgrade 1.4.1 -> 1.5.0
mpfr: upgrade 4.1.1 -> 4.2.0
dpkg: upgrade 1.21.13 -> 1.21.17
rxvt-unicode: upgrade 9.30 -> 9.31
virglrenderer: upgrade 0.10.3 -> 0.10.4
Xiangyu Chen (3):
grub: backport patches to fix CVE-2022-28736
openssh: remove RRECOMMENDS to rng-tools for sshd package
grub2: backport patch to fix CVE-2022-2601 CVE-2022-3775
Yoann Congal (2):
bitbake: Group and reorder options in bitbake help
bitbake: main: Move --buildfile help at the end of "Execution" group
leimaohui (1):
libpng: Enable NEON for aarch64 to enensure consistency with arm32.
pgowda (1):
binutils: Add patch to fix CVE-2022-4285
张忠山 (1):
bitbake: data_smart: Use regex consistently for override matching
meta-raspberrypi: 93dadf336c..896566aa92:
Carlos Alberto Lopez Perez (1):
weston: disablepackageconfig options that fail to build with userland drivers
Khem Raj (2):
lirc: Drop upstreamed patch
linux-raspberrypi.inc: Weakly assign COMPATIBLE_MACHINE
Martin Jansa (2):
bluez5: update patches to apply on 5.66 version
layer.conf: update LAYERSERIES_COMPAT for mickledore
Vincent Davis Jr (5):
rpidistro-vlc,rpidistro-ffmpeg: update COMPATIBLE_HOST regex
rpidistro-vlc: upgrade 3.0.12 -> 3.0.17
rpi-default-providers: add libav and libpostproc
rpidistro-ffmpeg: upgrade 4.3.2 -> 4.3.4
rpidistro-ffmpeg: remove --enable-v4l2-request flag
Signed-off-by: Andrew Geissler <geissonator@yahoo.com>
Change-Id: Ied8537beedde0f83790e6e3595057db45f408107
Diffstat (limited to 'poky/bitbake/lib')
33 files changed, 1389 insertions, 1002 deletions
diff --git a/poky/bitbake/lib/bb/__init__.py b/poky/bitbake/lib/bb/__init__.py index 99cb5a0101..4e90964173 100644 --- a/poky/bitbake/lib/bb/__init__.py +++ b/poky/bitbake/lib/bb/__init__.py @@ -12,8 +12,8 @@ __version__ = "2.2.0" import sys -if sys.version_info < (3, 6, 0): - raise RuntimeError("Sorry, python 3.6.0 or later is required for this version of bitbake") +if sys.version_info < (3, 8, 0): + raise RuntimeError("Sorry, python 3.8.0 or later is required for this version of bitbake") class BBHandledException(Exception): diff --git a/poky/bitbake/lib/bb/build.py b/poky/bitbake/lib/bb/build.py index db706d0a3a..5a1727116a 100644 --- a/poky/bitbake/lib/bb/build.py +++ b/poky/bitbake/lib/bb/build.py @@ -772,44 +772,7 @@ def exec_task(fn, task, d, profile = False): event.fire(failedevent, d) return 1 -def stamp_internal(taskname, d, file_name, baseonly=False, noextra=False): - """ - Internal stamp helper function - Makes sure the stamp directory exists - Returns the stamp path+filename - - In the bitbake core, d can be a CacheData and file_name will be set. - When called in task context, d will be a data store, file_name will not be set - """ - taskflagname = taskname - if taskname.endswith("_setscene") and taskname != "do_setscene": - taskflagname = taskname.replace("_setscene", "") - - if file_name: - stamp = d.stamp[file_name] - extrainfo = d.stamp_extrainfo[file_name].get(taskflagname) or "" - else: - stamp = d.getVar('STAMP') - file_name = d.getVar('BB_FILENAME') - extrainfo = d.getVarFlag(taskflagname, 'stamp-extra-info') or "" - - if baseonly: - return stamp - if noextra: - extrainfo = "" - - if not stamp: - return - - stamp = bb.parse.siggen.stampfile(stamp, file_name, taskname, extrainfo) - - stampdir = os.path.dirname(stamp) - if cached_mtime_noerror(stampdir) == 0: - bb.utils.mkdirhier(stampdir) - - return stamp - -def stamp_cleanmask_internal(taskname, d, file_name): +def _get_cleanmask(taskname, mcfn): """ Internal stamp helper function to generate stamp cleaning mask Returns the stamp path+filename @@ -817,27 +780,14 @@ def stamp_cleanmask_internal(taskname, d, file_name): In the bitbake core, d can be a CacheData and file_name will be set. When called in task context, d will be a data store, file_name will not be set """ - taskflagname = taskname - if taskname.endswith("_setscene") and taskname != "do_setscene": - taskflagname = taskname.replace("_setscene", "") - - if file_name: - stamp = d.stampclean[file_name] - extrainfo = d.stamp_extrainfo[file_name].get(taskflagname) or "" - else: - stamp = d.getVar('STAMPCLEAN') - file_name = d.getVar('BB_FILENAME') - extrainfo = d.getVarFlag(taskflagname, 'stamp-extra-info') or "" - - if not stamp: - return [] - - cleanmask = bb.parse.siggen.stampcleanmask(stamp, file_name, taskname, extrainfo) - - return [cleanmask, cleanmask.replace(taskflagname, taskflagname + "_setscene")] - -def clean_stamp(task, d, file_name = None): - cleanmask = stamp_cleanmask_internal(task, d, file_name) + cleanmask = bb.parse.siggen.stampcleanmask_mcfn(taskname, mcfn) + taskflagname = taskname.replace("_setscene", "") + if cleanmask: + return [cleanmask, cleanmask.replace(taskflagname, taskflagname + "_setscene")] + return [] + +def clean_stamp_mcfn(task, mcfn): + cleanmask = _get_cleanmask(task, mcfn) for mask in cleanmask: for name in glob.glob(mask): # Preserve sigdata files in the stamps directory @@ -847,33 +797,46 @@ def clean_stamp(task, d, file_name = None): if name.endswith('.taint'): continue os.unlink(name) - return -def make_stamp(task, d, file_name = None): +def clean_stamp(task, d): + mcfn = d.getVar('BB_FILENAME') + clean_stamp_mcfn(task, mcfn) + +def make_stamp_mcfn(task, mcfn): + + basestamp = bb.parse.siggen.stampfile_mcfn(task, mcfn) + + stampdir = os.path.dirname(basestamp) + if cached_mtime_noerror(stampdir) == 0: + bb.utils.mkdirhier(stampdir) + + clean_stamp_mcfn(task, mcfn) + + # Remove the file and recreate to force timestamp + # change on broken NFS filesystems + if basestamp: + bb.utils.remove(basestamp) + open(basestamp, "w").close() + +def make_stamp(task, d): """ Creates/updates a stamp for a given task - (d can be a data dict or dataCache) """ - clean_stamp(task, d, file_name) + mcfn = d.getVar('BB_FILENAME') - stamp = stamp_internal(task, d, file_name) - # Remove the file and recreate to force timestamp - # change on broken NFS filesystems - if stamp: - bb.utils.remove(stamp) - open(stamp, "w").close() + make_stamp_mcfn(task, mcfn) # If we're in task context, write out a signature file for each task # as it completes - if not task.endswith("_setscene") and task != "do_setscene" and not file_name: - stampbase = stamp_internal(task, d, None, True) - file_name = d.getVar('BB_FILENAME') - bb.parse.siggen.dump_sigtask(file_name, task, stampbase, True) - -def find_stale_stamps(task, d, file_name=None): - current = stamp_internal(task, d, file_name) - current2 = stamp_internal(task + "_setscene", d, file_name) - cleanmask = stamp_cleanmask_internal(task, d, file_name) + if not task.endswith("_setscene"): + stampbase = bb.parse.siggen.stampfile_base(mcfn) + bb.parse.siggen.dump_sigtask(mcfn, task, stampbase, True) + + +def find_stale_stamps(task, mcfn): + current = bb.parse.siggen.stampfile_mcfn(task, mcfn) + current2 = bb.parse.siggen.stampfile_mcfn(task + "_setscene", mcfn) + cleanmask = _get_cleanmask(task, mcfn) found = [] for mask in cleanmask: for name in glob.glob(mask): @@ -887,38 +850,14 @@ def find_stale_stamps(task, d, file_name=None): found.append(name) return found -def del_stamp(task, d, file_name = None): - """ - Removes a stamp for a given task - (d can be a data dict or dataCache) - """ - stamp = stamp_internal(task, d, file_name) - bb.utils.remove(stamp) - -def write_taint(task, d, file_name = None): +def write_taint(task, d): """ Creates a "taint" file which will force the specified task and its dependents to be re-run the next time by influencing the value of its taskhash. - (d can be a data dict or dataCache) - """ - import uuid - if file_name: - taintfn = d.stamp[file_name] + '.' + task + '.taint' - else: - taintfn = d.getVar('STAMP') + '.' + task + '.taint' - bb.utils.mkdirhier(os.path.dirname(taintfn)) - # The specific content of the taint file is not really important, - # we just need it to be random, so a random UUID is used - with open(taintfn, 'w') as taintf: - taintf.write(str(uuid.uuid4())) - -def stampfile(taskname, d, file_name = None, noextra=False): - """ - Return the stamp for a given task - (d can be a data dict or dataCache) """ - return stamp_internal(taskname, d, file_name, noextra=noextra) + mcfn = d.getVar('BB_FILENAME') + bb.parse.siggen.invalidate_task(task, mcfn) def add_tasks(tasklist, d): task_deps = d.getVar('_task_deps', False) diff --git a/poky/bitbake/lib/bb/cache.py b/poky/bitbake/lib/bb/cache.py index 988c596c39..ee924b2d2b 100644 --- a/poky/bitbake/lib/bb/cache.py +++ b/poky/bitbake/lib/bb/cache.py @@ -28,7 +28,7 @@ import shutil logger = logging.getLogger("BitBake.Cache") -__cache_version__ = "154" +__cache_version__ = "155" def getCacheFile(path, filename, mc, data_hash): mcspec = '' @@ -105,7 +105,7 @@ class CoreRecipeInfo(RecipeInfoCommon): self.tasks = metadata.getVar('__BBTASKS', False) - self.basetaskhashes = self.taskvar('BB_BASEHASH', self.tasks, metadata) + self.basetaskhashes = metadata.getVar('__siggen_basehashes', False) or {} self.hashfilename = self.getvar('BB_HASHFILENAME', metadata) self.task_deps = metadata.getVar('_task_deps', False) or {'tasks': [], 'parents': {}} @@ -238,6 +238,106 @@ class CoreRecipeInfo(RecipeInfoCommon): cachedata.fakerootlogs[fn] = self.fakerootlogs cachedata.extradepsfunc[fn] = self.extradepsfunc + +class SiggenRecipeInfo(RecipeInfoCommon): + __slots__ = () + + classname = "SiggenRecipeInfo" + cachefile = "bb_cache_" + classname +".dat" + # we don't want to show this information in graph files so don't set cachefields + #cachefields = [] + + def __init__(self, filename, metadata): + self.siggen_gendeps = metadata.getVar("__siggen_gendeps", False) + self.siggen_varvals = metadata.getVar("__siggen_varvals", False) + self.siggen_taskdeps = metadata.getVar("__siggen_taskdeps", False) + + @classmethod + def init_cacheData(cls, cachedata): + cachedata.siggen_taskdeps = {} + cachedata.siggen_gendeps = {} + cachedata.siggen_varvals = {} + + def add_cacheData(self, cachedata, fn): + cachedata.siggen_gendeps[fn] = self.siggen_gendeps + cachedata.siggen_varvals[fn] = self.siggen_varvals + cachedata.siggen_taskdeps[fn] = self.siggen_taskdeps + + # The siggen variable data is large and impacts: + # - bitbake's overall memory usage + # - the amount of data sent over IPC between parsing processes and the server + # - the size of the cache files on disk + # - the size of "sigdata" hash information files on disk + # The data consists of strings (some large) or frozenset lists of variables + # As such, we a) deplicate the data here and b) pass references to the object at second + # access (e.g. over IPC or saving into pickle). + + store = {} + save_map = {} + save_count = 1 + restore_map = {} + restore_count = {} + + @classmethod + def reset(cls): + # Needs to be called before starting new streamed data in a given process + # (e.g. writing out the cache again) + cls.save_map = {} + cls.save_count = 1 + cls.restore_map = {} + + @classmethod + def _save(cls, deps): + ret = [] + if not deps: + return deps + for dep in deps: + fs = deps[dep] + if fs is None: + ret.append((dep, None, None)) + elif fs in cls.save_map: + ret.append((dep, None, cls.save_map[fs])) + else: + cls.save_map[fs] = cls.save_count + ret.append((dep, fs, cls.save_count)) + cls.save_count = cls.save_count + 1 + return ret + + @classmethod + def _restore(cls, deps, pid): + ret = {} + if not deps: + return deps + if pid not in cls.restore_map: + cls.restore_map[pid] = {} + map = cls.restore_map[pid] + for dep, fs, mapnum in deps: + if fs is None and mapnum is None: + ret[dep] = None + elif fs is None: + ret[dep] = map[mapnum] + else: + try: + fs = cls.store[fs] + except KeyError: + cls.store[fs] = fs + map[mapnum] = fs + ret[dep] = fs + return ret + + def __getstate__(self): + ret = {} + for key in ["siggen_gendeps", "siggen_taskdeps", "siggen_varvals"]: + ret[key] = self._save(self.__dict__[key]) + ret['pid'] = os.getpid() + return ret + + def __setstate__(self, state): + pid = state['pid'] + for key in ["siggen_gendeps", "siggen_taskdeps", "siggen_varvals"]: + setattr(self, key, self._restore(state[key], pid)) + + def virtualfn2realfn(virtualfn): """ Convert a virtual file name to a real one + the associated subclass keyword @@ -280,75 +380,18 @@ def variant2virtual(realfn, variant): return "mc:" + elems[1] + ":" + realfn return "virtual:" + variant + ":" + realfn -def parse_recipe(bb_data, bbfile, appends, mc=''): - """ - Parse a recipe - """ - - bb_data.setVar("__BBMULTICONFIG", mc) - - bbfile_loc = os.path.abspath(os.path.dirname(bbfile)) - bb.parse.cached_mtime_noerror(bbfile_loc) - - if appends: - bb_data.setVar('__BBAPPEND', " ".join(appends)) - bb_data = bb.parse.handle(bbfile, bb_data) - return bb_data - - -class NoCache(object): - - def __init__(self, databuilder): - self.databuilder = databuilder - self.data = databuilder.data - - def loadDataFull(self, virtualfn, appends): - """ - Return a complete set of data for fn. - To do this, we need to parse the file. - """ - logger.debug("Parsing %s (full)" % virtualfn) - (fn, virtual, mc) = virtualfn2realfn(virtualfn) - bb_data = self.load_bbfile(virtualfn, appends, virtonly=True) - return bb_data[virtual] - - def load_bbfile(self, bbfile, appends, virtonly = False, mc=None): - """ - Load and parse one .bb build file - Return the data and whether parsing resulted in the file being skipped - """ - - if virtonly: - (bbfile, virtual, mc) = virtualfn2realfn(bbfile) - bb_data = self.databuilder.mcdata[mc].createCopy() - bb_data.setVar("__ONLYFINALISE", virtual or "default") - datastores = parse_recipe(bb_data, bbfile, appends, mc) - return datastores - - if mc is not None: - bb_data = self.databuilder.mcdata[mc].createCopy() - return parse_recipe(bb_data, bbfile, appends, mc) - - bb_data = self.data.createCopy() - datastores = parse_recipe(bb_data, bbfile, appends) - - for mc in self.databuilder.mcdata: - if not mc: - continue - bb_data = self.databuilder.mcdata[mc].createCopy() - newstores = parse_recipe(bb_data, bbfile, appends, mc) - for ns in newstores: - datastores["mc:%s:%s" % (mc, ns)] = newstores[ns] - - return datastores - -class Cache(NoCache): +# +# Cooker calls cacheValid on its recipe list, then either calls loadCached +# from it's main thread or parse from separate processes to generate an up to +# date cache +# +class Cache(object): """ BitBake Cache implementation """ def __init__(self, databuilder, mc, data_hash, caches_array): - super().__init__(databuilder) - data = databuilder.data + self.databuilder = databuilder + self.data = databuilder.data # Pass caches_array information into Cache Constructor # It will be used later for deciding whether we @@ -356,7 +399,7 @@ class Cache(NoCache): self.mc = mc self.logger = PrefixLoggerAdapter("Cache: %s: " % (mc if mc else "default"), logger) self.caches_array = caches_array - self.cachedir = data.getVar("CACHE") + self.cachedir = self.data.getVar("CACHE") self.clean = set() self.checked = set() self.depends_cache = {} @@ -366,20 +409,12 @@ class Cache(NoCache): self.filelist_regex = re.compile(r'(?:(?<=:True)|(?<=:False))\s+') if self.cachedir in [None, '']: - self.has_cache = False - self.logger.info("Not using a cache. " - "Set CACHE = <directory> to enable.") - return - - self.has_cache = True + bb.fatal("Please ensure CACHE is set to the cache directory for BitBake to use") def getCacheFile(self, cachefile): return getCacheFile(self.cachedir, cachefile, self.mc, self.data_hash) def prepare_cache(self, progress): - if not self.has_cache: - return 0 - loaded = 0 self.cachefile = self.getCacheFile("bb_cache.dat") @@ -418,9 +453,6 @@ class Cache(NoCache): return loaded def cachesize(self): - if not self.has_cache: - return 0 - cachesize = 0 for cache_class in self.caches_array: cachefile = self.getCacheFile(cache_class.cachefile) @@ -486,7 +518,7 @@ class Cache(NoCache): """Parse the specified filename, returning the recipe information""" self.logger.debug("Parsing %s", filename) infos = [] - datastores = self.load_bbfile(filename, appends, mc=self.mc) + datastores = self.databuilder.parseRecipeVariants(filename, appends, mc=self.mc) depends = [] variants = [] # Process the "real" fn last so we can store variants list @@ -508,43 +540,19 @@ class Cache(NoCache): return infos - def load(self, filename, appends): + def loadCached(self, filename, appends): """Obtain the recipe information for the specified filename, - using cached values if available, otherwise parsing. - - Note that if it does parse to obtain the info, it will not - automatically add the information to the cache or to your - CacheData. Use the add or add_info method to do so after - running this, or use loadData instead.""" - cached = self.cacheValid(filename, appends) - if cached: - infos = [] - # info_array item is a list of [CoreRecipeInfo, XXXRecipeInfo] - info_array = self.depends_cache[filename] - for variant in info_array[0].variants: - virtualfn = variant2virtual(filename, variant) - infos.append((virtualfn, self.depends_cache[virtualfn])) - else: - return self.parse(filename, appends, configdata, self.caches_array) - - return cached, infos - - def loadData(self, fn, appends, cacheData): - """Load the recipe info for the specified filename, - parsing and adding to the cache if necessary, and adding - the recipe information to the supplied CacheData instance.""" - skipped, virtuals = 0, 0 + using cached values. + """ - cached, infos = self.load(fn, appends) - for virtualfn, info_array in infos: - if info_array[0].skipped: - self.logger.debug("Skipping %s: %s", virtualfn, info_array[0].skipreason) - skipped += 1 - else: - self.add_info(virtualfn, info_array, cacheData, not cached) - virtuals += 1 + infos = [] + # info_array item is a list of [CoreRecipeInfo, XXXRecipeInfo] + info_array = self.depends_cache[filename] + for variant in info_array[0].variants: + virtualfn = variant2virtual(filename, variant) + infos.append((virtualfn, self.depends_cache[virtualfn])) - return cached, skipped, virtuals + return infos def cacheValid(self, fn, appends): """ @@ -553,10 +561,6 @@ class Cache(NoCache): """ if fn not in self.checked: self.cacheValidUpdate(fn, appends) - - # Is cache enabled? - if not self.has_cache: - return False if fn in self.clean: return True return False @@ -566,10 +570,6 @@ class Cache(NoCache): Is the cache valid for fn? Make thorough (slower) checks including timestamps. """ - # Is cache enabled? - if not self.has_cache: - return False - self.checked.add(fn) # File isn't in depends_cache @@ -676,10 +676,6 @@ class Cache(NoCache): Save the cache Called from the parser when complete (or exiting) """ - - if not self.has_cache: - return - if self.cacheclean: self.logger.debug2("Cache is clean, not saving.") return @@ -700,6 +696,7 @@ class Cache(NoCache): p.dump(info) del self.depends_cache + SiggenRecipeInfo.reset() @staticmethod def mtime(cachefile): @@ -722,26 +719,11 @@ class Cache(NoCache): if watcher: watcher(info_array[0].file_depends) - if not self.has_cache: - return - if (info_array[0].skipped or 'SRCREVINACTION' not in info_array[0].pv) and not info_array[0].nocache: if parsed: self.cacheclean = False self.depends_cache[filename] = info_array - def add(self, file_name, data, cacheData, parsed=None): - """ - Save data we need into the cache - """ - - realfn = virtualfn2realfn(file_name)[0] - - info_array = [] - for cache_class in self.caches_array: - info_array.append(cache_class(realfn, data)) - self.add_info(file_name, info_array, cacheData, parsed) - class MulticonfigCache(Mapping): def __init__(self, databuilder, data_hash, caches_array): def progress(p): @@ -778,6 +760,7 @@ class MulticonfigCache(Mapping): loaded = 0 for c in self.__caches.values(): + SiggenRecipeInfo.reset() loaded += c.prepare_cache(progress) previous_progress = current_progress diff --git a/poky/bitbake/lib/bb/codeparser.py b/poky/bitbake/lib/bb/codeparser.py index 9d66d3ae41..ecae7b0808 100644 --- a/poky/bitbake/lib/bb/codeparser.py +++ b/poky/bitbake/lib/bb/codeparser.py @@ -27,6 +27,7 @@ import ast import sys import codegen import logging +import inspect import bb.pysh as pysh import bb.utils, bb.data import hashlib @@ -58,10 +59,39 @@ def check_indent(codestr): return codestr -# A custom getstate/setstate using tuples is actually worth 15% cachesize by -# avoiding duplication of the attribute names! +modulecode_deps = {} +def add_module_functions(fn, functions, namespace): + fstat = os.stat(fn) + fixedhash = fn + ":" + str(fstat.st_size) + ":" + str(fstat.st_mtime) + for f in functions: + name = "%s.%s" % (namespace, f) + parser = PythonParser(name, logger) + try: + parser.parse_python(None, filename=fn, lineno=1, fixedhash=fixedhash+f) + #bb.warn("Cached %s" % f) + except KeyError: + lines, lineno = inspect.getsourcelines(functions[f]) + src = "".join(lines) + parser.parse_python(src, filename=fn, lineno=lineno, fixedhash=fixedhash+f) + #bb.warn("Not cached %s" % f) + execs = parser.execs.copy() + # Expand internal module exec references + for e in parser.execs: + if e in functions: + execs.remove(e) + execs.add(namespace + "." + e) + modulecode_deps[name] = [parser.references.copy(), execs, parser.var_execs.copy(), parser.contains.copy()] + #bb.warn("%s: %s\nRefs:%s Execs: %s %s %s" % (name, src, parser.references, parser.execs, parser.var_execs, parser.contains)) + +def update_module_dependencies(d): + for mod in modulecode_deps: + excludes = set((d.getVarFlag(mod, "vardepsexclude") or "").split()) + if excludes: + modulecode_deps[mod] = [modulecode_deps[mod][0] - excludes, modulecode_deps[mod][1] - excludes, modulecode_deps[mod][2] - excludes, modulecode_deps[mod][3]] +# A custom getstate/setstate using tuples is actually worth 15% cachesize by +# avoiding duplication of the attribute names! class SetCache(object): def __init__(self): self.setcache = {} @@ -289,11 +319,17 @@ class PythonParser(): self.unhandled_message = "in call of %s, argument '%s' is not a string literal" self.unhandled_message = "while parsing %s, %s" % (name, self.unhandled_message) - def parse_python(self, node, lineno=0, filename="<string>"): - if not node or not node.strip(): + # For the python module code it is expensive to have the function text so it is + # uses a different fixedhash to cache against. We can take the hit on obtaining the + # text if it isn't in the cache. + def parse_python(self, node, lineno=0, filename="<string>", fixedhash=None): + if not fixedhash and (not node or not node.strip()): return - h = bbhash(str(node)) + if fixedhash: + h = fixedhash + else: + h = bbhash(str(node)) if h in codeparsercache.pythoncache: self.references = set(codeparsercache.pythoncache[h].refs) @@ -311,6 +347,9 @@ class PythonParser(): self.contains[i] = set(codeparsercache.pythoncacheextras[h].contains[i]) return + if fixedhash and not node: + raise KeyError + # Need to parse so take the hit on the real log buffer self.log = BufferedLogger('BitBake.Data.PythonParser', logging.DEBUG, self._log) diff --git a/poky/bitbake/lib/bb/command.py b/poky/bitbake/lib/bb/command.py index ec86885220..9e2cdc5c75 100644 --- a/poky/bitbake/lib/bb/command.py +++ b/poky/bitbake/lib/bb/command.py @@ -51,16 +51,17 @@ class Command: """ A queue of asynchronous commands for bitbake """ - def __init__(self, cooker): + def __init__(self, cooker, process_server): self.cooker = cooker self.cmds_sync = CommandsSync() self.cmds_async = CommandsAsync() self.remotedatastores = None - # FIXME Add lock for this + self.process_server = process_server + # Access with locking using process_server.{get/set/clear}_async_cmd() self.currentAsyncCommand = None - def runCommand(self, commandline, ro_only = False): + def runCommand(self, commandline, process_server, ro_only=False): command = commandline.pop(0) # Ensure cooker is ready for commands @@ -84,7 +85,7 @@ class Command: if not hasattr(command_method, 'readonly') or not getattr(command_method, 'readonly'): return None, "Not able to execute not readonly commands in readonly mode" try: - self.cooker.process_inotify_updates() + self.cooker.process_inotify_updates_apply() if getattr(command_method, 'needconfig', True): self.cooker.updateCacheSync() result = command_method(self, commandline) @@ -99,24 +100,24 @@ class Command: return None, traceback.format_exc() else: return result, None - if self.currentAsyncCommand is not None: - return None, "Busy (%s in progress)" % self.currentAsyncCommand[0] if command not in CommandsAsync.__dict__: return None, "No such command" - self.currentAsyncCommand = (command, commandline) - self.cooker.idleCallBackRegister(self.cooker.runCommands, self.cooker) + if not process_server.set_async_cmd((command, commandline)): + return None, "Busy (%s in progress)" % self.process_server.get_async_cmd()[0] + self.cooker.idleCallBackRegister(self.runAsyncCommand, process_server) return True, None - def runAsyncCommand(self): + def runAsyncCommand(self, _, process_server, halt): try: - self.cooker.process_inotify_updates() + self.cooker.process_inotify_updates_apply() if self.cooker.state in (bb.cooker.state.error, bb.cooker.state.shutdown, bb.cooker.state.forceshutdown): # updateCache will trigger a shutdown of the parser # and then raise BBHandledException triggering an exit self.cooker.updateCache() - return False - if self.currentAsyncCommand is not None: - (command, options) = self.currentAsyncCommand + return bb.server.process.idleFinish("Cooker in error state") + cmd = process_server.get_async_cmd() + if cmd is not None: + (command, options) = cmd commandmethod = getattr(CommandsAsync, command) needcache = getattr( commandmethod, "needcache" ) if needcache and self.cooker.state != bb.cooker.state.running: @@ -126,24 +127,21 @@ class Command: commandmethod(self.cmds_async, self, options) return False else: - return False + return bb.server.process.idleFinish("Nothing to do, no async command?") except KeyboardInterrupt as exc: - self.finishAsyncCommand("Interrupted") - return False + return bb.server.process.idleFinish("Interrupted") except SystemExit as exc: arg = exc.args[0] if isinstance(arg, str): - self.finishAsyncCommand(arg) + return bb.server.process.idleFinish(arg) else: - self.finishAsyncCommand("Exited with %s" % arg) - return False + return bb.server.process.idleFinish("Exited with %s" % arg) except Exception as exc: import traceback if isinstance(exc, bb.BBHandledException): - self.finishAsyncCommand("") + return bb.server.process.idleFinish("") else: - self.finishAsyncCommand(traceback.format_exc()) - return False + return bb.server.process.idleFinish(traceback.format_exc()) def finishAsyncCommand(self, msg=None, code=None): if msg or msg == "": @@ -152,8 +150,8 @@ class Command: bb.event.fire(CommandExit(code), self.cooker.data) else: bb.event.fire(CommandCompleted(), self.cooker.data) - self.currentAsyncCommand = None self.cooker.finishcommand() + self.process_server.clear_async_cmd() def reset(self): if self.remotedatastores: @@ -166,6 +164,12 @@ class CommandsSync: These must not influence any running synchronous command. """ + def ping(self, command, params): + """ + Allow a UI to check the server is still alive + """ + return "Still alive!" + def stateShutdown(self, command, params): """ Trigger cooker 'shutdown' mode @@ -564,11 +568,10 @@ class CommandsSync: if config_data: # We have to use a different function here if we're passing in a datastore # NOTE: we took a copy above, so we don't do it here again - envdata = bb.cache.parse_recipe(config_data, fn, appendfiles, mc)[''] + envdata = command.cooker.databuilder._parse_recipe(config_data, fn, appendfiles, mc)[''] else: # Use the standard path - parser = bb.cache.NoCache(command.cooker.databuilder) - envdata = parser.loadDataFull(fn, appendfiles) + envdata = command.cooker.databuilder.parseRecipe(fn, appendfiles) idx = command.remotedatastores.store(envdata) return DataStoreConnectionHandle(idx) parseRecipeFile.readonly = True @@ -741,7 +744,7 @@ class CommandsAsync: """ event = params[0] bb.event.fire(eval(event), command.cooker.data) - command.currentAsyncCommand = None + process_server.clear_async_cmd() triggerEvent.needcache = False def resetCooker(self, command, params): diff --git a/poky/bitbake/lib/bb/cooker.py b/poky/bitbake/lib/bb/cooker.py index 1da2f03197..cfaa7cbe6c 100644 --- a/poky/bitbake/lib/bb/cooker.py +++ b/poky/bitbake/lib/bb/cooker.py @@ -80,7 +80,7 @@ class SkippedPackage: class CookerFeatures(object): - _feature_list = [HOB_EXTRA_CACHES, BASEDATASTORE_TRACKING, SEND_SANITYEVENTS] = list(range(3)) + _feature_list = [HOB_EXTRA_CACHES, BASEDATASTORE_TRACKING, SEND_SANITYEVENTS, RECIPE_SIGGEN_INFO] = list(range(4)) def __init__(self): self._features=set() @@ -149,7 +149,7 @@ class BBCooker: Manages one bitbake build run """ - def __init__(self, featureSet=None, idleCallBackRegister=None): + def __init__(self, featureSet=None, server=None): self.recipecaches = None self.eventlog = None self.skiplist = {} @@ -163,7 +163,12 @@ class BBCooker: self.configuration = bb.cookerdata.CookerConfiguration() - self.idleCallBackRegister = idleCallBackRegister + self.process_server = server + self.idleCallBackRegister = None + self.waitIdle = None + if server: + self.idleCallBackRegister = server.register_idle_function + self.waitIdle = server.wait_for_idle bb.debug(1, "BBCooker starting %s" % time.time()) sys.stdout.flush() @@ -189,12 +194,6 @@ class BBCooker: self.inotify_modified_files = [] - def _process_inotify_updates(server, cooker, halt): - cooker.process_inotify_updates() - return 1.0 - - self.idleCallBackRegister(_process_inotify_updates, self) - # TOSTOP must not be set or our children will hang when they output try: fd = sys.stdout.fileno() @@ -208,7 +207,7 @@ class BBCooker: except UnsupportedOperation: pass - self.command = bb.command.Command(self) + self.command = bb.command.Command(self, self.process_server) self.state = state.initial self.parser = None @@ -220,6 +219,8 @@ class BBCooker: bb.debug(1, "BBCooker startup complete %s" % time.time()) sys.stdout.flush() + self.inotify_threadlock = threading.Lock() + def init_configdata(self): if not hasattr(self, "data"): self.initConfigurationData() @@ -248,11 +249,18 @@ class BBCooker: self.notifier = pyinotify.Notifier(self.watcher, self.notifications) def process_inotify_updates(self): - for n in [self.confignotifier, self.notifier]: - if n and n.check_events(timeout=0): - # read notified events and enqueue them - n.read_events() - n.process_events() + with bb.utils.lock_timeout(self.inotify_threadlock): + for n in [self.confignotifier, self.notifier]: + if n and n.check_events(timeout=0): + # read notified events and enqueue them + n.read_events() + + def process_inotify_updates_apply(self): + with bb.utils.lock_timeout(self.inotify_threadlock): + for n in [self.confignotifier, self.notifier]: + if n and n.check_events(timeout=0): + n.read_events() + n.process_events() def config_notifications(self, event): if event.maskname == "IN_Q_OVERFLOW": @@ -367,12 +375,12 @@ class BBCooker: if CookerFeatures.BASEDATASTORE_TRACKING in self.featureset: self.enableDataTracking() - all_extra_cache_names = [] + caches_name_array = ['bb.cache:CoreRecipeInfo'] # We hardcode all known cache types in a single place, here. if CookerFeatures.HOB_EXTRA_CACHES in self.featureset: - all_extra_cache_names.append("bb.cache_extra:HobRecipeInfo") - - caches_name_array = ['bb.cache:CoreRecipeInfo'] + all_extra_cache_names + caches_name_array.append("bb.cache_extra:HobRecipeInfo") + if CookerFeatures.RECIPE_SIGGEN_INFO in self.featureset: + caches_name_array.append("bb.cache:SiggenRecipeInfo") # At least CoreRecipeInfo will be loaded, so caches_array will never be empty! # This is the entry point, no further check needed! @@ -400,9 +408,7 @@ class BBCooker: self.disableDataTracking() for mc in self.databuilder.mcdata.values(): - mc.renameVar("__depends", "__base_depends") self.add_filewatch(mc.getVar("__base_depends", False), self.configwatcher) - mc.setVar("__bbclasstype", "recipe") self.baseconfig_valid = True self.parsecache_valid = False @@ -436,10 +442,8 @@ class BBCooker: upstream=upstream, ) self.hashserv.serve_as_process() - self.data.setVar("BB_HASHSERVE", self.hashservaddr) - self.databuilder.origdata.setVar("BB_HASHSERVE", self.hashservaddr) - self.databuilder.data.setVar("BB_HASHSERVE", self.hashservaddr) for mc in self.databuilder.mcdata: + self.databuilder.mcorigdata[mc].setVar("BB_HASHSERVE", self.hashservaddr) self.databuilder.mcdata[mc].setVar("BB_HASHSERVE", self.hashservaddr) bb.parse.init_parser(self.data) @@ -535,15 +539,6 @@ class BBCooker: logger.debug("Base environment change, triggering reparse") self.reset() - def runCommands(self, server, data, halt): - """ - Run any queued asynchronous command - This is done by the idle handler so it runs in true context rather than - tied to any UI. - """ - - return self.command.runAsyncCommand() - def showVersions(self): (latest_versions, preferred_versions, required) = self.findProviders() @@ -617,8 +612,7 @@ class BBCooker: if fn: try: - bb_caches = bb.cache.MulticonfigCache(self.databuilder, self.data_hash, self.caches_array) - envdata = bb_caches[mc].loadDataFull(fn, self.collections[mc].get_file_appends(fn)) + envdata = self.databuilder.parseRecipe(fn, self.collections[mc].get_file_appends(fn)) except Exception as e: parselog.exception("Unable to read %s", fn) raise @@ -1449,10 +1443,12 @@ class BBCooker: self.recipecaches[mc].rundeps[fn] = defaultdict(list) self.recipecaches[mc].runrecs[fn] = defaultdict(list) + bb.parse.siggen.setup_datacache(self.recipecaches) + # Invalidate task for target if force mode active if self.configuration.force: logger.verbose("Invalidate task %s, %s", task, fn) - bb.parse.siggen.invalidate_task(task, self.recipecaches[mc], fn) + bb.parse.siggen.invalidate_task(task, fn) # Setup taskdata structure taskdata = {} @@ -1466,6 +1462,7 @@ class BBCooker: buildname = self.databuilder.mcdata[mc].getVar("BUILDNAME") if fireevents: bb.event.fire(bb.event.BuildStarted(buildname, [item]), self.databuilder.mcdata[mc]) + bb.event.enable_heartbeat() # Execute the runqueue runlist = [[mc, item, task, fn]] @@ -1491,22 +1488,21 @@ class BBCooker: failures += len(exc.args) retval = False except SystemExit as exc: - self.command.finishAsyncCommand(str(exc)) if quietlog: bb.runqueue.logger.setLevel(rqloglevel) - return False + return bb.server.process.idleFinish(str(exc)) if not retval: if fireevents: bb.event.fire(bb.event.BuildCompleted(len(rq.rqdata.runtaskentries), buildname, item, failures, interrupted), self.databuilder.mcdata[mc]) - self.command.finishAsyncCommand(msg) + bb.event.disable_heartbeat() # We trashed self.recipecaches above self.parsecache_valid = False self.configuration.limited_deps = False bb.parse.siggen.reset(self.data) if quietlog: bb.runqueue.logger.setLevel(rqloglevel) - return False + return bb.server.process.idleFinish(msg) if retval is True: return True return retval @@ -1536,16 +1532,16 @@ class BBCooker: failures += len(exc.args) retval = False except SystemExit as exc: - self.command.finishAsyncCommand(str(exc)) - return False + return bb.server.process.idleFinish(str(exc)) if not retval: try: for mc in self.multiconfigs: bb.event.fire(bb.event.BuildCompleted(len(rq.rqdata.runtaskentries), buildname, targets, failures, interrupted), self.databuilder.mcdata[mc]) finally: - self.command.finishAsyncCommand(msg) - return False + bb.event.disable_heartbeat() + return bb.server.process.idleFinish(msg) + if retval is True: return True return retval @@ -1577,6 +1573,7 @@ class BBCooker: for mc in self.multiconfigs: bb.event.fire(bb.event.BuildStarted(buildname, ntargets), self.databuilder.mcdata[mc]) + bb.event.enable_heartbeat() rq = bb.runqueue.RunQueue(self, self.data, self.recipecaches, taskdata, runlist) if 'universe' in targets: @@ -1756,22 +1753,26 @@ class BBCooker: if hasattr(self, "data"): bb.event.fire(CookerExit(), self.data) - def shutdown(self, force = False): + def shutdown(self, force=False): if force: self.state = state.forceshutdown else: self.state = state.shutdown if self.parser: - self.parser.shutdown(clean=not force) + self.parser.shutdown(clean=False) self.parser.final_cleanup() def finishcommand(self): + if hasattr(self.parser, 'shutdown'): + self.parser.shutdown(clean=False) + self.parser.final_cleanup() self.state = state.initial def reset(self): if hasattr(bb.parse, "siggen"): bb.parse.siggen.exit() + self.finishcommand() self.initConfigurationData() self.handlePRServ() @@ -1783,8 +1784,9 @@ class BBCooker: if hasattr(self, "data"): self.databuilder.reset() self.data = self.databuilder.data + # In theory tinfoil could have modified the base data before parsing, + # ideally need to track if anything did modify the datastore self.parsecache_valid = False - self.baseconfig_valid = False class CookerExit(bb.event.Event): @@ -2092,29 +2094,29 @@ class Parser(multiprocessing.Process): multiprocessing.util.Finalize(None, bb.fetch.fetcher_parse_save, exitpriority=1) pending = [] + havejobs = True try: - while True: - try: - self.quit.get_nowait() - except queue.Empty: - pass - else: + while havejobs or pending: + if self.quit.is_set(): break - if pending: - result = pending.pop() - else: - try: - job = self.jobs.pop() - except IndexError: - break + job = None + try: + job = self.jobs.pop() + except IndexError: + havejobs = False + if job: result = self.parse(*job) # Clear the siggen cache after parsing to control memory usage, its huge bb.parse.siggen.postparsing_clean_cache() - try: - self.results.put(result, timeout=0.25) - except queue.Full: pending.append(result) + + if pending: + try: + result = pending.pop() + self.results.put(result, timeout=0.05) + except queue.Full: + pending.append(result) finally: self.results.close() self.results.join_thread() @@ -2189,13 +2191,15 @@ class CookerParser(object): self.haveshutdown = False self.syncthread = None + bb.cache.SiggenRecipeInfo.reset() + def start(self): self.results = self.load_cached() self.processes = [] if self.toparse: bb.event.fire(bb.event.ParseStarted(self.toparse), self.cfgdata) - self.parser_quit = multiprocessing.Queue(maxsize=self.num_processes) + self.parser_quit = multiprocessing.Event() self.result_queue = multiprocessing.Queue() def chunkify(lst,n): @@ -2227,8 +2231,15 @@ class CookerParser(object): else: bb.error("Parsing halted due to errors, see error messages above") - for process in self.processes: - self.parser_quit.put(None) + def sync_caches(): + for c in self.bb_caches.values(): + bb.cache.SiggenRecipeInfo.reset() + c.sync() + + self.syncthread = threading.Thread(target=sync_caches, name="SyncThread") + self.syncthread.start() + + self.parser_quit.set() # Cleanup the queue before call process.join(), otherwise there might be # deadlocks. @@ -2258,18 +2269,9 @@ class CookerParser(object): if hasattr(process, "close"): process.close() - self.parser_quit.close() - # Allow data left in the cancel queue to be discarded - self.parser_quit.cancel_join_thread() - - def sync_caches(): - for c in self.bb_caches.values(): - c.sync() - sync = threading.Thread(target=sync_caches, name="SyncThread") - self.syncthread = sync - sync.start() bb.codeparser.parser_cache_savemerge() + bb.cache.SiggenRecipeInfo.reset() bb.fetch.fetcher_parse_done() if self.cooker.configuration.profile: profiles = [] @@ -2288,8 +2290,8 @@ class CookerParser(object): def load_cached(self): for mc, cache, filename, appends in self.fromcache: - cached, infos = cache.load(filename, appends) - yield not cached, mc, infos + infos = cache.loadCached(filename, appends) + yield False, mc, infos def parse_generator(self): empty = False @@ -2387,6 +2389,7 @@ class CookerParser(object): return True def reparse(self, filename): + bb.cache.SiggenRecipeInfo.reset() to_reparse = set() for mc in self.cooker.multiconfigs: to_reparse.add((mc, filename, self.cooker.collections[mc].get_file_appends(filename))) diff --git a/poky/bitbake/lib/bb/cookerdata.py b/poky/bitbake/lib/bb/cookerdata.py index 8a354fed7c..c6b5658d75 100644 --- a/poky/bitbake/lib/bb/cookerdata.py +++ b/poky/bitbake/lib/bb/cookerdata.py @@ -184,7 +184,7 @@ def catch_parse_error(func): @catch_parse_error def parse_config_file(fn, data, include=True): - return bb.parse.handle(fn, data, include) + return bb.parse.handle(fn, data, include, baseconfig=True) @catch_parse_error def _inherit(bbclass, data): @@ -263,6 +263,7 @@ class CookerDataBuilder(object): self.mcdata = {} def parseBaseConfiguration(self, worker=False): + mcdata = {} data_hash = hashlib.sha256() try: self.data = self.parseConfigurationFiles(self.prefiles, self.postfiles) @@ -288,18 +289,18 @@ class CookerDataBuilder(object): bb.parse.init_parser(self.data) data_hash.update(self.data.get_hash().encode('utf-8')) - self.mcdata[''] = self.data + mcdata[''] = self.data multiconfig = (self.data.getVar("BBMULTICONFIG") or "").split() for config in multiconfig: if config[0].isdigit(): bb.fatal("Multiconfig name '%s' is invalid as multiconfigs cannot start with a digit" % config) - mcdata = self.parseConfigurationFiles(self.prefiles, self.postfiles, config) - bb.event.fire(bb.event.ConfigParsed(), mcdata) - self.mcdata[config] = mcdata - data_hash.update(mcdata.get_hash().encode('utf-8')) + parsed_mcdata = self.parseConfigurationFiles(self.prefiles, self.postfiles, config) + bb.event.fire(bb.event.ConfigParsed(), parsed_mcdata) + mcdata[config] = parsed_mcdata + data_hash.update(parsed_mcdata.get_hash().encode('utf-8')) if multiconfig: - bb.event.fire(bb.event.MultiConfigParsed(self.mcdata), self.data) + bb.event.fire(bb.event.MultiConfigParsed(mcdata), self.data) self.data_hash = data_hash.hexdigest() except (SyntaxError, bb.BBHandledException): @@ -311,6 +312,7 @@ class CookerDataBuilder(object): logger.exception("Error parsing configuration files") raise bb.BBHandledException() + bb.codeparser.update_module_dependencies(self.data) # Handle obsolete variable names d = self.data @@ -331,17 +333,23 @@ class CookerDataBuilder(object): if issues: raise bb.BBHandledException() + for mc in mcdata: + mcdata[mc].renameVar("__depends", "__base_depends") + mcdata[mc].setVar("__bbclasstype", "recipe") + # Create a copy so we can reset at a later date when UIs disconnect - self.origdata = self.data - self.data = bb.data.createCopy(self.origdata) - self.mcdata[''] = self.data + self.mcorigdata = mcdata + for mc in mcdata: + self.mcdata[mc] = bb.data.createCopy(mcdata[mc]) + self.data = self.mcdata[''] def reset(self): # We may not have run parseBaseConfiguration() yet - if not hasattr(self, 'origdata'): + if not hasattr(self, 'mcorigdata'): return - self.data = bb.data.createCopy(self.origdata) - self.mcdata[''] = self.data + for mc in self.mcorigdata: + self.mcdata[mc] = bb.data.createCopy(self.mcorigdata[mc]) + self.data = self.mcdata[''] def _findLayerConf(self, data): return findConfigFile("bblayers.conf", data) @@ -383,6 +391,8 @@ class CookerDataBuilder(object): parselog.critical("Please check BBLAYERS in %s" % (layerconf)) raise bb.BBHandledException() + layerseries = None + compat_entries = {} for layer in layers: parselog.debug2("Adding layer %s", layer) if 'HOME' in approved and '~' in layer: @@ -395,8 +405,27 @@ class CookerDataBuilder(object): data.expandVarref('LAYERDIR') data.expandVarref('LAYERDIR_RE') + # Sadly we can't have nice things. + # Some layers think they're going to be 'clever' and copy the values from + # another layer, e.g. using ${LAYERSERIES_COMPAT_core}. The whole point of + # this mechanism is to make it clear which releases a layer supports and + # show when a layer master branch is bitrotting and is unmaintained. + # We therefore avoid people doing this here. + collections = (data.getVar('BBFILE_COLLECTIONS') or "").split() + for c in collections: + compat_entry = data.getVar("LAYERSERIES_COMPAT_%s" % c) + if compat_entry: + compat_entries[c] = set(compat_entry.split()) + data.delVar("LAYERSERIES_COMPAT_%s" % c) + if not layerseries: + layerseries = set((data.getVar("LAYERSERIES_CORENAMES") or "").split()) + if layerseries: + data.delVar("LAYERSERIES_CORENAMES") + data.delVar('LAYERDIR_RE') data.delVar('LAYERDIR') + for c in compat_entries: + data.setVar("LAYERSERIES_COMPAT_%s" % c, " ".join(sorted(compat_entries[c]))) bbfiles_dynamic = (data.getVar('BBFILES_DYNAMIC') or "").split() collections = (data.getVar('BBFILE_COLLECTIONS') or "").split() @@ -415,13 +444,15 @@ class CookerDataBuilder(object): if invalid: bb.fatal("BBFILES_DYNAMIC entries must be of the form {!}<collection name>:<filename pattern>, not:\n %s" % "\n ".join(invalid)) - layerseries = set((data.getVar("LAYERSERIES_CORENAMES") or "").split()) collections_tmp = collections[:] for c in collections: collections_tmp.remove(c) if c in collections_tmp: bb.fatal("Found duplicated BBFILE_COLLECTIONS '%s', check bblayers.conf or layer.conf to fix it." % c) - compat = set((data.getVar("LAYERSERIES_COMPAT_%s" % c) or "").split()) + + compat = set() + if c in compat_entries: + compat = compat_entries[c] if compat and not layerseries: bb.fatal("No core layer found to work with layer '%s'. Missing entry in bblayers.conf?" % c) if compat and not (compat & layerseries): @@ -430,6 +461,8 @@ class CookerDataBuilder(object): elif not compat and not data.getVar("BB_WORKERCONTEXT"): bb.warn("Layer %s should set LAYERSERIES_COMPAT_%s in its conf/layer.conf file to list the core layer names it is compatible with." % (c, c)) + data.setVar("LAYERSERIES_CORENAMES", " ".join(sorted(layerseries))) + if not data.getVar("BBPATH"): msg = "The BBPATH variable is not set" if not layerconf: @@ -466,3 +499,54 @@ class CookerDataBuilder(object): return data + @staticmethod + def _parse_recipe(bb_data, bbfile, appends, mc=''): + bb_data.setVar("__BBMULTICONFIG", mc) + + bbfile_loc = os.path.abspath(os.path.dirname(bbfile)) + bb.parse.cached_mtime_noerror(bbfile_loc) + + if appends: + bb_data.setVar('__BBAPPEND', " ".join(appends)) + bb_data = bb.parse.handle(bbfile, bb_data) + return bb_data + + def parseRecipeVariants(self, bbfile, appends, virtonly=False, mc=None): + """ + Load and parse one .bb build file + Return the data and whether parsing resulted in the file being skipped + """ + + if virtonly: + (bbfile, virtual, mc) = bb.cache.virtualfn2realfn(bbfile) + bb_data = self.mcdata[mc].createCopy() + bb_data.setVar("__ONLYFINALISE", virtual or "default") + datastores = self._parse_recipe(bb_data, bbfile, appends, mc) + return datastores + + if mc is not None: + bb_data = self.mcdata[mc].createCopy() + return self._parse_recipe(bb_data, bbfile, appends, mc) + + bb_data = self.data.createCopy() + datastores = self._parse_recipe(bb_data, bbfile, appends) + + for mc in self.mcdata: + if not mc: + continue + bb_data = self.mcdata[mc].createCopy() + newstores = self._parse_recipe(bb_data, bbfile, appends, mc) + for ns in newstores: + datastores["mc:%s:%s" % (mc, ns)] = newstores[ns] + + return datastores + + def parseRecipe(self, virtualfn, appends): + """ + Return a complete set of data for fn. + To do this, we need to parse the file. + """ + logger.debug("Parsing %s (full)" % virtualfn) + (fn, virtual, mc) = bb.cache.virtualfn2realfn(virtualfn) + bb_data = self.parseRecipeVariants(virtualfn, appends, virtonly=True) + return bb_data[virtual] diff --git a/poky/bitbake/lib/bb/data.py b/poky/bitbake/lib/bb/data.py index 3a6af325f4..841369699e 100644 --- a/poky/bitbake/lib/bb/data.py +++ b/poky/bitbake/lib/bb/data.py @@ -28,11 +28,6 @@ the speed is more critical here. import sys, os, re import hashlib -if sys.argv[0][-5:] == "pydoc": - path = os.path.dirname(os.path.dirname(sys.argv[1])) -else: - path = os.path.dirname(os.path.dirname(sys.argv[0])) -sys.path.insert(0, path) from itertools import groupby from bb import data_smart @@ -266,9 +261,40 @@ def emit_func_python(func, o=sys.__stdout__, d = init()): newdeps |= set((d.getVarFlag(dep, "vardeps") or "").split()) newdeps -= seen -def build_dependencies(key, keys, shelldeps, varflagsexcl, ignored_vars, d): +def build_dependencies(key, keys, mod_funcs, shelldeps, varflagsexcl, ignored_vars, d): + def handle_contains(value, contains, exclusions, d): + newvalue = [] + if value: + newvalue.append(str(value)) + for k in sorted(contains): + if k in exclusions or k in ignored_vars: + continue + l = (d.getVar(k) or "").split() + for item in sorted(contains[k]): + for word in item.split(): + if not word in l: + newvalue.append("\n%s{%s} = Unset" % (k, item)) + break + else: + newvalue.append("\n%s{%s} = Set" % (k, item)) + return "".join(newvalue) + + def handle_remove(value, deps, removes, d): + for r in sorted(removes): + r2 = d.expandWithRefs(r, None) + value += "\n_remove of %s" % r + deps |= r2.references + deps = deps | (keys & r2.execs) + return value + deps = set() try: + if key in mod_funcs: + exclusions = set() + moddep = bb.codeparser.modulecode_deps[key] + value = handle_contains("", moddep[3], exclusions, d) + return frozenset((moddep[0] | keys & moddep[1]) - ignored_vars), value + if key[-1] == ']': vf = key[:-1].split('[') if vf[1] == "vardepvalueexclude": @@ -276,36 +302,12 @@ def build_dependencies(key, keys, shelldeps, varflagsexcl, ignored_vars, d): value, parser = d.getVarFlag(vf[0], vf[1], False, retparser=True) deps |= parser.references deps = deps | (keys & parser.execs) - return deps, value + deps -= ignored_vars + return frozenset(deps), value varflags = d.getVarFlags(key, ["vardeps", "vardepvalue", "vardepsexclude", "exports", "postfuncs", "prefuncs", "lineno", "filename"]) or {} vardeps = varflags.get("vardeps") exclusions = varflags.get("vardepsexclude", "").split() - def handle_contains(value, contains, exclusions, d): - newvalue = [] - if value: - newvalue.append(str(value)) - for k in sorted(contains): - if k in exclusions or k in ignored_vars: - continue - l = (d.getVar(k) or "").split() - for item in sorted(contains[k]): - for word in item.split(): - if not word in l: - newvalue.append("\n%s{%s} = Unset" % (k, item)) - break - else: - newvalue.append("\n%s{%s} = Set" % (k, item)) - return "".join(newvalue) - - def handle_remove(value, deps, removes, d): - for r in sorted(removes): - r2 = d.expandWithRefs(r, None) - value += "\n_remove of %s" % r - deps |= r2.references - deps = deps | (keys & r2.execs) - return value - if "vardepvalue" in varflags: value = varflags.get("vardepvalue") elif varflags.get("func"): @@ -359,18 +361,20 @@ def build_dependencies(key, keys, shelldeps, varflagsexcl, ignored_vars, d): deps |= set((vardeps or "").split()) deps -= set(exclusions) + deps -= ignored_vars except bb.parse.SkipRecipe: raise except Exception as e: bb.warn("Exception during build_dependencies for %s" % key) raise - return deps, value + return frozenset(deps), value #bb.note("Variable %s references %s and calls %s" % (key, str(deps), str(execs))) #d.setVarFlag(key, "vardeps", deps) def generate_dependencies(d, ignored_vars): - keys = set(key for key in d if not key.startswith("__")) + mod_funcs = set(bb.codeparser.modulecode_deps.keys()) + keys = set(key for key in d if not key.startswith("__")) | mod_funcs shelldeps = set(key for key in d.getVar("__exportlist", False) if d.getVarFlag(key, "export", False) and not d.getVarFlag(key, "unexport", False)) varflagsexcl = d.getVar('BB_SIGNATURE_EXCLUDE_FLAGS') @@ -379,16 +383,16 @@ def generate_dependencies(d, ignored_vars): tasklist = d.getVar('__BBTASKS', False) or [] for task in tasklist: - deps[task], values[task] = build_dependencies(task, keys, shelldeps, varflagsexcl, ignored_vars, d) + deps[task], values[task] = build_dependencies(task, keys, mod_funcs, shelldeps, varflagsexcl, ignored_vars, d) newdeps = deps[task] seen = set() while newdeps: - nextdeps = newdeps - ignored_vars + nextdeps = newdeps seen |= nextdeps newdeps = set() for dep in nextdeps: if dep not in deps: - deps[dep], values[dep] = build_dependencies(dep, keys, shelldeps, varflagsexcl, ignored_vars, d) + deps[dep], values[dep] = build_dependencies(dep, keys, mod_funcs, shelldeps, varflagsexcl, ignored_vars, d) newdeps |= deps[dep] newdeps -= seen #print "For %s: %s" % (task, str(deps[task])) @@ -407,7 +411,6 @@ def generate_dependency_hash(tasklist, gendeps, lookupcache, ignored_vars, fn): else: data = [data] - gendeps[task] -= ignored_vars newdeps = gendeps[task] seen = set() while newdeps: @@ -415,9 +418,6 @@ def generate_dependency_hash(tasklist, gendeps, lookupcache, ignored_vars, fn): seen |= nextdeps newdeps = set() for dep in nextdeps: - if dep in ignored_vars: - continue - gendeps[dep] -= ignored_vars newdeps |= gendeps[dep] newdeps -= seen @@ -429,7 +429,7 @@ def generate_dependency_hash(tasklist, gendeps, lookupcache, ignored_vars, fn): data.append(str(var)) k = fn + ":" + task basehash[k] = hashlib.sha256("".join(data).encode("utf-8")).hexdigest() - taskdeps[task] = alldeps + taskdeps[task] = frozenset(seen) return taskdeps, basehash diff --git a/poky/bitbake/lib/bb/data_smart.py b/poky/bitbake/lib/bb/data_smart.py index 5415f2fccf..e2c93597e5 100644 --- a/poky/bitbake/lib/bb/data_smart.py +++ b/poky/bitbake/lib/bb/data_smart.py @@ -92,10 +92,11 @@ def infer_caller_details(loginfo, parent = False, varval = True): loginfo['func'] = func class VariableParse: - def __init__(self, varname, d, val = None): + def __init__(self, varname, d, unexpanded_value = None, val = None): self.varname = varname self.d = d self.value = val + self.unexpanded_value = unexpanded_value self.references = set() self.execs = set() @@ -447,9 +448,9 @@ class DataSmart(MutableMapping): def expandWithRefs(self, s, varname): if not isinstance(s, str): # sanity check - return VariableParse(varname, self, s) + return VariableParse(varname, self, s, s) - varparse = VariableParse(varname, self) + varparse = VariableParse(varname, self, s) while s.find('${') != -1: olds = s @@ -486,12 +487,14 @@ class DataSmart(MutableMapping): return if self.inoverride: return + overrride_stack = [] for count in range(5): self.inoverride = True # Can end up here recursively so setup dummy values self.overrides = [] self.overridesset = set() self.overrides = (self.getVar("OVERRIDES") or "").split(":") or [] + overrride_stack.append(self.overrides) self.overridesset = set(self.overrides) self.inoverride = False self.expand_cache = {} @@ -501,7 +504,7 @@ class DataSmart(MutableMapping): self.overrides = newoverrides self.overridesset = set(self.overrides) else: - bb.fatal("Overrides could not be expanded into a stable state after 5 iterations, overrides must be being referenced by other overridden variables in some recursive fashion. Please provide your configuration to bitbake-devel so we can laugh, er, I mean try and understand how to make it work.") + bb.fatal("Overrides could not be expanded into a stable state after 5 iterations, overrides must be being referenced by other overridden variables in some recursive fashion. Please provide your configuration to bitbake-devel so we can laugh, er, I mean try and understand how to make it work. The list of failing override expansions: %s" % "\n".join(str(s) for s in overrride_stack)) def initVar(self, var): self.expand_cache = {} @@ -718,7 +721,7 @@ class DataSmart(MutableMapping): if ':' in var: override = var[var.rfind(':')+1:] shortvar = var[:var.rfind(':')] - while override and override.islower(): + while override and __override_regexp__.match(override): try: if shortvar in self.overridedata: # Force CoW by recreating the list first @@ -773,6 +776,9 @@ class DataSmart(MutableMapping): return None cachename = var + "[" + flag + "]" + if not expand and retparser and cachename in self.expand_cache: + return self.expand_cache[cachename].unexpanded_value, self.expand_cache[cachename] + if expand and cachename in self.expand_cache: return self.expand_cache[cachename].value diff --git a/poky/bitbake/lib/bb/event.py b/poky/bitbake/lib/bb/event.py index 97668601a1..8b05f93e2f 100644 --- a/poky/bitbake/lib/bb/event.py +++ b/poky/bitbake/lib/bb/event.py @@ -68,29 +68,28 @@ _catchall_handlers = {} _eventfilter = None _uiready = False _thread_lock = threading.Lock() -_thread_lock_enabled = False - -if hasattr(__builtins__, '__setitem__'): - builtins = __builtins__ -else: - builtins = __builtins__.__dict__ +_heartbeat_enabled = False def enable_threadlock(): - global _thread_lock_enabled - _thread_lock_enabled = True + # Always needed now + return def disable_threadlock(): - global _thread_lock_enabled - _thread_lock_enabled = False + # Always needed now + return + +def enable_heartbeat(): + global _heartbeat_enabled + _heartbeat_enabled = True + +def disable_heartbeat(): + global _heartbeat_enabled + _heartbeat_enabled = False def execute_handler(name, handler, event, d): event.data = d - addedd = False - if 'd' not in builtins: - builtins['d'] = d - addedd = True try: - ret = handler(event) + ret = handler(event, d) except (bb.parse.SkipRecipe, bb.BBHandledException): raise except Exception: @@ -104,8 +103,7 @@ def execute_handler(name, handler, event, d): raise finally: del event.data - if addedd: - del builtins['d'] + def fire_class_handlers(event, d): if isinstance(event, logging.LogRecord): @@ -180,36 +178,30 @@ def print_ui_queue(): def fire_ui_handlers(event, d): global _thread_lock - global _thread_lock_enabled if not _uiready: # No UI handlers registered yet, queue up the messages ui_queue.append(event) return - if _thread_lock_enabled: - _thread_lock.acquire() - - errors = [] - for h in _ui_handlers: - #print "Sending event %s" % event - try: - if not _ui_logfilters[h].filter(event): - continue - # We use pickle here since it better handles object instances - # which xmlrpc's marshaller does not. Events *must* be serializable - # by pickle. - if hasattr(_ui_handlers[h].event, "sendpickle"): - _ui_handlers[h].event.sendpickle((pickle.dumps(event))) - else: - _ui_handlers[h].event.send(event) - except: - errors.append(h) - for h in errors: - del _ui_handlers[h] - - if _thread_lock_enabled: - _thread_lock.release() + with bb.utils.lock_timeout(_thread_lock): + errors = [] + for h in _ui_handlers: + #print "Sending event %s" % event + try: + if not _ui_logfilters[h].filter(event): + continue + # We use pickle here since it better handles object instances + # which xmlrpc's marshaller does not. Events *must* be serializable + # by pickle. + if hasattr(_ui_handlers[h].event, "sendpickle"): + _ui_handlers[h].event.sendpickle((pickle.dumps(event))) + else: + _ui_handlers[h].event.send(event) + except: + errors.append(h) + for h in errors: + del _ui_handlers[h] def fire(event, d): """Fire off an Event""" @@ -253,12 +245,12 @@ def register(name, handler, mask=None, filename=None, lineno=None, data=None): if handler is not None: # handle string containing python code if isinstance(handler, str): - tmp = "def %s(e):\n%s" % (name, handler) + tmp = "def %s(e, d):\n%s" % (name, handler) try: code = bb.methodpool.compile_cache(tmp) if not code: if filename is None: - filename = "%s(e)" % name + filename = "%s(e, d)" % name code = compile(tmp, filename, "exec", ast.PyCF_ONLY_AST) if lineno is not None: ast.increment_lineno(code, lineno-1) @@ -323,21 +315,23 @@ def set_eventfilter(func): _eventfilter = func def register_UIHhandler(handler, mainui=False): - bb.event._ui_handler_seq = bb.event._ui_handler_seq + 1 - _ui_handlers[_ui_handler_seq] = handler - level, debug_domains = bb.msg.constructLogOptions() - _ui_logfilters[_ui_handler_seq] = UIEventFilter(level, debug_domains) - if mainui: - global _uiready - _uiready = _ui_handler_seq - return _ui_handler_seq + with bb.utils.lock_timeout(_thread_lock): + bb.event._ui_handler_seq = bb.event._ui_handler_seq + 1 + _ui_handlers[_ui_handler_seq] = handler + level, debug_domains = bb.msg.constructLogOptions() + _ui_logfilters[_ui_handler_seq] = UIEventFilter(level, debug_domains) + if mainui: + global _uiready + _uiready = _ui_handler_seq + return _ui_handler_seq def unregister_UIHhandler(handlerNum, mainui=False): if mainui: global _uiready _uiready = False - if handlerNum in _ui_handlers: - del _ui_handlers[handlerNum] + with bb.utils.lock_timeout(_thread_lock): + if handlerNum in _ui_handlers: + del _ui_handlers[handlerNum] return def get_uihandler(): diff --git a/poky/bitbake/lib/bb/fetch2/git.py b/poky/bitbake/lib/bb/fetch2/git.py index 578edc5914..2e3d32515f 100644 --- a/poky/bitbake/lib/bb/fetch2/git.py +++ b/poky/bitbake/lib/bb/fetch2/git.py @@ -44,7 +44,8 @@ Supported SRC_URI options are: - nobranch Don't check the SHA validation for branch. set this option for the recipe - referring to commit which is valid in tag instead of branch. + referring to commit which is valid in any namespace (branch, tag, ...) + instead of branch. The default is "0", set nobranch=1 if needed. - usehead @@ -382,7 +383,11 @@ class Git(FetchMethod): runfetchcmd("%s remote rm origin" % ud.basecmd, d, workdir=ud.clonedir) runfetchcmd("%s remote add --mirror=fetch origin %s" % (ud.basecmd, shlex.quote(repourl)), d, workdir=ud.clonedir) - fetch_cmd = "LANG=C %s fetch -f --progress %s refs/*:refs/*" % (ud.basecmd, shlex.quote(repourl)) + + if ud.nobranch: + fetch_cmd = "LANG=C %s fetch -f --progress %s refs/*:refs/*" % (ud.basecmd, shlex.quote(repourl)) + else: + fetch_cmd = "LANG=C %s fetch -f --progress %s refs/heads/*:refs/heads/* refs/tags/*:refs/tags/*" % (ud.basecmd, shlex.quote(repourl)) if ud.proto.lower() != 'file': bb.fetch2.check_network_access(d, fetch_cmd, ud.url) progresshandler = GitProgressHandler(d) diff --git a/poky/bitbake/lib/bb/fetch2/wget.py b/poky/bitbake/lib/bb/fetch2/wget.py index 821afa5b58..696e918030 100644 --- a/poky/bitbake/lib/bb/fetch2/wget.py +++ b/poky/bitbake/lib/bb/fetch2/wget.py @@ -341,7 +341,8 @@ class Wget(FetchMethod): opener = urllib.request.build_opener(*handlers) try: - uri = ud.url.split(";")[0] + uri_base = ud.url.split(";")[0] + uri = "{}://{}{}".format(urllib.parse.urlparse(uri_base).scheme, ud.host, ud.path) r = urllib.request.Request(uri) r.get_method = lambda: "HEAD" # Some servers (FusionForge, as used on Alioth) require that the @@ -644,10 +645,10 @@ class Wget(FetchMethod): # search for version matches on folders inside the path, like: # "5.7" in http://download.gnome.org/sources/${PN}/5.7/${PN}-${PV}.tar.gz dirver_regex = re.compile(r"(?P<dirver>[^/]*(\d+\.)*\d+([-_]r\d+)*)/") - m = dirver_regex.search(path) + m = dirver_regex.findall(path) if m: pn = d.getVar('PN') - dirver = m.group('dirver') + dirver = m[-1][0] dirver_pn_regex = re.compile(r"%s\d?" % (re.escape(pn))) if not dirver_pn_regex.search(dirver): diff --git a/poky/bitbake/lib/bb/main.py b/poky/bitbake/lib/bb/main.py index 93eda3632e..92d8dc0293 100755 --- a/poky/bitbake/lib/bb/main.py +++ b/poky/bitbake/lib/bb/main.py @@ -12,11 +12,12 @@ import os import sys import logging -import optparse +import argparse import warnings import fcntl import time import traceback +import datetime import bb from bb import event @@ -43,18 +44,18 @@ def present_options(optionlist): else: return optionlist[0] -class BitbakeHelpFormatter(optparse.IndentedHelpFormatter): - def format_option(self, option): +class BitbakeHelpFormatter(argparse.HelpFormatter): + def _get_help_string(self, action): # We need to do this here rather than in the text we supply to # add_option() because we don't want to call list_extension_modules() # on every execution (since it imports all of the modules) # Note also that we modify option.help rather than the returned text # - this is so that we don't have to re-format the text ourselves - if option.dest == 'ui': + if action.dest == 'ui': valid_uis = list_extension_modules(bb.ui, 'main') - option.help = option.help.replace('@CHOICES@', present_options(valid_uis)) + return action.help.replace('@CHOICES@', present_options(valid_uis)) - return optparse.IndentedHelpFormatter.format_option(self, option) + return action.help def list_extension_modules(pkg, checkattr): """ @@ -114,180 +115,205 @@ def _showwarning(message, category, filename, lineno, file=None, line=None): warnings.showwarning = _showwarning def create_bitbake_parser(): - parser = optparse.OptionParser( - formatter=BitbakeHelpFormatter(), - version="BitBake Build Tool Core version %s" % bb.__version__, - usage="""%prog [options] [recipename/target recipe:do_task ...] - - Executes the specified task (default is 'build') for a given set of target recipes (.bb files). - It is assumed there is a conf/bblayers.conf available in cwd or in BBPATH which - will provide the layer, BBFILES and other configuration information.""") - - parser.add_option("-b", "--buildfile", action="store", dest="buildfile", default=None, - help="Execute tasks from a specific .bb recipe directly. WARNING: Does " - "not handle any dependencies from other recipes.") - - parser.add_option("-k", "--continue", action="store_false", dest="halt", default=True, - help="Continue as much as possible after an error. While the target that " - "failed and anything depending on it cannot be built, as much as " - "possible will be built before stopping.") - - parser.add_option("-f", "--force", action="store_true", dest="force", default=False, - help="Force the specified targets/task to run (invalidating any " - "existing stamp file).") - - parser.add_option("-c", "--cmd", action="store", dest="cmd", - help="Specify the task to execute. The exact options available " - "depend on the metadata. Some examples might be 'compile'" - " or 'populate_sysroot' or 'listtasks' may give a list of " - "the tasks available.") - - parser.add_option("-C", "--clear-stamp", action="store", dest="invalidate_stamp", - help="Invalidate the stamp for the specified task such as 'compile' " - "and then run the default task for the specified target(s).") - - parser.add_option("-r", "--read", action="append", dest="prefile", default=[], - help="Read the specified file before bitbake.conf.") - - parser.add_option("-R", "--postread", action="append", dest="postfile", default=[], - help="Read the specified file after bitbake.conf.") - - parser.add_option("-v", "--verbose", action="store_true", dest="verbose", default=False, - help="Enable tracing of shell tasks (with 'set -x'). " - "Also print bb.note(...) messages to stdout (in " - "addition to writing them to ${T}/log.do_<task>).") - - parser.add_option("-D", "--debug", action="count", dest="debug", default=0, - help="Increase the debug level. You can specify this " - "more than once. -D sets the debug level to 1, " - "where only bb.debug(1, ...) messages are printed " - "to stdout; -DD sets the debug level to 2, where " - "both bb.debug(1, ...) and bb.debug(2, ...) " - "messages are printed; etc. Without -D, no debug " - "messages are printed. Note that -D only affects " - "output to stdout. All debug messages are written " - "to ${T}/log.do_taskname, regardless of the debug " - "level.") - - parser.add_option("-q", "--quiet", action="count", dest="quiet", default=0, - help="Output less log message data to the terminal. You can specify this more than once.") - - parser.add_option("-n", "--dry-run", action="store_true", dest="dry_run", default=False, - help="Don't execute, just go through the motions.") - - parser.add_option("-S", "--dump-signatures", action="append", dest="dump_signatures", - default=[], metavar="SIGNATURE_HANDLER", - help="Dump out the signature construction information, with no task " - "execution. The SIGNATURE_HANDLER parameter is passed to the " - "handler. Two common values are none and printdiff but the handler " - "may define more/less. none means only dump the signature, printdiff" - " means compare the dumped signature with the cached one.") - - parser.add_option("-p", "--parse-only", action="store_true", - dest="parse_only", default=False, - help="Quit after parsing the BB recipes.") - - parser.add_option("-s", "--show-versions", action="store_true", - dest="show_versions", default=False, - help="Show current and preferred versions of all recipes.") - - parser.add_option("-e", "--environment", action="store_true", - dest="show_environment", default=False, - help="Show the global or per-recipe environment complete with information" - " about where variables were set/changed.") - - parser.add_option("-g", "--graphviz", action="store_true", dest="dot_graph", default=False, - help="Save dependency tree information for the specified " - "targets in the dot syntax.") - - parser.add_option("-I", "--ignore-deps", action="append", - dest="extra_assume_provided", default=[], - help="Assume these dependencies don't exist and are already provided " - "(equivalent to ASSUME_PROVIDED). Useful to make dependency " - "graphs more appealing") - - parser.add_option("-l", "--log-domains", action="append", dest="debug_domains", default=[], - help="Show debug logging for the specified logging domains") - - parser.add_option("-P", "--profile", action="store_true", dest="profile", default=False, - help="Profile the command and save reports.") + parser = argparse.ArgumentParser( + description="""\ + It is assumed there is a conf/bblayers.conf available in cwd or in BBPATH which + will provide the layer, BBFILES and other configuration information. + """, + formatter_class=BitbakeHelpFormatter, + allow_abbrev=False, + add_help=False, # help is manually added below in a specific argument group + ) + + general_group = parser.add_argument_group('General options') + task_group = parser.add_argument_group('Task control options') + exec_group = parser.add_argument_group('Execution control options') + logging_group = parser.add_argument_group('Logging/output control options') + server_group = parser.add_argument_group('Server options') + config_group = parser.add_argument_group('Configuration options') + + general_group.add_argument("targets", nargs="*", metavar="recipename/target", + help="Execute the specified task (default is 'build') for these target " + "recipes (.bb files).") + + general_group.add_argument("-s", "--show-versions", action="store_true", + help="Show current and preferred versions of all recipes.") + + general_group.add_argument("-e", "--environment", action="store_true", + dest="show_environment", + help="Show the global or per-recipe environment complete with information" + " about where variables were set/changed.") + + general_group.add_argument("-g", "--graphviz", action="store_true", dest="dot_graph", + help="Save dependency tree information for the specified " + "targets in the dot syntax.") # @CHOICES@ is substituted out by BitbakeHelpFormatter above - parser.add_option("-u", "--ui", action="store", dest="ui", - default=os.environ.get('BITBAKE_UI', 'knotty'), - help="The user interface to use (@CHOICES@ - default %default).") - - parser.add_option("", "--token", action="store", dest="xmlrpctoken", - default=os.environ.get("BBTOKEN"), - help="Specify the connection token to be used when connecting " - "to a remote server.") - - parser.add_option("", "--revisions-changed", action="store_true", - dest="revisions_changed", default=False, - help="Set the exit code depending on whether upstream floating " - "revisions have changed or not.") - - parser.add_option("", "--server-only", action="store_true", - dest="server_only", default=False, - help="Run bitbake without a UI, only starting a server " - "(cooker) process.") - - parser.add_option("-B", "--bind", action="store", dest="bind", default=False, - help="The name/address for the bitbake xmlrpc server to bind to.") - - parser.add_option("-T", "--idle-timeout", type=float, dest="server_timeout", - default=os.getenv("BB_SERVER_TIMEOUT"), - help="Set timeout to unload bitbake server due to inactivity, " - "set to -1 means no unload, " - "default: Environment variable BB_SERVER_TIMEOUT.") - - parser.add_option("", "--no-setscene", action="store_true", - dest="nosetscene", default=False, - help="Do not run any setscene tasks. sstate will be ignored and " - "everything needed, built.") - - parser.add_option("", "--skip-setscene", action="store_true", - dest="skipsetscene", default=False, - help="Skip setscene tasks if they would be executed. Tasks previously " - "restored from sstate will be kept, unlike --no-setscene") - - parser.add_option("", "--setscene-only", action="store_true", - dest="setsceneonly", default=False, - help="Only run setscene tasks, don't run any real tasks.") - - parser.add_option("", "--remote-server", action="store", dest="remote_server", - default=os.environ.get("BBSERVER"), - help="Connect to the specified server.") - - parser.add_option("-m", "--kill-server", action="store_true", - dest="kill_server", default=False, - help="Terminate any running bitbake server.") - - parser.add_option("", "--observe-only", action="store_true", - dest="observe_only", default=False, - help="Connect to a server as an observing-only client.") - - parser.add_option("", "--status-only", action="store_true", - dest="status_only", default=False, - help="Check the status of the remote bitbake server.") - - parser.add_option("-w", "--write-log", action="store", dest="writeeventlog", - default=os.environ.get("BBEVENTLOG"), - help="Writes the event log of the build to a bitbake event json file. " - "Use '' (empty string) to assign the name automatically.") - - parser.add_option("", "--runall", action="append", dest="runall", - help="Run the specified task for any recipe in the taskgraph of the specified target (even if it wouldn't otherwise have run).") - - parser.add_option("", "--runonly", action="append", dest="runonly", - help="Run only the specified task within the taskgraph of the specified targets (and any task dependencies those tasks may have).") + general_group.add_argument("-u", "--ui", + default=os.environ.get('BITBAKE_UI', 'knotty'), + help="The user interface to use (@CHOICES@ - default %(default)s).") + + general_group.add_argument("--version", action="store_true", + help="Show programs version and exit.") + + general_group.add_argument('-h', '--help', action='help', + help='Show this help message and exit.') + + + task_group.add_argument("-f", "--force", action="store_true", + help="Force the specified targets/task to run (invalidating any " + "existing stamp file).") + + task_group.add_argument("-c", "--cmd", + help="Specify the task to execute. The exact options available " + "depend on the metadata. Some examples might be 'compile'" + " or 'populate_sysroot' or 'listtasks' may give a list of " + "the tasks available.") + + task_group.add_argument("-C", "--clear-stamp", dest="invalidate_stamp", + help="Invalidate the stamp for the specified task such as 'compile' " + "and then run the default task for the specified target(s).") + + task_group.add_argument("--runall", action="append", default=[], + help="Run the specified task for any recipe in the taskgraph of the " + "specified target (even if it wouldn't otherwise have run).") + + task_group.add_argument("--runonly", action="append", + help="Run only the specified task within the taskgraph of the " + "specified targets (and any task dependencies those tasks may have).") + + task_group.add_argument("--no-setscene", action="store_true", + dest="nosetscene", + help="Do not run any setscene tasks. sstate will be ignored and " + "everything needed, built.") + + task_group.add_argument("--skip-setscene", action="store_true", + dest="skipsetscene", + help="Skip setscene tasks if they would be executed. Tasks previously " + "restored from sstate will be kept, unlike --no-setscene.") + + task_group.add_argument("--setscene-only", action="store_true", + dest="setsceneonly", + help="Only run setscene tasks, don't run any real tasks.") + + + exec_group.add_argument("-n", "--dry-run", action="store_true", + help="Don't execute, just go through the motions.") + + exec_group.add_argument("-p", "--parse-only", action="store_true", + help="Quit after parsing the BB recipes.") + + exec_group.add_argument("-k", "--continue", action="store_false", dest="halt", + help="Continue as much as possible after an error. While the target that " + "failed and anything depending on it cannot be built, as much as " + "possible will be built before stopping.") + + exec_group.add_argument("-P", "--profile", action="store_true", + help="Profile the command and save reports.") + + exec_group.add_argument("-S", "--dump-signatures", action="append", + default=[], metavar="SIGNATURE_HANDLER", + help="Dump out the signature construction information, with no task " + "execution. The SIGNATURE_HANDLER parameter is passed to the " + "handler. Two common values are none and printdiff but the handler " + "may define more/less. none means only dump the signature, printdiff" + " means compare the dumped signature with the cached one.") + + exec_group.add_argument("--revisions-changed", action="store_true", + help="Set the exit code depending on whether upstream floating " + "revisions have changed or not.") + + exec_group.add_argument("-b", "--buildfile", + help="Execute tasks from a specific .bb recipe directly. WARNING: Does " + "not handle any dependencies from other recipes.") + + logging_group.add_argument("-D", "--debug", action="count", default=0, + help="Increase the debug level. You can specify this " + "more than once. -D sets the debug level to 1, " + "where only bb.debug(1, ...) messages are printed " + "to stdout; -DD sets the debug level to 2, where " + "both bb.debug(1, ...) and bb.debug(2, ...) " + "messages are printed; etc. Without -D, no debug " + "messages are printed. Note that -D only affects " + "output to stdout. All debug messages are written " + "to ${T}/log.do_taskname, regardless of the debug " + "level.") + + logging_group.add_argument("-l", "--log-domains", action="append", dest="debug_domains", + default=[], + help="Show debug logging for the specified logging domains.") + + logging_group.add_argument("-v", "--verbose", action="store_true", + help="Enable tracing of shell tasks (with 'set -x'). " + "Also print bb.note(...) messages to stdout (in " + "addition to writing them to ${T}/log.do_<task>).") + + logging_group.add_argument("-q", "--quiet", action="count", default=0, + help="Output less log message data to the terminal. You can specify this " + "more than once.") + + logging_group.add_argument("-w", "--write-log", dest="writeeventlog", + default=os.environ.get("BBEVENTLOG"), + help="Writes the event log of the build to a bitbake event json file. " + "Use '' (empty string) to assign the name automatically.") + + + server_group.add_argument("-B", "--bind", default=False, + help="The name/address for the bitbake xmlrpc server to bind to.") + + server_group.add_argument("-T", "--idle-timeout", type=float, dest="server_timeout", + default=os.getenv("BB_SERVER_TIMEOUT"), + help="Set timeout to unload bitbake server due to inactivity, " + "set to -1 means no unload, " + "default: Environment variable BB_SERVER_TIMEOUT.") + + server_group.add_argument("--remote-server", + default=os.environ.get("BBSERVER"), + help="Connect to the specified server.") + + server_group.add_argument("-m", "--kill-server", action="store_true", + help="Terminate any running bitbake server.") + + server_group.add_argument("--token", dest="xmlrpctoken", + default=os.environ.get("BBTOKEN"), + help="Specify the connection token to be used when connecting " + "to a remote server.") + + server_group.add_argument("--observe-only", action="store_true", + help="Connect to a server as an observing-only client.") + + server_group.add_argument("--status-only", action="store_true", + help="Check the status of the remote bitbake server.") + + server_group.add_argument("--server-only", action="store_true", + help="Run bitbake without a UI, only starting a server " + "(cooker) process.") + + + config_group.add_argument("-r", "--read", action="append", dest="prefile", default=[], + help="Read the specified file before bitbake.conf.") + + config_group.add_argument("-R", "--postread", action="append", dest="postfile", default=[], + help="Read the specified file after bitbake.conf.") + + + config_group.add_argument("-I", "--ignore-deps", action="append", + dest="extra_assume_provided", default=[], + help="Assume these dependencies don't exist and are already provided " + "(equivalent to ASSUME_PROVIDED). Useful to make dependency " + "graphs more appealing.") + return parser class BitBakeConfigParameters(cookerdata.ConfigParameters): def parseCommandLine(self, argv=sys.argv): parser = create_bitbake_parser() - options, targets = parser.parse_args(argv) + options = parser.parse_intermixed_args(argv[1:]) + + if options.version: + print("BitBake Build Tool Core version %s" % bb.__version__) + sys.exit(0) if options.quiet and options.verbose: parser.error("options --quiet and --verbose are mutually exclusive") @@ -319,7 +345,7 @@ class BitBakeConfigParameters(cookerdata.ConfigParameters): else: options.xmlrpcinterface = (None, 0) - return options, targets[1:] + return options, options.targets def bitbake_main(configParams, configuration): @@ -384,6 +410,9 @@ def bitbake_main(configParams, configuration): return 1 +def timestamp(): + return datetime.datetime.now().strftime('%H:%M:%S.%f') + def setup_bitbake(configParams, extrafeatures=None): # Ensure logging messages get sent to the UI as events handler = bb.event.LogHandler() @@ -391,6 +420,11 @@ def setup_bitbake(configParams, extrafeatures=None): # In status only mode there are no logs and no UI logger.addHandler(handler) + if configParams.dump_signatures: + if extrafeatures is None: + extrafeatures = [] + extrafeatures.append(bb.cooker.CookerFeatures.RECIPE_SIGGEN_INFO) + if configParams.server_only: featureset = [] ui_module = None @@ -418,7 +452,7 @@ def setup_bitbake(configParams, extrafeatures=None): retries = 8 while retries: try: - topdir, lock = lockBitbake() + topdir, lock, lockfile = lockBitbake() sockname = topdir + "/bitbake.sock" if lock: if configParams.status_only or configParams.kill_server: @@ -429,18 +463,22 @@ def setup_bitbake(configParams, extrafeatures=None): logger.info("Starting bitbake server...") # Clear the event queue since we already displayed messages bb.event.ui_queue = [] - server = bb.server.process.BitBakeServer(lock, sockname, featureset, configParams.server_timeout, configParams.xmlrpcinterface) + server = bb.server.process.BitBakeServer(lock, sockname, featureset, configParams.server_timeout, configParams.xmlrpcinterface, configParams.profile) else: logger.info("Reconnecting to bitbake server...") if not os.path.exists(sockname): - logger.info("Previous bitbake instance shutting down?, waiting to retry...") + logger.info("Previous bitbake instance shutting down?, waiting to retry... (%s)" % timestamp()) + procs = bb.server.process.get_lockfile_process_msg(lockfile) + if procs: + logger.info("Processes holding bitbake.lock (missing socket %s):\n%s" % (sockname, procs)) + logger.info("Directory listing: %s" % (str(os.listdir(topdir)))) i = 0 lock = None # Wait for 5s or until we can get the lock while not lock and i < 50: time.sleep(0.1) - _, lock = lockBitbake() + _, lock, _ = lockBitbake() i += 1 if lock: bb.utils.unlockfile(lock) @@ -459,9 +497,9 @@ def setup_bitbake(configParams, extrafeatures=None): retries -= 1 tryno = 8 - retries if isinstance(e, (bb.server.process.ProcessTimeout, BrokenPipeError, EOFError, SystemExit)): - logger.info("Retrying server connection (#%d)..." % tryno) + logger.info("Retrying server connection (#%d)... (%s)" % (tryno, timestamp())) else: - logger.info("Retrying server connection (#%d)... (%s)" % (tryno, traceback.format_exc())) + logger.info("Retrying server connection (#%d)... (%s, %s)" % (tryno, traceback.format_exc(), timestamp())) if not retries: bb.fatal("Unable to connect to bitbake server, or start one (server startup failures would be in bitbake-cookerdaemon.log).") @@ -490,5 +528,5 @@ def lockBitbake(): bb.error("Unable to find conf/bblayers.conf or conf/bitbake.conf. BBPATH is unset and/or not in a build directory?") raise BBMainFatal lockfile = topdir + "/bitbake.lock" - return topdir, bb.utils.lockfile(lockfile, False, False) + return topdir, bb.utils.lockfile(lockfile, False, False), lockfile diff --git a/poky/bitbake/lib/bb/parse/__init__.py b/poky/bitbake/lib/bb/parse/__init__.py index 347609513b..4cd82f115b 100644 --- a/poky/bitbake/lib/bb/parse/__init__.py +++ b/poky/bitbake/lib/bb/parse/__init__.py @@ -99,12 +99,12 @@ def supports(fn, data): return 1 return 0 -def handle(fn, data, include = 0): +def handle(fn, data, include=0, baseconfig=False): """Call the handler that is appropriate for this file""" for h in handlers: if h['supports'](fn, data): with data.inchistory.include(fn): - return h['handle'](fn, data, include) + return h['handle'](fn, data, include, baseconfig) raise ParseError("not a BitBake file", fn) def init(fn, data): diff --git a/poky/bitbake/lib/bb/parse/ast.py b/poky/bitbake/lib/bb/parse/ast.py index 9e0a0f5c98..375ba3cb79 100644 --- a/poky/bitbake/lib/bb/parse/ast.py +++ b/poky/bitbake/lib/bb/parse/ast.py @@ -9,6 +9,7 @@ # SPDX-License-Identifier: GPL-2.0-only # +import sys import bb from bb import methodpool from bb.parse import logger @@ -269,6 +270,41 @@ class BBHandlerNode(AstNode): data.setVarFlag(h, "handler", 1) data.setVar('__BBHANDLERS', bbhands) +class PyLibNode(AstNode): + def __init__(self, filename, lineno, libdir, namespace): + AstNode.__init__(self, filename, lineno) + self.libdir = libdir + self.namespace = namespace + + def eval(self, data): + global_mods = (data.getVar("BB_GLOBAL_PYMODULES") or "").split() + for m in global_mods: + if m not in bb.utils._context: + bb.utils._context[m] = __import__(m) + + libdir = data.expand(self.libdir) + if libdir not in sys.path: + sys.path.append(libdir) + try: + bb.utils._context[self.namespace] = __import__(self.namespace) + toimport = getattr(bb.utils._context[self.namespace], "BBIMPORTS", []) + for i in toimport: + bb.utils._context[self.namespace] = __import__(self.namespace + "." + i) + mod = getattr(bb.utils._context[self.namespace], i) + fn = getattr(mod, "__file__") + funcs = {} + for f in dir(mod): + if f.startswith("_"): + continue + fcall = getattr(mod, f) + if not callable(fcall): + continue + funcs[f] = fcall + bb.codeparser.add_module_functions(fn, funcs, "%s.%s" % (self.namespace, i)) + + except AttributeError as e: + bb.error("Error importing OE modules: %s" % str(e)) + class InheritNode(AstNode): def __init__(self, filename, lineno, classes): AstNode.__init__(self, filename, lineno) @@ -320,6 +356,9 @@ def handleDelTask(statements, filename, lineno, m): def handleBBHandlers(statements, filename, lineno, m): statements.append(BBHandlerNode(filename, lineno, m.group(1))) +def handlePyLib(statements, filename, lineno, m): + statements.append(PyLibNode(filename, lineno, m.group(1), m.group(2))) + def handleInherit(statements, filename, lineno, m): classes = m.group(1) statements.append(InheritNode(filename, lineno, classes)) diff --git a/poky/bitbake/lib/bb/parse/parse_py/BBHandler.py b/poky/bitbake/lib/bb/parse/parse_py/BBHandler.py index 18e6868387..4d5b45e1ef 100644 --- a/poky/bitbake/lib/bb/parse/parse_py/BBHandler.py +++ b/poky/bitbake/lib/bb/parse/parse_py/BBHandler.py @@ -101,8 +101,8 @@ def get_statements(filename, absolute_filename, base_name): cached_statements[absolute_filename] = statements return statements -def handle(fn, d, include): - global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __infunc__, __body__, __residue__, __classname__ +def handle(fn, d, include, baseconfig=False): + global __infunc__, __body__, __residue__, __classname__ __body__ = [] __infunc__ = [] __classname__ = "" @@ -154,7 +154,7 @@ def handle(fn, d, include): return d def feeder(lineno, s, fn, root, statements, eof=False): - global __func_start_regexp__, __inherit_regexp__, __export_func_regexp__, __addtask_regexp__, __addhandler_regexp__, __def_regexp__, __python_func_regexp__, __inpython__, __infunc__, __body__, bb, __residue__, __classname__ + global __inpython__, __infunc__, __body__, __residue__, __classname__ # Check tabs in python functions: # - def py_funcname(): covered by __inpython__ @@ -265,7 +265,7 @@ def feeder(lineno, s, fn, root, statements, eof=False): ast.handleInherit(statements, fn, lineno, m) return - return ConfHandler.feeder(lineno, s, fn, statements) + return ConfHandler.feeder(lineno, s, fn, statements, conffile=False) # Add us to the handlers list from .. import handlers diff --git a/poky/bitbake/lib/bb/parse/parse_py/ConfHandler.py b/poky/bitbake/lib/bb/parse/parse_py/ConfHandler.py index 451e68dd66..3076067287 100644 --- a/poky/bitbake/lib/bb/parse/parse_py/ConfHandler.py +++ b/poky/bitbake/lib/bb/parse/parse_py/ConfHandler.py @@ -46,6 +46,7 @@ __require_regexp__ = re.compile( r"require\s+(.+)" ) __export_regexp__ = re.compile( r"export\s+([a-zA-Z0-9\-_+.${}/~]+)$" ) __unset_regexp__ = re.compile( r"unset\s+([a-zA-Z0-9\-_+.${}/~]+)$" ) __unset_flag_regexp__ = re.compile( r"unset\s+([a-zA-Z0-9\-_+.${}/~]+)\[([a-zA-Z0-9\-_+.]+)\]$" ) +__addpylib_regexp__ = re.compile(r"addpylib\s+(.+)\s+(.+)" ) def init(data): return @@ -107,7 +108,7 @@ def include_single_file(parentfn, fn, lineno, data, error_out): # parsing. This turns out to be a hard problem to solve any other way. confFilters = [] -def handle(fn, data, include): +def handle(fn, data, include, baseconfig=False): init(data) if include == 0: @@ -144,7 +145,7 @@ def handle(fn, data, include): # skip comments if s[0] == '#': continue - feeder(lineno, s, abs_fn, statements) + feeder(lineno, s, abs_fn, statements, baseconfig=baseconfig) # DONE WITH PARSING... time to evaluate data.setVar('FILE', abs_fn) @@ -157,7 +158,9 @@ def handle(fn, data, include): return data -def feeder(lineno, s, fn, statements): +# baseconfig is set for the bblayers/layer.conf cookerdata config parsing +# The function is also used by BBHandler, conffile would be False +def feeder(lineno, s, fn, statements, baseconfig=False, conffile=True): m = __config_regexp__.match(s) if m: groupd = m.groupdict() @@ -189,6 +192,11 @@ def feeder(lineno, s, fn, statements): ast.handleUnsetFlag(statements, fn, lineno, m) return + m = __addpylib_regexp__.match(s) + if baseconfig and conffile and m: + ast.handlePyLib(statements, fn, lineno, m) + return + raise ParseError("unparsed line: '%s'" % s, fn, lineno); # Add us to the handlers list diff --git a/poky/bitbake/lib/bb/runqueue.py b/poky/bitbake/lib/bb/runqueue.py index 338d1fe36f..ce711b6252 100644 --- a/poky/bitbake/lib/bb/runqueue.py +++ b/poky/bitbake/lib/bb/runqueue.py @@ -155,7 +155,7 @@ class RunQueueScheduler(object): self.stamps = {} for tid in self.rqdata.runtaskentries: (mc, fn, taskname, taskfn) = split_tid_mcfn(tid) - self.stamps[tid] = bb.build.stampfile(taskname, self.rqdata.dataCaches[mc], taskfn, noextra=True) + self.stamps[tid] = bb.parse.siggen.stampfile_mcfn(taskname, taskfn, extrainfo=False) if tid in self.rq.runq_buildable: self.buildable.append(tid) @@ -651,6 +651,8 @@ class RunQueueData: # Nothing to do return 0 + bb.parse.siggen.setup_datacache(self.dataCaches) + self.init_progress_reporter.start() self.init_progress_reporter.next_stage() @@ -698,6 +700,8 @@ class RunQueueData: frommc = mcdependency[1] mcdep = mcdependency[2] deptask = mcdependency[4] + if mcdep not in taskData: + bb.fatal("Multiconfig '%s' is referenced in multiconfig dependency '%s' but not enabled in BBMULTICONFIG?" % (mcdep, dep)) if mc == frommc: fn = taskData[mcdep].build_targets[pn][0] newdep = '%s:%s' % (fn,deptask) @@ -933,7 +937,7 @@ class RunQueueData: bb.debug(1, "Task %s is marked nostamp, cannot invalidate this task" % taskname) else: logger.verbose("Invalidate task %s, %s", taskname, fn) - bb.parse.siggen.invalidate_task(taskname, self.dataCaches[mc], taskfn) + bb.parse.siggen.invalidate_task(taskname, taskfn) self.target_tids = [] for (mc, target, task, fn) in self.targets: @@ -1243,9 +1247,8 @@ class RunQueueData: return len(self.runtaskentries) def prepare_task_hash(self, tid): - dc = bb.parse.siggen.get_data_caches(self.dataCaches, mc_from_tid(tid)) - bb.parse.siggen.prep_taskhash(tid, self.runtaskentries[tid].depends, dc) - self.runtaskentries[tid].hash = bb.parse.siggen.get_taskhash(tid, self.runtaskentries[tid].depends, dc) + bb.parse.siggen.prep_taskhash(tid, self.runtaskentries[tid].depends, self.dataCaches) + self.runtaskentries[tid].hash = bb.parse.siggen.get_taskhash(tid, self.runtaskentries[tid].depends, self.dataCaches) self.runtaskentries[tid].unihash = bb.parse.siggen.get_unihash(tid) def dump_data(self): @@ -1311,10 +1314,6 @@ class RunQueue: workerpipe = runQueuePipe(worker.stdout, None, self.cfgData, self, rqexec, fakerootlogs=fakerootlogs) workerdata = { - "taskdeps" : self.rqdata.dataCaches[mc].task_deps, - "fakerootenv" : self.rqdata.dataCaches[mc].fakerootenv, - "fakerootdirs" : self.rqdata.dataCaches[mc].fakerootdirs, - "fakerootnoenv" : self.rqdata.dataCaches[mc].fakerootnoenv, "sigdata" : bb.parse.siggen.get_taskdata(), "logdefaultlevel" : bb.msg.loggerDefaultLogLevel, "build_verbose_shell" : self.cooker.configuration.build_verbose_shell, @@ -1399,7 +1398,7 @@ class RunQueue: if taskname is None: taskname = tn - stampfile = bb.build.stampfile(taskname, self.rqdata.dataCaches[mc], taskfn) + stampfile = bb.parse.siggen.stampfile_mcfn(taskname, taskfn) # If the stamp is missing, it's not current if not os.access(stampfile, os.F_OK): @@ -1411,7 +1410,7 @@ class RunQueue: logger.debug2("%s.%s is nostamp\n", fn, taskname) return False - if taskname != "do_setscene" and taskname.endswith("_setscene"): + if taskname.endswith("_setscene"): return True if cache is None: @@ -1422,8 +1421,8 @@ class RunQueue: for dep in self.rqdata.runtaskentries[tid].depends: if iscurrent: (mc2, fn2, taskname2, taskfn2) = split_tid_mcfn(dep) - stampfile2 = bb.build.stampfile(taskname2, self.rqdata.dataCaches[mc2], taskfn2) - stampfile3 = bb.build.stampfile(taskname2 + "_setscene", self.rqdata.dataCaches[mc2], taskfn2) + stampfile2 = bb.parse.siggen.stampfile_mcfn(taskname2, taskfn2) + stampfile3 = bb.parse.siggen.stampfile_mcfn(taskname2 + "_setscene", taskfn2) t2 = get_timestamp(stampfile2) t3 = get_timestamp(stampfile3) if t3 and not t2: @@ -1512,7 +1511,7 @@ class RunQueue: if not self.dm_event_handler_registered: res = bb.event.register(self.dm_event_handler_name, - lambda x: self.dm.check(self) if self.state in [runQueueRunning, runQueueCleanUp] else False, + lambda x, y: self.dm.check(self) if self.state in [runQueueRunning, runQueueCleanUp] else False, ('bb.event.HeartbeatEvent',), data=self.cfgData) self.dm_event_handler_registered = True @@ -1609,29 +1608,28 @@ class RunQueue: else: self.rqexe.finish() - def rq_dump_sigfn(self, fn, options): - bb_cache = bb.cache.NoCache(self.cooker.databuilder) - mc = bb.runqueue.mc_from_tid(fn) - the_data = bb_cache.loadDataFull(fn, self.cooker.collections[mc].get_file_appends(fn)) - siggen = bb.parse.siggen - dataCaches = self.rqdata.dataCaches - siggen.dump_sigfn(fn, dataCaches, options) + def _rq_dump_sigtid(self, tids): + for tid in tids: + (mc, fn, taskname, taskfn) = split_tid_mcfn(tid) + dataCaches = self.rqdata.dataCaches + bb.parse.siggen.dump_sigtask(taskfn, taskname, dataCaches[mc].stamp[taskfn], True) def dump_signatures(self, options): - fns = set() - bb.note("Reparsing files to collect dependency data") + if bb.cooker.CookerFeatures.RECIPE_SIGGEN_INFO not in self.cooker.featureset: + bb.fatal("The dump signatures functionality needs the RECIPE_SIGGEN_INFO feature enabled") - for tid in self.rqdata.runtaskentries: - fn = fn_from_tid(tid) - fns.add(fn) + bb.note("Writing task signature files") max_process = int(self.cfgData.getVar("BB_NUMBER_PARSE_THREADS") or os.cpu_count() or 1) + def chunkify(l, n): + return [l[i::n] for i in range(n)] + tids = chunkify(list(self.rqdata.runtaskentries), max_process) # We cannot use the real multiprocessing.Pool easily due to some local data # that can't be pickled. This is a cheap multi-process solution. launched = [] - while fns: + while tids: if len(launched) < max_process: - p = Process(target=self.rq_dump_sigfn, args=(fns.pop(), options)) + p = Process(target=self._rq_dump_sigtid, args=(tids.pop(), )) p.start() launched.append(p) for q in launched: @@ -2140,21 +2138,33 @@ class RunQueueExecute: startevent = sceneQueueTaskStarted(task, self.stats, self.rq) bb.event.fire(startevent, self.cfgData) - taskdepdata = self.sq_build_taskdepdata(task) - taskdep = self.rqdata.dataCaches[mc].task_deps[taskfn] - taskhash = self.rqdata.get_task_hash(task) - unihash = self.rqdata.get_task_unihash(task) + runtask = { + 'fn' : taskfn, + 'task' : task, + 'taskname' : taskname, + 'taskhash' : self.rqdata.get_task_hash(task), + 'unihash' : self.rqdata.get_task_unihash(task), + 'quieterrors' : True, + 'appends' : self.cooker.collections[mc].get_file_appends(taskfn), + 'taskdepdata' : self.sq_build_taskdepdata(task), + 'dry_run' : False, + 'taskdep': taskdep, + 'fakerootenv' : self.rqdata.dataCaches[mc].fakerootenv[taskfn], + 'fakerootdirs' : self.rqdata.dataCaches[mc].fakerootdirs[taskfn], + 'fakerootnoenv' : self.rqdata.dataCaches[mc].fakerootnoenv[taskfn] + } + if 'fakeroot' in taskdep and taskname in taskdep['fakeroot'] and not self.cooker.configuration.dry_run: if not mc in self.rq.fakeworker: self.rq.start_fakeworker(self, mc) - self.rq.fakeworker[mc].process.stdin.write(b"<runtask>" + pickle.dumps((taskfn, task, taskname, taskhash, unihash, True, self.cooker.collections[mc].get_file_appends(taskfn), taskdepdata, False)) + b"</runtask>") + self.rq.fakeworker[mc].process.stdin.write(b"<runtask>" + pickle.dumps(runtask) + b"</runtask>") self.rq.fakeworker[mc].process.stdin.flush() else: - self.rq.worker[mc].process.stdin.write(b"<runtask>" + pickle.dumps((taskfn, task, taskname, taskhash, unihash, True, self.cooker.collections[mc].get_file_appends(taskfn), taskdepdata, False)) + b"</runtask>") + self.rq.worker[mc].process.stdin.write(b"<runtask>" + pickle.dumps(runtask) + b"</runtask>") self.rq.worker[mc].process.stdin.flush() - self.build_stamps[task] = bb.build.stampfile(taskname, self.rqdata.dataCaches[mc], taskfn, noextra=True) + self.build_stamps[task] = bb.parse.siggen.stampfile_mcfn(taskname, taskfn, extrainfo=False) self.build_stamps2.append(self.build_stamps[task]) self.sq_running.add(task) self.sq_live.add(task) @@ -2214,18 +2224,30 @@ class RunQueueExecute: self.runq_running.add(task) self.stats.taskActive() if not (self.cooker.configuration.dry_run or self.rqdata.setscene_enforce): - bb.build.make_stamp(taskname, self.rqdata.dataCaches[mc], taskfn) + bb.build.make_stamp_mcfn(taskname, taskfn) self.task_complete(task) return True else: startevent = runQueueTaskStarted(task, self.stats, self.rq) bb.event.fire(startevent, self.cfgData) - taskdepdata = self.build_taskdepdata(task) - taskdep = self.rqdata.dataCaches[mc].task_deps[taskfn] - taskhash = self.rqdata.get_task_hash(task) - unihash = self.rqdata.get_task_unihash(task) + runtask = { + 'fn' : taskfn, + 'task' : task, + 'taskname' : taskname, + 'taskhash' : self.rqdata.get_task_hash(task), + 'unihash' : self.rqdata.get_task_unihash(task), + 'quieterrors' : False, + 'appends' : self.cooker.collections[mc].get_file_appends(taskfn), + 'taskdepdata' : self.build_taskdepdata(task), + 'dry_run' : self.rqdata.setscene_enforce, + 'taskdep': taskdep, + 'fakerootenv' : self.rqdata.dataCaches[mc].fakerootenv[taskfn], + 'fakerootdirs' : self.rqdata.dataCaches[mc].fakerootdirs[taskfn], + 'fakerootnoenv' : self.rqdata.dataCaches[mc].fakerootnoenv[taskfn] + } + if 'fakeroot' in taskdep and taskname in taskdep['fakeroot'] and not (self.cooker.configuration.dry_run or self.rqdata.setscene_enforce): if not mc in self.rq.fakeworker: try: @@ -2235,13 +2257,13 @@ class RunQueueExecute: self.rq.state = runQueueFailed self.stats.taskFailed() return True - self.rq.fakeworker[mc].process.stdin.write(b"<runtask>" + pickle.dumps((taskfn, task, taskname, taskhash, unihash, False, self.cooker.collections[mc].get_file_appends(taskfn), taskdepdata, self.rqdata.setscene_enforce)) + b"</runtask>") + self.rq.fakeworker[mc].process.stdin.write(b"<runtask>" + pickle.dumps(runtask) + b"</runtask>") self.rq.fakeworker[mc].process.stdin.flush() else: - self.rq.worker[mc].process.stdin.write(b"<runtask>" + pickle.dumps((taskfn, task, taskname, taskhash, unihash, False, self.cooker.collections[mc].get_file_appends(taskfn), taskdepdata, self.rqdata.setscene_enforce)) + b"</runtask>") + self.rq.worker[mc].process.stdin.write(b"<runtask>" + pickle.dumps(runtask) + b"</runtask>") self.rq.worker[mc].process.stdin.flush() - self.build_stamps[task] = bb.build.stampfile(taskname, self.rqdata.dataCaches[mc], taskfn, noextra=True) + self.build_stamps[task] = bb.parse.siggen.stampfile_mcfn(taskname, taskfn, extrainfo=False) self.build_stamps2.append(self.build_stamps[task]) self.runq_running.add(task) self.stats.taskActive() @@ -2413,8 +2435,7 @@ class RunQueueExecute: if self.rqdata.runtaskentries[p].depends and not self.rqdata.runtaskentries[tid].depends.isdisjoint(total): continue orighash = self.rqdata.runtaskentries[tid].hash - dc = bb.parse.siggen.get_data_caches(self.rqdata.dataCaches, mc_from_tid(tid)) - newhash = bb.parse.siggen.get_taskhash(tid, self.rqdata.runtaskentries[tid].depends, dc) + newhash = bb.parse.siggen.get_taskhash(tid, self.rqdata.runtaskentries[tid].depends, self.rqdata.dataCaches) origuni = self.rqdata.runtaskentries[tid].unihash newuni = bb.parse.siggen.get_unihash(tid) # FIXME, need to check it can come from sstate at all for determinism? @@ -2489,17 +2510,6 @@ class RunQueueExecute: self.sq_buildable.remove(tid) if tid in self.sq_running: self.sq_running.remove(tid) - harddepfail = False - for t in self.sqdata.sq_harddeps: - if tid in self.sqdata.sq_harddeps[t] and t in self.scenequeue_notcovered: - harddepfail = True - break - if not harddepfail and self.sqdata.sq_revdeps[tid].issubset(self.scenequeue_covered | self.scenequeue_notcovered): - if tid not in self.sq_buildable: - self.sq_buildable.add(tid) - if not self.sqdata.sq_revdeps[tid]: - self.sq_buildable.add(tid) - if tid in self.sqdata.outrightfail: self.sqdata.outrightfail.remove(tid) if tid in self.scenequeue_notcovered: @@ -2510,7 +2520,7 @@ class RunQueueExecute: self.scenequeue_notneeded.remove(tid) (mc, fn, taskname, taskfn) = split_tid_mcfn(tid) - self.sqdata.stamps[tid] = bb.build.stampfile(taskname + "_setscene", self.rqdata.dataCaches[mc], taskfn, noextra=True) + self.sqdata.stamps[tid] = bb.parse.siggen.stampfile_mcfn(taskname, taskfn, extrainfo=False) if tid in self.stampcache: del self.stampcache[tid] @@ -2518,24 +2528,40 @@ class RunQueueExecute: if tid in self.build_stamps: del self.build_stamps[tid] - update_tasks.append((tid, harddepfail, tid in self.sqdata.valid)) + update_tasks.append(tid) - if update_tasks: + update_tasks2 = [] + for tid in update_tasks: + harddepfail = False + for t in self.sqdata.sq_harddeps: + if tid in self.sqdata.sq_harddeps[t] and t in self.scenequeue_notcovered: + harddepfail = True + break + if not harddepfail and self.sqdata.sq_revdeps[tid].issubset(self.scenequeue_covered | self.scenequeue_notcovered): + if tid not in self.sq_buildable: + self.sq_buildable.add(tid) + if not self.sqdata.sq_revdeps[tid]: + self.sq_buildable.add(tid) + + update_tasks2.append((tid, harddepfail, tid in self.sqdata.valid)) + + if update_tasks2: self.sqdone = False for mc in sorted(self.sqdata.multiconfigs): - for tid in sorted([t[0] for t in update_tasks]): + for tid in sorted([t[0] for t in update_tasks2]): if mc_from_tid(tid) != mc: continue h = pending_hash_index(tid, self.rqdata) if h in self.sqdata.hashes and tid != self.sqdata.hashes[h]: self.sq_deferred[tid] = self.sqdata.hashes[h] bb.note("Deferring %s after %s" % (tid, self.sqdata.hashes[h])) - update_scenequeue_data([t[0] for t in update_tasks], self.sqdata, self.rqdata, self.rq, self.cooker, self.stampcache, self, summary=False) + update_scenequeue_data([t[0] for t in update_tasks2], self.sqdata, self.rqdata, self.rq, self.cooker, self.stampcache, self, summary=False) - for (tid, harddepfail, origvalid) in update_tasks: + for (tid, harddepfail, origvalid) in update_tasks2: if tid in self.sqdata.valid and not origvalid: hashequiv_logger.verbose("Setscene task %s became valid" % tid) if harddepfail: + logger.debug2("%s has an unavailable hard dependency so skipping" % (tid)) self.sq_task_failoutright(tid) if changed: @@ -2810,7 +2836,8 @@ def build_scenequeue_data(sqdata, rqdata, rq, cooker, stampcache, sqrq): (mc, fn, taskname, taskfn) = split_tid_mcfn(tid) realtid = tid + "_setscene" idepends = rqdata.taskData[mc].taskentries[realtid].idepends - sqdata.stamps[tid] = bb.build.stampfile(taskname + "_setscene", rqdata.dataCaches[mc], taskfn, noextra=True) + sqdata.stamps[tid] = bb.parse.siggen.stampfile_mcfn(taskname, taskfn, extrainfo=False) + for (depname, idependtask) in idepends: if depname not in rqdata.taskData[mc].build_targets: @@ -2889,7 +2916,7 @@ def build_scenequeue_data(sqdata, rqdata, rq, cooker, stampcache, sqrq): found = {} for tid in rqdata.runq_setscene_tids: (mc, fn, taskname, taskfn) = split_tid_mcfn(tid) - stamps = bb.build.find_stale_stamps(taskname, rqdata.dataCaches[mc], taskfn) + stamps = bb.build.find_stale_stamps(taskname, taskfn) if stamps: if mc not in found: found[mc] = {} @@ -2905,7 +2932,7 @@ def check_setscene_stamps(tid, rqdata, rq, stampcache, noexecstamp=False): taskdep = rqdata.dataCaches[mc].task_deps[taskfn] if 'noexec' in taskdep and taskname in taskdep['noexec']: - bb.build.make_stamp(taskname + "_setscene", rqdata.dataCaches[mc], taskfn) + bb.build.make_stamp_mcfn(taskname + "_setscene", taskfn) return True, False if rq.check_stamp_task(tid, taskname + "_setscene", cache=stampcache): @@ -2935,11 +2962,13 @@ def update_scenequeue_data(tids, sqdata, rqdata, rq, cooker, stampcache, sqrq, s if noexec: sqdata.noexec.add(tid) sqrq.sq_task_skip(tid) + logger.debug2("%s is noexec so skipping setscene" % (tid)) continue if stamppresent: sqdata.stamppresent.add(tid) sqrq.sq_task_skip(tid) + logger.debug2("%s has a valid stamp, skipping" % (tid)) continue tocheck.add(tid) @@ -2960,6 +2989,7 @@ def update_scenequeue_data(tids, sqdata, rqdata, rq, cooker, stampcache, sqrq, s if tid in sqrq.sq_deferred: continue sqdata.outrightfail.add(tid) + logger.debug2("%s already handled (fallthrough), skipping" % (tid)) class TaskFailure(Exception): """ diff --git a/poky/bitbake/lib/bb/server/process.py b/poky/bitbake/lib/bb/server/process.py index 5d02c0b9f5..529196b78c 100644 --- a/poky/bitbake/lib/bb/server/process.py +++ b/poky/bitbake/lib/bb/server/process.py @@ -28,6 +28,7 @@ import datetime import pickle import traceback import gc +import stat import bb.server.xmlrpcserver from bb import daemonize from multiprocessing import queues @@ -41,6 +42,39 @@ def serverlog(msg): print(str(os.getpid()) + " " + datetime.datetime.now().strftime('%H:%M:%S.%f') + " " + msg) sys.stdout.flush() +# +# When we have lockfile issues, try and find infomation about which process is +# using the lockfile +# +def get_lockfile_process_msg(lockfile): + # Some systems may not have lsof available + procs = None + try: + procs = subprocess.check_output(["lsof", '-w', lockfile], stderr=subprocess.STDOUT) + except subprocess.CalledProcessError: + # File was deleted? + pass + except OSError as e: + if e.errno != errno.ENOENT: + raise + if procs is None: + # Fall back to fuser if lsof is unavailable + try: + procs = subprocess.check_output(["fuser", '-v', lockfile], stderr=subprocess.STDOUT) + except subprocess.CalledProcessError: + # File was deleted? + pass + except OSError as e: + if e.errno != errno.ENOENT: + raise + if procs: + return procs.decode("utf-8") + return None + +class idleFinish(): + def __init__(self, msg): + self.msg = msg + class ProcessServer(): profile_filename = "profile.log" profile_processed_filename = "profile.log.processed" @@ -58,12 +92,19 @@ class ProcessServer(): self.maxuiwait = 30 self.xmlrpc = False + self.idle = None + # Need a lock for _idlefuns changes self._idlefuns = {} + self._idlefuncsLock = threading.Lock() + self.idle_cond = threading.Condition(self._idlefuncsLock) self.bitbake_lock = lock self.bitbake_lock_name = lockname self.sock = sock self.sockname = sockname + # It is possible the directory may be renamed. Cache the inode of the socket file + # so we can tell if things changed. + self.sockinode = os.stat(self.sockname)[stat.ST_INO] self.server_timeout = server_timeout self.timeout = self.server_timeout @@ -72,7 +113,9 @@ class ProcessServer(): def register_idle_function(self, function, data): """Register a function to be called while the server is idle""" assert hasattr(function, '__call__') - self._idlefuns[function] = data + with bb.utils.lock_timeout(self._idlefuncsLock): + self._idlefuns[function] = data + serverlog("Registering idle function %s" % str(function)) def run(self): @@ -111,6 +154,31 @@ class ProcessServer(): return ret + def _idle_check(self): + return len(self._idlefuns) == 0 and self.cooker.command.currentAsyncCommand is None + + def wait_for_idle(self, timeout=30): + # Wait for the idle loop to have cleared + with bb.utils.lock_timeout(self._idlefuncsLock): + return self.idle_cond.wait_for(self._idle_check, timeout) is not False + + def set_async_cmd(self, cmd): + with bb.utils.lock_timeout(self._idlefuncsLock): + ret = self.idle_cond.wait_for(self._idle_check, 30) + if ret is False: + return False + self.cooker.command.currentAsyncCommand = cmd + return True + + def clear_async_cmd(self): + with bb.utils.lock_timeout(self._idlefuncsLock): + self.cooker.command.currentAsyncCommand = None + self.idle_cond.notify_all() + + def get_async_cmd(self): + with bb.utils.lock_timeout(self._idlefuncsLock): + return self.cooker.command.currentAsyncCommand + def main(self): self.cooker.pre_serve() @@ -125,14 +193,19 @@ class ProcessServer(): fds.append(self.xmlrpc) seendata = False serverlog("Entering server connection loop") + serverlog("Lockfile is: %s\nSocket is %s (%s)" % (self.bitbake_lock_name, self.sockname, os.path.exists(self.sockname))) def disconnect_client(self, fds): - serverlog("Disconnecting Client") + serverlog("Disconnecting Client (socket: %s)" % os.path.exists(self.sockname)) if self.controllersock: fds.remove(self.controllersock) self.controllersock.close() self.controllersock = False if self.haveui: + # Wait for the idle loop to have cleared (30s max) + if not self.wait_for_idle(30): + serverlog("Idle loop didn't finish queued commands after 30s, exiting.") + self.quit = True fds.remove(self.command_channel) bb.event.unregister_UIHhandler(self.event_handle, True) self.command_channel_reply.writer.close() @@ -144,7 +217,7 @@ class ProcessServer(): self.cooker.clientComplete() self.haveui = False ready = select.select(fds,[],[],0)[0] - if newconnections: + if newconnections and not self.quit: serverlog("Starting new client") conn = newconnections.pop(-1) fds.append(conn) @@ -216,8 +289,8 @@ class ProcessServer(): continue try: serverlog("Running command %s" % command) - self.command_channel_reply.send(self.cooker.command.runCommand(command)) - serverlog("Command Completed") + self.command_channel_reply.send(self.cooker.command.runCommand(command, self)) + serverlog("Command Completed (socket: %s)" % os.path.exists(self.sockname)) except Exception as e: stack = traceback.format_exc() serverlog('Exception in server main event loop running command %s (%s)' % (command, stack)) @@ -244,16 +317,25 @@ class ProcessServer(): ready = self.idle_commands(.1, fds) - serverlog("Exiting") + if self.idle: + self.idle.join() + + serverlog("Exiting (socket: %s)" % os.path.exists(self.sockname)) # Remove the socket file so we don't get any more connections to avoid races + # The build directory could have been renamed so if the file isn't the one we created + # we shouldn't delete it. try: - os.unlink(self.sockname) - except: - pass + sockinode = os.stat(self.sockname)[stat.ST_INO] + if sockinode == self.sockinode: + os.unlink(self.sockname) + else: + serverlog("bitbake.sock inode mismatch (%s vs %s), not deleting." % (sockinode, self.sockinode)) + except Exception as err: + serverlog("Removing socket file '%s' failed (%s)" % (self.sockname, err)) self.sock.close() try: - self.cooker.shutdown(True) + self.cooker.shutdown(True, idle=False) self.cooker.notifier.stop() self.cooker.confignotifier.stop() except: @@ -306,80 +388,90 @@ class ProcessServer(): return if not lock: - # Some systems may not have lsof available - procs = None - try: - procs = subprocess.check_output(["lsof", '-w', lockfile], stderr=subprocess.STDOUT) - except subprocess.CalledProcessError: - # File was deleted? - continue - except OSError as e: - if e.errno != errno.ENOENT: - raise - if procs is None: - # Fall back to fuser if lsof is unavailable - try: - procs = subprocess.check_output(["fuser", '-v', lockfile], stderr=subprocess.STDOUT) - except subprocess.CalledProcessError: - # File was deleted? - continue - except OSError as e: - if e.errno != errno.ENOENT: - raise - + procs = get_lockfile_process_msg(lockfile) msg = ["Delaying shutdown due to active processes which appear to be holding bitbake.lock"] if procs: - msg.append(":\n%s" % str(procs.decode("utf-8"))) + msg.append(":\n%s" % procs) serverlog("".join(msg)) - def idle_commands(self, delay, fds=None): - nextsleep = delay - if not fds: + def idle_thread(self): + def remove_idle_func(function): + with bb.utils.lock_timeout(self._idlefuncsLock): + del self._idlefuns[function] + self.idle_cond.notify_all() + + while not self.quit: + nextsleep = 0.1 fds = [] - for function, data in list(self._idlefuns.items()): - try: - retval = function(self, data, False) - if retval is False: - del self._idlefuns[function] - nextsleep = None - elif retval is True: - nextsleep = None - elif isinstance(retval, float) and nextsleep: - if (retval < nextsleep): - nextsleep = retval - elif nextsleep is None: - continue - else: - fds = fds + retval - except SystemExit: - raise - except Exception as exc: - if not isinstance(exc, bb.BBHandledException): - logger.exception('Running idle function') - del self._idlefuns[function] - self.quit = True + self.cooker.process_inotify_updates() + + with bb.utils.lock_timeout(self._idlefuncsLock): + items = list(self._idlefuns.items()) - # Create new heartbeat event? - now = time.time() - if now >= self.next_heartbeat: - # We might have missed heartbeats. Just trigger once in - # that case and continue after the usual delay. - self.next_heartbeat += self.heartbeat_seconds - if self.next_heartbeat <= now: - self.next_heartbeat = now + self.heartbeat_seconds - if hasattr(self.cooker, "data"): - heartbeat = bb.event.HeartbeatEvent(now) + for function, data in items: try: - bb.event.fire(heartbeat, self.cooker.data) + retval = function(self, data, False) + if isinstance(retval, idleFinish): + serverlog("Removing idle function %s at idleFinish" % str(function)) + remove_idle_func(function) + self.cooker.command.finishAsyncCommand(retval.msg) + nextsleep = None + elif retval is False: + serverlog("Removing idle function %s" % str(function)) + remove_idle_func(function) + nextsleep = None + elif retval is True: + nextsleep = None + elif isinstance(retval, float) and nextsleep: + if (retval < nextsleep): + nextsleep = retval + elif nextsleep is None: + continue + else: + fds = fds + retval + except SystemExit: + raise except Exception as exc: if not isinstance(exc, bb.BBHandledException): - logger.exception('Running heartbeat function') + logger.exception('Running idle function') + remove_idle_func(function) + serverlog("Exception %s broke the idle_thread, exiting" % traceback.format_exc()) self.quit = True - if nextsleep and now + nextsleep > self.next_heartbeat: - # Shorten timeout so that we we wake up in time for - # the heartbeat. - nextsleep = self.next_heartbeat - now + + # Create new heartbeat event? + now = time.time() + if bb.event._heartbeat_enabled and now >= self.next_heartbeat: + # We might have missed heartbeats. Just trigger once in + # that case and continue after the usual delay. + self.next_heartbeat += self.heartbeat_seconds + if self.next_heartbeat <= now: + self.next_heartbeat = now + self.heartbeat_seconds + if hasattr(self.cooker, "data"): + heartbeat = bb.event.HeartbeatEvent(now) + try: + bb.event.fire(heartbeat, self.cooker.data) + except Exception as exc: + if not isinstance(exc, bb.BBHandledException): + logger.exception('Running heartbeat function') + serverlog("Exception %s broke in idle_thread, exiting" % traceback.format_exc()) + self.quit = True + if nextsleep and bb.event._heartbeat_enabled and now + nextsleep > self.next_heartbeat: + # Shorten timeout so that we we wake up in time for + # the heartbeat. + nextsleep = self.next_heartbeat - now + + if nextsleep is not None: + select.select(fds,[],[],nextsleep)[0] + + def idle_commands(self, delay, fds=None): + nextsleep = delay + if not fds: + fds = [] + + if not self.idle: + self.idle = threading.Thread(target=self.idle_thread) + self.idle.start() if nextsleep is not None: if self.xmlrpc: @@ -448,13 +540,14 @@ start_log_datetime_format = '%Y-%m-%d %H:%M:%S.%f' class BitBakeServer(object): - def __init__(self, lock, sockname, featureset, server_timeout, xmlrpcinterface): + def __init__(self, lock, sockname, featureset, server_timeout, xmlrpcinterface, profile): self.server_timeout = server_timeout self.xmlrpcinterface = xmlrpcinterface self.featureset = featureset self.sockname = sockname self.bitbake_lock = lock + self.profile = profile self.readypipe, self.readypipein = os.pipe() # Place the log in the builddirectory alongside the lock file @@ -518,9 +611,9 @@ class BitBakeServer(object): os.set_inheritable(self.bitbake_lock.fileno(), True) os.set_inheritable(self.readypipein, True) serverscript = os.path.realpath(os.path.dirname(__file__) + "/../../../bin/bitbake-server") - os.execl(sys.executable, "bitbake-server", serverscript, "decafbad", str(self.bitbake_lock.fileno()), str(self.readypipein), self.logfile, self.bitbake_lock.name, self.sockname, str(self.server_timeout or 0), str(self.xmlrpcinterface[0]), str(self.xmlrpcinterface[1])) + os.execl(sys.executable, "bitbake-server", serverscript, "decafbad", str(self.bitbake_lock.fileno()), str(self.readypipein), self.logfile, self.bitbake_lock.name, self.sockname, str(self.server_timeout or 0), str(int(self.profile)), str(self.xmlrpcinterface[0]), str(self.xmlrpcinterface[1])) -def execServer(lockfd, readypipeinfd, lockname, sockname, server_timeout, xmlrpcinterface): +def execServer(lockfd, readypipeinfd, lockname, sockname, server_timeout, xmlrpcinterface, profile): import bb.cookerdata import bb.cooker @@ -532,6 +625,7 @@ def execServer(lockfd, readypipeinfd, lockname, sockname, server_timeout, xmlrpc # Create server control socket if os.path.exists(sockname): + serverlog("WARNING: removing existing socket file '%s'" % sockname) os.unlink(sockname) sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) @@ -548,7 +642,8 @@ def execServer(lockfd, readypipeinfd, lockname, sockname, server_timeout, xmlrpc writer = ConnectionWriter(readypipeinfd) try: featureset = [] - cooker = bb.cooker.BBCooker(featureset, server.register_idle_function) + cooker = bb.cooker.BBCooker(featureset, server) + cooker.configuration.profile = profile except bb.BBHandledException: return None writer.send("r") @@ -667,7 +762,7 @@ class BBUIEventQueue: self.t.start() def getEvent(self): - with self.eventQueueLock: + with bb.utils.lock_timeout(self.eventQueueLock): if len(self.eventQueue) == 0: return None @@ -682,7 +777,7 @@ class BBUIEventQueue: return self.getEvent() def queue_event(self, event): - with self.eventQueueLock: + with bb.utils.lock_timeout(self.eventQueueLock): self.eventQueue.append(event) self.eventQueueNotify.set() @@ -718,7 +813,7 @@ class ConnectionReader(object): return self.reader.poll(timeout) def get(self): - with self.rlock: + with bb.utils.lock_timeout(self.rlock): res = self.reader.recv_bytes() return multiprocessing.reduction.ForkingPickler.loads(res) @@ -739,7 +834,7 @@ class ConnectionWriter(object): def _send(self, obj): gc.disable() - with self.wlock: + with bb.utils.lock_timeout(self.wlock): self.writer.send_bytes(obj) gc.enable() @@ -752,7 +847,7 @@ class ConnectionWriter(object): # pthread_sigmask block/unblock would be nice but doesn't work, https://bugs.python.org/issue47139 process = multiprocessing.current_process() if process and hasattr(process, "queue_signals"): - with process.signal_threadlock: + with bb.utils.lock_timeout(process.signal_threadlock): process.queue_signals = True self._send(obj) process.queue_signals = False diff --git a/poky/bitbake/lib/bb/server/xmlrpcserver.py b/poky/bitbake/lib/bb/server/xmlrpcserver.py index 01f55538ae..2e65dc34a9 100644 --- a/poky/bitbake/lib/bb/server/xmlrpcserver.py +++ b/poky/bitbake/lib/bb/server/xmlrpcserver.py @@ -118,7 +118,7 @@ class BitBakeXMLRPCServerCommands(): """ Run a cooker command on the server """ - return self.server.cooker.command.runCommand(command, self.server.readonly) + return self.server.cooker.command.runCommand(command, self.server, self.server.readonly) def getEventHandle(self): return self.event_handle diff --git a/poky/bitbake/lib/bb/siggen.py b/poky/bitbake/lib/bb/siggen.py index 07bb529452..0e79404f76 100644 --- a/poky/bitbake/lib/bb/siggen.py +++ b/poky/bitbake/lib/bb/siggen.py @@ -14,6 +14,7 @@ import bb.data import difflib import simplediff import json +import types import bb.compress.zstd from bb.checksum import FileChecksumCache from bb import runqueue @@ -25,13 +26,13 @@ hashequiv_logger = logging.getLogger('BitBake.SigGen.HashEquiv') class SetEncoder(json.JSONEncoder): def default(self, obj): - if isinstance(obj, set): + if isinstance(obj, set) or isinstance(obj, frozenset): return dict(_set_object=list(sorted(obj))) return json.JSONEncoder.default(self, obj) def SetDecoder(dct): if '_set_object' in dct: - return set(dct['_set_object']) + return frozenset(dct['_set_object']) return dct def init(d): @@ -53,11 +54,6 @@ class SignatureGenerator(object): """ name = "noop" - # If the derived class supports multiconfig datacaches, set this to True - # The default is False for backward compatibility with derived signature - # generators that do not understand multiconfig caches - supports_multiconfig_datacaches = False - def __init__(self, data): self.basehash = {} self.taskhash = {} @@ -75,6 +71,27 @@ class SignatureGenerator(object): def postparsing_clean_cache(self): return + def setup_datacache(self, datacaches): + self.datacaches = datacaches + + def setup_datacache_from_datastore(self, mcfn, d): + # In task context we have no cache so setup internal data structures + # from the fully parsed data store provided + + mc = d.getVar("__BBMULTICONFIG", False) or "" + tasks = d.getVar('__BBTASKS', False) + + self.datacaches = {} + self.datacaches[mc] = types.SimpleNamespace() + setattr(self.datacaches[mc], "stamp", {}) + self.datacaches[mc].stamp[mcfn] = d.getVar('STAMP') + setattr(self.datacaches[mc], "stamp_extrainfo", {}) + self.datacaches[mc].stamp_extrainfo[mcfn] = {} + for t in tasks: + flag = d.getVarFlag(t, "stamp-extra-info") + if flag: + self.datacaches[mc].stamp_extrainfo[mcfn][t] = flag + def get_unihash(self, tid): return self.taskhash[tid] @@ -89,17 +106,51 @@ class SignatureGenerator(object): """Write/update the file checksum cache onto disk""" return + def stampfile_base(self, mcfn): + mc = bb.runqueue.mc_from_tid(mcfn) + return self.datacaches[mc].stamp[mcfn] + + def stampfile_mcfn(self, taskname, mcfn, extrainfo=True): + mc = bb.runqueue.mc_from_tid(mcfn) + stamp = self.datacaches[mc].stamp[mcfn] + if not stamp: + return + + stamp_extrainfo = "" + if extrainfo: + taskflagname = taskname + if taskname.endswith("_setscene"): + taskflagname = taskname.replace("_setscene", "") + stamp_extrainfo = self.datacaches[mc].stamp_extrainfo[mcfn].get(taskflagname) or "" + + return self.stampfile(stamp, mcfn, taskname, stamp_extrainfo) + def stampfile(self, stampbase, file_name, taskname, extrainfo): return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.') + def stampcleanmask_mcfn(self, taskname, mcfn): + mc = bb.runqueue.mc_from_tid(mcfn) + stamp = self.datacaches[mc].stamp[mcfn] + if not stamp: + return [] + + taskflagname = taskname + if taskname.endswith("_setscene"): + taskflagname = taskname.replace("_setscene", "") + stamp_extrainfo = self.datacaches[mc].stamp_extrainfo[mcfn].get(taskflagname) or "" + + return self.stampcleanmask(stamp, mcfn, taskname, stamp_extrainfo) + def stampcleanmask(self, stampbase, file_name, taskname, extrainfo): return ("%s.%s.%s" % (stampbase, taskname, extrainfo)).rstrip('.') - def dump_sigtask(self, fn, task, stampbase, runtime): + def dump_sigtask(self, mcfn, task, stampbase, runtime): return - def invalidate_task(self, task, d, fn): - bb.build.del_stamp(task, d, fn) + def invalidate_task(self, task, mcfn): + mc = bb.runqueue.mc_from_tid(mcfn) + stamp = self.datacaches[mc].stamp[mcfn] + bb.utils.remove(stamp) def dump_sigs(self, dataCache, options): return @@ -128,38 +179,6 @@ class SignatureGenerator(object): def set_setscene_tasks(self, setscene_tasks): return - @classmethod - def get_data_caches(cls, dataCaches, mc): - """ - This function returns the datacaches that should be passed to signature - generator functions. If the signature generator supports multiconfig - caches, the entire dictionary of data caches is sent, otherwise a - special proxy is sent that support both index access to all - multiconfigs, and also direct access for the default multiconfig. - - The proxy class allows code in this class itself to always use - multiconfig aware code (to ease maintenance), but derived classes that - are unaware of multiconfig data caches can still access the default - multiconfig as expected. - - Do not override this function in derived classes; it will be removed in - the future when support for multiconfig data caches is mandatory - """ - class DataCacheProxy(object): - def __init__(self): - pass - - def __getitem__(self, key): - return dataCaches[key] - - def __getattr__(self, name): - return getattr(dataCaches[mc], name) - - if cls.supports_multiconfig_datacaches: - return dataCaches - - return DataCacheProxy() - def exit(self): return @@ -172,12 +191,9 @@ class SignatureGeneratorBasic(SignatureGenerator): self.basehash = {} self.taskhash = {} self.unihash = {} - self.taskdeps = {} self.runtaskdeps = {} self.file_checksum_values = {} self.taints = {} - self.gendeps = {} - self.lookupcache = {} self.setscenetasks = set() self.basehash_ignore_vars = set((data.getVar("BB_BASEHASH_IGNORE_VARS") or "").split()) self.taskhash_ignore_tasks = None @@ -201,15 +217,15 @@ class SignatureGeneratorBasic(SignatureGenerator): else: self.twl = None - def _build_data(self, fn, d): + def _build_data(self, mcfn, d): ignore_mismatch = ((d.getVar("BB_HASH_IGNORE_MISMATCH") or '') == '1') tasklist, gendeps, lookupcache = bb.data.generate_dependencies(d, self.basehash_ignore_vars) - taskdeps, basehash = bb.data.generate_dependency_hash(tasklist, gendeps, lookupcache, self.basehash_ignore_vars, fn) + taskdeps, basehash = bb.data.generate_dependency_hash(tasklist, gendeps, lookupcache, self.basehash_ignore_vars, mcfn) for task in tasklist: - tid = fn + ":" + task + tid = mcfn + ":" + task if not ignore_mismatch and tid in self.basehash and self.basehash[tid] != basehash[tid]: bb.error("When reparsing %s, the basehash value changed from %s to %s. The metadata is not deterministic and this needs to be fixed." % (tid, self.basehash[tid], basehash[tid])) bb.error("The following commands may help:") @@ -220,11 +236,7 @@ class SignatureGeneratorBasic(SignatureGenerator): bb.error("%s -Sprintdiff\n" % cmd) self.basehash[tid] = basehash[tid] - self.taskdeps[fn] = taskdeps - self.gendeps[fn] = gendeps - self.lookupcache[fn] = lookupcache - - return taskdeps + return taskdeps, gendeps, lookupcache def set_setscene_tasks(self, setscene_tasks): self.setscenetasks = set(setscene_tasks) @@ -232,31 +244,41 @@ class SignatureGeneratorBasic(SignatureGenerator): def finalise(self, fn, d, variant): mc = d.getVar("__BBMULTICONFIG", False) or "" + mcfn = fn if variant or mc: - fn = bb.cache.realfn2virtual(fn, variant, mc) + mcfn = bb.cache.realfn2virtual(fn, variant, mc) try: - taskdeps = self._build_data(fn, d) + taskdeps, gendeps, lookupcache = self._build_data(mcfn, d) except bb.parse.SkipRecipe: raise except: - bb.warn("Error during finalise of %s" % fn) + bb.warn("Error during finalise of %s" % mcfn) raise #Slow but can be useful for debugging mismatched basehashes - #for task in self.taskdeps[fn]: - # self.dump_sigtask(fn, task, d.getVar("STAMP"), False) + #for task in self.taskdeps[mcfn]: + # self.dump_sigtask(mcfn, task, d.getVar("STAMP"), False) + basehashes = {} for task in taskdeps: - d.setVar("BB_BASEHASH:task-%s" % task, self.basehash[fn + ":" + task]) + basehashes[task] = self.basehash[mcfn + ":" + task] - def postparsing_clean_cache(self): - # - # After parsing we can remove some things from memory to reduce our memory footprint - # - self.gendeps = {} - self.lookupcache = {} - self.taskdeps = {} + d.setVar("__siggen_basehashes", basehashes) + d.setVar("__siggen_gendeps", gendeps) + d.setVar("__siggen_varvals", lookupcache) + d.setVar("__siggen_taskdeps", taskdeps) + + def setup_datacache_from_datastore(self, mcfn, d): + super().setup_datacache_from_datastore(mcfn, d) + + mc = bb.runqueue.mc_from_tid(mcfn) + for attr in ["siggen_varvals", "siggen_taskdeps", "siggen_gendeps"]: + if not hasattr(self.datacaches[mc], attr): + setattr(self.datacaches[mc], attr, {}) + self.datacaches[mc].siggen_varvals[mcfn] = d.getVar("__siggen_varvals") + self.datacaches[mc].siggen_taskdeps[mcfn] = d.getVar("__siggen_taskdeps") + self.datacaches[mc].siggen_gendeps[mcfn] = d.getVar("__siggen_gendeps") def rundep_check(self, fn, recipename, task, dep, depname, dataCaches): # Return True if we should keep the dependency, False to drop it @@ -279,38 +301,33 @@ class SignatureGeneratorBasic(SignatureGenerator): def prep_taskhash(self, tid, deps, dataCaches): - (mc, _, task, fn) = bb.runqueue.split_tid_mcfn(tid) + (mc, _, task, mcfn) = bb.runqueue.split_tid_mcfn(tid) self.basehash[tid] = dataCaches[mc].basetaskhash[tid] self.runtaskdeps[tid] = [] self.file_checksum_values[tid] = [] - recipename = dataCaches[mc].pkg_fn[fn] + recipename = dataCaches[mc].pkg_fn[mcfn] self.tidtopn[tid] = recipename for dep in sorted(deps, key=clean_basepath): (depmc, _, _, depmcfn) = bb.runqueue.split_tid_mcfn(dep) depname = dataCaches[depmc].pkg_fn[depmcfn] - if not self.supports_multiconfig_datacaches and mc != depmc: - # If the signature generator doesn't understand multiconfig - # data caches, any dependency not in the same multiconfig must - # be skipped for backward compatibility - continue - if not self.rundep_check(fn, recipename, task, dep, depname, dataCaches): + if not self.rundep_check(mcfn, recipename, task, dep, depname, dataCaches): continue if dep not in self.taskhash: bb.fatal("%s is not in taskhash, caller isn't calling in dependency order?" % dep) self.runtaskdeps[tid].append(dep) - if task in dataCaches[mc].file_checksums[fn]: + if task in dataCaches[mc].file_checksums[mcfn]: if self.checksum_cache: - checksums = self.checksum_cache.get_checksums(dataCaches[mc].file_checksums[fn][task], recipename, self.localdirsexclude) + checksums = self.checksum_cache.get_checksums(dataCaches[mc].file_checksums[mcfn][task], recipename, self.localdirsexclude) else: - checksums = bb.fetch2.get_file_checksums(dataCaches[mc].file_checksums[fn][task], recipename, self.localdirsexclude) + checksums = bb.fetch2.get_file_checksums(dataCaches[mc].file_checksums[mcfn][task], recipename, self.localdirsexclude) for (f,cs) in checksums: self.file_checksum_values[tid].append((f,cs)) - taskdep = dataCaches[mc].task_deps[fn] + taskdep = dataCaches[mc].task_deps[mcfn] if 'nostamp' in taskdep and task in taskdep['nostamp']: # Nostamp tasks need an implicit taint so that they force any dependent tasks to run if tid in self.taints and self.taints[tid].startswith("nostamp:"): @@ -321,7 +338,7 @@ class SignatureGeneratorBasic(SignatureGenerator): taint = str(uuid.uuid4()) self.taints[tid] = "nostamp:" + taint - taint = self.read_taint(fn, task, dataCaches[mc].stamp[fn]) + taint = self.read_taint(mcfn, task, dataCaches[mc].stamp[mcfn]) if taint: self.taints[tid] = taint logger.warning("%s is tainted from a forced run" % tid) @@ -366,9 +383,9 @@ class SignatureGeneratorBasic(SignatureGenerator): def copy_unitaskhashes(self, targetdir): self.unihash_cache.copyfile(targetdir) - def dump_sigtask(self, fn, task, stampbase, runtime): - - tid = fn + ":" + task + def dump_sigtask(self, mcfn, task, stampbase, runtime): + tid = mcfn + ":" + task + mc = bb.runqueue.mc_from_tid(mcfn) referencestamp = stampbase if isinstance(runtime, str) and runtime.startswith("customfile"): sigfile = stampbase @@ -385,16 +402,16 @@ class SignatureGeneratorBasic(SignatureGenerator): data['task'] = task data['basehash_ignore_vars'] = self.basehash_ignore_vars data['taskhash_ignore_tasks'] = self.taskhash_ignore_tasks - data['taskdeps'] = self.taskdeps[fn][task] + data['taskdeps'] = self.datacaches[mc].siggen_taskdeps[mcfn][task] data['basehash'] = self.basehash[tid] data['gendeps'] = {} data['varvals'] = {} - data['varvals'][task] = self.lookupcache[fn][task] - for dep in self.taskdeps[fn][task]: + data['varvals'][task] = self.datacaches[mc].siggen_varvals[mcfn][task] + for dep in self.datacaches[mc].siggen_taskdeps[mcfn][task]: if dep in self.basehash_ignore_vars: - continue - data['gendeps'][dep] = self.gendeps[fn][dep] - data['varvals'][dep] = self.lookupcache[fn][dep] + continue + data['gendeps'][dep] = self.datacaches[mc].siggen_gendeps[mcfn][dep] + data['varvals'][dep] = self.datacaches[mc].siggen_varvals[mcfn][dep] if runtime and tid in self.taskhash: data['runtaskdeps'] = self.runtaskdeps[tid] @@ -410,7 +427,7 @@ class SignatureGeneratorBasic(SignatureGenerator): data['taskhash'] = self.taskhash[tid] data['unihash'] = self.get_unihash(tid) - taint = self.read_taint(fn, task, referencestamp) + taint = self.read_taint(mcfn, task, referencestamp) if taint: data['taint'] = taint @@ -441,18 +458,6 @@ class SignatureGeneratorBasic(SignatureGenerator): pass raise err - def dump_sigfn(self, fn, dataCaches, options): - if fn in self.taskdeps: - for task in self.taskdeps[fn]: - tid = fn + ":" + task - mc = bb.runqueue.mc_from_tid(tid) - if tid not in self.taskhash: - continue - if dataCaches[mc].basetaskhash[tid] != self.basehash[tid]: - bb.error("Bitbake's cached basehash does not match the one we just generated (%s)!" % tid) - bb.error("The mismatched hashes were %s and %s" % (dataCaches[mc].basetaskhash[tid], self.basehash[tid])) - self.dump_sigtask(fn, task, dataCaches[mc].stamp[fn], True) - class SignatureGeneratorBasicHash(SignatureGeneratorBasic): name = "basichash" @@ -463,11 +468,11 @@ class SignatureGeneratorBasicHash(SignatureGeneratorBasic): # If task is not in basehash, then error return self.basehash[tid] - def stampfile(self, stampbase, fn, taskname, extrainfo, clean=False): - if taskname != "do_setscene" and taskname.endswith("_setscene"): - tid = fn + ":" + taskname[:-9] + def stampfile(self, stampbase, mcfn, taskname, extrainfo, clean=False): + if taskname.endswith("_setscene"): + tid = mcfn + ":" + taskname[:-9] else: - tid = fn + ":" + taskname + tid = mcfn + ":" + taskname if clean: h = "*" else: @@ -475,12 +480,23 @@ class SignatureGeneratorBasicHash(SignatureGeneratorBasic): return ("%s.%s.%s.%s" % (stampbase, taskname, h, extrainfo)).rstrip('.') - def stampcleanmask(self, stampbase, fn, taskname, extrainfo): - return self.stampfile(stampbase, fn, taskname, extrainfo, clean=True) + def stampcleanmask(self, stampbase, mcfn, taskname, extrainfo): + return self.stampfile(stampbase, mcfn, taskname, extrainfo, clean=True) + + def invalidate_task(self, task, mcfn): + bb.note("Tainting hash to force rebuild of task %s, %s" % (mcfn, task)) + + mc = bb.runqueue.mc_from_tid(mcfn) + stamp = self.datacaches[mc].stamp[mcfn] + + taintfn = stamp + '.' + task + '.taint' - def invalidate_task(self, task, d, fn): - bb.note("Tainting hash to force rebuild of task %s, %s" % (fn, task)) - bb.build.write_taint(task, d, fn) + import uuid + bb.utils.mkdirhier(os.path.dirname(taintfn)) + # The specific content of the taint file is not really important, + # we just need it to be random, so a random UUID is used + with open(taintfn, 'w') as taintf: + taintf.write(str(uuid.uuid4())) class SignatureGeneratorUniHashMixIn(object): def __init__(self, data): @@ -599,8 +615,8 @@ class SignatureGeneratorUniHashMixIn(object): unihash = d.getVar('BB_UNIHASH') report_taskdata = d.getVar('SSTATE_HASHEQUIV_REPORT_TASKDATA') == '1' tempdir = d.getVar('T') - fn = d.getVar('BB_FILENAME') - tid = fn + ':do_' + task + mcfn = d.getVar('BB_FILENAME') + tid = mcfn + ':do_' + task key = tid + ':' + taskhash if self.setscenetasks and tid not in self.setscenetasks: @@ -659,7 +675,7 @@ class SignatureGeneratorUniHashMixIn(object): if new_unihash != unihash: hashequiv_logger.debug('Task %s unihash changed %s -> %s by server %s' % (taskhash, unihash, new_unihash, self.server)) - bb.event.fire(bb.runqueue.taskUniHashUpdate(fn + ':do_' + task, new_unihash), d) + bb.event.fire(bb.runqueue.taskUniHashUpdate(mcfn + ':do_' + task, new_unihash), d) self.set_unihash(tid, new_unihash) d.setVar('BB_UNIHASH', new_unihash) else: @@ -719,19 +735,12 @@ class SignatureGeneratorTestEquivHash(SignatureGeneratorUniHashMixIn, SignatureG self.server = data.getVar('BB_HASHSERVE') self.method = "sstate_output_hash" -# -# Dummy class used for bitbake-selftest -# -class SignatureGeneratorTestMulticonfigDepends(SignatureGeneratorBasicHash): - name = "TestMulticonfigDepends" - supports_multiconfig_datacaches = True - def dump_this_task(outfile, d): import bb.parse - fn = d.getVar("BB_FILENAME") + mcfn = d.getVar("BB_FILENAME") task = "do_" + d.getVar("BB_CURRENTTASK") - referencestamp = bb.build.stamp_internal(task, d, None, True) - bb.parse.siggen.dump_sigtask(fn, task, outfile, "customfile:" + referencestamp) + referencestamp = bb.parse.siggen.stampfile_base(mcfn) + bb.parse.siggen.dump_sigtask(mcfn, task, outfile, "customfile:" + referencestamp) def init_colors(enable_color): """Initialise colour dict for passing to compare_sigfiles()""" @@ -1056,7 +1065,7 @@ def calc_basehash(sigdata): basedata = '' alldeps = sigdata['taskdeps'] - for dep in alldeps: + for dep in sorted(alldeps): basedata = basedata + dep val = sigdata['varvals'][dep] if val is not None: diff --git a/poky/bitbake/lib/bb/tests/codeparser.py b/poky/bitbake/lib/bb/tests/codeparser.py index 71ed382ab8..a508f23bcb 100644 --- a/poky/bitbake/lib/bb/tests/codeparser.py +++ b/poky/bitbake/lib/bb/tests/codeparser.py @@ -318,7 +318,7 @@ d.getVar(a(), False) "filename": "example.bb", }) - deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(deps, set(["somevar", "bar", "something", "inexpand", "test", "test2", "a"])) @@ -365,7 +365,7 @@ esac self.d.setVarFlags("FOO", {"func": True}) self.setEmptyVars(execs) - deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(deps, set(["somevar", "inverted"] + execs)) @@ -375,7 +375,7 @@ esac self.d.setVar("FOO", "foo=oe_libinstall; eval $foo") self.d.setVarFlag("FOO", "vardeps", "oe_libinstall") - deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(deps, set(["oe_libinstall"])) @@ -384,7 +384,7 @@ esac self.d.setVar("FOO", "foo=oe_libinstall; eval $foo") self.d.setVarFlag("FOO", "vardeps", "${@'oe_libinstall'}") - deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("FOO", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(deps, set(["oe_libinstall"])) @@ -399,7 +399,7 @@ esac # Check dependencies self.d.setVar('ANOTHERVAR', expr) self.d.setVar('TESTVAR', 'anothervalue testval testval2') - deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(sorted(values.splitlines()), sorted([expr, 'TESTVAR{anothervalue} = Set', @@ -418,14 +418,14 @@ esac self.d.setVar('ANOTHERVAR', varval) self.d.setVar('TESTVAR', 'anothervalue testval testval2') self.d.setVar('TESTVAR2', 'testval3') - deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(["TESTVAR"]), self.d) + deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(), set(["TESTVAR"]), self.d) self.assertEqual(sorted(values.splitlines()), sorted([varval])) self.assertEqual(deps, set(["TESTVAR2"])) self.assertEqual(self.d.getVar('ANOTHERVAR').split(), ['testval3', 'anothervalue']) # Check the vardepsexclude flag is handled by contains functionality self.d.setVarFlag('ANOTHERVAR', 'vardepsexclude', 'TESTVAR') - deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(), self.d) + deps, values = bb.data.build_dependencies("ANOTHERVAR", set(self.d.keys()), set(), set(), set(), set(), self.d) self.assertEqual(sorted(values.splitlines()), sorted([varval])) self.assertEqual(deps, set(["TESTVAR2"])) self.assertEqual(self.d.getVar('ANOTHERVAR').split(), ['testval3', 'anothervalue']) diff --git a/poky/bitbake/lib/bb/tests/color.py b/poky/bitbake/lib/bb/tests/color.py index 88dd278006..bb70cb393d 100644 --- a/poky/bitbake/lib/bb/tests/color.py +++ b/poky/bitbake/lib/bb/tests/color.py @@ -20,7 +20,7 @@ class ProgressWatcher: def __init__(self): self._reports = [] - def handle_event(self, event): + def handle_event(self, event, d): self._reports.append((event.progress, event.rate)) def reports(self): diff --git a/poky/bitbake/lib/bb/tests/event.py b/poky/bitbake/lib/bb/tests/event.py index 9ca7e9bc8e..d959f2d95d 100644 --- a/poky/bitbake/lib/bb/tests/event.py +++ b/poky/bitbake/lib/bb/tests/event.py @@ -157,7 +157,7 @@ class EventHandlingTest(unittest.TestCase): self._test_process.event_handler, event, None) - self._test_process.event_handler.assert_called_once_with(event) + self._test_process.event_handler.assert_called_once_with(event, None) def test_fire_class_handlers(self): """ Test fire_class_handlers method """ @@ -175,10 +175,10 @@ class EventHandlingTest(unittest.TestCase): bb.event.fire_class_handlers(event1, None) bb.event.fire_class_handlers(event2, None) bb.event.fire_class_handlers(event2, None) - expected_event_handler1 = [call(event1)] - expected_event_handler2 = [call(event1), - call(event2), - call(event2)] + expected_event_handler1 = [call(event1, None)] + expected_event_handler2 = [call(event1, None), + call(event2, None), + call(event2, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected_event_handler1) self.assertEqual(self._test_process.event_handler2.call_args_list, @@ -205,7 +205,7 @@ class EventHandlingTest(unittest.TestCase): bb.event.fire_class_handlers(event2, None) bb.event.fire_class_handlers(event2, None) expected_event_handler1 = [] - expected_event_handler2 = [call(event1)] + expected_event_handler2 = [call(event1, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected_event_handler1) self.assertEqual(self._test_process.event_handler2.call_args_list, @@ -223,7 +223,7 @@ class EventHandlingTest(unittest.TestCase): self.assertEqual(result, bb.event.Registered) bb.event.fire_class_handlers(event1, None) bb.event.fire_class_handlers(event2, None) - expected = [call(event1), call(event2)] + expected = [call(event1, None), call(event2, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected) @@ -237,7 +237,7 @@ class EventHandlingTest(unittest.TestCase): self.assertEqual(result, bb.event.Registered) bb.event.fire_class_handlers(event1, None) bb.event.fire_class_handlers(event2, None) - expected = [call(event1), call(event2), call(event1)] + expected = [call(event1, None), call(event2, None), call(event1, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected) @@ -251,7 +251,7 @@ class EventHandlingTest(unittest.TestCase): self.assertEqual(result, bb.event.Registered) bb.event.fire_class_handlers(event1, None) bb.event.fire_class_handlers(event2, None) - expected = [call(event1), call(event2), call(event1), call(event2)] + expected = [call(event1,None), call(event2, None), call(event1, None), call(event2, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected) @@ -359,9 +359,10 @@ class EventHandlingTest(unittest.TestCase): event1 = bb.event.ConfigParsed() bb.event.fire(event1, None) - expected = [call(event1)] + expected = [call(event1, None)] self.assertEqual(self._test_process.event_handler1.call_args_list, expected) + expected = [call(event1)] self.assertEqual(self._test_ui1.event.send.call_args_list, expected) @@ -450,10 +451,9 @@ class EventHandlingTest(unittest.TestCase): and disable threadlocks tests """ bb.event.fire(bb.event.OperationStarted(), None) - def test_enable_threadlock(self): + def test_event_threadlock(self): """ Test enable_threadlock method """ self._set_threadlock_test_mockups() - bb.event.enable_threadlock() self._set_and_run_threadlock_test_workers() # Calls to UI handlers should be in order as all the registered # handlers for the event coming from the first worker should be @@ -461,20 +461,6 @@ class EventHandlingTest(unittest.TestCase): self.assertEqual(self._threadlock_test_calls, ["w1_ui1", "w1_ui2", "w2_ui1", "w2_ui2"]) - - def test_disable_threadlock(self): - """ Test disable_threadlock method """ - self._set_threadlock_test_mockups() - bb.event.disable_threadlock() - self._set_and_run_threadlock_test_workers() - # Calls to UI handlers should be intertwined together. Thanks to the - # delay in the registered handlers for the event coming from the first - # worker, the event coming from the second worker starts being - # processed before finishing handling the first worker event. - self.assertEqual(self._threadlock_test_calls, - ["w1_ui1", "w2_ui1", "w1_ui2", "w2_ui2"]) - - class EventClassesTest(unittest.TestCase): """ Event classes test class """ diff --git a/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.10/index.html b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.10/index.html new file mode 100644 index 0000000000..4e41af6d6a --- /dev/null +++ b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.10/index.html @@ -0,0 +1,20 @@ +<!DOCTYPE html><html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"><meta name="viewport" content="width=device-width"><style type="text/css">body,html {background:#fff;font-family:"Bitstream Vera Sans","Lucida Grande","Lucida Sans Unicode",Lucidux,Verdana,Lucida,sans-serif;}tr:nth-child(even) {background:#f4f4f4;}th,td {padding:0.1em 0.5em;}th {text-align:left;font-weight:bold;background:#eee;border-bottom:1px solid #aaa;}#list {border:1px solid #aaa;width:100%;}a {color:#a33;}a:hover {color:#e33;}</style> + +<title>Index of /sources/libxml2/2.10/</title> +</head><body><h1>Index of /sources/libxml2/2.10/</h1> +<table id="list"><thead><tr><th style="width:55%"><a href="?C=N&O=A">File Name</a> <a href="?C=N&O=D"> ↓ </a></th><th style="width:20%"><a href="?C=S&O=A">File Size</a> <a href="?C=S&O=D"> ↓ </a></th><th style="width:25%"><a href="?C=M&O=A">Date</a> <a href="?C=M&O=D"> ↓ </a></th></tr></thead> +<tbody><tr><td class="link"><a href="../">Parent directory/</a></td><td class="size">-</td><td class="date">-</td></tr> +<tr><td class="link"><a href="LATEST-IS-2.10.3" title="LATEST-IS-2.10.3">LATEST-IS-2.10.3</a></td><td class="size">2.5 MiB</td><td class="date">2022-Oct-14 12:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.0.news" title="libxml2-2.10.0.news">libxml2-2.10.0.news</a></td><td class="size">7.1 KiB</td><td class="date">2022-Aug-17 11:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.0.sha256sum" title="libxml2-2.10.0.sha256sum">libxml2-2.10.0.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-Aug-17 11:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.0.tar.xz" title="libxml2-2.10.0.tar.xz">libxml2-2.10.0.tar.xz</a></td><td class="size">2.6 MiB</td><td class="date">2022-Aug-17 11:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.1.news" title="libxml2-2.10.1.news">libxml2-2.10.1.news</a></td><td class="size">455 B</td><td class="date">2022-Aug-25 11:33</td></tr> +<tr><td class="link"><a href="libxml2-2.10.1.sha256sum" title="libxml2-2.10.1.sha256sum">libxml2-2.10.1.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-Aug-25 11:33</td></tr> +<tr><td class="link"><a href="libxml2-2.10.1.tar.xz" title="libxml2-2.10.1.tar.xz">libxml2-2.10.1.tar.xz</a></td><td class="size">2.6 MiB</td><td class="date">2022-Aug-25 11:33</td></tr> +<tr><td class="link"><a href="libxml2-2.10.2.news" title="libxml2-2.10.2.news">libxml2-2.10.2.news</a></td><td class="size">309 B</td><td class="date">2022-Aug-29 14:56</td></tr> +<tr><td class="link"><a href="libxml2-2.10.2.sha256sum" title="libxml2-2.10.2.sha256sum">libxml2-2.10.2.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-Aug-29 14:56</td></tr> +<tr><td class="link"><a href="libxml2-2.10.2.tar.xz" title="libxml2-2.10.2.tar.xz">libxml2-2.10.2.tar.xz</a></td><td class="size">2.5 MiB</td><td class="date">2022-Aug-29 14:56</td></tr> +<tr><td class="link"><a href="libxml2-2.10.3.news" title="libxml2-2.10.3.news">libxml2-2.10.3.news</a></td><td class="size">294 B</td><td class="date">2022-Oct-14 12:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.3.sha256sum" title="libxml2-2.10.3.sha256sum">libxml2-2.10.3.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-Oct-14 12:55</td></tr> +<tr><td class="link"><a href="libxml2-2.10.3.tar.xz" title="libxml2-2.10.3.tar.xz">libxml2-2.10.3.tar.xz</a></td><td class="size">2.5 MiB</td><td class="date">2022-Oct-14 12:55</td></tr> +</tbody></table></body></html> diff --git a/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.9/index.html b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.9/index.html new file mode 100644 index 0000000000..abdfdd0fa2 --- /dev/null +++ b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/2.9/index.html @@ -0,0 +1,40 @@ +<!DOCTYPE html><html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"><meta name="viewport" content="width=device-width"><style type="text/css">body,html {background:#fff;font-family:"Bitstream Vera Sans","Lucida Grande","Lucida Sans Unicode",Lucidux,Verdana,Lucida,sans-serif;}tr:nth-child(even) {background:#f4f4f4;}th,td {padding:0.1em 0.5em;}th {text-align:left;font-weight:bold;background:#eee;border-bottom:1px solid #aaa;}#list {border:1px solid #aaa;width:100%;}a {color:#a33;}a:hover {color:#e33;}</style> + +<title>Index of /sources/libxml2/2.9/</title> +</head><body><h1>Index of /sources/libxml2/2.9/</h1> +<table id="list"><thead><tr><th style="width:55%"><a href="?C=N&O=A">File Name</a> <a href="?C=N&O=D"> ↓ </a></th><th style="width:20%"><a href="?C=S&O=A">File Size</a> <a href="?C=S&O=D"> ↓ </a></th><th style="width:25%"><a href="?C=M&O=A">Date</a> <a href="?C=M&O=D"> ↓ </a></th></tr></thead> +<tbody><tr><td class="link"><a href="../">Parent directory/</a></td><td class="size">-</td><td class="date">-</td></tr> +<tr><td class="link"><a href="LATEST-IS-2.9.14" title="LATEST-IS-2.9.14">LATEST-IS-2.9.14</a></td><td class="size">3.0 MiB</td><td class="date">2022-May-02 12:03</td></tr> +<tr><td class="link"><a href="libxml2-2.9.0.sha256sum" title="libxml2-2.9.0.sha256sum">libxml2-2.9.0.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:27</td></tr> +<tr><td class="link"><a href="libxml2-2.9.0.tar.xz" title="libxml2-2.9.0.tar.xz">libxml2-2.9.0.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:27</td></tr> +<tr><td class="link"><a href="libxml2-2.9.1.sha256sum" title="libxml2-2.9.1.sha256sum">libxml2-2.9.1.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:28</td></tr> +<tr><td class="link"><a href="libxml2-2.9.1.tar.xz" title="libxml2-2.9.1.tar.xz">libxml2-2.9.1.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:28</td></tr> +<tr><td class="link"><a href="libxml2-2.9.10.sha256sum" title="libxml2-2.9.10.sha256sum">libxml2-2.9.10.sha256sum</a></td><td class="size">88 B</td><td class="date">2022-Feb-14 18:42</td></tr> +<tr><td class="link"><a href="libxml2-2.9.10.tar.xz" title="libxml2-2.9.10.tar.xz">libxml2-2.9.10.tar.xz</a></td><td class="size">3.2 MiB</td><td class="date">2022-Feb-14 18:42</td></tr> +<tr><td class="link"><a href="libxml2-2.9.11.sha256sum" title="libxml2-2.9.11.sha256sum">libxml2-2.9.11.sha256sum</a></td><td class="size">88 B</td><td class="date">2022-Feb-14 18:43</td></tr> +<tr><td class="link"><a href="libxml2-2.9.11.tar.xz" title="libxml2-2.9.11.tar.xz">libxml2-2.9.11.tar.xz</a></td><td class="size">3.2 MiB</td><td class="date">2022-Feb-14 18:43</td></tr> +<tr><td class="link"><a href="libxml2-2.9.12.sha256sum" title="libxml2-2.9.12.sha256sum">libxml2-2.9.12.sha256sum</a></td><td class="size">88 B</td><td class="date">2022-Feb-14 18:45</td></tr> +<tr><td class="link"><a href="libxml2-2.9.12.tar.xz" title="libxml2-2.9.12.tar.xz">libxml2-2.9.12.tar.xz</a></td><td class="size">3.2 MiB</td><td class="date">2022-Feb-14 18:45</td></tr> +<tr><td class="link"><a href="libxml2-2.9.13.news" title="libxml2-2.9.13.news">libxml2-2.9.13.news</a></td><td class="size">26.6 KiB</td><td class="date">2022-Feb-20 12:42</td></tr> +<tr><td class="link"><a href="libxml2-2.9.13.sha256sum" title="libxml2-2.9.13.sha256sum">libxml2-2.9.13.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-Feb-20 12:42</td></tr> +<tr><td class="link"><a href="libxml2-2.9.13.tar.xz" title="libxml2-2.9.13.tar.xz">libxml2-2.9.13.tar.xz</a></td><td class="size">3.1 MiB</td><td class="date">2022-Feb-20 12:42</td></tr> +<tr><td class="link"><a href="libxml2-2.9.14.news" title="libxml2-2.9.14.news">libxml2-2.9.14.news</a></td><td class="size">1.0 KiB</td><td class="date">2022-May-02 12:03</td></tr> +<tr><td class="link"><a href="libxml2-2.9.14.sha256sum" title="libxml2-2.9.14.sha256sum">libxml2-2.9.14.sha256sum</a></td><td class="size">174 B</td><td class="date">2022-May-02 12:03</td></tr> +<tr><td class="link"><a href="libxml2-2.9.14.tar.xz" title="libxml2-2.9.14.tar.xz">libxml2-2.9.14.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-May-02 12:03</td></tr> +<tr><td class="link"><a href="libxml2-2.9.2.sha256sum" title="libxml2-2.9.2.sha256sum">libxml2-2.9.2.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:30</td></tr> +<tr><td class="link"><a href="libxml2-2.9.2.tar.xz" title="libxml2-2.9.2.tar.xz">libxml2-2.9.2.tar.xz</a></td><td class="size">3.2 MiB</td><td class="date">2022-Feb-14 18:30</td></tr> +<tr><td class="link"><a href="libxml2-2.9.3.sha256sum" title="libxml2-2.9.3.sha256sum">libxml2-2.9.3.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:31</td></tr> +<tr><td class="link"><a href="libxml2-2.9.3.tar.xz" title="libxml2-2.9.3.tar.xz">libxml2-2.9.3.tar.xz</a></td><td class="size">3.2 MiB</td><td class="date">2022-Feb-14 18:31</td></tr> +<tr><td class="link"><a href="libxml2-2.9.4.sha256sum" title="libxml2-2.9.4.sha256sum">libxml2-2.9.4.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:33</td></tr> +<tr><td class="link"><a href="libxml2-2.9.4.tar.xz" title="libxml2-2.9.4.tar.xz">libxml2-2.9.4.tar.xz</a></td><td class="size">2.9 MiB</td><td class="date">2022-Feb-14 18:33</td></tr> +<tr><td class="link"><a href="libxml2-2.9.5.sha256sum" title="libxml2-2.9.5.sha256sum">libxml2-2.9.5.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:35</td></tr> +<tr><td class="link"><a href="libxml2-2.9.5.tar.xz" title="libxml2-2.9.5.tar.xz">libxml2-2.9.5.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:35</td></tr> +<tr><td class="link"><a href="libxml2-2.9.6.sha256sum" title="libxml2-2.9.6.sha256sum">libxml2-2.9.6.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:36</td></tr> +<tr><td class="link"><a href="libxml2-2.9.6.tar.xz" title="libxml2-2.9.6.tar.xz">libxml2-2.9.6.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:36</td></tr> +<tr><td class="link"><a href="libxml2-2.9.7.sha256sum" title="libxml2-2.9.7.sha256sum">libxml2-2.9.7.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:37</td></tr> +<tr><td class="link"><a href="libxml2-2.9.7.tar.xz" title="libxml2-2.9.7.tar.xz">libxml2-2.9.7.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:37</td></tr> +<tr><td class="link"><a href="libxml2-2.9.8.sha256sum" title="libxml2-2.9.8.sha256sum">libxml2-2.9.8.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:39</td></tr> +<tr><td class="link"><a href="libxml2-2.9.8.tar.xz" title="libxml2-2.9.8.tar.xz">libxml2-2.9.8.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:39</td></tr> +<tr><td class="link"><a href="libxml2-2.9.9.sha256sum" title="libxml2-2.9.9.sha256sum">libxml2-2.9.9.sha256sum</a></td><td class="size">87 B</td><td class="date">2022-Feb-14 18:40</td></tr> +<tr><td class="link"><a href="libxml2-2.9.9.tar.xz" title="libxml2-2.9.9.tar.xz">libxml2-2.9.9.tar.xz</a></td><td class="size">3.0 MiB</td><td class="date">2022-Feb-14 18:40</td></tr> +</tbody></table></body></html> diff --git a/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/index.html b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/index.html new file mode 100644 index 0000000000..c183e06a55 --- /dev/null +++ b/poky/bitbake/lib/bb/tests/fetch-testdata/software/libxml2/index.html @@ -0,0 +1,19 @@ +<!DOCTYPE html><html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"><meta name="viewport" content="width=device-width"><style type="text/css">body,html {background:#fff;font-family:"Bitstream Vera Sans","Lucida Grande","Lucida Sans Unicode",Lucidux,Verdana,Lucida,sans-serif;}tr:nth-child(even) {background:#f4f4f4;}th,td {padding:0.1em 0.5em;}th {text-align:left;font-weight:bold;background:#eee;border-bottom:1px solid #aaa;}#list {border:1px solid #aaa;width:100%;}a {color:#a33;}a:hover {color:#e33;}</style> + +<title>Index of /sources/libxml2/</title> +</head><body><h1>Index of /sources/libxml2/</h1> +<table id="list"><thead><tr><th style="width:55%"><a href="?C=N&O=A">File Name</a> <a href="?C=N&O=D"> ↓ </a></th><th style="width:20%"><a href="?C=S&O=A">File Size</a> <a href="?C=S&O=D"> ↓ </a></th><th style="width:25%"><a href="?C=M&O=A">Date</a> <a href="?C=M&O=D"> ↓ </a></th></tr></thead> +<tbody><tr><td class="link"><a href="../">Parent directory/</a></td><td class="size">-</td><td class="date">-</td></tr> +<tr><td class="link"><a href="2.0/" title="2.0">2.0/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:04</td></tr> +<tr><td class="link"><a href="2.1/" title="2.1">2.1/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:04</td></tr> +<tr><td class="link"><a href="2.10/" title="2.10">2.10/</a></td><td class="size">-</td><td class="date">2022-Oct-14 12:55</td></tr> +<tr><td class="link"><a href="2.2/" title="2.2">2.2/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:04</td></tr> +<tr><td class="link"><a href="2.3/" title="2.3">2.3/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:05</td></tr> +<tr><td class="link"><a href="2.4/" title="2.4">2.4/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:05</td></tr> +<tr><td class="link"><a href="2.5/" title="2.5">2.5/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:05</td></tr> +<tr><td class="link"><a href="2.6/" title="2.6">2.6/</a></td><td class="size">-</td><td class="date">2009-Jul-14 13:05</td></tr> +<tr><td class="link"><a href="2.7/" title="2.7">2.7/</a></td><td class="size">-</td><td class="date">2022-Feb-14 18:24</td></tr> +<tr><td class="link"><a href="2.8/" title="2.8">2.8/</a></td><td class="size">-</td><td class="date">2022-Feb-14 18:26</td></tr> +<tr><td class="link"><a href="2.9/" title="2.9">2.9/</a></td><td class="size">-</td><td class="date">2022-May-02 12:04</td></tr> +<tr><td class="link"><a href="cache.json" title="cache.json">cache.json</a></td><td class="size">22.8 KiB</td><td class="date">2022-Oct-14 12:55</td></tr> +</tbody></table></body></html> diff --git a/poky/bitbake/lib/bb/tests/fetch.py b/poky/bitbake/lib/bb/tests/fetch.py index 0af06e46e5..ad3d4dea7d 100644 --- a/poky/bitbake/lib/bb/tests/fetch.py +++ b/poky/bitbake/lib/bb/tests/fetch.py @@ -1401,6 +1401,9 @@ class FetchLatestVersionTest(FetcherTest): # http://www.cmake.org/files/v2.8/cmake-2.8.12.1.tar.gz ("cmake", "/files/v2.8/cmake-2.8.12.1.tar.gz", "", "") : "2.8.12.1", + # https://download.gnome.org/sources/libxml2/2.9/libxml2-2.9.14.tar.xz + ("libxml2", "/software/libxml2/2.9/libxml2-2.9.14.tar.xz", "", "") + : "2.10.3", # # packages with versions only in current directory # diff --git a/poky/bitbake/lib/bb/tests/runqueue.py b/poky/bitbake/lib/bb/tests/runqueue.py index 061a5a1f80..cc87e8d6a8 100644 --- a/poky/bitbake/lib/bb/tests/runqueue.py +++ b/poky/bitbake/lib/bb/tests/runqueue.py @@ -288,7 +288,7 @@ class RunQueueTests(unittest.TestCase): with tempfile.TemporaryDirectory(prefix="runqueuetest") as tempdir: extraenv = { "BBMULTICONFIG" : "mc-1 mc_2", - "BB_SIGNATURE_HANDLER" : "TestMulticonfigDepends", + "BB_SIGNATURE_HANDLER" : "basichash", "EXTRA_BBFILES": "${COREBASE}/recipes/fails-mc/*.bb", } tasks = self.run_bitbakecmd(["bitbake", "mc:mc-1:f1"], tempdir, "", extraenv=extraenv, cleanup=True) diff --git a/poky/bitbake/lib/bb/tinfoil.py b/poky/bitbake/lib/bb/tinfoil.py index e68a3b879a..91fbf1b13e 100644 --- a/poky/bitbake/lib/bb/tinfoil.py +++ b/poky/bitbake/lib/bb/tinfoil.py @@ -10,6 +10,7 @@ import logging import os import sys +import time import atexit import re from collections import OrderedDict, defaultdict @@ -729,6 +730,7 @@ class Tinfoil: ret = self.run_command('buildTargets', targets, task) if handle_events: + lastevent = time.time() result = False # Borrowed from knotty, instead somewhat hackily we use the helper # as the object to store "shutdown" on @@ -741,6 +743,7 @@ class Tinfoil: try: event = self.wait_event(0.25) if event: + lastevent = time.time() if event_callback and event_callback(event): continue if helper.eventHandler(event): @@ -773,7 +776,7 @@ class Tinfoil: if isinstance(event, bb.command.CommandCompleted): result = True break - if isinstance(event, bb.command.CommandFailed): + if isinstance(event, (bb.command.CommandFailed, bb.command.CommandExit)): self.logger.error(str(event)) result = False break @@ -785,10 +788,13 @@ class Tinfoil: self.logger.error(str(event)) result = False break - elif helper.shutdown > 1: break termfilter.updateFooter() + if time.time() > (lastevent + (3*60)): + if not self.run_command('ping', handle_events=False): + print("\nUnable to ping server and no events, closing down...\n") + return False except KeyboardInterrupt: termfilter.clearFooter() if helper.shutdown == 1: diff --git a/poky/bitbake/lib/bb/ui/knotty.py b/poky/bitbake/lib/bb/ui/knotty.py index 61cf0a37f4..431baa15ef 100644 --- a/poky/bitbake/lib/bb/ui/knotty.py +++ b/poky/bitbake/lib/bb/ui/knotty.py @@ -625,25 +625,38 @@ def main(server, eventHandler, params, tf = TerminalFilter): printintervaldelta = 10 * 60 # 10 minutes printinterval = printintervaldelta - lastprint = time.time() + pinginterval = 1 * 60 # 1 minute + lastevent = lastprint = time.time() termfilter = tf(main, helper, console_handlers, params.options.quiet) atexit.register(termfilter.finish) - while True: + # shutdown levels + # 0 - normal operation + # 1 - no new task execution, let current running tasks finish + # 2 - interrupting currently executing tasks + # 3 - we're done, exit + while main.shutdown < 3: try: if (lastprint + printinterval) <= time.time(): termfilter.keepAlive(printinterval) printinterval += printintervaldelta event = eventHandler.waitEvent(0) if event is None: - if main.shutdown > 1: - break + if (lastevent + pinginterval) <= time.time(): + ret, error = server.runCommand(["ping"]) + if error or not ret: + termfilter.clearFooter() + print("No reply after pinging server (%s, %s), exiting." % (str(error), str(ret))) + return_value = 3 + main.shutdown = 3 + lastevent = time.time() if not parseprogress: termfilter.updateFooter() event = eventHandler.waitEvent(0.25) if event is None: continue + lastevent = time.time() helper.eventHandler(event) if isinstance(event, bb.runqueue.runQueueExitWait): if not main.shutdown: @@ -748,15 +761,15 @@ def main(server, eventHandler, params, tf = TerminalFilter): if event.error: errors = errors + 1 logger.error(str(event)) - main.shutdown = 2 + main.shutdown = 3 continue if isinstance(event, bb.command.CommandExit): if not return_value: return_value = event.exitcode - main.shutdown = 2 + main.shutdown = 3 continue if isinstance(event, (bb.command.CommandCompleted, bb.cooker.CookerExit)): - main.shutdown = 2 + main.shutdown = 3 continue if isinstance(event, bb.event.MultipleProviders): logger.info(str(event)) diff --git a/poky/bitbake/lib/bb/ui/uievent.py b/poky/bitbake/lib/bb/ui/uievent.py index d595f172ac..adbe698939 100644 --- a/poky/bitbake/lib/bb/ui/uievent.py +++ b/poky/bitbake/lib/bb/ui/uievent.py @@ -70,30 +70,22 @@ class BBUIEventQueue: self.t.start() def getEvent(self): - - self.eventQueueLock.acquire() - - if not self.eventQueue: - self.eventQueueLock.release() - return None - - item = self.eventQueue.pop(0) - - if not self.eventQueue: - self.eventQueueNotify.clear() - - self.eventQueueLock.release() - return item + with bb.utils.lock_timeout(self.eventQueueLock): + if not self.eventQueue: + return None + item = self.eventQueue.pop(0) + if not self.eventQueue: + self.eventQueueNotify.clear() + return item def waitEvent(self, delay): self.eventQueueNotify.wait(delay) return self.getEvent() def queue_event(self, event): - self.eventQueueLock.acquire() - self.eventQueue.append(event) - self.eventQueueNotify.set() - self.eventQueueLock.release() + with bb.utils.lock_timeout(self.eventQueueLock): + self.eventQueue.append(event) + self.eventQueueNotify.set() def send_event(self, event): self.queue_event(pickle.loads(event)) diff --git a/poky/bitbake/lib/bb/utils.py b/poky/bitbake/lib/bb/utils.py index 64a004d0d8..8c79159573 100644 --- a/poky/bitbake/lib/bb/utils.py +++ b/poky/bitbake/lib/bb/utils.py @@ -13,6 +13,7 @@ import errno import logging import bb import bb.msg +import locale import multiprocessing import fcntl import importlib @@ -608,6 +609,21 @@ def preserved_envvars(): ] return v + preserved_envvars_exported() +def check_system_locale(): + """Make sure the required system locale are available and configured""" + default_locale = locale.getlocale(locale.LC_CTYPE) + + try: + locale.setlocale(locale.LC_CTYPE, ("en_US", "UTF-8")) + except: + sys.exit("Please make sure locale 'en_US.UTF-8' is available on your system") + else: + locale.setlocale(locale.LC_CTYPE, default_locale) + + if sys.getfilesystemencoding() != "utf-8": + sys.exit("Please use a locale setting which supports UTF-8 (such as LANG=en_US.UTF-8).\n" + "Python can't change the filesystem locale after loading so we need a UTF-8 when Python starts or things won't work.") + def filter_environment(good_vars): """ Create a pristine environment for bitbake. This will remove variables that @@ -992,6 +1008,9 @@ def to_boolean(string, default=None): if not string: return default + if isinstance(string, int): + return string != 0 + normalized = string.lower() if normalized in ("y", "yes", "1", "true"): return True @@ -1822,3 +1841,16 @@ def mkstemp(suffix=None, prefix=None, dir=None, text=False): else: prefix = tempfile.gettempprefix() + entropy return tempfile.mkstemp(suffix=suffix, prefix=prefix, dir=dir, text=text) + +# If we don't have a timeout of some kind and a process/thread exits badly (for example +# OOM killed) and held a lock, we'd just hang in the lock futex forever. It is better +# we exit at some point than hang. 5 minutes with no progress means we're probably deadlocked. +@contextmanager +def lock_timeout(lock): + held = lock.acquire(timeout=5*60) + try: + if not held: + os._exit(1) + yield held + finally: + lock.release() |