summaryrefslogtreecommitdiff
path: root/poky/meta/classes
diff options
context:
space:
mode:
authorBrad Bishop <bradleyb@fuzziesquirrel.com>2019-11-04 21:55:29 +0300
committerBrad Bishop <bradleyb@fuzziesquirrel.com>2019-11-04 21:56:09 +0300
commit64c979e88e6d0917b6fe45e52e381affec150afd (patch)
treea0e35da2075116b2d1d43813cc3f7f57f99d843a /poky/meta/classes
parent868407c65d79e82e83c37f7c32bef9a2e2bc4cd5 (diff)
downloadopenbmc-64c979e88e6d0917b6fe45e52e381affec150afd.tar.xz
poky: subtree update:52a625582e..7035b4b21e
Adrian Bunk (9): squashfs-tools: Upgrade to 4.4 screen: Upgrade 4.6.2 -> 4.7.0 stress-ng: Upgrade 0.10.00 -> 0.10.08 nspr: Upgrade 4.21 -> 4.23 gcc: Remove stale gcc 8 patchfile gnu-efi: Upgrade 3.0.9 -> 3.0.10 python3-numpy: Stop shipping manual config files coreutils: Move stdbuf into an own package coreutils-stdbuf gnu-efi: Upgrade 3.0.10 -> 3.0.11 Alessio Igor Bogani (1): systemtap: support usrmerge Alexander Hirsch (1): libksba: Fix license specification Alexander Kanavin (6): gcr: update to 3.34.0 btrfs-tools: update to 5.3 libmodulemd-v1: update to 1.8.16 selftest: skip virgl test on centos 7 entirely nfs-utils: do not depend on bash unnecessarily selftest: add a test for gpl3-free images Alistair Francis (4): opensbi: Bump from 0.4 to 0.5 u-boot: Bump from 2019.07 to 2019.10 qemuriscv64: Build smode U-Boot libsdl2: Fix build failure when using mesa 19.2.1 Andreas Müller (4): adwaita-icon-theme: upgrade 3.32.0 -> 3.34.0 gsettings-desktop-schemas: upgrade 3.32.0 -> 3.34.0 IMAGE_LINGUAS_COMPLEMENTARY: auto-add language packages other than locales libical: add PACKAGECONFIG glib and enable it by default André Draszik (10): testimage.bbclass: support hardware-controlled targets testimage.bbclass: enable ssh agent forwarding oeqa/runtime/df: don't fail on long device names oeqa/core/decorator: add skipIfFeature oeqa/runtime/opkg: skip install on read-only-rootfs oeqa/runtime/systemd: skip unit enable/disable on read-only-rootfs ruby: update to v2.6.4 ruby: some ptest fixes oeqa/runtime/context.py: ignore more files when loading controllers connman: mark connman-wait-online as SYSTEMD_PACKAGE Bruce Ashfield (6): linux-yocto/4.19: update to v4.19.78 linux-yocto/5.2: update to v5.2.20 perf: fix v5.4+ builds perf: create directories before copying single files perf: add 'cap' PACKAGECONFIG perf: drop 'include' copy Carlos Rafael Giani (12): gstreamer1.0: upgrade to version 1.16.1 gstreamer1.0-plugins-base: upgrade to version 1.16.1 gstreamer1.0-plugins-good: upgrade to version 1.16.1 gstreamer1.0-plugins-bad: upgrade to version 1.16.1 gstreamer1.0-plugins-ugly: upgrade to version 1.16.1 gstreamer1.0-libav: upgrade to version 1.16.1 gstreamer1.0-vaapi: upgrade to version 1.16.1 gstreamer1.0-omx: upgrade to version 1.16.1 gstreamer1.0-python: upgrade to version 1.16.1 gstreamer1.0-rtsp-server: upgrade to version 1.16.1 gst-validate: upgrade to version 1.16.1 gstreamer: Change SRC_URI to use HTTPS access instead of HTTP Changqing Li (4): qemu: Fix CVE-2019-12068 python: Fix CVE-2019-10160 sudo: fix CVE-2019-14287 mdadm: fix do_package failed when changed local.conf but not cleaned Chee Yang Lee (2): wic/help: change 'wic write' help description wic/engine: use 'linux-swap' for swap file system Chen Qi (3): go: fix CVE-2019-16276 python3: fix CVE-2019-16935 python: fix CVE-2019-16935 Chris Laplante via bitbake-devel (2): bitbake: bitbake: contrib/vim: initial commit, with unmodified code from indent/python.vim bitbake: bitbake: contrib/vim: Modify Python indentation to work with 'python do_task {' Christopher Larson (2): bitbake: fetch2/git: fetch shallow revs when needed bitbake: tests/fetch: add test for fetching shallow revs Dan Callaghan (1): elfutils: add PACKAGECONFIG for compression algorithms Douglas Royds via Openembedded-core (1): icecc: Export ICECC_CC and friends via wrapper-script Eduardo Abinader (1): devtool: add ssh key option to deploy-target param Eugene Smirnov (1): wic/rawcopy: Support files in sub-directories Ferry Toth (1): sudo: Fix fetching sources Frazer Leslie Clews (2): makedevs: fix format strings in makedevs.c in print statements makedevs: fix invalidScanfFormatWidth to prevent overflowing usr_buf George McCollister (1): openssl: make OPENSSL_ENGINES match install path Haiqing Bai (1): unfs3: fixed the issue that unfsd consumes 100% CPU He Zhe (1): ltp: Fix overcommit_memory failure Hongxu Jia (1): openssh: fix CVE-2019-16905 Joe Slater (2): libtiff: fix CVE-2019-17546 libxslt: fix CVE-2019-18197 Kai Kang (1): bind: fix CVE-2019-6471 and CVE-2018-5743 Liwei Song (1): util-linux: fix PKNAME name is NULL when use lsblk [LIN1019-2963] Mattias Hansson (1): base.bbclass: add dependency on pseudo from do_prepare_recipe_sysroot Max Tomago (1): python-native: Remove debug.patch Maxime Roussin-Bélanger (2): meta: update and add missing homepage/bugtracker links meta: add missing description in recipes-gnome Michael Ho (1): cmake.bbclass: add HOSTTOOLS_DIR to CMAKE_FIND_ROOT_PATH Mike Crowe (2): kernel-fitimage: Cope with non-standard kernel deploy subdirectory kernel-devicetree: Cope with non-standard kernel deploy subdirectory Mikko Rapeli (1): systemd.bbclass: enable all services specified in ${SYSTEMD_SERVICE} Nicola Lunghi (1): ofono: tidy up the recipe Ola x Nilsson (10): oeqa/selftest/recipetool: Use with to control file handle lifetime oe.types.path: Use with to control file handle lifetime lib/oe/packagedata: Use with to control file handle lifetime lib/oe/package_manager: Use with to control file handle lifetime report-error.bbclass: Use with to control file handle lifetime package.bbclass: Use with to manage file handle lifetimes devtool-source.bbclass: Use with to manage file handle lifetime libc-package.bbclass: Use with to manage filehandle in do_spit_gconvs bitbake: bitbake: prserv/serv: Use with while reading pidfile bitbake: bitbake: ConfHandler: Use with to manage filehandle lifetime Oleksandr Kravchuk (4): ell: update to 0.23 ell: update to 0.25 ell: update to 0.26 ofono: update to 1.31 Ricardo Ribalda Delgado (1): i2c-tools: Add missing RDEPEND Richard Leitner (1): kernel-fitimage: introduce FIT_SIGN_ALG Richard Purdie (4): tinderclient: Drop obsolete class meson: Backport fix to assist meta-oe breakage nfs-utils: Improve handling when no exported fileysystems qemu: Avoid potential build configuration contamination Robert Yang (1): bluez5: Fix for --enable-btpclient Ross Burton (29): sanity: check the format of SDK_VENDOR file: explicitly disable seccomp python3: -dev should depend on distutils gawk: add PACKAGECONFIG for readline python3: alternative name is python3-config not python-config python3: ensure that all forms of python3-config are in python3-dev oeqa/selftest: use specialist assert* methods bluez5: refresh upstreamed patches xorgproto: fix summary libx11: upgrade to 1.6.9 xorgproto: upgrade to 2019.2 llvm: add missing Upstream-Status tags buildhistory-analysis: filter out -src changes by default squashfs-tools: remove redundant source checksums squashfs-tools: clean up compile/install tasks wpa-supplicant: fix CVE-2019-16275 gcr: remove intltool-native elfutils: disable bzip cve-check: ensure all known CVEs are in the report git: some tools are no longer perl, so move to main recipe git: cleanup man install qemu-helper-native: add missing option to getopt() call qemu-helper-native: showing help shouldn't be an error qemu-helper-native: pass compiler flags oeqa/selftest: add test for oe-run-native cve-check: failure to parse versions should be more visible gst-examples: rename so PV is in filename sanity: check for more bits of Python recipeutils-test: use a small dependency in the dummy recipe Sai Hari Chandana Kalluri (1): devtool: Add --remove-work option for devtool reset command Scott Rifenbark (9): ref-manual: First pass of 2.8 migration changes (WIP) poky.ent: Updated the release date to October 2019 dev-manual: Added info to "Selecting an Initialization Manager" ref-manual: 2nd pass 3.0 migration documenation: Changed "2.8" to "3.0". ref-manual: Removed deprecated link to ref-classes-bluetooth ref-manual, dev-manual: Clean up of a commit ref-manual: Updated the BUSYBOX_SPLIT_SUID variable. ref-manual, dev-manual: Added CMake toolchain files. Stefan Agner (1): uninative: check .done file instead of tarball Tom Benn (1): dbus: update dbus-1.init to reflect new PID file Trevor Gamblin (5): aspell: upgrade from 0.60.7 to 0.60.8 binutils: fix CVE-2019-17450 binutils: fix CVE-2019-17451 ncurses: fix CVE-2019-17594, CVE-2019-17595 libgcrypt: upgrade 1.8.4 -> 1.8.5 Trevor Woerner (1): libcap-ng: undefined reference to `pthread_atfork' Wenlin Kang (1): sysstat: fix CVE-2019-16167 Yann Dirson (1): mesa: fix meson configure fix when 'dri' is excluded from PACKAGECONFIG Yeoh Ee Peng (1): scripts/oe-pkgdata-util: Enable list-pkgs to print ordered packages Yi Zhao (2): libsdl2: fix CVE-2019-13616 libgcrypt: fix CVE-2019-12904 Zang Ruochen (6): bison:upgrade 3.4.1 -> 3.4.2 e2fsprogs:upgrade 1.45.3 -> 1.45.4 libxvmc:upgrade 1.0.11 -> 1.0.12 python3-pip:upgrade 19.2.3 -> 19.3.1 python-setuptools:upgrade 41.2.0 -> 41.4.0 libcap-ng:upgrade 0.7.9 -> 0.7.10 Signed-off-by: Brad Bishop <bradleyb@fuzziesquirrel.com> Change-Id: I50bc42f74dffdc406ffc0dea034e41462fe6e06b
Diffstat (limited to 'poky/meta/classes')
-rw-r--r--poky/meta/classes/base.bbclass1
-rw-r--r--poky/meta/classes/cmake.bbclass3
-rw-r--r--poky/meta/classes/cve-check.bbclass13
-rw-r--r--poky/meta/classes/devtool-source.bbclass8
-rw-r--r--poky/meta/classes/icecc.bbclass32
-rw-r--r--poky/meta/classes/image.bbclass2
-rw-r--r--poky/meta/classes/kernel-devicetree.bbclass20
-rw-r--r--poky/meta/classes/kernel-fitimage.bbclass24
-rw-r--r--poky/meta/classes/libc-package.bbclass15
-rw-r--r--poky/meta/classes/package.bbclass76
-rw-r--r--poky/meta/classes/report-error.bbclass8
-rw-r--r--poky/meta/classes/sanity.bbclass17
-rw-r--r--poky/meta/classes/systemd.bbclass6
-rw-r--r--poky/meta/classes/testimage.bbclass18
-rw-r--r--poky/meta/classes/tinderclient.bbclass368
-rw-r--r--poky/meta/classes/uninative.bbclass2
16 files changed, 135 insertions, 478 deletions
diff --git a/poky/meta/classes/base.bbclass b/poky/meta/classes/base.bbclass
index d3184ecf7..1cea3a221 100644
--- a/poky/meta/classes/base.bbclass
+++ b/poky/meta/classes/base.bbclass
@@ -482,6 +482,7 @@ python () {
# If we're building a target package we need to use fakeroot (pseudo)
# in order to capture permissions, owners, groups and special files
if not bb.data.inherits_class('native', d) and not bb.data.inherits_class('cross', d):
+ d.appendVarFlag('do_prepare_recipe_sysroot', 'depends', ' virtual/fakeroot-native:do_populate_sysroot')
d.setVarFlag('do_unpack', 'umask', '022')
d.setVarFlag('do_configure', 'umask', '022')
d.setVarFlag('do_compile', 'umask', '022')
diff --git a/poky/meta/classes/cmake.bbclass b/poky/meta/classes/cmake.bbclass
index 2b317c832..291f1e8d4 100644
--- a/poky/meta/classes/cmake.bbclass
+++ b/poky/meta/classes/cmake.bbclass
@@ -106,11 +106,12 @@ set( CMAKE_CXX_LINK_FLAGS "${OECMAKE_CXX_LINK_FLAGS}" CACHE STRING "LDFLAGS" )
# only search in the paths provided so cmake doesnt pick
# up libraries and tools from the native build machine
-set( CMAKE_FIND_ROOT_PATH ${STAGING_DIR_HOST} ${STAGING_DIR_NATIVE} ${CROSS_DIR} ${OECMAKE_PERLNATIVE_DIR} ${OECMAKE_EXTRA_ROOT_PATH} ${EXTERNAL_TOOLCHAIN})
+set( CMAKE_FIND_ROOT_PATH ${STAGING_DIR_HOST} ${STAGING_DIR_NATIVE} ${CROSS_DIR} ${OECMAKE_PERLNATIVE_DIR} ${OECMAKE_EXTRA_ROOT_PATH} ${EXTERNAL_TOOLCHAIN} ${HOSTTOOLS_DIR})
set( CMAKE_FIND_ROOT_PATH_MODE_PACKAGE ONLY )
set( CMAKE_FIND_ROOT_PATH_MODE_PROGRAM ${OECMAKE_FIND_ROOT_PATH_MODE_PROGRAM} )
set( CMAKE_FIND_ROOT_PATH_MODE_LIBRARY ONLY )
set( CMAKE_FIND_ROOT_PATH_MODE_INCLUDE ONLY )
+set( CMAKE_PROGRAM_PATH "/" )
# Use qt.conf settings
set( ENV{QT_CONF_PATH} ${WORKDIR}/qt.conf )
diff --git a/poky/meta/classes/cve-check.bbclass b/poky/meta/classes/cve-check.bbclass
index c00d2910b..1c8b2223a 100644
--- a/poky/meta/classes/cve-check.bbclass
+++ b/poky/meta/classes/cve-check.bbclass
@@ -208,19 +208,21 @@ def check_cves(d, patched_cves):
if cve in cve_whitelist:
bb.note("%s-%s has been whitelisted for %s" % (product, pv, cve))
+ # TODO: this should be in the report as 'whitelisted'
+ patched_cves.add(cve)
elif cve in patched_cves:
bb.note("%s has been patched" % (cve))
else:
to_append = False
if (operator_start == '=' and pv == version_start):
- cves_unpatched.append(cve)
+ to_append = True
else:
if operator_start:
try:
to_append_start = (operator_start == '>=' and LooseVersion(pv) >= LooseVersion(version_start))
to_append_start |= (operator_start == '>' and LooseVersion(pv) > LooseVersion(version_start))
except:
- bb.note("%s: Failed to compare %s %s %s for %s" %
+ bb.warn("%s: Failed to compare %s %s %s for %s" %
(product, pv, operator_start, version_start, cve))
to_append_start = False
else:
@@ -231,7 +233,7 @@ def check_cves(d, patched_cves):
to_append_end = (operator_end == '<=' and LooseVersion(pv) <= LooseVersion(version_end))
to_append_end |= (operator_end == '<' and LooseVersion(pv) < LooseVersion(version_end))
except:
- bb.note("%s: Failed to compare %s %s %s for %s" %
+ bb.warn("%s: Failed to compare %s %s %s for %s" %
(product, pv, operator_end, version_end, cve))
to_append_end = False
else:
@@ -243,8 +245,11 @@ def check_cves(d, patched_cves):
to_append = to_append_start or to_append_end
if to_append:
+ bb.note("%s-%s is vulnerable to %s" % (product, pv, cve))
cves_unpatched.append(cve)
- bb.debug(2, "%s-%s is not patched for %s" % (product, pv, cve))
+ else:
+ bb.note("%s-%s is not vulnerable to %s" % (product, pv, cve))
+ patched_cves.add(cve)
conn.close()
return (list(patched_cves), cves_unpatched)
diff --git a/poky/meta/classes/devtool-source.bbclass b/poky/meta/classes/devtool-source.bbclass
index a8110006f..280d6009f 100644
--- a/poky/meta/classes/devtool-source.bbclass
+++ b/poky/meta/classes/devtool-source.bbclass
@@ -97,17 +97,15 @@ python devtool_post_unpack() {
local_files = oe.recipeutils.get_recipe_local_files(d)
if is_kernel_yocto:
- for key in local_files.copy():
- if key.endswith('scc'):
- sccfile = open(local_files[key], 'r')
+ for key in [f for f in local_files if f.endswith('scc')]:
+ with open(local_files[key], 'r') as sccfile:
for l in sccfile:
line = l.split()
if line and line[0] in ('kconf', 'patch'):
cfg = os.path.join(os.path.dirname(local_files[key]), line[-1])
- if not cfg in local_files.values():
+ if cfg not in local_files.values():
local_files[line[-1]] = cfg
shutil.copy2(cfg, workdir)
- sccfile.close()
# Ignore local files with subdir={BP}
srcabspath = os.path.abspath(srcsubdir)
diff --git a/poky/meta/classes/icecc.bbclass b/poky/meta/classes/icecc.bbclass
index 4376aa37d..bc3d6f4cc 100644
--- a/poky/meta/classes/icecc.bbclass
+++ b/poky/meta/classes/icecc.bbclass
@@ -356,17 +356,6 @@ set_icecc_env() {
return
fi
- # Create symlinks to icecc in the recipe-sysroot directory
- mkdir -p ${ICE_PATH}
- if [ -n "${KERNEL_CC}" ]; then
- compilers="${@get_cross_kernel_cc(bb,d)}"
- else
- compilers="${HOST_PREFIX}gcc ${HOST_PREFIX}g++"
- fi
- for compiler in $compilers; do
- ln -sf ${ICECC_BIN} ${ICE_PATH}/$compiler
- done
-
ICECC_CC="${@icecc_get_and_check_tool(bb, d, "gcc")}"
ICECC_CXX="${@icecc_get_and_check_tool(bb, d, "g++")}"
# cannot use icecc_get_and_check_tool here because it assumes as without target_sys prefix
@@ -385,6 +374,26 @@ set_icecc_env() {
return
fi
+ # Create symlinks to icecc and wrapper-scripts in the recipe-sysroot directory
+ mkdir -p $ICE_PATH/symlinks
+ if [ -n "${KERNEL_CC}" ]; then
+ compilers="${@get_cross_kernel_cc(bb,d)}"
+ else
+ compilers="${HOST_PREFIX}gcc ${HOST_PREFIX}g++"
+ fi
+ for compiler in $compilers; do
+ ln -sf $ICECC_BIN $ICE_PATH/symlinks/$compiler
+ rm -f $ICE_PATH/$compiler
+ cat <<-__EOF__ > $ICE_PATH/$compiler
+ #!/bin/sh -e
+ export ICECC_VERSION=$ICECC_VERSION
+ export ICECC_CC=$ICECC_CC
+ export ICECC_CXX=$ICECC_CXX
+ $ICE_PATH/symlinks/$compiler "\$@"
+ __EOF__
+ chmod 775 $ICE_PATH/$compiler
+ done
+
ICECC_AS="`${ICECC_CC} -print-prog-name=as`"
# for target recipes should return something like:
# /OE/tmp-eglibc/sysroots/x86_64-linux/usr/libexec/arm920tt-oe-linux-gnueabi/gcc/arm-oe-linux-gnueabi/4.8.2/as
@@ -417,7 +426,6 @@ set_icecc_env() {
export CCACHE_PATH="$PATH"
export CCACHE_DISABLE="1"
- export ICECC_VERSION ICECC_CC ICECC_CXX
export PATH="$ICE_PATH:$PATH"
bbnote "Using icecc path: $ICE_PATH"
diff --git a/poky/meta/classes/image.bbclass b/poky/meta/classes/image.bbclass
index f4633da3d..c2824395c 100644
--- a/poky/meta/classes/image.bbclass
+++ b/poky/meta/classes/image.bbclass
@@ -124,7 +124,7 @@ python () {
def rootfs_variables(d):
from oe.rootfs import variable_depends
variables = ['IMAGE_DEVICE_TABLE','IMAGE_DEVICE_TABLES','BUILD_IMAGES_FROM_FEEDS','IMAGE_TYPES_MASKED','IMAGE_ROOTFS_ALIGNMENT','IMAGE_OVERHEAD_FACTOR','IMAGE_ROOTFS_SIZE','IMAGE_ROOTFS_EXTRA_SPACE',
- 'IMAGE_ROOTFS_MAXSIZE','IMAGE_NAME','IMAGE_LINK_NAME','IMAGE_MANIFEST','DEPLOY_DIR_IMAGE','IMAGE_FSTYPES','IMAGE_INSTALL_COMPLEMENTARY','IMAGE_LINGUAS',
+ 'IMAGE_ROOTFS_MAXSIZE','IMAGE_NAME','IMAGE_LINK_NAME','IMAGE_MANIFEST','DEPLOY_DIR_IMAGE','IMAGE_FSTYPES','IMAGE_INSTALL_COMPLEMENTARY','IMAGE_LINGUAS', 'IMAGE_LINGUAS_COMPLEMENTARY',
'MULTILIBRE_ALLOW_REP','MULTILIB_TEMP_ROOTFS','MULTILIB_VARIANTS','MULTILIBS','ALL_MULTILIB_PACKAGE_ARCHS','MULTILIB_GLOBAL_VARIANTS','BAD_RECOMMENDATIONS','NO_RECOMMENDATIONS',
'PACKAGE_ARCHS','PACKAGE_CLASSES','TARGET_VENDOR','TARGET_ARCH','TARGET_OS','OVERRIDES','BBEXTENDVARIANT','FEED_DEPLOYDIR_BASE_URI','INTERCEPT_DIR','USE_DEVFS',
'CONVERSIONTYPES', 'IMAGE_GEN_DEBUGFS', 'ROOTFS_RO_UNNEEDED', 'IMGDEPLOYDIR', 'PACKAGE_EXCLUDE_COMPLEMENTARY', 'REPRODUCIBLE_TIMESTAMP_ROOTFS', 'IMAGE_INSTALL_DEBUGFS']
diff --git a/poky/meta/classes/kernel-devicetree.bbclass b/poky/meta/classes/kernel-devicetree.bbclass
index 8a81c850f..522c46575 100644
--- a/poky/meta/classes/kernel-devicetree.bbclass
+++ b/poky/meta/classes/kernel-devicetree.bbclass
@@ -71,23 +71,23 @@ do_deploy_append() {
dtb=`normalize_dtb "$dtbf"`
dtb_ext=${dtb##*.}
dtb_base_name=`basename $dtb .$dtb_ext`
- install -d ${DEPLOYDIR}
- install -m 0644 ${D}/${KERNEL_IMAGEDEST}/$dtb_base_name.$dtb_ext ${DEPLOYDIR}/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext
- ln -sf $dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext ${DEPLOYDIR}/$dtb_base_name.$dtb_ext
- ln -sf $dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext ${DEPLOYDIR}/$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext
+ install -d $deployDir
+ install -m 0644 ${D}/${KERNEL_IMAGEDEST}/$dtb_base_name.$dtb_ext $deployDir/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext
+ ln -sf $dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext $deployDir/$dtb_base_name.$dtb_ext
+ ln -sf $dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext $deployDir/$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext
for type in ${KERNEL_IMAGETYPE_FOR_MAKE}; do
if [ "$type" = "zImage" ] && [ "${KERNEL_DEVICETREE_BUNDLE}" = "1" ]; then
cat ${D}/${KERNEL_IMAGEDEST}/$type \
- ${DEPLOYDIR}/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext \
- > ${DEPLOYDIR}/$type-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin
+ $deployDir/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext \
+ > $deployDir/$type-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin
ln -sf $type-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin \
- ${DEPLOYDIR}/$type-$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext.bin
+ $deployDir/$type-$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext.bin
if [ -e "${KERNEL_OUTPUT_DIR}/${type}.initramfs" ]; then
cat ${KERNEL_OUTPUT_DIR}/${type}.initramfs \
- ${DEPLOYDIR}/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext \
- > ${DEPLOYDIR}/${type}-${INITRAMFS_NAME}-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin
+ $deployDir/$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext \
+ > $deployDir/${type}-${INITRAMFS_NAME}-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin
ln -sf ${type}-${INITRAMFS_NAME}-$dtb_base_name-${KERNEL_DTB_NAME}.$dtb_ext.bin \
- ${DEPLOYDIR}/${type}-${INITRAMFS_NAME}-$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext.bin
+ $deployDir/${type}-${INITRAMFS_NAME}-$dtb_base_name-${KERNEL_DTB_LINK_NAME}.$dtb_ext.bin
fi
fi
done
diff --git a/poky/meta/classes/kernel-fitimage.bbclass b/poky/meta/classes/kernel-fitimage.bbclass
index 1bcb09c59..ec18a3d69 100644
--- a/poky/meta/classes/kernel-fitimage.bbclass
+++ b/poky/meta/classes/kernel-fitimage.bbclass
@@ -53,6 +53,9 @@ UBOOT_MKIMAGE_DTCOPTS ??= ""
# fitImage Hash Algo
FIT_HASH_ALG ?= "sha256"
+# fitImage Signature Algo
+FIT_SIGN_ALG ?= "rsa2048"
+
#
# Emit the fitImage ITS header
#
@@ -246,6 +249,7 @@ EOF
fitimage_emit_section_config() {
conf_csum="${FIT_HASH_ALG}"
+ conf_sign_algo="${FIT_SIGN_ALG}"
if [ -n "${UBOOT_SIGN_ENABLE}" ] ; then
conf_sign_keyname="${UBOOT_SIGN_KEYNAME}"
fi
@@ -327,7 +331,7 @@ EOF
cat << EOF >> ${1}
signature@1 {
- algo = "${conf_csum},rsa2048";
+ algo = "${conf_csum},${conf_sign_algo}";
key-name-hint = "${conf_sign_keyname}";
${sign_line}
};
@@ -500,27 +504,27 @@ kernel_do_deploy_append() {
# Update deploy directory
if echo ${KERNEL_IMAGETYPES} | grep -wq "fitImage"; then
echo "Copying fit-image.its source file..."
- install -m 0644 ${B}/fit-image.its ${DEPLOYDIR}/fitImage-its-${KERNEL_FIT_NAME}.its
- ln -snf fitImage-its-${KERNEL_FIT_NAME}.its ${DEPLOYDIR}/fitImage-its-${KERNEL_FIT_LINK_NAME}
+ install -m 0644 ${B}/fit-image.its "$deployDir/fitImage-its-${KERNEL_FIT_NAME}.its"
+ ln -snf fitImage-its-${KERNEL_FIT_NAME}.its "$deployDir/fitImage-its-${KERNEL_FIT_LINK_NAME}"
echo "Copying linux.bin file..."
- install -m 0644 ${B}/linux.bin ${DEPLOYDIR}/fitImage-linux.bin-${KERNEL_FIT_NAME}.bin
- ln -snf fitImage-linux.bin-${KERNEL_FIT_NAME}.bin ${DEPLOYDIR}/fitImage-linux.bin-${KERNEL_FIT_LINK_NAME}
+ install -m 0644 ${B}/linux.bin $deployDir/fitImage-linux.bin-${KERNEL_FIT_NAME}.bin
+ ln -snf fitImage-linux.bin-${KERNEL_FIT_NAME}.bin "$deployDir/fitImage-linux.bin-${KERNEL_FIT_LINK_NAME}"
if [ -n "${INITRAMFS_IMAGE}" ]; then
echo "Copying fit-image-${INITRAMFS_IMAGE}.its source file..."
- install -m 0644 ${B}/fit-image-${INITRAMFS_IMAGE}.its ${DEPLOYDIR}/fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.its
- ln -snf fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.its ${DEPLOYDIR}/fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_LINK_NAME}
+ install -m 0644 ${B}/fit-image-${INITRAMFS_IMAGE}.its "$deployDir/fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.its"
+ ln -snf fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.its "$deployDir/fitImage-its-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_LINK_NAME}"
echo "Copying fitImage-${INITRAMFS_IMAGE} file..."
- install -m 0644 ${B}/arch/${ARCH}/boot/fitImage-${INITRAMFS_IMAGE} ${DEPLOYDIR}/fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.bin
- ln -snf fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.bin ${DEPLOYDIR}/fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_LINK_NAME}
+ install -m 0644 ${B}/arch/${ARCH}/boot/fitImage-${INITRAMFS_IMAGE} "$deployDir/fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.bin"
+ ln -snf fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_NAME}.bin "$deployDir/fitImage-${INITRAMFS_IMAGE_NAME}-${KERNEL_FIT_LINK_NAME}"
fi
if [ "${UBOOT_SIGN_ENABLE}" = "1" -a -n "${UBOOT_DTB_BINARY}" ] ; then
# UBOOT_DTB_IMAGE is a realfile, but we can't use
# ${UBOOT_DTB_IMAGE} since it contains ${PV} which is aimed
# for u-boot, but we are in kernel env now.
- install -m 0644 ${B}/u-boot-${MACHINE}*.dtb ${DEPLOYDIR}/
+ install -m 0644 ${B}/u-boot-${MACHINE}*.dtb "$deployDir/"
fi
fi
}
diff --git a/poky/meta/classes/libc-package.bbclass b/poky/meta/classes/libc-package.bbclass
index a66e54088..de816bcec 100644
--- a/poky/meta/classes/libc-package.bbclass
+++ b/poky/meta/classes/libc-package.bbclass
@@ -346,14 +346,13 @@ python package_do_split_gconvs () {
if use_bin == "compile":
makefile = oe.path.join(d.getVar("WORKDIR"), "locale-tree", "Makefile")
- m = open(makefile, "w")
- m.write("all: %s\n\n" % " ".join(commands.keys()))
- total = len(commands)
- for i, cmd in enumerate(commands):
- m.write(cmd + ":\n")
- m.write("\t@echo 'Progress %d/%d'\n" % (i, total))
- m.write("\t" + commands[cmd] + "\n\n")
- m.close()
+ with open(makefile, "w") as m:
+ m.write("all: %s\n\n" % " ".join(commands.keys()))
+ total = len(commands)
+ for i, (maketarget, makerecipe) in enumerate(commands.items()):
+ m.write(maketarget + ":\n")
+ m.write("\t@echo 'Progress %d/%d'\n" % (i, total))
+ m.write("\t" + makerecipe + "\n\n")
d.setVar("EXTRA_OEMAKE", "-C %s ${PARALLEL_MAKE}" % (os.path.dirname(makefile)))
d.setVarFlag("oe_runmake", "progress", "outof:Progress\s(\d+)/(\d+)")
bb.note("Executing binary locale generation makefile")
diff --git a/poky/meta/classes/package.bbclass b/poky/meta/classes/package.bbclass
index d8bef3afb..f955df111 100644
--- a/poky/meta/classes/package.bbclass
+++ b/poky/meta/classes/package.bbclass
@@ -826,8 +826,9 @@ python fixup_perms () {
# Now we actually load from the configuration files
for conf in get_fs_perms_list(d).split():
- if os.path.exists(conf):
- f = open(conf)
+ if not os.path.exists(conf):
+ continue
+ with open(conf) as f:
for line in f:
if line.startswith('#'):
continue
@@ -848,7 +849,6 @@ python fixup_perms () {
fs_perms_table[entry.path] = entry
if entry.path in fs_link_table:
fs_link_table.pop(entry.path)
- f.close()
# Debug -- list out in-memory table
#for dir in fs_perms_table:
@@ -1424,10 +1424,9 @@ fi
pkgdest = d.getVar('PKGDEST')
pkgdatadir = d.getVar('PKGDESTWORK')
- data_file = pkgdatadir + d.expand("/${PN}" )
- f = open(data_file, 'w')
- f.write("PACKAGES: %s\n" % packages)
- f.close()
+ data_file = pkgdatadir + d.expand("/${PN}")
+ with open(data_file, 'w') as fd:
+ fd.write("PACKAGES: %s\n" % packages)
pn = d.getVar('PN')
global_variants = (d.getVar('MULTILIB_GLOBAL_VARIANTS') or "").split()
@@ -1778,21 +1777,20 @@ python package_do_shlibs() {
bb.note("Renaming %s to %s" % (old, new))
os.rename(old, new)
pkgfiles[pkg].remove(old)
-
+
shlibs_file = os.path.join(shlibswork_dir, pkg + ".list")
if len(sonames):
- fd = open(shlibs_file, 'w')
- for s in sonames:
- if s[0] in shlib_provider and s[1] in shlib_provider[s[0]]:
- (old_pkg, old_pkgver) = shlib_provider[s[0]][s[1]]
- if old_pkg != pkg:
- bb.warn('%s-%s was registered as shlib provider for %s, changing it to %s-%s because it was built later' % (old_pkg, old_pkgver, s[0], pkg, pkgver))
- bb.debug(1, 'registering %s-%s as shlib provider for %s' % (pkg, pkgver, s[0]))
- fd.write(s[0] + ':' + s[1] + ':' + s[2] + '\n')
- if s[0] not in shlib_provider:
- shlib_provider[s[0]] = {}
- shlib_provider[s[0]][s[1]] = (pkg, pkgver)
- fd.close()
+ with open(shlibs_file, 'w') as fd:
+ for s in sonames:
+ if s[0] in shlib_provider and s[1] in shlib_provider[s[0]]:
+ (old_pkg, old_pkgver) = shlib_provider[s[0]][s[1]]
+ if old_pkg != pkg:
+ bb.warn('%s-%s was registered as shlib provider for %s, changing it to %s-%s because it was built later' % (old_pkg, old_pkgver, s[0], pkg, pkgver))
+ bb.debug(1, 'registering %s-%s as shlib provider for %s' % (pkg, pkgver, s[0]))
+ fd.write(s[0] + ':' + s[1] + ':' + s[2] + '\n')
+ if s[0] not in shlib_provider:
+ shlib_provider[s[0]] = {}
+ shlib_provider[s[0]][s[1]] = (pkg, pkgver)
if needs_ldconfig and use_ldconfig:
bb.debug(1, 'adding ldconfig call to postinst for %s' % pkg)
postinst = d.getVar('pkg_postinst_%s' % pkg)
@@ -1864,11 +1862,10 @@ python package_do_shlibs() {
deps_file = os.path.join(pkgdest, pkg + ".shlibdeps")
if os.path.exists(deps_file):
os.remove(deps_file)
- if len(deps):
- fd = open(deps_file, 'w')
- for dep in sorted(deps):
- fd.write(dep + '\n')
- fd.close()
+ if deps:
+ with open(deps_file, 'w') as fd:
+ for dep in sorted(deps):
+ fd.write(dep + '\n')
}
python package_do_pkgconfig () {
@@ -1898,9 +1895,8 @@ python package_do_pkgconfig () {
pkgconfig_provided[pkg].append(name)
if not os.access(file, os.R_OK):
continue
- f = open(file, 'r')
- lines = f.readlines()
- f.close()
+ with open(file, 'r') as f:
+ lines = f.readlines()
for l in lines:
m = var_re.match(l)
if m:
@@ -1918,10 +1914,9 @@ python package_do_pkgconfig () {
for pkg in packages.split():
pkgs_file = os.path.join(shlibswork_dir, pkg + ".pclist")
if pkgconfig_provided[pkg] != []:
- f = open(pkgs_file, 'w')
- for p in pkgconfig_provided[pkg]:
- f.write('%s\n' % p)
- f.close()
+ with open(pkgs_file, 'w') as f:
+ for p in pkgconfig_provided[pkg]:
+ f.write('%s\n' % p)
# Go from least to most specific since the last one found wins
for dir in reversed(shlibs_dirs):
@@ -1931,9 +1926,8 @@ python package_do_pkgconfig () {
m = re.match(r'^(.*)\.pclist$', file)
if m:
pkg = m.group(1)
- fd = open(os.path.join(dir, file))
- lines = fd.readlines()
- fd.close()
+ with open(os.path.join(dir, file)) as fd:
+ lines = fd.readlines()
pkgconfig_provided[pkg] = []
for l in lines:
pkgconfig_provided[pkg].append(l.rstrip())
@@ -1951,10 +1945,9 @@ python package_do_pkgconfig () {
bb.note("couldn't find pkgconfig module '%s' in any package" % n)
deps_file = os.path.join(pkgdest, pkg + ".pcdeps")
if len(deps):
- fd = open(deps_file, 'w')
- for dep in deps:
- fd.write(dep + '\n')
- fd.close()
+ with open(deps_file, 'w') as fd:
+ for dep in deps:
+ fd.write(dep + '\n')
}
def read_libdep_files(d):
@@ -1965,9 +1958,8 @@ def read_libdep_files(d):
for extension in ".shlibdeps", ".pcdeps", ".clilibdeps":
depsfile = d.expand("${PKGDEST}/" + pkg + extension)
if os.access(depsfile, os.R_OK):
- fd = open(depsfile)
- lines = fd.readlines()
- fd.close()
+ with open(depsfile) as fd:
+ lines = fd.readlines()
for l in lines:
l.rstrip()
deps = bb.utils.explode_dep_versions2(l)
diff --git a/poky/meta/classes/report-error.bbclass b/poky/meta/classes/report-error.bbclass
index ea043b23e..1a12db120 100644
--- a/poky/meta/classes/report-error.bbclass
+++ b/poky/meta/classes/report-error.bbclass
@@ -78,19 +78,15 @@ python errorreport_handler () {
taskdata['task'] = task
if log:
try:
- logFile = codecs.open(log, 'r', 'utf-8')
- logdata = logFile.read()
-
+ with codecs.open(log, encoding='utf-8') as logFile:
+ logdata = logFile.read()
# Replace host-specific paths so the logs are cleaner
for d in ("TOPDIR", "TMPDIR"):
s = e.data.getVar(d)
if s:
logdata = logdata.replace(s, d)
-
- logFile.close()
except:
logdata = "Unable to read log file"
-
else:
logdata = "No Log"
diff --git a/poky/meta/classes/sanity.bbclass b/poky/meta/classes/sanity.bbclass
index 2d3f49eb1..a14bf5388 100644
--- a/poky/meta/classes/sanity.bbclass
+++ b/poky/meta/classes/sanity.bbclass
@@ -622,13 +622,14 @@ def check_sanity_version_change(status, d):
# In other words, these tests run once in a given build directory and then
# never again until the sanity version or host distrubution id/version changes.
- # Check the python install is complete. glib-2.0-natives requries
- # xml.parsers.expat
+ # Check the python install is complete. Examples that are often removed in
+ # minimal installations: glib-2.0-natives requries # xml.parsers.expat and icu
+ # requires distutils.sysconfig.
try:
import xml.parsers.expat
- except ImportError:
- status.addresult('Your python is not a full install. Please install the module xml.parsers.expat (python-xml on openSUSE and SUSE Linux).\n')
- import stat
+ import distutils.sysconfig
+ except ImportError as e:
+ status.addresult('Your Python 3 is not a full install. Please install the module %s (see the Getting Started guide for further information).\n' % e.name)
status.addresult(check_make_version(d))
status.addresult(check_patch_version(d))
@@ -664,6 +665,7 @@ def check_sanity_version_change(status, d):
status.addresult('Please use ASSUME_PROVIDED +=, not ASSUME_PROVIDED = in your local.conf\n')
# Check that TMPDIR isn't on a filesystem with limited filename length (eg. eCryptFS)
+ import stat
tmpdir = d.getVar('TMPDIR')
status.addresult(check_create_long_filename(tmpdir, "TMPDIR"))
tmpdirmode = os.stat(tmpdir).st_mode
@@ -798,6 +800,11 @@ def check_sanity_everybuild(status, d):
elif d.getVar('SDK_ARCH', False) == "${BUILD_ARCH}":
status.addresult('SDKMACHINE is set, but SDK_ARCH has not been changed as a result - SDKMACHINE may have been set too late (e.g. in the distro configuration)\n')
+ # If SDK_VENDOR looks like "-my-sdk" then the triples are badly formed so fail early
+ sdkvendor = d.getVar("SDK_VENDOR")
+ if not (sdkvendor.startswith("-") and sdkvendor.count("-") == 1):
+ status.addresult("SDK_VENDOR should be of the form '-foosdk' with a single dash\n")
+
check_supported_distro(d)
omask = os.umask(0o022)
diff --git a/poky/meta/classes/systemd.bbclass b/poky/meta/classes/systemd.bbclass
index 1dca09964..9e8a82c9f 100644
--- a/poky/meta/classes/systemd.bbclass
+++ b/poky/meta/classes/systemd.bbclass
@@ -32,11 +32,7 @@ if type systemctl >/dev/null 2>/dev/null; then
if [ "${SYSTEMD_AUTO_ENABLE}" = "enable" ]; then
for service in ${SYSTEMD_SERVICE_ESCAPED}; do
- case "${service}" in
- *@*)
- systemctl ${OPTS} enable "${service}"
- ;;
- esac
+ systemctl ${OPTS} enable "$service"
done
fi
diff --git a/poky/meta/classes/testimage.bbclass b/poky/meta/classes/testimage.bbclass
index 525c5a617..844ed8794 100644
--- a/poky/meta/classes/testimage.bbclass
+++ b/poky/meta/classes/testimage.bbclass
@@ -262,6 +262,24 @@ def testimage_main(d):
# It would be better to find these modules using instrospection.
target_kwargs['target_modules_path'] = d.getVar('BBPATH')
+ # hardware controlled targets might need further access
+ target_kwargs['powercontrol_cmd'] = d.getVar("TEST_POWERCONTROL_CMD") or None
+ target_kwargs['powercontrol_extra_args'] = d.getVar("TEST_POWERCONTROL_EXTRA_ARGS") or ""
+ target_kwargs['serialcontrol_cmd'] = d.getVar("TEST_SERIALCONTROL_CMD") or None
+ target_kwargs['serialcontrol_extra_args'] = d.getVar("TEST_SERIALCONTROL_EXTRA_ARGS") or ""
+
+ def export_ssh_agent(d):
+ import os
+
+ variables = ['SSH_AGENT_PID', 'SSH_AUTH_SOCK']
+ for v in variables:
+ if v not in os.environ.keys():
+ val = d.getVar(v)
+ if val is not None:
+ os.environ[v] = val
+
+ export_ssh_agent(d)
+
# runtime use network for download projects for build
export_proxies(d)
diff --git a/poky/meta/classes/tinderclient.bbclass b/poky/meta/classes/tinderclient.bbclass
deleted file mode 100644
index 00f453cec..000000000
--- a/poky/meta/classes/tinderclient.bbclass
+++ /dev/null
@@ -1,368 +0,0 @@
-def tinder_http_post(server, selector, content_type, body):
- import httplib
- # now post it
- for i in range(0,5):
- try:
- h = httplib.HTTP(server)
- h.putrequest('POST', selector)
- h.putheader('content-type', content_type)
- h.putheader('content-length', str(len(body)))
- h.endheaders()
- h.send(body)
- errcode, errmsg, headers = h.getreply()
- #print(errcode, errmsg, headers)
- return (errcode,errmsg, headers, h.file)
- except:
- print("Error sending the report!")
- # try again
- pass
-
- # return some garbage
- return (-1, "unknown", "unknown", None)
-
-def tinder_form_data(bound, dict, log):
- output = []
- # for each key in the dictionary
- for name in dict:
- assert dict[name]
- output.append( "--" + bound )
- output.append( 'Content-Disposition: form-data; name="%s"' % name )
- output.append( "" )
- output.append( dict[name] )
- if log:
- output.append( "--" + bound )
- output.append( 'Content-Disposition: form-data; name="log"; filename="log.txt"' )
- output.append( '' )
- output.append( log )
- output.append( '--' + bound + '--' )
- output.append( '' )
-
- return "\r\n".join(output)
-
-def tinder_time_string():
- """
- Return the time as GMT
- """
- return ""
-
-def tinder_format_http_post(d,status,log):
- """
- Format the Tinderbox HTTP post with the data needed
- for the tinderbox to be happy.
- """
-
- import random
-
- # the variables we will need to send on this form post
- variables = {
- "tree" : d.getVar('TINDER_TREE'),
- "machine_name" : d.getVar('TINDER_MACHINE'),
- "os" : os.uname()[0],
- "os_version" : os.uname()[2],
- "compiler" : "gcc",
- "clobber" : d.getVar('TINDER_CLOBBER') or "0",
- "srcdate" : d.getVar('SRCDATE'),
- "PN" : d.getVar('PN'),
- "PV" : d.getVar('PV'),
- "PR" : d.getVar('PR'),
- "FILE" : d.getVar('FILE') or "N/A",
- "TARGETARCH" : d.getVar('TARGET_ARCH'),
- "TARGETFPU" : d.getVar('TARGET_FPU') or "Unknown",
- "TARGETOS" : d.getVar('TARGET_OS') or "Unknown",
- "MACHINE" : d.getVar('MACHINE') or "Unknown",
- "DISTRO" : d.getVar('DISTRO') or "Unknown",
- "zecke-rocks" : "sure",
- }
-
- # optionally add the status
- if status:
- variables["status"] = str(status)
-
- # try to load the machine id
- # we only need on build_status.pl but sending it
- # always does not hurt
- try:
- f = open(d.getVar('TMPDIR')+'/tinder-machine.id', 'r')
- id = f.read()
- variables['machine_id'] = id
- except:
- pass
-
- # the boundary we will need
- boundary = "----------------------------------%d" % int(random.random()*1000000000000)
-
- # now format the body
- body = tinder_form_data( boundary, variables, log )
-
- return ("multipart/form-data; boundary=%s" % boundary),body
-
-
-def tinder_build_start(d):
- """
- Inform the tinderbox that a build is starting. We do this
- by posting our name and tree to the build_start.pl script
- on the server.
- """
-
- # get the body and type
- content_type, body = tinder_format_http_post(d,None,None)
- server = d.getVar('TINDER_HOST')
- url = d.getVar('TINDER_URL')
-
- selector = url + "/xml/build_start.pl"
-
- #print("selector %s and url %s" % (selector, url))
-
- # now post it
- errcode, errmsg, headers, h_file = tinder_http_post(server,selector,content_type, body)
- #print(errcode, errmsg, headers)
- report = h_file.read()
-
- # now let us find the machine id that was assigned to us
- search = "<machine id='"
- report = report[report.find(search)+len(search):]
- report = report[0:report.find("'")]
-
- bb.note("Machine ID assigned by tinderbox: %s" % report )
-
- # now we will need to save the machine number
- # we will override any previous numbers
- f = open(d.getVar('TMPDIR')+"/tinder-machine.id", 'w')
- f.write(report)
-
-
-def tinder_send_http(d, status, _log):
- """
- Send this log as build status
- """
-
- # get the body and type
- server = d.getVar('TINDER_HOST')
- url = d.getVar('TINDER_URL')
-
- selector = url + "/xml/build_status.pl"
-
- # now post it - in chunks of 10.000 characters
- new_log = _log
- while len(new_log) > 0:
- content_type, body = tinder_format_http_post(d,status,new_log[0:18000])
- errcode, errmsg, headers, h_file = tinder_http_post(server,selector,content_type, body)
- #print(errcode, errmsg, headers)
- #print(h.file.read())
- new_log = new_log[18000:]
-
-
-def tinder_print_info(d):
- """
- Print the TinderBox Info
- Including informations of the BaseSystem and the Tree
- we use.
- """
-
- # get the local vars
- time = tinder_time_string()
- ops = os.uname()[0]
- version = os.uname()[2]
- url = d.getVar('TINDER_URL')
- tree = d.getVar('TINDER_TREE')
- branch = d.getVar('TINDER_BRANCH')
- srcdate = d.getVar('SRCDATE')
- machine = d.getVar('MACHINE')
- distro = d.getVar('DISTRO')
- bbfiles = d.getVar('BBFILES')
- tarch = d.getVar('TARGET_ARCH')
- fpu = d.getVar('TARGET_FPU')
- oerev = d.getVar('OE_REVISION') or "unknown"
-
- # there is a bug with tipple quoted strings
- # i will work around but will fix the original
- # bug as well
- output = []
- output.append("== Tinderbox Info" )
- output.append("Time: %(time)s" )
- output.append("OS: %(ops)s" )
- output.append("%(version)s" )
- output.append("Compiler: gcc" )
- output.append("Tinderbox Client: 0.1" )
- output.append("Tinderbox Client Last Modified: yesterday" )
- output.append("Tinderbox Protocol: 0.1" )
- output.append("URL: %(url)s" )
- output.append("Tree: %(tree)s" )
- output.append("Config:" )
- output.append("branch = '%(branch)s'" )
- output.append("TARGET_ARCH = '%(tarch)s'" )
- output.append("TARGET_FPU = '%(fpu)s'" )
- output.append("SRCDATE = '%(srcdate)s'" )
- output.append("MACHINE = '%(machine)s'" )
- output.append("DISTRO = '%(distro)s'" )
- output.append("BBFILES = '%(bbfiles)s'" )
- output.append("OEREV = '%(oerev)s'" )
- output.append("== End Tinderbox Client Info" )
-
- # now create the real output
- return "\n".join(output) % vars()
-
-
-def tinder_print_env():
- """
- Print the environment variables of this build
- """
- time_start = tinder_time_string()
- time_end = tinder_time_string()
-
- # build the environment
- env = ""
- for var in os.environ:
- env += "%s=%s\n" % (var, os.environ[var])
-
- output = []
- output.append( "---> TINDERBOX RUNNING env %(time_start)s" )
- output.append( env )
- output.append( "<--- TINDERBOX FINISHED (SUCCESS) %(time_end)s" )
-
- return "\n".join(output) % vars()
-
-def tinder_tinder_start(d, event):
- """
- PRINT the configuration of this build
- """
-
- time_start = tinder_time_string()
- config = tinder_print_info(d)
- #env = tinder_print_env()
- time_end = tinder_time_string()
- packages = " ".join( event.getPkgs() )
-
- output = []
- output.append( "---> TINDERBOX PRINTING CONFIGURATION %(time_start)s" )
- output.append( config )
- #output.append( env )
- output.append( "<--- TINDERBOX FINISHED PRINTING CONFIGURATION %(time_end)s" )
- output.append( "---> TINDERBOX BUILDING '%(packages)s'" )
- output.append( "<--- TINDERBOX STARTING BUILD NOW" )
-
- output.append( "" )
-
- return "\n".join(output) % vars()
-
-def tinder_do_tinder_report(event):
- """
- Report to the tinderbox:
- On the BuildStart we will inform the box directly
- On the other events we will write to the TINDER_LOG and
- when the Task is finished we will send the report.
-
- The above is not yet fully implemented. Currently we send
- information immediately. The caching/queuing needs to be
- implemented. Also sending more or less information is not
- implemented yet.
-
- We have two temporary files stored in the TMP directory. One file
- contains the assigned machine id for the tinderclient. This id gets
- assigned when we connect the box and start the build process the second
- file is used to workaround an EventHandler limitation. If BitBake is ran
- with the continue option we want the Build to fail even if we get the
- BuildCompleted Event. In this case we have to look up the status and
- send it instead of 100/success.
- """
- import glob
-
- # variables
- name = bb.event.getName(event)
- log = ""
- status = 1
- # Check what we need to do Build* shows we start or are done
- if name == "BuildStarted":
- tinder_build_start(event.data)
- log = tinder_tinder_start(event.data,event)
-
- try:
- # truncate the tinder log file
- f = open(event.data.getVar('TINDER_LOG'), 'w')
- f.write("")
- f.close()
- except:
- pass
-
- try:
- # write a status to the file. This is needed for the -k option
- # of BitBake
- g = open(event.data.getVar('TMPDIR')+"/tinder-status", 'w')
- g.write("")
- g.close()
- except IOError:
- pass
-
- # Append the Task-Log (compile,configure...) to the log file
- # we will send to the server
- if name == "TaskSucceeded" or name == "TaskFailed":
- log_file = glob.glob("%s/log.%s.*" % (event.data.getVar('T'), event.task))
-
- if len(log_file) != 0:
- to_file = event.data.getVar('TINDER_LOG')
- log += "".join(open(log_file[0], 'r').readlines())
-
- # set the right 'HEADER'/Summary for the TinderBox
- if name == "TaskStarted":
- log += "---> TINDERBOX Task %s started\n" % event.task
- elif name == "TaskSucceeded":
- log += "<--- TINDERBOX Task %s done (SUCCESS)\n" % event.task
- elif name == "TaskFailed":
- log += "<--- TINDERBOX Task %s failed (FAILURE)\n" % event.task
- elif name == "PkgStarted":
- log += "---> TINDERBOX Package %s started\n" % event.data.getVar('PF')
- elif name == "PkgSucceeded":
- log += "<--- TINDERBOX Package %s done (SUCCESS)\n" % event.data.getVar('PF')
- elif name == "PkgFailed":
- if not event.data.getVar('TINDER_AUTOBUILD') == "0":
- build.exec_task('do_clean', event.data)
- log += "<--- TINDERBOX Package %s failed (FAILURE)\n" % event.data.getVar('PF')
- status = 200
- # remember the failure for the -k case
- h = open(event.data.getVar('TMPDIR')+"/tinder-status", 'w')
- h.write("200")
- elif name == "BuildCompleted":
- log += "Build Completed\n"
- status = 100
- # Check if we have a old status...
- try:
- h = open(event.data.getVar('TMPDIR')+'/tinder-status', 'r')
- status = int(h.read())
- except:
- pass
-
- elif name == "MultipleProviders":
- log += "---> TINDERBOX Multiple Providers\n"
- log += "multiple providers are available (%s);\n" % ", ".join(event.getCandidates())
- log += "consider defining PREFERRED_PROVIDER_%s\n" % event.getItem()
- log += "is runtime: %d\n" % event.isRuntime()
- log += "<--- TINDERBOX Multiple Providers\n"
- elif name == "NoProvider":
- log += "Error: No Provider for: %s\n" % event.getItem()
- log += "Error:Was Runtime: %d\n" % event.isRuntime()
- status = 200
- # remember the failure for the -k case
- h = open(event.data.getVar('TMPDIR')+"/tinder-status", 'w')
- h.write("200")
-
- # now post the log
- if len(log) == 0:
- return
-
- # for now we will use the http post method as it is the only one
- log_post_method = tinder_send_http
- log_post_method(event.data, status, log)
-
-
-# we want to be an event handler
-addhandler tinderclient_eventhandler
-python tinderclient_eventhandler() {
- if e.data is None or bb.event.getName(e) == "MsgNote":
- return
-
- do_tinder_report = e.data.getVar('TINDER_REPORT')
- if do_tinder_report and do_tinder_report == "1":
- tinder_do_tinder_report(e)
-
- return
-}
diff --git a/poky/meta/classes/uninative.bbclass b/poky/meta/classes/uninative.bbclass
index 3326c0db3..9f8645a36 100644
--- a/poky/meta/classes/uninative.bbclass
+++ b/poky/meta/classes/uninative.bbclass
@@ -45,7 +45,7 @@ python uninative_event_fetchloader() {
tarballdir = os.path.join(d.getVar("UNINATIVE_DLDIR"), chksum)
tarballpath = os.path.join(tarballdir, tarball)
- if not os.path.exists(tarballpath):
+ if not os.path.exists(tarballpath + ".done"):
bb.utils.mkdirhier(tarballdir)
if d.getVar("UNINATIVE_URL") == "unset":
bb.fatal("Uninative selected but not configured, please set UNINATIVE_URL")