Signed-off-by: Michał Kopeć <michal.kopec@3mdeb.com>
Open Source Firmware Remote Test Environment
The following repository contains set of tests and other features to conduct Dasharo firmware validation procedures.
Warning
!!! WARNING !!!
This repository is in the process of migration and multiple major reworks. If
you do not know what you are doing, consider not using it until at least
v0.5.0 is released. When this is scheduled, link to such a milestone will
appear here.
!!! WARNING !!!
Table of contents
- Lab architecture
- Test environment overview
- Supported platforms
- Getting started
- Checking Robot Framework syntax before committing
- Useful refactoring tools
- OSFV Stability Checks
- Generating documentation
- Additional documents
Lab architecture
This graphic presents a rough overview on how DUT can be connected in the Dasharo lab.
Following mechanisms may be used for DUT power control:
Following mechanisms may be used for DUT control:
- serial port over telnet, exposed by ser2net
- PiKVM with USB keyboard emulatation
- for some platforms, a mixture of both (serial for output, PiKVM keyboard for input)
Current OSFV architecture
repositories were replaced by common libraries such as those contained here.
These and robot framework libraries (found here)
are attached to the environment itself via requirements.txt which needs to be
kept up to date to serve its purpose.
Test environment overview
Dasharo OSFV consists of following modules:
dasharo-compatibility,dasharo-security,dasharo-performance,dasharo-stability.
Supported platforms
This table presents platform names along with their config names from
platform-configs directory. The support level (which test are supported per
different platform) may vary.
| Manufacturer | Platform | Firmware | $CONFIG |
|---|---|---|---|
| MSI | PRO Z690 A DDR5 | Dasharo | msi-pro-z690-a-ddr5 |
| MSI | PRO Z690 A WIFI DDR4 | Dasharo | msi-pro-z690-a-wifi-ddr4 |
| NovaCustom | NS50MU | Dasharo | novacustom-ns50mu |
| NovaCustom | NS50PU | Dasharo | novacustom-ns50pu |
| NovaCustom | NS70MU | Dasharo | movacustom-ns70mu |
| NovaCustom | NS70PU | Dasharo | novacustom-ns70pu |
| NovaCustom | NV41MB | Dasharo | novacustom-nv41mb |
| NovaCustom | NV41MZ | Dasharo | novacustom-nv41mz |
| NovaCustom | NV41PZ | Dasharo | novacustom-nv41pz |
| NovaCustom | V540TND | Dasharo | novacustom-v540tnd |
| NovaCustom | V540TU | Dasharo | novacustom-v540tu |
| NovaCustom | V560TND | Dasharo | novacustom-v560tnd |
| NovaCustom | V560TNE | Dasharo | novacustom-v560tne |
| NovaCustom | V560TU | Dasharo | novacustom-v560tu |
| PC Engines | apu4 | Dasharo | pcengines-apu4 |
| Protectli | V1210 | Dasharo | protectli-v1210 |
| Protectli | V1410 | Dasharo | protectli-v1410 |
| Protectli | V1610 | Dasharo | protectli-v1610 |
| Protectli | VP2410 | Dasharo | protectli-vp2410 |
| Protectli | VP2420 | Dasharo | protectli-vp2420 |
| Protectli | VP4630 | Dasharo | protectli-vp4630 |
| Protectli | VP4650 | Dasharo | protectli-vp4650 |
| Protectli | VP4670 | Dasharo | protectli-vp4670 |
| QEMU | Q35 | Dasharo (UEFI) | qemu |
| Raptor-CS | TalosII | Dasharo | raptor-cs_talos2 |
| Raspberry Pi | RaspberryPi 3B | Yocto | rpi-3b |
platform-configs has recently been reworked- by using the tree
topology to group platforms by more generic settings up above and the
more specific flags and settings much lower down to an exact platform
model. Example:

DCU
The osfv uses the
Dasharo Configuration Utility,
it lets you set firmware settings directly in binary file.

Getting started
Initializing environment
- Clone repository and setup virtualenv:
git clone https://github.com/Dasharo/open-source-firmware-validation
cd open-source-firmware-validation
git checkout develop
git submodule update --init --checkout --recursive --remote
python3 -m virtualenv venv
source venv/bin/activate
- Install modules (in case of Raptor Talos II platform):
pip install -U -r requirements-openbmc.txt
- Install modules (in case of other platforms):
pip install -r requirements.txt
ansible-galaxy install -r requirements.yml
- Follow the initialization instructions in
osfv-test-data/README.md:
cd osfv-test-data
git annex pull
./setup.sh
cd ..
If you try to run the environment again after the first initialization you must reinstall requirements.txt for it to work properly:
python3 -m virtualenv venv
source venv/bin/activate
pip install -r ./requirements.txt
ansible-galaxy install -r requirements.yml
- Or just create an alias:
alias penv="python3 -m virtualenv venv && source venv/bin/activate && \
pip install -r ./requirements.txt && \
ansible-galaxy install -r requirements.yml"
NOTE:
keywords.robotrequires osfv_cli to be installed on the host system. Go through these steps to configure the scripts
- Executing manual steps require that tkinter module be installed which can't be done via pip
sudo dnf install python3-tkinter
Running tests
When running tests on Dasharo platforms use the following commands:
- For running a single test case:
robot -L TRACE -v rte_ip:$RTE_IP -v config:$CONFIG -v device_ip:$DEVICE_IP \
-t $TEST_CASE_ID $TEST_MODULE/$TEST_SUITE
- For running a single test suite:
robot -L TRACE -v rte_ip:$RTE_IP -v config:$CONFIG -v device_ip:$DEVICE_IP \
$TEST_MODULE/$TEST_SUITE
- For running a single test module:
robot -L TRACE -v rte_ip:$RTE_IP -v config:$CONFIG -v device_ip:$DEVICE_IP \
$TEST_MODULE
Parameters should be defined as follows:
- $DEVICE_IP - IP address of the DUT. Required only when there is no serial input enabled for the device, or tests are executed over SSH. Currently, this is the case for NovaCustom and MSI devices.
- $RTE_IP - IP address of the RTE. Required only if RTE is used on a given test stand.
- $FW_FILE - path to and name of the coreboot firmware file. This is usually not required when running single tests or suites, where flashing is not necessary.
- $CONFIG - platform config - see the
platform-configsdirectory for available configurations. - $TEST_MODULE - name of the test module (i.e.
dasharo-compatibility), - $TEST_SUITE - name of the test suite (i.e.
uefi-shell.robot), - $TEST_CASE_ID - ID of the requested to run test case (i.e.
CBP001.001*). Note that after test case ID asterisk should be added, if you do not wish to provide the full test name here.
You can also run tests with -v snipeit:no in order to skip checking whether
the platform is available on snipeit and fetching data from the asset page.
By default, this is enabled. Mind that if you choose to skip you may need to
provide the following parameters:
- $SONOFF_IP - IP of the Sonoff device. Required if the DUT uses Sonoff for power control.
- $PIKVM_IP - IP of PiKVM. Required if the DUT's connection method is PiKVM.
The command below is an example of how to run tests without using SnipeIT on a platform that uses both Sonoff and PiKVM:
robot -L TRACE -v snipeit:no -v rte_ip:$RTE_IP -v config:$CONFIG \
-v device_ip:$DEVICE_IP -v sonoff_ip:$SONOFF_IP -v pikvm_ip:$PIKVM_IP \
$TEST_MODULE
Running tests via wrapper
Test can be run directly via robot command, but also via the run.sh
wrapper:
DEVICE_IP=$DEVICE_IP RTE_IP=$RTE_IP CONFIG=$CONFIG ./scripts/run.sh $TEST_SUITE
Running tests without snipeit requires additional variables:
DEVICE_IP=$DEVICE_IP RTE_IP=$RTE_IP CONFIG=$CONFIG SNIPEIT_NO="y" \
SONOFF_IP=$SONOFF_IP PIKVM_IP=$PIKVM_IP
./scripts/run.sh $TEST_SUITE
Mind that SNIPEIT_NO, only need to be set, meaning that whatever value it
has, it will be treated as true.
You may also specify a DIR_PREFIX when executing this wrapper script. The given prefix will be added to the beginning of the test results directory name.
Running tests with additional arguments
Any additional parameters to robot can be passed using the wrapper by giving
them after a separator '--'. The arguments can be anything that robot accepts.
For example: specifying the tests to perform by giving a test case ID and
reducing the output verbosity:
DEVICE_IP=$DEVICE_IP RTE_IP=$RTE_IP CONFIG=$CONFIG ./scripts/run.sh $TEST_SUITE -- -t $TEST_CASE_ID --quiet
Running regression tests
Regression tests involve running all OSFV tests supported by the given platform. The support for certain tests is indicated by the flags in the platform config file.
FW_FILE=$FW_FILE DEVICE_IP=$DEVICE_IP RTE_IP=$RTE_IP CONFIG=$CONFIG ./scripts/regression.sh
Running regression tests without snipeit works the same way as running regular tests.
Running regression tests with additional arguments
Giving additional arguments to robot can be done in the same way as in run.sh.
Example: running only minimal regression tests with given test ID and reduced verbosity:
FW_FILE=$FW_FILE DEVICE_IP=$DEVICE_IP RTE_IP=$RTE_IP CONFIG=$CONFIG ./scripts/regression.sh -- --include "minimal-regression" -t "BMM*" --quiet
Basic Platform Setup
The test suite at util/basic-platform-setup.robot is supposed to check, or
if possible ensure, that the platform is prepared for performing tests.
The suite should be run at least once to make sure the platform is good to go,
and can be used to verify if everything works fine.
The scripts/regression.sh wrapper script runs this test suite automatically
before the Dasharo test modules. This behavior can be turned off by setting
the NO_SETUP environment variable to any value.
Checking Robot Framework syntax before committing
Before pushing changes to the Open Source Firmware Remote Test Environment repository, it's recommended to run:
pre-commit run --all-files
This command checks the code for syntax and style issues to ensure its integrity before pushing to the remote repository.
Useful refactoring tools
- sherlock
- can detect unused keywords, and much more
- ./scripts/refactoring-state.sh
- dedicated script for this repo
- Renaming keywords
- Renaming Test Cases
- Renaming Variables
git-cliff - Automating changelog generation
The OSFV uses git-cliff to automate the generation of changelogs based on
commit messages, following the Conventional
Commits specification.
git-cliff helps maintain a clear history of changes by categorizing them into
sections.
git-cliff is already included in the project's dependencies, so it will be
installed automatically when setting up the
environment.
The configuration file cliff.toml is located in the repository and defines the
rules for how commit messages are grouped in the changelog.
Generating changelogs
To generate a changelog for a specific tag (e.g., v0.2.0), run the following command:
git cliff --tag v0.2.0 > CHANGELOG_NAME.md
This command generates a changelog for all commits from the previous tag to
v0.2.0 and saves it to a CHANGELOG_NAME.md file.
Example usage
You are also able to generate a changelog for a specific range of commits between two tags. For example:
git cliff --range v0.1.0..v0.2.0 > CHANGELOG_v0.2.0.md
Customizing change categories
Commit messages are grouped based on the rules defined in the cliff.toml file. To adjust these rules, you can modify the file to add custom commit categories or tweak existing patterns.
reuse - Automating license compliance
To ensure the project complies with licensing requirements, OSFV uses the reuse tool. This tool helps automate the process of adding license headers and ensures all files in the repository are compliant with SPDX license standards.
reuse is already included in the project’s dependencies, so it will be automatically installed when setting up the environment.
Adding license headers
After adding new files to the repository, ensure they have proper license headers. You can do this manually or automate the process using reuse.
Some files already have rules inside REUSE.toml, therefore, there is no need
to add a license for them. You can check whether your files need a license to be
specified by running reuse lint.
If your file needs a custom license, but it is already included in some rule in
REUSE.toml - you can still add a separate .license file or a license header
for it. Just make sure that the rule that your file is subject to has a
precedence field set to closest.
Here are some examples:
-
For files that support comments (e.g., .py, .sh):
To add license headers to files like Python scripts, use the following command:
reuse annotate --copyright="3mdeb <contact@3mdeb.com>" --license="Apache-2.0" <file.py> -
For files that don’t support comments (e.g., binary files):
For files that cannot contain comment-based license headers (such as
.pemor.binfiles), add a.licensefile next to them:reuse annotate --force-dot-license <file.bin> -
Adding license headers in bulk:
If you want to add license headers to multiple files at once, you can use the
findcommand. For example, to add headers to all.shfiles:find . -type f -name "*.sh" -exec reuse annotate --copyright="3mdeb <contact@3mdeb.com>" --license="Apache-2.0" {} \;
Checking license compliance
After adding or modifying files, before releasing a new version of the project, run the reuse lint tool to check if all files are compliant with the license requirements:
reuse lint
This command will generate a report indicating any files missing proper license headers or other licensing issues. Based on this report, you can make the necessary corrections. Example workflow:
- Add new files or make changes to existing ones.
- use
reuse annotateto add license headers to the new files. - Run
reuse lintto ensure the project complyes with licensing requirements. - Address any issues reported by the tool.
- Commit the changes and prepare the release.
OSFV Stability Checks
OSFV evolves quickly and a lot of changes that often break tests in some corner cases are being added. Due to that the releases of OSFV were stopped for a couple months. In order to restore the releases and work towards improving the reliability and reducing fail rate caused by errors in the testing environment, regression tests of most OSFV test suites on multiple supported devices will be run to verify how many tests pass "Out of the box", and without any kind of maintenance.
osfv_stability_run.py
scripts/ci/osfv_stability_run.py is used to run the test scope defined in
scripts/ci/regression-scope/configs/release_tests_suite_list_minimal.txt on devices
from scripts/ci/regression-scope/configs/release_tests_devices.txt, more
precisely defined in scripts/ci/regression-scope/devices/.
It reuses the system used for automatic CI runs on PRs.
The osfv_stability_run.py script will try to run the whole test scope on
all supported devices 2 times and wait until they're free for check-out on
Snipe-IT. It can easily take multiple hours, so always make sure to run it
on a stable device that always has the connection to the lab network.
Use LOGS_DIR env variable to redirect logs to NFS for future reference:
export LOGS_DIR=/srv/nfs/logs/osfv_stability/ci_logs
Use MANUAL_TESTS_LIST env var to select a list of tests to run, e.g. whole OSFV:
export MANUAL_TESTS_LIST="scripts/ci/regression-scope/configs/release_tests_suite_list_minimal.txt"
Use DEVICES env var to configure list of devices to run on:
export DEVICES="scripts/ci/regression-scope/configs/release_tests_devices.txt"
osfv_stability_reports.py
After the results are created, use scripts/ci/osfv_stability_reports.py to
create a summary of all the runs. It can take a couple minutes to parse all
that logs.
Use LOGS_DIR env var to direct the parser to the directory with logs.
E.g. The NFS with all the historic official runs on OSFV releases.
export LOGS_DIR=/srv/nfs/logs/osfv_stability/ci_logs
Use --json option to format the output in json for easy processing
./scripts/ci/osfv_stability_reports.py --json > reports.json
Generating documentation
Keywords documentation (Develop) deploy status:
Documentation in the form of auto-generated html documents can be created using
libdoc and testdoc.
Note: you should be in your python virtual environment. If you haven't created any, please refer to Getting Started
To generate a document for a resource file containing keywords, use these commands:
$(venv) libdoc keywords.robot keywords.html
Or in more general form:
$(venv) libdoc <file-with-keywords> <output filename>
The output file can be opened in any web-browser like so:
$ firefox keywords.html
Or use the provided create-docs.sh script, which automatically concatenates
all of the keyword-containing libraries from lib/ directory with
keywords.robot, and generates one big html file containing all the
keywords within this repo.
$(venv) ./scripts/create-docs.sh
Documentation generated and saved as ./docs/index.html
$ firefox ./docs/index.html
The resulting file can be opened in any web-browser:
$ firefox docs/index.html
To generate documentation regarding a specific test, testdoc has to
be used, for example if we want documentation regarding the
dasharo-compatibility/dasharo-tools-suite.robot test, these commands would
need to be executed:
$ python3 robot.testdoc dasharo-compatibility/dasharo-tools-suite.robot test.html
$ firefox test.html
This website shows
the current state of all keywords from all libraries as they appear right now on
the develop branch. It works by utilizing a workflow, so remember that local
changes that are made won't show up there, until they are pushed to the
develop branch.
Additional documents
- Adding new platforms - Instructions for adding support for new platforms
- Adding and naming test cases - Instructions for adding new test cases and (re-)assigning IDs
- Contributing - Instructions for first-time contributors
- Raptor CS Talos II - Documentation specific to the Raptor Computing Systems Talos II mainboard
- QEMU - Documentation for running tests in QEMU
- NovaCustom - Documentation for running tests on NovaCustom laptops
- Config parser - Instructions for the
scripts/config-parser.pyutility for parsing coreboot config files into .robot platform configs for OSFV
