Skip to content
Snippets Groups Projects
Commit f51b20d9 authored by Peter Senna Tschudin's avatar Peter Senna Tschudin
Browse files

Fix typos


Signed-off-by: default avatarPeter Senna Tschudin <peter.senna@collabora.com>
parent 02c9a0b5
No related branches found
No related tags found
1 merge request!204Add guide: "Apertis integration testing with LAVA"
Pipeline #240304 passed
......@@ -6,14 +6,15 @@ outputs = [ "html", "pdf-in",]
+++
[LAVA](https://www.lavasoftware.org/) is a testing system allowing to deploy
operation system to physical and virtual devices, sharing access to devices
between developers. As a rule tests are started in non-interactive unattended
mode and LAVA provide logs and results in a human-readable form for analysis.
[LAVA](https://www.lavasoftware.org/) is a testing system allowing the
deployment of operating systems to physical and virtual devices, sharing
access to devices between developers. As a rule tests are started in
non-interactive unattended mode and LAVA provides logs and results in a
human-readable form for analysis.
As a common part of development cycle we need to do some integration testing
of the application and validate it's behavior on different hardware and
software platforms.
As a common part of the development cycle we need to do some integration
testing of the application and validate it's behavior on different
hardware and software platforms.
## Integration testing example
......@@ -59,8 +60,8 @@ fi
## Testing in LAVA
As soon as we done with development, we push all changes to GitLab and CI will
prepare a new version of package and OS images. But we do not know if the
As soon as we are done with development, we push all changes to GitLab and CI will
prepare a new version of the package and OS images. But we do not know if the
updated version of `systemctl` is working well for all supported devices and
OS variants, so we want to have the integration test to be run by LAVA.
......@@ -75,7 +76,7 @@ To start the test with LAVA automation we need to:
The script above is not suitable for unattended testing in LAVA due some issues:
- LAVA relies to exit code to determine if test passed or not. The example above
- LAVA relies on exit code to determine if test a passed or not. The example above
always return the `success` code, only human-readable string allows to validate
the status of `systemctl`
- if `systemctl is-system-running` call is failed for some reason (segfault for
......@@ -132,11 +133,11 @@ indicate the test suite failure.
### Create GIT repository for the test suite
The test script must be accessible by LAVA for downloading. LAVA have support
of several methods for downloading but for Apertis the GIT fetch is preferable
since we are using separate version of test scripts for each release.
The test script must be accessible by LAVA for downloading. LAVA has support
for several methods for downloading but for Apertis the GIT fetch is preferable
since we are using separate versions of test scripts for each release.
It is strongly recommended to create separate repository with test scripts
It is strongly recommended to create a separate repository with test scripts
and tools for each single test suite.
As a first step we need a fresh and empty GIT repository anywhere, for
......@@ -213,18 +214,18 @@ for our changes:
pattern: "(?P<test_case_id>.*):\\s+(?P<result>(pass|fail))"
```
This test is aimed to be run for ostree-based minimal Apertis image for
This test is aimed to be run for an ostree-based minimal Apertis image for
all supported architectures. However the metadata is mostly needed for
documentation purposes.
Action "install" point to GIT repository as a source for the test, so LAVA
Action "install" points to the GIT repository as a source for the test, so LAVA
will fetch and deploy this repository for us.
Action "run" provide the step-by-step instruction how to execute the test.
Action "run" provides the step-by-step instructions on how to execute the test.
Please note that it is recommended to use wrapper for the test for
integration with LAVA.
Action "parse" provides own detection for the status of test results printed
Action "parse" provides its own detection for the status of test results printed
by script.
2. Push the test case to the GIT repository.
......@@ -235,7 +236,7 @@ for our changes:
git commit -s -m "add test case for systemctl" test-cases/test-systemctl.yaml
git push --set-upstream origin wip/example
3. Add job template to be run in lava. Job template contains all needed
3. Add a job template to be run in lava. Job template contains all needed
information for LAVA how to boot the target device and deploy the OS image
onto it.
......@@ -261,20 +262,20 @@ for our changes:
path: test-cases/test-systemctl.yaml
name: test-systemctl
```
Hopefully you don't need to deal with HW-related part, since we already
Hopefully you don't need to deal with the HW-related part, since we already
have those instructions for all supported boards and Apertis OS images.
Please pay attention to `revision` -- it must point to your development
branch while you are working on your test.
Instead of creation of the new template you may want to extend appropriate
existing template with additional test definition. In this case next step
could be omitted.
Instead of creating a new template, you may want to extend the appropriate
existing template with additional test definition. In this case the next
step could be omitted.
6. Add the template into profile.
6. Add the template into a profile.
Profile file is mapping test jobs to devices under the test. So you need to
add your job template into proper list. For example we may extend
add your job template into the proper list. For example we may extend
the templates list named `templates-minimal-ostree` in file
`lava/profiles.yaml`:
```
......@@ -282,7 +283,7 @@ for our changes:
- &templates-minimal-ostree
- test-systemctl-tpl.yaml
```
It is highly recommended to temporary remove or comment out the rest of
It is highly recommended to temporarily remove or comment out the rest of
templates from the list to avoid unnecessary workload on LAVA while you're
developing the test.
......@@ -295,7 +296,7 @@ for our changes:
in [personal LAVA tests](/qa/personal_lava_tests/) tutorial.
Since the LAVA is a part of Apertis OS CI -- it requires some variables to
be provided for using of Apertis profiles and templates. Let's define the
be provided for using Apertis profiles and templates. Let's define the
board we will use for testing, as well as the image release and variant:
release=v2022dev1
......@@ -321,11 +322,11 @@ for our changes:
**NB**: it is recommended to set `visibility` variable to "Apertis" group
during development to avoid any credentials/passwords leak by occasion.
Set the additional variable `priority` to `high` allows you to bypass the
jobs common queue and do not wait your job results for ages.
jobs common queue and do not wait for your job results for ages.
8. Submit your first job to LAVA.
Just repeat the `lqa` call above without `-n` option.
Just repeat the `lqa` call above without the `-n` option.
After the job submission you will see the job ID:
lqa submit -g lava/profiles.yaml -p "${profile_name}" -t visibility:"{'group': ['Apertis']}" -t priority:"high" -t imgpath:${imgpath} -t release:${release} -t image_date:${version} -t image_name:${image_name}
......@@ -344,6 +345,5 @@ for our changes:
git commit -a -m "hello world template added"
git push
As a last step you need to create merge request in GitLab and as soon as it
would be accepted your test became a part of Apertis testing CI.
As a last step you need to create a merge request in GitLab. As soon as it
gets accepted your test becomes part of Apertis testing CI.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment