diff --git a/content/concepts/rfw_integration_on_lava.md b/content/concepts/rfw_integration_on_lava.md index 4ea14c1b0e413ca32b6ce438c3bf8aba8b592d42..2d05be9830b1ac4e9d19113c1f6f4a152eac1f60 100644 --- a/content/concepts/rfw_integration_on_lava.md +++ b/content/concepts/rfw_integration_on_lava.md @@ -16,9 +16,11 @@ or executing Robot Framework test suites as it is intended only for continuous i Thanks to this integration the coverage test can be extended to cover different test areas by adding additional customized libraries and toolchains. -Integrating Robot Framework on LAVA infrastructure adds additional benefits of Robotic Process Automation (RPA), -[ATDD](https://en.wikipedia.org/wiki/Acceptance_test-driven_development) (Acceptance test–driven development) -and also allows to use a wide range of open source libraries developed for automation testing. +[LAVA](https://www.st.com/en/partner-products-and-services/lava-linaro-automated-validation-architecture.html) +(Linaro Automation and Validation Architecture) is a continuous integration system for deploying operating +systems onto physical and virtual hardware for running tests. Tests can be simple boot testing, bootloader testing +and system level testing, although extra hardware may be required for some system tests. Results are tracked over +time and data can be exported for further analysis. [Robot Framework]((https://www.tutorialspoint.com/robot_framework/robot_framework_overview.htm)) is simple, yet powerful and easily extensible tool which utilizes the keyword driven testing approach. It uses a tabular @@ -26,20 +28,13 @@ syntax which enables creating test cases in a uniform way. All these features en be quickly used to automate test cases. The best benefit with Robot Framework for the users is that there is no need for using any sort of programming language for implementing and running tests. +Robot Framework is open source software released under the Apache License 2.0. Its development is sponsored by the Robot Framework Foundation. -[Listener Mechanism](https://github.com/robotframework/robotframework/blob/master/doc/userguide/src/ExtendingRobotFramework/ListenerInterface.rst) -Robot Framework has a listener interface that can be used to receive notifications about test execution. -It provides a versatile framework for test automation, allowing you to create custom listeners to monitor and -interact with test executions. Listeners are classes or modules with certain special methods. -Listeners that monitor the whole test execution must be taken into use from the command line. - -[LAVA](https://www.st.com/en/partner-products-and-services/lava-linaro-automated-validation-architecture.html) -(Linaro Automation and Validation Architecture) is a continuous integration system for deploying operating -systems onto physical and virtual hardware for running tests. Tests can be simple boot testing, bootloader testing -and system level testing, although extra hardware may be required for some system tests. Results are tracked over -time and data can be exported for further analysis. +Integrating Robot Framework on LAVA infrastructure adds additional benefits of Robotic Process Automation (RPA), +[ATDD](https://en.wikipedia.org/wiki/Acceptance_test-driven_development) (Acceptance test–driven development) +and also allows to use a wide range of open source libraries developed for automation testing. -# Architecture overview +# Robot framework architecture overview  @@ -58,6 +53,7 @@ line that is executed as a part of the test case. **Robot Framework** +Robot Framework is a generic, application and technology independent framework. The primary advantage of the Robot framework is that it is agnostic of the target under test. The interaction with the layers below the framework can be done using the libraries built-in or user-created that make use of application interfaces. @@ -70,6 +66,8 @@ for these new keywords. Each Robot Framework library acts as glue between the high level language and low level details of the item being tested, or of the environment in which the item to be tested is present. +Robot Framework has a rich set of built-in libraries, including libraries for testing HTTP, FTP, SSH, and XML, as well as libraries for testing user interfaces and databases. + **System Under Test** This is the actual target on which the testing activity is performed. It could either be a @@ -79,73 +77,99 @@ system under test. The Robot Framework supports various file formats namely HTML (Tab Separated Values), reST (Restructured Text), and Plain text. As per the official documentation of Robot framework, the plain text format is recommended. -# Integration of Robot Framework on LAVA +When Robot Framework is started, it processes the data, executes test cases and generates logs and reports. The core framework does not know anything about the target under test, and the interaction with it is handled by libraries. Libraries can either use application interfaces directly or use lower level test tools as drivers. -Apertis does the complete test automation setup on LAVA infrastructure for all it's reference hardware. -System integration and boot level testing is done as part of automated on successfully generated image -from CI/CD image generation pipeline. Test report are displayed on [QA report app](https://qa.apertis.org/) -and bug task is created for each [failure test cases](https://gitlab.apertis.org/infrastructure/apertis-issues/-/boards/30) on gitlab issue board and tagged it under `test-area:test-failure`. +# Robot Framework on LAVA There are two main constraints on automated tests setup on LAVA, the asynchronous way of updating results -and user not having control over the job once it is submitted. Developers and CI pipeline can both submit -jobs to LAVA, but they cannot interact with a job while it is running. The LAVA workflow define the process -of submitting a job, wait for the job to be selected for execution, wait for the job to complete it's execution, -and download test results. +and user not having control over the job once it is submitted. +Developers and CI pipeline can both submit jobs to LAVA, but they cannot interact with a job while it is running. The LAVA workflow define the process of submitting a job, wait for the job to be selected for execution, wait for the job to complete it's execution, and download test results. -Considering the above constraints and covering the wide range of test areas including HMI tests, on Automated test infrastructure setup of LAVA integrating automated test framework of Robot Framework, provide more chances to automate complex tests by make use of open source library under RFW. +Considering the above constraints and covering the wide range of test areas including HMI tests, on Automated test infrastructure setup of LAVA integrating automated test framework of Robot Framework, provide more chances to automate complex tests by make use of open source library under Robot Framework. + +The Robot Framework can add value to Apertis. Adding Robot Framework to Apertis will involve developing and/or modifying Robot Framework libraries and developing a run-time compatibility layer for LAVA. The run-time compatibility layer for LAVA has two major objectives: keep testing environments as close as possible to production environments, and to adapt the execution of Robot Framework tests to suit the LAVA constraints. + +# Integration approach A LAVA instance consists of two primary components **masters** and **workers** works like server and client mechanisms. The simplest possible configuration is to run the master and worker components on a single machine, but for larger instances it can also be configured to support multiple workers controlling a larger number of attached devices -in a [multi node](https://docs.lavasoftware.org/lava/multinode.html) mechanism. +in a [multi node](https://docs.lavasoftware.org/lava/multinode.html) model. - - -/* There are two possible approaches available to integrate Robot Framework on LAVA: -1. Creating a QEMU emulator which uses Apertis SDK image and starts to execute Robot Framework test suites -2. Creating a Docker based container and start executing Robot Framework test suites - +1. Integrting standalone development setup inside the dispatcher. +2. Introduce different device type to enable standalone docker with Robot Framework instance +3. Introducing test:docker container to run Robot Framework instance -The first approach consists in creating a QEMU emulator with Apertis SDK image and -installing Robot Framework. By using it all the SDK related test cases can be migrated -to Robot Framework and run successfully. However, running tests on Fixed Function or HMI -images is not feasible, making this approach not suitable for testing target relates tests, -and therefore not meeting all the use cases. -*/ +The first approach consists of creating a QEMU emulator with Apertis SDK image and +installing Robot Framework. In this apparoch, user can run all automated testes related to system and toolchain. Mainly this apparoch is to test the headless functionality which are part of development activities. However, running target related tests such as Fixed Function or HMI images is not feasible, therefore this apparoach is not meeting all the use cases of production readiness. +The second approach consists of creating a separate device type on LAVA instance which contains test Docker container with robot framework runs under the worker context. This setup provides the benefits of isolation and security, but, it includes additional effort of maintaining different device type on LAVA. Test suite specifically mention the device-type along with the architecture to run the tests on this instance. +An additional advantage is that each test suite execution will be run on independent Docker container making parallel execution possible for different jobs, this apparoch increases the isolation of running the test suites and handling report, but memory overhead on if too many devices are attached and simultatneously running. - - - -The second approach consists of creating a Docker container which runs under the worker context, -which provides the benefits of isolation and security. An additional advantage is that each test -suite execution will be run on independent Docker container making parallel execution possible in -some scenarios. +The third approach consists of introducing test:docker login mechanism on LAVA instance. This apparoach is completely developed and open sourced by Apertis team. Here, job description should define the docker part by providing valid credential to pull the docker to run on dispatcher instance and execute the test steps mentioned on the test suits. The worker is responsible for running the `lava-worker` daemon to start and monitor test jobs running on the dispatcher. Each master has a worker installed by default and additional workers can be added on separate machines, known as remote workers. The admin decides how many devices are assign to each worker. In large instances, it is common for all devices to be assigned to remote workers to manage the load. -Workflow shows the stages of running jobs starting from triggering the test suite as a job and -finally updating test report back to the server. A test suite consists of multiple test cases -related which will be run on a specific release, image type and hardware as described in -[LAVA testing]({{< ref "/guides/lava-apertis-testing.md" >}}) -A Docker instance will be created for each job instance and deleted once the job execution is -completed. Running each job on dispatcher will create a separate docker instance of Robot Framework -and monitor the job completion. The test cases execution will be done inside the Robot Framework instance, -and once the test execution is completed the generated report will be sent back to dispatcher to be parsed -and send back to server. - -**Listener Mechanism** - -Custom listeners in Robot Framework offer a range of benefits to enhance your test automation process. These advantages include: -Custom listeners are valuable when the default logging and reporting mechanisms. For testing processes that demand comprehensive -insights into test case execution, including individual step outcomes, timing, and custom attributes, a custom listener proves to be a suitable solution. -Custom listeners play a pivotal role when debugging and troubleshooting are critical aspects of your test automation process. -When real-time visibility into the progress of test executions is crucial, a custom listener can be specifically designed to provide ongoing updates and insights as the tests are running. + + +in the above three apparoches, Apertis distribution for easy maintanence of LAVA instance and adaption to product team, third approach suites for all the requirements. + +# Test execution workflow + +Test cases and test suites are developed using recommanded IDE of RIDE and these tests can be run manually on Apertis SDK, and also configure these tests to be run on LAVA. + +Following workflow provide the steps to integrate Robot Framework tests to be run on LAVA. + +Create a common group for all the Robot Framework tests running on LAVA under apertis-test-cases/lava e.g: group-robot-tpl.yaml. +Describre the LAVA job to introduce the test:docker which is part of apertis infrastructure by providing credtial to pull and run inside the dispatcher. +Sample job defination: +- test: + timeout: + minutes: 180 + namespace: rfw-test + name: {{group}}-tests + docker: + image: "docker://registry.gitlab.apertis.org/infrastructure/apertis-docker-images/{{release_version}}-rfw-docker:latest" + login: + registry: "registry.gitlab.apertis.org" + user: "gitlab-ci-token" + password: "{{ '{{job.CI_JOB_TOKEN}}' }}" + definitions: + - repository: https://gitlab-ci-token:{{ '{{job.CI_JOB_TOKEN}}' }}@gitlab.apertis.org/tests/apertis-test-cases.git + branch: 'apertis/v2023' + history: False + from: git + name: robot-connman-tests + path: test-cases/robot-connman.yaml + parameters: + DEVICE_IP: "$(lava-target-ip)" + ROBOT_FRAMEWORK_CONNMAN_URL: |- + https://gitlab-ci-token:{{ '{{job.CI_JOB_TOKEN}}' }}@gitlab.apertis.org/tests/robotframework.git + +Below workflow shows the stages of running jobs starting from triggering the test suite as a job and +finally updating test report back to the server. A test suite consists of multiple test cases to validate common functionality and which will be run on a specific release, image type and hardware as described in +[LAVA testing]({{< ref "/guides/lava-apertis-testing.md" >}}). + +In the Docker based apparoach,docker instance will be created for each job and deleted once the job execution is +completed. Each docker instance will connect the DUT based on job description considering the architecture and image type. +Robot Framework will run the test connected DUT on SSH/SCP protocal, instead copying entire test inside the DUT and triggering the script to run, +once the test execution is completed, test report will be generated inside the docker instance and that will be shared to LAVA server to report the summary. + +Robot Framework generate three files in the output directory: output.xml, log.html, and report.html. The output.xml file contains the raw data of your test execution, such as test names, statuses, messages, and tags. The log.html file is a detailed log of your test execution, which includes timestamps, keywords, arguments, screenshots, and console output. The report.html file is a summary report of your test execution, which shows the overall statistics, test cases, and errors. + +Currently LAVA server is not processing any of this Robot Framework test report only track the test status, so we plan to add the data parser and provide the completed data to LAVA server to host these information on webserver and provide the link on the report. + +Robot Framework will generate the status report only at the end of all the test cases executed, so improve the test handling mechanism, Robot Framework provide the Listner mechanism to track the status of each test execution and it output artifacts. + +Custom listeners in Robot Framework offer a range of benefits to enhance your test automation process. +These advantages include: +1. Custom listeners are valuable when the default logging and reporting mechanisms. For testing processes that demand comprehensive insights into test case execution, including individual step outcomes, timing, and custom attributes, a custom listener proves to be a suitable solution. +2. Custom listeners play a pivotal role when debugging and troubleshooting are critical aspects of your test automation process. When real-time visibility into the progress of test executions is crucial, a custom listener can be specifically designed to provide ongoing updates and insights as the tests are running. These benefits make custom listeners a valuable tool for optimizing and customizing your Robot Framework test automation process. The custom listener script is written in Python and should be saved with a .py extension. It defines listener functions and handles logging to the console and files. @@ -153,9 +177,17 @@ The custom listener script is written in Python and should be saved with a .py e Run your Robot Framework tests with the listener using the --listener option: `robot --listener <listener.py> <Robot_file>` +LAVA provides the plugin mechanism to [QA dash board](https://qa.apertis.org/) for hosting all the status report of test executed on it. + +# Project deployment workflow -# Impact analysis on Apertis distribution  + +Project team can host LAVA server and dispatcher on their infrastructure and assign the roles for LAVA user based on the activities. +Customization can be done in triggering a job from different tool e.g GitLab, Git, Artifactory, and also customize the usage of image repository, deployment toolchain, target setup and publishing QA report on project team bashboard. + +# Impact analysis on Apertis distribution + ## Infrastructure Integrating Robot Framework on existing Apertis infrastructure will requires the following changes : - Integrating Docker container on worker, so worker setup should improved to meet the Docker requirement. @@ -178,39 +210,35 @@ Framework testcases to run on SDK and target devices running Fixed Function or H tests defined with the new Robot Framework test suites which will help to improve the test coverage. # Summary -• End-End workflow -- LAVA pipelines can take care of -All test stages: power on-off, Flashing & booting image. -loads tests from GIT, run tests, reporting. -Parallel scheduling to no. of devices to reduce cycle time, +• End to End workflow +- LAVA pipelines can take care of All test stages: power on-off, Flashing & booting image. +loads tests from GIT, run tests, reporting. Parallel scheduling to no. of devices to reduce cycle time, - Can be configured to lock/reserve devices for specific tests or specific users. • Internet facing Web Service - One front end service (hosted by Central team) for all .Multiple teams can hook their DUT setup to the same service. -Internet connected enabling collaboration with external partners. - remotely manage the DUTs- Configure, Power On/Off, boot via web service w/o needing access to DUTs - Users can login to run tests, view logs, reports (independent of Jenkins CI). - Role based access permissions (Maintainer, Developer, Guest) - Mail notifications/alerts +Internet connected enabling collaboration with external partners. +remotely manage the DUTs- Configure, Power On/Off, boot via web service w/o needing access to DUTs Users can login to run tests, view logs, reports (independent of Jenkins CI). Role based access permissions (Maintainer, Developer, Guest) Mail notifications/alerts • ONE DUT TEST setup for all variants -o One DUT setup can support. -Linux, Android Tests on device variants and emulator -All types of tests (with any test f/w) Interface/Functional/System -Multiple teams can share/reuse the same DUT setups. + +Linux, Android Tests on device variants and emulator All types of tests (with any test f/w) Interface/Functional/System Multiple teams can share/reuse the same DUT setups. • Continuous testing -o Easily hook LAVA to CI/CD setup for continuous testing -o reuse /trigger same LAVA tests in different CI/CD setups +- Easily hook LAVA to CI/CD setup for continuous testing +- Reuse /trigger same LAVA tests in different CI/CD setups • Handles inconsistency. -o Configured to retry during initial(download) failures/timeout or test run failures. + +Configured to retry during initial(download) failures/timeout or test run failures. • Inbuilt reporting dashboard -o Insight full metrics available in the inbuilt dashboard: - Test result report - Linkage to Test case definitions. - Test run timing statistics. -o Extensive Dashboard - LAVA provides interfaces for integration to Project Dashboards +- Insight full metrics available in the inbuilt dashboard: + -- Test result report + -- Linkage to Test case definitions. + -- Test run timing statistics. + +- Extensive Dashboard + +LAVA provides interfaces for integration to Project Dashboards \ No newline at end of file