- Sep 13, 2024
-
-
Dylan Aïssi authored
The file was moved when the repository was reorganized. See: apertis-infrastructure@41fed1b1 Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jul 18, 2024
-
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
Dylan Aïssi authored
This allows to quickly run trigger-* jobs without waiting for the dashboard to be built (which takes a while). Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jul 16, 2024
-
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
Dylan Aïssi authored
Updates from backports, testing or unstable are made on a case-by-case basis, and should be triggered automatically. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jul 12, 2024
-
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jun 28, 2024
-
-
- Feb 22, 2024
-
-
Dylan Aïssi authored
For some packages, the watch file use the "git" mode to check git repository, in that case git is required otherwise it fails with: uscan issue: Command 'git' not found in /usr/local/sbin, /usr/local/bin, /usr/sbin, /usr/bin, /sbin, /bin at /usr/share/perl5/Devscripts/Utils.pm line 24. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jan 30, 2024
-
-
Dylan Aïssi authored
Both scripts packaging-updates and packaging-check-invariants contain quite long lists of packages to ignore. This makes both scripts and lists a bit difficult to read. Instead move the lists into a specific file at data/whitelists.yaml. This should make maintenance easier. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jan 22, 2024
-
-
Dylan Aïssi authored
Otherwise uscan fails with a message saying required files are badly formatted. This issue is not observed in Apertis because all our repositories are public, but it affects downstream since their repositories are not publicly available. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Nov 04, 2023
-
-
Walter Lozano authored
After switching to use JSON as format for the outcome of the different steps in the dashboard the resources needed drop dramatically. In this context the use of lightweight runners should be sufficient and it is preferable to reduce costs. Signed-off-by:
Walter Lozano <walter.lozano@collabora.com>
-
Walter Lozano authored
YAML is a nice format, very easy to read, unfortunately the Python YAML library is very inefficient both CPU and memory wise. Loading the same content using JSON takes 10 times less memory and time. Since dashboard is always struggling with OOM, let's use JSON for the data it produces. As reference, below results of importing a 10 MB file with YAML and JSON are presented yaml-json $ ./test.py yaml Time 15.507138013839722 seg Memory (70914086, 394044198) bytes (current, peak) yaml-json $ ./test.py json Time 0.6210496425628662 seg Memory (58913059, 67501787) bytes (current, peak) Signed-off-by:
Walter Lozano <walter.lozano@collabora.com>
-
Walter Lozano authored
In commit 797862 the logic continues even if the cache file cannot be downloaded. Unfortunately this approach is buggy since wget can create empty files if the download fails. To avoid passing an invalid cache file, check if file is not empty. Signed-off-by:
Walter Lozano <walter.lozano@collabora.com>
-
- Nov 03, 2023
-
-
In some circumstances there is no valid artifact to download to be used as cache causing the job to fail. In such a case, instead of aborting the job just continue without cache. As reference: $ CACHE_ARGS="" $ ARTIFACT_URL=${ARTIFACT_URL:-$CI_API_V4_URL/projects/$CI_PROJECT_ID/jobs/artifacts/$CI_DEFAULT_BRANCH/raw/packaging-cache.yaml?job=pages} $ if [ "$ARTIFACT_URL" != none ] && [ "$DISABLE_CACHE" == "no" ] && [ "$FILTER_ON_CACHE" == "yes" ] # collapsed multi-line command --2023-10-24 18:16:17-- https://gitlab.apertis.org/api/v4/projects/6587/jobs/artifacts/master/raw/packaging-cache.yaml?job=pages Resolving gitlab.apertis.org (gitlab.apertis.org)... 116.203.10.182, 2a01:4f8:1c0c:80ad::1 Connecting to gitlab.apertis.org (gitlab.apertis.org)|116.203.10.182|:443... connected. HTTP request sent, awaiting response... 404 Not Found 2023-10-24 18:16:18 ERROR 404: Not Found. Signed-off-by:
Walter Lozano <walter.lozano@collabora.com>
-
- Nov 02, 2023
-
-
Emanuele Aina authored
The current naive approach at retrying is not exactly gentle: if the server is overloaded we retry right ahead making the problem just worse. See https://encore.dev/blog/retries for a fun explanation of how the naive approach can be catastrophic. Fortunately the `tenacity` library allow us to add a bounded exponential backoff while also making the code easier to understand. This should make the dashboard behave much better toward GitLab and OBS when they hit some issue. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Sep 06, 2023
-
-
Emanuele Aina authored
Gathering the YAML files from different jobs and merging them locally when trying to reproduce an issue when rendering the dashboard is really tedious, so let's capture the merged `packaging.yaml` even on failures. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Emanuele Aina authored
GitLab does not currently grant access to the raw files API endpoints to CI_JOB_TOKEN in any way. However, it allows for CI_JOB_TOKEN to be used to clone via `git` so let's use that until the situation improves: https://gitlab.com/gitlab-org/gitlab/-/issues/424161 Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Sep 05, 2023
-
-
Dylan Aïssi authored
This will avoid failures due to unexpected removal of gitlab-rulez from trixie Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Sep 04, 2023
-
-
Emanuele Aina authored
Make downstream customization slightly easier by moving the `apertis` bit to a separate and cleaner to customize variable, which could be easily set at the group level. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Emanuele Aina authored
Move the call to `gitlab-rulez` to a separate job to shorten a bit the already extra-long `packaging-data-fetch-downstream` job, and shift it to the `check` stage as it seems more appropriate. This also causes the pipeline to properly honor the `$GITLAB_RULES_URL` variable, as it stops using the hardcoded gitlab.apertis.org/infrastructure/apertis-infrastructure URL in `bin/packaging-data-fetch-downstream`. Reporting is also improved by showing details about the incorrect settings by using the recently introduced `--output json` option in `gitlab-rulez`, currently only available starting from Debian Trixie. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Aug 23, 2023
-
-
Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Aug 17, 2023
-
-
Emanuele Aina authored
Our scans are quite heavyweight, so let's not run more than one at a time. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Aug 16, 2023
-
-
Downstreams usually require authentication to access their GitLab instances, let's fix the retrieval of the ruleset in that case. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Aug 04, 2023
-
-
Emanuele Aina authored
Set options for variable with limited choices to give GitLab the chance to render dropdowns when triggering pipelines manually, reducing the chance of user errors. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Emanuele Aina authored
Currently `FILTER` only works for GitLab projects, the OBS and APT jobs always retrieve everything, which is particularly costly for OBS as it scans the content of each package. Drop the `FILTER` variable and replace it with a `PROJECTS_NAMESPACE` variable for GitLab and a `FILTER_PACKAGES` variable to be used by all jobs to narrow the amount of data retrieved, which is often very useful for testing. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Jul 24, 2023
-
-
Emanuele Aina authored
Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Emanuele Aina authored
Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Jul 23, 2023
-
-
Emanuele Aina authored
Commit 85802ab7 "HACK: Skip OBS scanning by default" worked around a performance issue with the initial deploymento of OBS 2.10. With ce731f22 "Do not skip OBS scan" it got re-enabled again, albeit needing more time than before. Time to drop the hack. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
Emanuele Aina authored
The final rendering stage is now running out of memory even on larger runners. Signed-off-by:
Emanuele Aina <emanuele.aina@collabora.com>
-
- Jun 22, 2023
-
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
Dylan Aïssi authored
For packages not imported from Debian, run uscan to check if new upstream releases are available. If a new version is found report it on the dashboard. Also report missing and broken debian/watch files to ensure we continue to track the correct upstream. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Feb 16, 2023
-
-
Dylan Aïssi authored
Fetching upstream sources data for all Debian packages (~ 38100 pkgs) and not only those in Apertis (~ 5600 pkgs) generates big YAML files containing useless data. Processing these files lead to frequent out-of-memory issues. To reduce the memory consumption, we can filter out all packages not available in Apertis based on the cache file. This job produces a 42 MB YAML file, merging this file with others YAML files leads to out-of-memory issues in downstream jobs. Filtering the generated YAML file at this step reduces the size to only 8 MB. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Jan 11, 2023
-
-
Dylan Aïssi authored
No need to retrieve a file which is not used. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
Dylan Aïssi authored
Currently, the cache file is quite big ~ 70MB and parsing it during the packaging-data-fetch-downstream job gives frequent out-of-memory issues. This file contains a lot of unused data (information for all packages available in Debian). In order to keep only used data (i.e. for packages available in Apertis), A new file packaging-cache.yaml is generated containing only data for cache. It is a subset of the packaging.yaml and it is used only during the packaging-data-fetch-downstream job to load cache without triggering the oom issue. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Dec 21, 2022
-
-
Dylan Aïssi authored
Thus settings don't have to be stored in several locations and the rulez.yaml is the only right location to define correct settings for gitlab repositories Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Dec 08, 2022
-
-
Dylan Aïssi authored
Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
Dylan Aïssi authored
No need to run the gitlab-rulez diff because the dashboard already reports issues. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-
- Dec 05, 2022
-
-
Dylan Aïssi authored
Parsing the big cache file can sometimes lead to an OOM issue with packaging-data-fetch-downstream. This option allows to disable the cache at this step to unblock dashboard updates. Signed-off-by:
Dylan Aïssi <dylan.aissi@collabora.com>
-