Skip to content
Snippets Groups Projects
Commit 19c8dc73 authored by Martyn Welch's avatar Martyn Welch
Browse files

Improve the contribution checklist


The contribution checklist currently covers only a very narrow subset of
possible submissions, primarily new Apertis specific code and APIs. This is no
longer a major part of what Apertis is providing and many of the requirements
do not reflect current practice.

Remove the existing checklist and provide a new one that covers points to be
covered for code, component and design submissions, breaking down when and
where responsibility lies during the development and submission process.

Signed-off-by: default avatarMartyn Welch <martyn.welch@collabora.com>
parent 85b2ee1f
No related branches found
No related tags found
1 merge request!117T7218 Improve Contribution Checklist (to include code, component and design evaluation guidelines
......@@ -14,307 +14,267 @@ aliases = [
outputs = ["html", "pdf-in"]
+++
# Summary
Before submitting a large commit, go through the checklist below to ensure it
meets all the requirements. If submitting a large patchset, submit it in parts,
and ensure that feedback from reviews of the first few parts of the patchset are
applied to subsequent parts.
# Overall principles
In order to break sets of reviews down into chunks, there a few key principles
we need to stick to:
* Review patches which are as small as possible, but no smaller (see
[here](https://developer.gnome.org/programming-guidelines/stable/version-control.html.en#guidelines-for-making-commits),
[here](https://crealytics.com/blog/2010/07/09/5-reasons-keeping-git-commits-small/),
and [here](http://who-t.blogspot.co.uk/2009/12/on-commit-messages.html))
* Learn from each review so the same review comments do not need to be made
more than once
* Use automated tools to eliminate many of the repetitive and time consuming
parts of patch review and rework
* Do high-level API review first, ideally before implementing those APIs, to
avoid unnecessary rework
# Order of reviewing commits
Before proposing a large patchset, work out what order the patches should be
submitted in. If multiple new classes are being submitted, they should be
submitted depth-first, as the APIs of the root classes affect the implementation
of everything else.
The high-level API of any major new feature should be first, then – only once
that high-level API has been reviewed – its implementation. There is no point in
starting to review the implementation before the high-level API, as the
high-level review could suggest some big changes which invalidate a lot of the
implementation review.
# Revisiting earlier commits
Rather than trying to get everything comprehensive first time, we should aim to
get everything correct and minimal first time. This is especially important for
base classes. The commit which introduces a new base class should be fairly
minimal, and subsequent commits can add functionality to it as that
functionality becomes needed by new class implementations.
The aim here is to reduce the amount of initial review needed on base classes,
and to ensure that the non-core parts of the API are motivated by specific needs
in subclasses, rather than being added speculatively.
# Automated tests
One of the checklist items requires checking the code coverage of the automated
tests for a class. We are explicitly not requiring that the code coverage
reaches some target value, as the appropriateness of this value would vary
wildly between patches and classes. Instead, we require that the code coverage
report (`lcov` output) is checked for each patch, and the developer thinks about
whether it would be easy to add additional automated tests to increase the
coverage for the code in that patch.
# Pre-submission checklist
(A rationale for each of these points is given in the section below to avoid
cluttering this one.)
Before submitting any patch, please make sure that it passes this checklist, to
avoid the review getting hung up on avoidable issues:
1. All new code follows the
[coding guidelines]( {{< ref "coding_conventions.md" >}} ),
especially the
[API design guidelines]( {{< ref "api_design.md" >}} ),
[namespacing guidelines](https://developer.gnome.org/programming-guidelines/unstable/namespacing.html.en),
[memory management guidelines](https://developer.gnome.org/programming-guidelines/unstable/memory-management.html.en),
[pre- and post-condition guidelines](https://developer.gnome.org/programming-guidelines/unstable/preconditions.html.en),
and
[introspection guidelines](https://developer.gnome.org/programming-guidelines/unstable/introspection.html.en)
— some key points from these are pulled out below, but these are not the
only points to pay attention to.
1. All new public API must be
[namespaced correctly](https://developer.gnome.org/programming-guidelines/unstable/namespacing.html.en).
1. All new public API must have a complete and useful
[documentation comment]( {{< ref "api_documentation.md" >}} )
(ignore the build system comments on that page – we use hotdoc now – the
guidelines about the comments themselves are all still relevant).
1. All new public API documentation comments must have
[GObject Introspection annotations](https://wiki.gnome.org/Projects/GObjectIntrospection/Annotations)
where appropriate; `g-ir-scanner` (part of the build process) must emit no
warnings when run with `--warn-all --warn-error` (which should be set by
`$(WARN_SCANNERFLAGS)` from `AX_COMPILER_FLAGS`).
1. All new public methods must have
[pre- and post-conditions](https://developer.gnome.org/programming-guidelines/unstable/preconditions.html.en)
to enforce constraints on the accepted parameter values.
1. The code must compile without warnings, after ensuring that
`AX_COMPILER_FLAGS` is used *and enabled* in `configure.ac` (if it is
correctly enabled, compiling liblightwood should fail if there are any
compiler warnings) — remember to add `$(WARN_CFLAGS)`, `$(WARN_LDFLAGS)`
and `$(WARN_SCANNERFLAGS)` to new `Makefile.am` targets as appropriate.
1. The introduction documentation comment for each new class must give a usage
example for that class in each of the main modes it can be used (for
example, if done for the roller, there would be one example for fixed mode,
one for variable mode, one for linked rollers, one for each animation mode,
etc.).
1. All new code must be formatted as per the
[coding guidelines]( {{< ref "coding_conventions.md#code-formatting" >}} ),
using
[`clang-format`]( {{< ref "coding_conventions.md#reformatting-code" >}} )
not GNU `indent`.
1. There must be an example program for each new class, which can be used to
manually test all of the class’s main modes of operation (for example, if
done for the roller, there would be one example for fixed mode, one for
variable mode, one for linked rollers, one for each animation mode, etc.) —
these examples may be submitted in a separate patch from the class
implementation, but must be submitted at the same time as the implementation
in order to allow review in parallel. Example programs must be usable when
installed or uninstalled, so they can be used during development and on
production machines.
1. There must be automated tests (using the
[`GTest` framework in GLib](https://developer.gnome.org/glib/stable/glib-Testing.html))
for construction of each new class, and for getting and setting each of its
properties.
1. The code coverage of the automated tests must be checked (using
`make check-code-coverage`; see D3673) before submission, and if it’s
possible to add more automated tests (and for them to be reliable) to
improve the coverage, this should be done; the final code coverage figure
for the class should be mentioned in a comment on the diff, and it would be
helpful to have the `lcov` reports for the class saved somewhere for
analysis as part of the review.
1. There must be no definite memory leaks reported by Valgrind when running the
automated tests under it (using `AX_VALGRIND_CHECK` and
`make check-valgrind`; see D3673).
1. All automated tests must be installed as
[installed-tests](https://wiki.gnome.org/Initiatives/GnomeGoals/InstalledTests)
and must be
[run when liblightwood is built into a package](https://git.apertis.org/cgit/rhosydd.git/tree/debian/tests/gnome-desktop-testing)
(we can help with the initial setup of this infrastructure if needed).
1. `build-snapshot -Is $build_machine` must pass before submission of any patch
(where `$build_machine` is a machine running an up-to-date copy of Apertis,
which may be `localhost` — this is a standard usage of `build-snapshot`).
1. All new code has been checked to ensure it doesn’t contradict review
comments from previous reviews of other classes (i.e. we want to avoid
making the same review comments on every submitted class).
1. Commit messages must
[explain *why* they make the changes they do](http://chris.beams.io/posts/git-commit/).
1. The dependency information between Phabricator diffs must be checked to be
in the correct order after submitting diffs.
1. All changes are documented including wiki and appdev portal
1. Grammar and spelling should be checked with an automated tool for typos and
mistakes (where appropriate)
# Rationales
1. Each coding guideline has its own rationale for why it’s useful, and many of
them significantly affect the structure of a diff, so are important to get
right early on.
1. Namespacing is important for the correct functioning of a lot of the
developer tools (for example, GObject Introspection), and to avoid symbol
collisions between libraries — checking it is a very mechanical process
which it is best to not have to spend review time on.
1. Documentation comments are useful to both the reviewer and to end users of
the API — for the reviewer, they act as an explanation of why a particular
API is necessary, how it is meant to be used, and can provide insight into
implementation choices. These are questions which the reviewer would
otherwise have to ask in the review, so writing them up lucidly in a
documentation comment saves time in the long run.
1. GObject Introspection annotations are a requirement for the platform’s
language bindings (to JavaScript, for example) to work, so must be added at
some point. Fixing the error messages from `g-ir-scanner` is sufficient to
ensure that the API can be introspected.
1. Pre- and post-conditions are a form of assertion in the code, which check
for programmer errors at runtime. If they are used consistently throughout
the code on every API entry point, they can catch programmer errors much
nearer their origin than otherwise, which speeds up debugging both during
development of the library, and when end users are using the public APIs.
They also act as a loose form of documentation of what each API will allow
as its inputs and outputs, which helps review (see the comments about
documentation above).
1. The set of compiler warnings enabled by `AX_COMPILER_FLAGS` have been chosen
to balance
[false positives against false negatives](https://en.wikipedia.org/wiki/Type_I_and_type_II_errors)
in detecting bugs in the code. Each compiler warning typically identifies a
single bug in the code which would otherwise have to be fixed later in the
life of the library — fixing bugs later is always more expensive in terms of
debugging time.
1. Usage examples are another form of documentation (as discussed above), which
specifically make it clearer to a reviewer how a particular class is
intended to be used. In writing usage examples, the author of a patch can
often notice awkwardness in their API design, which can then be fixed before
review — this is faster than them being caught in review and sent back for
modification.
1. Well formatted code is a lot easier to read and review than poorly formatted
code. It allows the reviewer to think about the function of the code they
are reviewing, rather than (for example) which function call a given
argument actually applies to, or which block of code a statement is actually
part of.
1. Example programs are a loose form of testing, and also act as usage examples
and documentation for the class (see above). They provide an easy way for
the reviewer to run the class and (for example, if it is a widget) review
its visual appearance and interactivity, which is very hard to do by simply
looking at the code in a patch. Their biggest benefit will be when the class
is modified in future — the example programs can be used to test changes to
it and ensure that its behavior changes (or does not) as expected.
Availability of example programs which covered each of the modes of using
`LightwoodRoller` would have made it easier to test changes to the roller in
the last two releases, and discover that they broke some modes of operation
(like coupling two rollers).
1. For each unit test for a piece of code, the behavior checked by that unit
test can be guaranteed to be unchanged across modifications to the code in
future. This prevents regressions (especially as the unit tests for Apertis
projects are set up to be run automatically on each commit by
@apertis-qa-bot, which is more frequently than in other projects). The value
of unit tests when initially implementing a class is in the way they guide
API design to be testable in the first place. It is often the case that an
API will be written without unit tests, and later someone will try to add
unit tests and find that the API is untestable; typically because it relies
on internal state which the test harness cannot affect. By that point, the
API is stable and cannot be changed to allow testing.
1. Looking at code coverage reports is a good way to check that unit tests are
actually checking what they are expected to check about the code. Code
coverage provides a simple, coarse-grained metric of code quality — the
quality of untested code is unknown.
1. Every memory leak is a bug, and hence needs to be fixed at some point.
Checking for memory leaks in a code review is a very mechanical,
time-consuming process. If memory leaks can be detected automatically, by
using `valgrind` on the unit tests, this reduces the amount of time needed
to catch them during review. This is an area where higher code coverage
provides immediate benefits. Another way to avoid leaks is to use
[`g_autoptr()`](https://developer.gnome.org/glib/stable/glib-Miscellaneous-Macros.html#g-autoptr)
to automatically free memory when leaving a control block — however, as this
is a completely new technique to learn, we are not mandating its use yet.
You might find it easier though.
1. If all automated tests are run at package build time, they will be run by
@apertis-qa-bot for every patch submission; and can also be run as part of
the system-wide integration tests, to check that liblightwood behavior
doesn’t change when other system libraries (for example, Clutter or
libthornbury) are changed. This is one of the
[motivations behind installed-tests](https://wiki.gnome.org/Initiatives/GnomeGoals/InstalledTests#Issues_with_.22make_check.22).
This is a one-time setup needed for liblightwood, and once it’s set up, does
not need to be done for each commit.
1. `build-snapshot` ensures that a Debian package can be built successfully
from the code, which also entails running all the unit tests, and checking
that examples compile. This is the canonical way to ensure that liblightwood
remains deliverable as a Debian package, which is important, as the
deliverable for Apertis is essentially a bunch of Debian packages.
1. If each patch is updated to learn from the results of previous patch
reviews, the amount of time spent making and explaining repeated patch
review comments should be significantly reduced, which saves everyone’s
time.
1. Commit messages are a form of documentation of the changes being made to a
project. They should explain the motivation behind the changes, and clarify
any design decisions which the author thinks the reviewer might question. If
a commit message is inadequate, the reviewer is going to ask questions in
the review which could have been avoided otherwise.
# Checklist for a design contribution
This checklist contains the main sections that are expected on a proposal of a
new Apertis design. The main difference of a design when compared to a
component is the project-wide impact. If a component contribution has impact
that goes beyond the expected additional maintenance effort, a design document
is likely to be required before the component can be added to Apertis. We use
here the same example we used on the [Contribution Process: Adding designs to Apertis]( {{< ref "contribution_process.md#adding-designs-to-apertis" >}} ).
1. **What is the design proposal goal?** In our example the goal is to provide
tools and workflows for process automation by including the
[Robot Framework](https://robotframework.org/) in the Apertis Universe.
1. **What is the state-of-the-art for addressing the goals of the design proposal?**
In our example the Robot Framework is not the only process automation
framework available. The goal here is to compare the Robot Framework to other
existing solutions, and include a rationale of why Robot Framework was
chosen.
1. **How does the design work?** Following our example this section should explain
what Robot Framework is, how it works, what is the architecture, and mention
use cases.
1. **What is the potential impact on Apertis?** This is a very important section,
and all known details should be included. For the Robot Framework we would
describe the potential impact on the Apertis development workflow, on the
content of Apertis test images, and on the Apertis testing infrastructure.
1. **What are the benefits for the Apertis Universe?** This section should explain
the benefits to Apertis, such as offering a popular feature that will bring
new users to Apertis.
1. **What is the license of the main components?** Do we plan to ship components on
Apertis target images? Robot Framework is released under [Apache License 2.0](http://www.apache.org/licenses/LICENSE-2.0.html),
and we do not expect to ship Robot Framework components on Apertis target
images.
1. **What is the proposal to integrate the design into Apertis?** In our example we
describe how to address the integration with Apertis taking into account the
constraints of the Apertis development workflow, of testing Apertis images, and
of the Apertis testing infrastructure. Remember to mention deprecating
existing components if needed.
1. **High level description of the estimated work.** For integrating Robot
Framework into Apertis will involve developing and/or modifying Robot
Framework libraries; and developing a run-time compatibility layer for LAVA to
keep testing environments as close as possible to production environments, and
to adapt the execution of Robot Framework tests to suit the LAVA constraints.
1. **High level implementation plan.** Describe the main work packages and the
execution order.
{{% notice note %}}
This list contains the general topics, but it may not be complete for all
designs. Regarding the level of details the design document should be complete
enough to describe the design and surrounding problems to developers and
project managers, but it is not necessary to describe implementation details.
This document covers the steps that should be taken at the various stages of
making a contribution to Apertis with the rationale more fully explained in the
[policy]({{< ref "contribution_process.md" >}}). It covers both those steps to
be taken by the contributor as well as the maintainer(s) accepting the
contribution on behalf of Apertis. It is presented in this manner to provide
transparency of the steps that are taken and the considerations that are made
when accepting a contribution into Apertis, be that a modification of an
existing component, addition of a new component or concept design.
The steps required to be taken by a contributor will be marked with
`Contributor`, those to be taken by a maintainer on behalf of the Apertis
project will be labeled `Maintainer`.
{{% notice warning %}}
Apertis is utilized by multiple parties, all of whom have a stake in Apertis
continuing to meet their own set of requirements. Whilst a proposed change may
provide the optimal solution for your use case, the Apertis maintainers will
need to consider the impact the change will have on the other users of Apertis
too and thus may request changes to the solution to ensure that Apertis
continues to well serve all its users.
{{% /notice %}}
{{% notice info %}}
Depending on the scope and content of the requested change, options may be
available to provide a
[dedicated project area]( {{< ref "contribution_process.md#dedicated-project-areas" >}} )
to host party/project specific packages to enable such changes to be made to a
project specific version therefore avoiding and impact on the core Apertis
offering.
{{% /notice %}}
As a rule of thumb start with a lean design document and submit it for review
as early as possible. You can send a new design for review to the same channels
This checklist is broken down into the following stages, with some items broken
out per contribution type:
- [Pre-development]({{< ref "#pre-development" >}}): These are topics
that should be addressed prior to any significant work being carried out to
avoid pitfalls that may cause a contribution to be rejected.
- [Development]({{< ref "#development" >}}): Certain factors and
considerations should be made during the development of proposed changes to
ensure they conform to the projects polices.
- [Pre-submission]({{< ref "#pre-submission" >}}): Final checks that should be
made prior to a submission being made.
- [Review]({{< ref "#review" >}}): Points that should be covered during
the review of the contribution.
- [Post-submission]({{< ref "#post-acceptance" >}}): These are on-going
responsibilities after a change has been accepted.
# Pre-development
## General
- `Contributor` **Understand licensing requirements**: Ensure that the
contribution will be able to be
[licensed in a manner acceptable to the Apertis project]({{< ref "license-applying.md" >}}).
- `Contributor` **Determine if ongoing support is to be provided**: Apertis is
supported with resources and effort by it's core backers. It is the
requirements of those who support the project who ultimately control its
direction. Whilst simple non-intrusive changes are very welcome, the ability
to offer firm commitments to support the project may impact the viability of
a proposed substantial change that provides no benefit to the existing
maintainers.
- `Contributor` **Identify the value that the proposed changes bring to
Apertis**: During review, the Apertis maintainers will consider the
[value brought to Apertis]({{< ref "contribution_process.md#extending-apertis" >}})
by any proposed changes. Ensure that such value can be expected before
starting development.
## Concept Design
- `Contributor` **Survey the state of art**: The project strives to adapt and
expand to new use cases utilizing the approach that provides the best fit for
Apertis. Any proposed change to the project should show that alternative have
been researched and evaluated.
# Development
## General
- `Contributor` **Explain what the contribution brings to Apertis**:
- Code: What does the change do?
- Component: What is the component, what does it do?
- Concept Design: What is the goal, how is it expected to work?
- `Contributor` **Any impacted documentation is updated**: This may be able to
form part of the same merge request or may need to be part of a separate
merge request depending on the repository to which changes are being made.
Either way, such changes should be available for review at the same time.
## Code
- `Contributor` **Coding conventions**: Ensure that any code conforms with the
[Apertis coding conventions]({{< ref "coding_conventions.md" >}})
- `Contributor` **Changes don't break any supported architecture**: Apertis
provides support for a number of
[reference platforms]({{< ref "reference_hardware/_index.md" >}}), the
changes must work or not be applicable on all applicable architectures and
platforms.
## Component
- `Contributor` **Components should follow the packaging workflow**:
Repositories for newly added components should be structured according to the
[packaging workflow]({{< ref "gitlab-based_packaging_workflow.md" >}}),
including providing Apertis' CI pipelines. The pipeline should succeed for all
supported architectures. Where a components applicability is limited to
specific platforms or architectures, this should be well documented and the
components repository configured to reflect this.
## Concept Design
- `Contributor` **Document expected approach to meeting goals**: An outline
should be giving a high level overview of the steps that would need to be
taken to take Apertis from its current state to the end goal of the concept
design.
{{% notice note %}}
The breadth of topics that may need to be covered here will be highly
dependent on the goal of the concept document. The document should be
detailed enough to clearly describe the design and surrounding problems to
developers and project managers, but it is not necessary to describe
implementation details.
Topics that may need to be addressed include changes or impact on:
- The development workflow
- CI/CD and testing approach
- Infrastructure configuration
- Existing components
- Support of releases over their lifetimes
- Long-term maintainability of the project
Such topics may require a high degree of familiarity with the project to
answer. The Apertis maintainers are open to discussing goals and approaches
prior to a concept design being submitted. Discussing and collaborating with
the maintainers at an early stage is likely to prove beneficial to the
contributor, increasing the likelihood that the submitted design concept will
ultimately be accepted.
{{% /notice %}}
- `Contributor` **Website integration**: Design concepts and other equivalent
documentation changes are submitted as a change to the documentation on the
Apertis website and should also be generated by the website CI/CD as a PDF to
aid with review. Documents should be formatted in Markdown and follow the
[relevant guidance](https://gitlab.apertis.org/docs/apertis-website/-/blob/master/README.md).
# Pre-submission
## General
- `Contributor` **The proposed changes should be broken down into 1 or more
[atomic commits]({{< ref "version_control.md#guidelines-for-making-commits" >}})**:
The addition of a new concept design may be presented as a single commit,
however is likely that many code changes should be broken down into multiple
well described logical commits.
- `Contributor` **Document impact of changes on Apertis**: What is the expected
outcome in Apertis? What is it adding? What needs to change? Is anything
being removed? Is it expected to cause any regressions?
## Code
- `Contributor` **Evaluate whether Apertis is the correct place for
the contribution**: Is Apertis the most suitable place to submit the proposed
changes in line with Apertis'
[upstreaming policy]({{< ref "contribution_process.md#extending-existing-components" >}})?
# Review
## General
- `Contributor` **Address review comments promptly and fully**: It is likely
that most submissions will result in feedback, be that requests or questions.
It is expected that a resolution should reached for any feedback prior to a
submission being accepted.
- `Maintainer` **License suitability**: Does the contribution meet the
[license expectations]({{< ref "license-expectations.md" >}}) and
[guidelines]({{< ref "license-applying.md" >}})?
- `Maintainer` **Evaluate the benefits of accepting the contribution**:
- Does the benefit of accepting the contribution outweigh the cost of
maintaining the changes long-term?
- Does the change fit the long-term goals of the Apertis project?
- Does the change well with the goals and objectives of the Apertis project?
- `Maintainer` **The goal of the change adequately explained.** For small code
changes a well written commit message will suffice. Larger changes should be
accompanied by a merge review description covering the entire merge request.
For concept documents, the goal should be adequately explained in the
document its self.
- `Maintainer` **Evaluate the impact on Apertis**:
- What is the expected outcome in Apertis?
- What is it adding?
- What needs to change?
- Is anything being removed?
- Is it expected to cause any regressions?
- `Maintainer` **Are changes broken down into atomic commits**: Good practice
should be followed with regards to
[commit history]({{< ref "version_control.md#guidelines-for-making-commits" >}}).
- `Maintainer` **Changes pass on all applicable CI/CD pipelines**: The changes
do not cause build regressions on any supported architecture.
- `Maintainer` **Evaluate impact across all supported platforms**: Changes
should either be applicable to all supported platforms or care should be
taken to evaluate that platform specific changes are made in a way that other
platforms are not negatively impacted.
## Code
- `Maintainer` **Check coding conventions**:
The contribution should conform to the
[coding conventions]({{< ref "coding_conventions.md" >}}).
- `Maintainer` **Evaluate whether Apertis is the correct place for
contribution**: Is Apertis the most suitable place to submit the proposed
changes in line with Apertis'
[upstreaming policy]({{< ref "contribution_process.md#extending-existing-components" >}})?
## Component
- `Maintainer` **Ensure that the component doesn't duplicate existing core
functionality**: Where possible adding multiple components that implement the
same functionality should be avoided as this increases maintenance for little
appreciable gain. An exception to this policy exists for developer tools,
especially editors.
- `Maintainer` **Component implements the package workflow**: The package
implements the
[packaging workflow]({{< ref "gitlab-based_packaging_workflow.md" >}}), is
correctly configured and all applicable CI pipelines succeed.
## Concept Design
- `Maintainer` **Ensure an evaluation of the state-of-the-art been performed**:
- Has a comprehensive review of alternative solutions been performed?
- Does the proposed solution seem the best fit for Apertis? (This should take
the rationale for inclusion into account.)
- `Maintainer` **Is the approach to meeting the goals is sufficiently clear**:
Have the impact of the proposed changes been fully considered. Is it
understood how it will work.
# Post-acceptance
- `Contributor` **Continue to support the changes that have been made**: Where commitments have been made regarding support of changes to the project, these should be honored.
- `Maintainer` **Maintain changes as long as possible**: Changes added to Apertis should be maintained where possible for as long as they are meaningful to the project.
{{% notice tip %}}
If submitting a large patch set, consider whether it can be broken down into several stages,
ensuring that any feedback from reviews of earlier stages are applied to subsequent ones.
As a rule of thumb start with a lean design/change and submit it for review
as early as possible.
You can send a new design for review to the same channels
used for a [component contribution]( {{< ref "development_process.md" >}} ).
{{% /notice %}}
......@@ -19,7 +19,9 @@ outputs = ["html", "pdf-in"]
This guide covers the expectations and processes for Apertis developers wishing
to make contributions to the Apertis project and the wider open source
ecosystem. These policies should be followed by all developers, including core
and third party contributors.
and third party contributors. A
[checklist]({{< ref "contribution_checklist.md" >}}) is provided in conjunction
with these policies to aid contributors.
# Suitability of contributions
......@@ -187,6 +189,8 @@ information:
the points made above where relevant
- Whether any resources are to be made available to help maintain the component.
### Dedicated Project Areas
An alternative to adding packages to the main Apertis project is to apply to
have a dedicated project area, where code specific to a given project can be
stored. Such an area can be useful for providing components that are highly
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment