Code contributions are done using the Merge Requests feature on GitLab.
As the Author of the contribution, please read the following steps and apply them in your day-to-day:
We conduct code reviews using the Merge Requests features on GitLab, and discussions should happen in the open, either on the Issue, the Merge Request, or the team-wide communication channel.
Reviewers are selected by the Head of Engineering or the Developer who created the Merge Request. In general, a reviewer reads the code, reviews the related issue, and then examines the files modified by the Author.
A reviewer must have the following mindset when performing a review:
- Transferring knowledge to the author.
This can range from a small code suggestion on how to make the code more maintainable or faster, to suggesting a library, reminding them of the guidelines, suggesting a way to organize the code, or signaling fundamental architecture/bugs/security problems that should be considered with the current approach the author is taking.
- The author probably knows more than the reviewer.
The author is the one in the field, touching the code and seeing the problem first-hand. Always give the benefit of the doubt and start the discussion with a question, rather than an affirmation that things are wrong. There is a chance the reviewer is not seeing the full picture.
- Neither the reviewer nor the author has more authority.
When proposing something, make it sound like a proposal and not like an order. If what a reviewer says has value, the author will probably accept it and apply it right away. If a discussion arises, keep it healthy, constructive, and argument-based. Either the author is seeing something the reviewer doesn’t see yet, or maybe the reviewer is seeing something the author doesn’t see yet. This
“aha” moment unlocks learning, and a safe environment to argue is key to good decision-making.
- Minor improvements or fixes can come later.
If merging a Pull Request adds more value than closing it, go ahead and merge it. Just take note somewhere so that the author reminds amending it later. Also, don’t be too picky, especially about things that are subjective, like style, formatting, or those that are too minor to even pay attention to (like a typo in a comment).
- Avoid reviewing or approving Merge Requests that fall outside your domain knowledge or expertise.
It’s better to defer to someone with direct experience in the area to ensure a high-quality and informed review.
A reviewer must check:
- That the contributing steps have been followed, not only in the Merge Request, but also in the associated Issue.
- That the Merge Request adds more value than what it takes. This is subjective, but the 8 quality characteristics of good software are a good starting point.
A reviewer should accept a contribution if it’s been made according to this document.
Code Coverage
Codecov is an invaluable tool for evaluating the quality and effectiveness of unit tests in our products. Here’s a detailed guide on how to interpret and utilize the reports it provides:
- Accessing Reports: Codecov coverage reports are available for each Pull Request and can be accessed directly from the GitLab user interface.
- Coverage Interpretation: The coverage report provides an overview of the code being tested by our unit tests. It is represented as a percentage, where 100% means all lines of code are being executed by our tests.
- Identifying Untested Areas: Review the coverage report to identify code areas that have not been adequately tested. These areas may be critical points requiring further attention in terms of unit testing.
- Coverage Trends: Codecov also offers insights into coverage trends over time. Use this information to evaluate whether our product’s coverage is improving or declining over time.
- Corrective Actions: If areas of code with insufficient coverage are identified, collaborate with the team to implement additional unit tests and improve coverage in those areas.
By effectively using this tool, we can ensure the quality and stability of our code over time. This practice also fosters a test-focused development culture throughout our team.
Integration of Coverage Results in Codecov
In our development workflow, each time tests are run in our continuous integration (CI) environment, artifacts containing coverage results are accumulated. These results are generated by tools such as
pytest,
jest, and
cypress, providing a detailed view of both unit and integration test coverage.
At the end of this process, a dedicated job for each component uploads these coverage results to Codecov. This job is presented in each of our products. For example, in the case of Integrates Back, the job is located at
integrates/nix/pkgs/integrates-back-coverage .
Additionally, Codecov provides information about coverage deltas, showing changes in code coverage between two consecutive test runs. This allows us to quickly identify which code areas have been affected by recent changes and ensure they are adequately tested.
You can access the coverage reports and corresponding graphs at
the Codecov page.
This integration ensures that we maintain high standards of test coverage in all our products, contributing to the overall stability and quality of our code.
Progress report
This is a quick guide on what information is expected in a daily progress report:
What did I do today:
- Current milestone(s), please include the corresponding URLs.
- Brief summary of tasks completed (e.g., “Fixed bug in login module”).
- Reference issues or tickets worked on (e.g., “Related to https://gitlab.com/fluidattacks/universe/-/issues/15624”).
- Any blockers encountered and how they were resolved.
- Collaboration: meetings, chats, asking for help, or supporting others.
- If applicable, use the ETA model when the task can be broken down into multiple, similar subtasks that allow for measurable daily progress toward a clear objective. The ETA should be calculated based on Colombian business days. Use this calculator as a reference.
- Include reviewed and approved MRs.
What will I do tomorrow:
- Tasks you plan to work on or continue (e.g., “Start implementing new resolvers”).
- Related issue or ticket ID (e.g., “Will focus on https://gitlab.com/fluidattacks/universe/-/issues/15624”).
- Objective for tomorrow — be specific (e.g., “Objective: Complete initial testing of the new resolver.”).
- Dependencies (e.g., “Waiting for implementation of new database table”).
I would need help with:
- Clarification on a task, requirement, or issue (e.g., “Need clarity on validation rules for XYZ input”).
- Unresolved blockers.
- Peer review or a second opinion on a solution you’re not confident about.
- Coordination needs (e.g., “Need to sync with the design team on the new visuals”).