Merge Requests and Code Review Standards
2 min readJun 19, 2021
In the following post, I want to share my thoughts about effective merge requests publising and code review process. I believe that the mindset around this process should be positive, necessary and an important part in building quality software even though it may pose kind of a bottleneck and slow us down sometimes
Merge Request Policy and Best Practices
- Min Reviewers Policy: 2, this threshold is the most effective according to my experience and public researches
- Ownership Practice: Code Owners should be assigned to relevant projects or even sections (This practice enables moderation by repositories’ SMEs for better code quality): https://docs.gitlab.com/ee/user/project/merge_requests/merge_request_approvals.html#code-owners-as-eligible-approvers
- Build Pipeline Practices: Integration of static code analysis tools as a prerequisite of human code review. Tools around coding standards/linting, security, test coverage, etc.
Issuer Guidelines
Commits:
- Size: Keep them small as possible by handling one concern at a time per commit and add a descriptive message about what you did — That way, your changes would be much easier to track, review and understand
- Convention: Make sure to prefix your commit with the relevant ticketing system’s (e.g., Jira ) ticket ID, e.g., “ESTIDEN-555-Add logs to entry function” — it is critical for historical product context when reviewing the source code)
MR Description/Overview: When posting your MR, elaborate and add any vital context on the changes you’ve made and the motivation behind them in a descriptive manner to let the reviewer the best guidance for a high-quality review
Reviewer Guidelines
What to look for?
- Coverage: Testing coverage for new or modified logics (e.g., if-else control statements, loops, etc.): To avoid future bugs and PROD issues with your current & future mission-critical services
- Build: Successful pipeline build (“Green“)
- Observability: Make sure that required info and error logs were added accordingly
- Automated Tools: Review static code/linters analysis tools report to pinpoint critical issues
- Packages: New modules/packages utilization: Are they known and already in use within your organization? What are their popularity and known security issues? Are there known alternatives?
- Code Duplication: Can we avoid or reduce it?
- Time and space complexity: Possible code performance (e.g., blocking code, double array sorting, etc.) or memory usage issues
- Over-engineering: Can we build our solutions more simply or use some existing packages to help us?
- Feedback: Don’t be shy; provide your professional feedback for the sake of a productive & educational mutual process
- Feedback Loop: Keep it as quick & effective as possible to help the team deliver rapidly and accelerate the process (i.e., Respond quickly as possible to MR’s questions & comments)