Definition of done: Difference between revisions

Jump to navigation Jump to search
clarify ambiguous requirements with conclusions from the retrospective meeting d.d. 28-06-2024
No edit summary
(clarify ambiguous requirements with conclusions from the retrospective meeting d.d. 28-06-2024)
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
At Delft Solutions we aim to have a shared understanding what it takes to release an incremental update to one of our projects. Having a shared understanding of what it means to call something **done** also means we don't have to ask "But is it _really_ done?" or "Hey you said it was done but I don't see it, where is it?".
At Delft Solutions we aim to have a shared understanding what it takes to release an incremental update to one of our projects. Having a shared understanding of what it means to call something '''done''' also means we don't have to ask "But is it ''really'' done?" or "Hey you said it was done but I don't see it, where is it?".


== Functional Requirements ==
== Functional Requirements ==
Line 7: Line 7:
== Quality ==
== Quality ==


In order to keep the code maintainable and relatively bug-free, as well as broaden the amount of people that know about a certain feature's implementation, we expect PRs to be peer-reviewed and senior-reviewed (someone senior to the project, or, if not available, senior in the company).
In order to keep the code maintainable and relatively bug-free, as well as broaden the amount of people that know about a certain feature's implementation, we expect PRs to be peer-reviewed by one other person. Ideally, this would be someone senior who knows more about the project than you do; or if you know the most, a more experienced person in the company. If you're written this feature using pair-programming, no code review after the fact is required.  


Any automated analysis that runs on a project should also be without errors. When there are warnings, it should be explained ''why'' the warning isn't resolved. Yes, some of this tooling sometimes gets it wrong, but overall they make the code more consistent in style, and it often prevents a lot of common issues.
Any automated analysis that runs on a project should also be without errors. When there are warnings, it should be explained ''why'' the warning isn't resolved. Yes, some of this tooling sometimes gets it wrong, but overall they make the code more consistent in style, and it often prevents a lot of common issues.
Line 15: Line 15:
== Non-Functional Requirements ==
== Non-Functional Requirements ==


* No known defects
* We make things better, not worse
* No build failures
* Build errors and warnings must be solved or explained (and accepted)
* No errors in coding standards (automated analysis)
* Existing tests passed
* Existing tests passed
* Peer code review passed
* Peer code review passed
* Deployed (if applicable)
* Deployed (if applicable)
116

edits

Navigation menu