What is Definition of Done (DoD)?
«The “Definition of Done” (DoD) is a formal description of the state of the Increment when it meets the quality measures required for the product» - [Schwaber 2020].
While Scrum was created in 1995, this concept has been introduced in Scrum from 2009 [Verheyen 2020], apparently from a pattern that was proposed in 2002 by Dan Rawsthorne [Rawsthorne 2015]. The DoD creates transparency for everyone on delivered artifacts. It is based on a shared understanding of what should be done to make those things done when considered at product level [Madan 2019]; therefore, if several Teams are working on the same product, they may have different DoD but their combined work must involve a DoD based on the product to make it releasable.
Even if this approach tends to result in a staggered SDLC, it will always be preferable to releasing mediocre code because, as SAFe likes to say, "You can't scale crappy code" [SAFe 2021-42] and the DoD helps to know the smell ahead of time. Without it, code is sent to production without knowing the impacts it will have on operations. However, staggering is not a curse, aiming towards a Feature Team based organization reduces this bottleneck and there are tons of agile practices that help to do everything at the same time.
Impact on the testing maturity
The DoD concept is often confused with “Acceptance Criteria” (AC); actually, while ACs vary from User Story (US) to US, the DoD is at higher rank [Purushotham 2020]. It is also applicable to any Product Backlog Item (PBI) as well as any event such as Sprint Refinement and Planning, Demoes or Retrospectives to make sure each ceremony is complete [Dalton 2018].
The DoD consists of 3 main components [Madan 2019]:
- Business or Functional requirements: the ACs
- Subjective or measured Quality Matters such as test coverage, open bugs, etc.
- Non-Functional Requirements
These components can be perceived through a “Done Thinking Grid” [Gupta 2008] to get more into details and pin point items on a checklist more specifically.
Actually, a survey over publications on the DoD issue managed to spot 62 criteria which cover 9 categories mostly focused on regulatory compliance [Silva 2017].
Indeed, depending on the point of view adopted, the DoD will have different values depending on who expresses its expectations, thus different levels of “Done” can be defined [Rawsthorne 2015]:
- US must be “Done/Done” – US get “Done” and are verified in terms of DoD.
- Features must be “Done/Done/Done” when all composing US are “Done/Done” and the PO validates that the Feature is good enough.
- Products must be “Done/Done/Done/Done” when there are enough “Done/Done/Done” Features to be useful to users and the release is ready from Ops guys’ point of view.
These feedback loops on the “Done” statuses are merely like Argyris’ double loop learning process [Argyris 1977] [Moustier 2020]; doing things right at level N infers aiming at a higher level. They reflect the maturity of the Team who update the DoD from both internal & external constraints. The more they practice Gemba Walks or act as X-Teams, the more accurate and relevant is the DoD [Moustier 2020]. However, setting a DoD with very high standards from the beginning with Teams that are not ready for that is not a good idea. Just like with games, you should have a couple of wins and good first pass yields to get into it. Therefore, you should measure happiness within the Team, notably thanks to a simple Niko-Niko Calendar.
Whatever the content of your DoD, the thing to keep in mind is the “shared understanding” dimension. It should then never be pushed by a single person but it should rather emerge from the people who will be responsible for applying those criteria. To facilitate this approach, a set of cards named “DoD Kards” has been created to let the whole team decide on the DoD to apply along with the Product Owner [Velasquez 2017].
Agilitest’s standpoint on this practice
From an economic and DevOps perspective, automation is a must have and automating test scripts is inevitable and this should then be part of any DoD at US level. Moreover, when it comes to test scripts automation, the scripts should be treated as production code which also implies extra criteria on the DoD to enable maintainable and sustainable scripts such as “FIRST” [Tekguard 2018] [Moustier 2019-1] or the use of test harnesses to enable testability and fast test execution.
To discover the whole set of practices, click here.
Related cards
To go further
- [Argyris 1977] : Chris Argyris - « Double Loop Learning in Organizations » - Harvard Business Review - SEP/1977 - https://hbr.org/1977/09/double-loop-learning-in-organizations
- [Dalton 2018] : Jeff Dalton - DEC 2018 - “Great Big Agile: An OS for Agile Leaders” - isbn:9781484242056
- [Gupta 2008] : Mayank Gupta - SEP 2008 - “Definition of Done: A Reference” - http://athena.ecs.csus.edu/~buckley/CSc190/Definition%20of%20Done.pdf
- [Madan 2019] : Sumeet Madan - DEC 2019 - “DONE Understanding Of The Definition Of ‘Done’“ - https://www.scrum.org/resources/blog/done-understanding-definition-done
- [Moustier 2019-1] : Christophe Moustier – JUN 2019 – « Le test en mode agile » - ISBN 978-2-409-01943-2
- [Moustier 2020] : Christophe Moustier – OCT 2020 – « Conduite de tests agiles pour SAFe et LeSS » - ISBN : 978-2-409-02727-7
- [Purushotham 2020] : Sowmya Purushotham & Amith Pulla - AOU 2013 - “Bridging the Gap Between Acceptance Criteria and Definition of Done” - https://pdfs.semanticscholar.org/296b/3dc358ef9ec621f77c891371a5e73a91e119.pdf
- [Rawsthorne 2015] : Dan Rawsthorne - c2015 - “Done Done Done Done in Scrum” - http://blog.3back.com/scrum-industryterms/done-done-done-done-in-scrum/
- [SAFe 2018-42]: SAFe - FEV 2021 - “Agile Software Engineering Landing Page” - https://www.scaledagileframework.com/agile-software-engineering-landing-page/
- [Schwaber 2020] : Ken Schwaber et Jeff Sutherland - « The Scrum Guide : The Definitive Guide to Scrum: The Rules of the Game » - NOV 2020 - https://scrumguides.org/docs/scrumguide/v2020/2020-Scrum-Guide-French.pdf or https://scrumguides.org/docs/scrumguide/v2020/2020-Scrum-Guide-US.pdf
- [Silva 2017] : Ana Silva & Thalles Araújo & João Nunes & Mirko Perkusich & Ednaldo Dilorenzo & Hyggo Almeida & Angelo Perkusich - JUN 2017 - “A systematic review on the use of Definition of Done on agile software development projects” - https://www.researchgate.net/publication/316441150_A_systematic_review_on_the_use_of_Definition_of_Done_on_agile_software_development_projects
- [Tekguard 2018] : Tekguard - NOV 2018 - “F.I.R.S.T Principles of Unit Testing” - https://github.com/tekguard/Principles-of-Unit-Testing
- [Velasquez 2017] : Camilo Velasquez & Thomas Wallet & Kleer - NOV 2011 - “DoD Kards” - FR: https://coach-agile.com/2017/11/decouvrez-dod-kards-definition-of-done/ - ES : https://www.elproximopaso.net/2017/07/dod-kards.html - EN : https://medium.com/@twallet/dod-kards-a-game-to-co-create-the-team-definition-of-done-7eda68f6dbec
- [Verheyen 2020] : Gunther Verheyen - DEC 2020 - “Scrum - A Brief History of a Long-Lived Hype” - https://guntherverheyen.com/wp-content/uploads/2020/12/Scrum-A-Brief-History-of-a-Long-Lived-Hype-Paper.pdf