PanTesting
State of mindCombination of Double Learning Loop / Panarchy / Theory of Constraints / Testability practices for agile testing
What is PanTesting?
“PanTesting” is a model for agile testing at scale. It is composed with [Moustier 2020]:
- Testability to provide technical and social means to enable testing [Bach 2015a] [Meaney 2018a] [Meaney 2018b] [MoT 2019] since it helps Hunting transparence
- Theory of Constraints (ToC) enables flow at the system level [Goldratt 1984] [Goldratt 1990] [Stein 1997] [Cox III 2010]
- Panarchy is an interaction model between subsystems [Gunderson 2002] and leads to an evolutionary maturity standard towards an organization without siloes
- Double loop learning proposes intertwined feedback loops [Argyris 1977] [Smith 2001] that lead to a “good product” (the so-called “governing variables” described in Argyris’ theory) instead of a “compliant product” since there is an Understanding what is to be tested.
The combination of those four components can be used at any scale of the organization thanks to the understanding of both Panarchy and ToC while testability and double loop learning help adapting testing at virtually any part of the organization, should it be:
- the business domain
- the technical side of the solution
- the development process
- the collaboration between both internal and external teams
- the budget management
- the tooling
- the quality management system
- the knowledge management
- the culture inside the company
Understanding both Panarchy will help to break silos between subsystems (the term used in Panarchy is “ecocycles”) by exploiting ecocycle changing phases α, r, K and Ω and progressively integrating changes rates from linked ecocycles into your own ecocycle until they are synchronized.
When it come to changes rates, the application of ToC helps to integrate the changes by managing synchronization discrepancies at the appropriate moment, should it be when those discrepancies would become an unbearable bottleneck or in a just-in-time approach notably thanks to cadencing and synchronization [SAFe 2021-7].
Impact of PanTesting on the testing maturity
Introducing PanTesting becomes inevitable since it enables intrinsically delivering a good product built from several teams and quality at scale from every subsystem of the organization that would influence the product quality.
Actually, once an organization is able to test the product, eventually in an agile way, the company will surely face a ceiling of glass that will limit testing efficiency: the whole system must be improved by applying system thinking [SAFe 2021-2].
To enable this systemic view, focussing on the whole solution notably by adopting X-Teams to infuse an ubiquitary language and aiming a Customer centric organization is fundamental. However, this requires a lot of dependencies (i.e. linked ecocycles) such as organizing around value, unlocking collaborators’ intrinsic motivation, etc.
PanTesting provides both a maturity model towards merged ecocycles and a path for managers to spread the model as gardeners. The diagram here below shows existing managed connections between ecocycles (plain arrows) and the next connections to start managing to improve quality within the organization. For new and older connections, the path to merging ecocycles shown above can then be used to monitor the progression in terms of maturity.
The Managers may use a prioritized backlog to old connections improvements or new connections setup notably thanks to a Weighted Shorter Job First (WSJF) prioritization method [SAFe 2021-35] and WIP.
To facilitate this progression, three roles are involved [Moustier 2020a]:
- the Teams who uses and anticipate thirds ecocycle changes thanks to the use practices such as X-Team or Yokoten and boost change cycles speed of their ecocycles when it’s possible [SAFe 2021-4] with a strong testability of their deliverables
- the Managers who will foster the learning organization by leveraging the path to merging ecocycles, double loop learning opportunities in respect of ToC laws
- the Communities who will act as connection hubs and therefore ease connections and Yokoten
Agilitest’s standpoint on PanTesting
Automating test scripts faces many challenges such as readiness with product development.
When script development comes after product development, ToC clearly applies for it will inevitably look like a bottleneck to software delivery. This leads to progressively synchronize ecocycles from product development and script development. This is precisely what’s happening in a US Task Force.
This simple example illustrates the use of two components of the PanTesting. Whatever your organization, the #nocode approach to scripting [Forsyth 2021] with Agilitest shortens both scripting and debugging times; hence, the bottleneck effect is reduced and could get out of the delivery critical path if both activities would be done in parallel.
Testability is also part of PanTesting. This one is actually vital to scripting. Without any testability at all, the script would not be able to check any of the requirements. Say a feature is to be implemented and no one is able to see any feedback on the success or failure of the running of the feature, then how would a script do? However, there would be some solution provided that some intrinsic testability would have been added in an inner part of the product (say a REST API, hidden from the end user but that could be reached from the script). This is why Agilitest provides some REST API features to enable such checks.
Finally, double loop learning is also inevitable in scripting. It would be mostly stupid to script a test without knowing:
- the objective of the test the script should check
- the components involved by the script on which the script relies on
The Scripter is naturally compelled towards those items to achieve the script. However, to provide some fault tolerance on the most versatile part of the scripting (i.e. the graphic components also nicknamed “widgets”), Agilitest embeds some heuristics to maximize the probability to capture widgets from criteria provided at script design time.
To discover the whole set of practices, click here.
Related cards
To go further
- [Argyris 1977] : Chris Argyris - « Double Loop Learning in Organizations » - Harvard Business Review - SEP/1977 - https://hbr.org/1977/09/double-loop-learning-in-organizations
- [Bach 2015a] : James Bach - « Heuristics of Software Testability » - Version 2.3, 2015 - https://www.satisfice.com/download/heuristics-of-software-testability?wpdmdl=1137&refresh=5d499ab18a3271565104817&open=1
- [Cox III 2010] : James F. Cox III et John G. Schleier Jr. - « Theory of Constraints Handbook » - McGraw-Hill - 2010 - ISBN: 978-0-07-166555-1
- [Forsyth 2021] : Alexander Forsyth – JAN 2021 - « Low-Code and No-Code: What’s the Difference and When to Use What? » - https://www.outsystems.com/blog/posts/low-code-vs-no-code/
- [Goldratt 1984] : Eliyahu M. Goldratt et Jeff Cox - « The Goal - A Process of Ongoing Improvement » - North River Press - 2004 (1ere ed. 1984) - ISBN: 0-88427-178-1
- [Goldratt 1990] : Eliyahu M. Goldratt - « What is this thing called Theory Of Constraints and how should it be implemented? » - North River Press - 1990 - ISBN 9780884270850
- [Gunderson 2002] : Lance H. Gunderson & C. S. Holling - « Panarchy - Understanding Transformations in Human and Natural Systems » - Island Press - ISBN 1-55963-857-5
- [Meaney 2018a] : Rob Meaney - « Let’s test testability » - Agile Testing Days 2018 - 2018 - https://www.katjasays.com/agile-testing-days-2018-sketchnotes-and-summary/
- [Meaney 2018b] : Robert Meaney - « Please explain testability to me » - 28/NOV/2018 - https://club.ministryoftesting.com/t/please-explain-testability-to-me/20586
- [MoT 2019] : Atelier de discussion sur Ministry of Testing - « Power Hour - Testability » - 2019 - https://club.ministryoftesting.com/t/power-hour-testability/27022
- [Moustier 2020] : Christophe Moustier – OCT 2020 – « Conduite de tests agiles pour SAFe et LeSS » - ISBN : 978-2-409-02727-7
- [Moustier 2020a] : Christophe Moustier – JUN 2020 - “PANTESTING Un modèle de test (agile) à l'échelle” - https://www.researchgate.net/publication/341914421_PANTESTING_Un_modele_de_test_agile_a_l'echelle and https://www.youtube.com/watch?v=3HdAsGi9Wqo
- [SAFe 2021-2] : SAFe - FEV 2021 - “Principle #2 – Apply systems thinking” - https://www.scaledagileframework.com/apply-systems-thinking/
- [SAFe 2021-35] : SAFe - FEV 21 - “Weighted Shortest Job First (WSJF)” - https://www.scaledagileframework.com/wsjf/
- [SAFe 2021-4] : SAFe - FEV 2021 - “Principle #4 – Build incrementally with fast, integrated learning cycles” - https://www.scaledagileframework.com/build-incrementally-with-fast-integrated-learning-cycles/
- [SAFe 2021-7] : SAFe - FEV 2021 - “Principle #7 – Apply cadence, synchronize with cross-domain planning” - https://www.scaledagileframework.com/apply-cadence-synchronize-with-cross-domain-planning/
- [Smith 2001] : Mark K. Smith - 2001 (updated in 2005) - “Chris Argyris: theories of action, double-loop learning and organizational learning” - www.infed.org/thinkers/argyris.htm
- [Stein 1997] : Robert E. Stein - « The Theory of Constraints : Applications in Quality and Manufacturing » - CRC Press - ISBN: 9780824700645