End-to-end testing (E2E) is a technique used to verify if an application (website, mobile application...) behaves as expected from an end user perspective. This technique makes it possible to validate the functioning of the front end. But also to verify its integration with the many related components and back office services.
E2E Testing is an umbrella term that covers multiple aspects:
- Functional testing: Has been imagined by first William Howden in 1978 who describes an approach to functional testing in which the design of a program is viewed as an integrated collection of functions [Meerts 2012].
- System Testing: A test level that focuses on verifying that a system as a whole meets specified requirements . Those requirements should include both functional and Non Functional Requirements (NRF) [ISO 25010 2011]
- Acceptance Testing: A test level that focuses on determining whether to accept the system. Those requirements also include both functional and NFR’s [ISO 25010 2011] and may be defined in a 3 Amigos session and automated the ATDD way.
That’s because E2E Testing happens in all 3 types of testing. It implies both Verification and Validation (V&V). V&V addresses both questions “are we building the product right?” and “are we building the right product?” [Boehm 1984].
E2E Testing will help to V&V System and Acceptance testing. It also does some integration testing in “big bang” mode [Parmar 2014] [Moustier 2019-1]. It also provides some perspectives on different NFR such as pentesting or load testing across multiple environments.
E2E Testing can be viewed from an horizontal perspective notably with
- multiple applications (aka “touchpoints”) found in a whole Information System (IS)
- multiple human interfaces (browsers, mobiles, heavyweight applications)
- multiple usages with different personae [Hendrickson 2013][Moustier 2019-1]
- multiple business processes
- multiple environments and data managements
E2E can also be viewed vertically from a technical perspective with different technologies that would also represent an E2E Testing.
To add some more confusion on what this E2E notion covers, we should also pay attention to what “System” refers to. When the touchpoint is the System Under Test (SUT), it is then viewed as a component of a whole IS. Therefore, testing this part is literally a component/unit testing. Although, from a Team point of view, testing that whole component is also a system testing…
V&V vs E2E Testing
This multitude is covered by a lot of testing techniques notably taught with ISTQB standard techniques (equivalence partitioning, pairwise, decision table, …) [Beizer 1990] [Otsuki 2012] [ISTQB 2015], but You must keep in mind that this is just a “school of testing” based on analyses and heuristics [Moustier 2021-2]; the others are [Pettichord 2007]:
- Standard School: testing as a way to measure progress notably on cost and repeatable standards
- Quality School: emphasizes processes and controls
- Context-Driven School: emphasizes people skills, context and seeking bugs that stakeholders care about
- Agile School: emphasizes automated testing and insure that development is complete
In a Shuhari state of mind, it appears that modern testing is not just checking the product is compliant with requirements [Bach 2019] and thinking out of the box is major improve E2E Testing for instance with the Modern Testing approach which also implies the continuous improvement of both the product and the system that builds it the Kaizen way.
Impact on the testing maturity
In Waterfall-like organizations, E2E Testing is done
- at part level - eventually with delivery-based services done by contractors
- after integration level - with Business Owners (BO) that do know extensively the IS to perform E2E Testing across multiple touchpoints
This configuration is prone to issues in communications due to silos. Another visible issue is when it comes to analyzing bugs met at BO level where systemic issues emerge [Appelo 2010] due to the complexity of the environment [Snowden 2007] which deprecates the causality predicate: an effect doesn't have a single cause and formal Root Cause Analysis (RCA) may loop back on issue already met in a 5 Whys.
To avoid the latter factor, it is essential to adopt iterative approaches with small increments [SAFe 2021-04]. At Google, every day, 800,000 builds and 150 million tests are executed on 13,000 projects [Soto-Sánchez 2021]. To cope with such output, the agile mindset guided by attempts to do everything at once, automation, cadence [SAFe 2021-07] and practices such as Poka-Yoke is inevitable. SAFe provides some answers through its principles and demos at every level of the organization (Team, Program, Solution). SAFe also introduces an interesting item, the Program Board.
The Program Board is a tool generated at Program Increment planning time [SAFe 2021-32] which shows dependencies between Teams and expected deliveries.
Program Board [Moustier 2020]
We already noticed that E2E was something relative to the SUT to be tested. Then it is simple to see that very local E2E Testing will be done on the first delivered part, and E2E Testing will progressively grow with the accumulated delivery [CFTL 2019].
To facilitate cadence and short increments, DevOps practices such as Dark Launch are most useful to keep a high delivery flow and E2E Testing. Such a delivery context infers the coexistence of many versions of the same component; under this context, the Open-Close principle will also be valuable; for instance, when it comes to API’s, existing API should not be altered (closed to modifications) but new API should be added (open for add). This way, compatibility is kept and the delivery is more resilient to synchronization issues. This example illustrates a specific E2E Testing environment addressed by Shift Right Testing.
It is also possible to apply a Poka-Yoke to E2E Testing in order to reduce the systemic issues aforementioned. It appears that it is possible to use Conway's law [Conway 1968] the reverse way to design Teams so that architecture will emerge [Skelton 2019]. Those Teams and the way they communicate should then be designed to ease E2E Testing notably with long term testing teams composed with functional/business experts and eventually with test harnesses as part of the communication between Teams. The dependency management can be optimized by caring about relations between teams with a Context Map from the Domain-Driven Design [Evans 2004] and with PanTesting along. This enables Managers to participate in testing at their level.
Agilitest’s standpoint on this practice
Agilitest provides a #nocode E2E Testing automation tool capable with web, mobile and MS Windows applications to interact with different types of touchpoints. The tool also embraces performance testing thanks to the Octoperf plugin.
However, E2E Testers should not rely on this sole vision of the test. Testers should also promote an holistic approach, at least to ease E2E Testing and push the system to avoid the Ice cream cone syndrome.
To go further
- [Appelo 2010] : Jurgen Appelo - « Management 3.0: Leading Agile Developers, Developing Agile Leaders » - Addison Wesley - 2010 - ISBN : 978-0321712479 - voir aussi https://fr.slideshare.net/jurgenappelo/agile-management-leading-teams-with-a-complex-mind/
- [Bach 2019] : James Bach & Aaron Hodder - AVR 2014 - “Test Cases Are Not Testing: Towards a Culture of Test Performance” - https://www.satisfice.com/download/test-cases-are-not-testing#
- [Beizer 1990] : Boris Beizer - 1990 - “Software Testing Techniques” - isbn:9781850328803
- [Boehm 1984] : Barry W. Boehm - JAN 1984 - “Verifying and Validating Software Requirements and Design Specifications” - http://www.pauldee.org/se-must-have/boehm-v-and-v.pdf
- [CFTL 2019] : Ouvrage collectif - « Les tests logiciels en Agile » - Comité Français du Test Logiciel - 2019 - ISBN: 978-2956749004
- [Conway 1968] : Melvin Conway - « How do Committee Invent ? » - Datamation magazine - 1968 -http://www.melconway.com/Home/Committees_Paper.html
- [Hendrickson 2013] : Elisabeth Hendrickson - 2013 - “Explore It!: Reduce Risk and Increase Confidence With Exploratory Testing” - ISBN:9781937785024
- [ISO 25010 2011] : BSI Standards Publication - “Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models” - BS ISO/IEC 25010:2011
- [ISTQB 2015] : ISTQB - 2015 - “Syllabus Niveau Avancé Analyste de Test” - https://www.cftl.fr/wp-content/uploads/2015/03/Advanced-Syllabus-2012-TA-GA-Release-191012_FR.pdf
- [Meerts 2012] : Joris Meerts & Dorothy Graham - 2012 - “The History of Software Testing” - http://www.testingreferences.com/testinghistory.php
- [Moustier 2020] : Christophe Moustier – OCT 2020 – « Conduite de tests agiles pour SAFe et LeSS » - ISBN : 978-2-409-02727-7
- [Moustier 2021-2] : Christophe Moustier - JAN 2021 - “Heuristiques de test” - https://fan-de-test.fandom.com/fr/wiki/Heuristiques_de_test
- [Otsuki 2012] : Tomohiro Otsuki - AUG 2012 - “Software testing” - https://www.academia.edu/15448246/Software_testing
- [Parmar 2014] : Kratika Parmar - OCT 2014 - “Integration Testing Techniques” - https://archive.sap.com/kmuuid2/c03b0790-6d3b-3210-dcb4-848320a3d9e4/Integration%20Testing%20Techniques.pdf
- [Pettichord 2007] : Bret Pettichord - MAR 2007 - “Schools of Software Testing” - https://www.prismnet.com/~wazmo/papers/four_schools.pdf
- [SAFe 2021-04] : SAFe - FEV 2021 - “Principle #4 – Build incrementally with fast, integrated learning cycles” - https://www.scaledagileframework.com/build-incrementally-with-fast-integrated-learning-cycles/
- [SAFe 2021-07] : SAFe - FEV 2021 - “Principle #7 – Apply cadence, synchronize with cross-domain planning” - https://www.scaledagileframework.com/apply-cadence-synchronize-with-cross-domain-planning/
- [SAFe 2021-32] : SAFe - FEV 2021 - “PI Planning” - https://www.scaledagileframework.com/pi-planning/
- [Skelton 2019] : Matthew Skelton & Manuel Pais - 2019 - “Team Topologies: Organizing Business and Technology Teams for Fast Flow” - isbn:9781942788829
- [Snowden 2007] : David J. Snowden et Mary E. Boone - NOV 2007 - “A Leader’s Framework for Decision Making” - https://hbr.org/2007/11/a-leaders-framework-for-decision-making
- [Soto‑Sánchez 2021] : Óscar Soto‑Sánchez & Michel Maes‑Bermejo & Micael Gallego & Francisco Gortázar - JUN 2021 - “A dataset of regressions in web applications detected by end‑to‑end tests” - https://doi.org/10.1007/s11219-021-09566-x