Header-canalplus
December 10, 2020

The robustness of automated software tests

Christophe Cressend
Blog > Agile
The robustness of automated software tests

What is a robust system?

The robustness of a system can be defined as "stability of performance", "the ability to reproduce an expected behavior", when the external conditions to which it is subjected vary, sometimes unexpectedly.

It can be associated with "performance", which can be defined as "the ability to reproduce an expected behavior" only in a predefined reduced range of operation.

There are also non-performing systems, and therefore not robust, which are not the subject of this article.

What is software robustness ?

In the field of software testing, the notion of "operating range" is essential since it allows you to distinguish between nominal and efficient operation and operation where the robustness of the software will be necessary for it to ensure the expected operation.

There is, moreover, a field of software testing that addresses the problems of robustness and allows you to define your operating range more completely: for example, the number of simultaneous users using the same server, or the use of volumes of data outside of the recommended conditions of use. The objective of these tests is to push the system to the limit in order to determine the limits of its operating range.

The definition of the operating range then allows recommendations to be made in terms of hardware and software prerquisites (OS) that will ensure nominal operation.

How to define automated software tests robustness

In the field of automated functional software testing, if you want to keep it simple, you should not have to rely on "robust" test operation: everything must run nominally in order to avoid flaky tests, false negatives, or tests that fail for no apparent reason.

If you replay your tests in continuous integration, you will have to analyze these flaky tests regularly in order to understand that the causes of failure are not due to bugs in the software under test, but to the conditions in which the tests were carried out.

How do you ensure automated software tests robustness?

To ensure that your tests will perform well, it is necessary to act at several levels:

  • Make sure that your tests are deterministic: that their execution is the result of a predefined and foreseen scenario, this avoids failing at the first unexpected behavior
  • Make sure that the data on which they act is identified: either by asking each test to create its own data, or by creating a common data repository in which each test will be able to find the data on which it is acting without bothering the other tests. This will be all the more effective if you plan to parallelize your tests
  • Make sure that the hardware conditions for replaying the tests are identical: for example, the response time of the servers can generate flaky tests, and therefore the load on the servers and the time schedules can be important. If you are doing tests using graphical recognition, the hardware environments are also important.

Automated software tests robustness according to Agilitest

Agilitest's initial focus is to try to limit test maintenance activities, and therefore unnecessary flaky test analysis, it's our priority from the beginning to ensure that tests run optimally, even when the conditions are out of range.

We also have a whole set of functions that will allow you to create randomized data for your test.

You will see that Agilitest is fast, but will take its time before declaring a failure condition, for example :

  • Check several times that a value different from the expected one always has the same value in the application.
  • Make sure that the server has returned the expected data.
  • Make sure that an element on which we have to perform an operation is found (or check again until it's found until a given limit)

Reaching zero flaky tests is the grail towards which we are aiming, it is an important stake since at the key, there is the possibility of delivering a qualified version of your software at any time with the certainty that you have covered the majority, if not all, of the important functions.

Want to give Agilitest a try?

See Agilitest in action. Divide by 5 the time needed to release a new version.

Scaling functional test automation for happy teams

  • From manual to automated tests
  • From test automation to smart test automation
  • Finding the right tools
ebook-scaling-test-automation-agilitest

Get great content updates from our team to your inbox.

Join thousands of subscribers. GDPR and CCPA compliant.