Testing Talks is a series of interviews highlighting well-known personalities from the world of software quality and testing. In this new episode, we had the pleasure to talk with John Ferguson Smart.
You can watch the interview or directly head to the transcript below.
Can you introduce yourself and your background?
I'm John Smart. I've been working in Agile for several decades. I created a test automation library framework and I wrote a few books on Agile development. One of them is called ‘BDD in Action’ and the second edition is coming out shortly. I have been doing software development since the mid 90s in different forms. I got involved in Agile development in the late 90s, early. I started getting involved with BBD, more particularly, when I started working in London. In fact, I was hanging out with the London XP practitioners and worked with a lot of very smart people there who were floating around ideas like TVD and BBD, and practices like that. I picked up a few things there and started applying them. I've been coaching teams in XP and Agile in general. I think the first one was in Egypt around 2001-2002.
I've sort of watched the evolution of test automation over the years going from automating test scripts to clunky big commercial tools to Selenium or JavaScript tools. I’ve been watching this evolution with great interest - seeing things like low code and no code tools coming out.
What I do in my day job is more about helping teams but also bringing testers up to speed and running a thing called the Serenity Dojo. It is a program for manual testers and testers who are new to test automation and get them up to a level of what I'd consider to be a senior test automation level. Because I think there are too many testers held back by not knowing the right way to go about test automation, not knowing the right techniques. And these are not necessarily complicated techniques, but nobody shows it to them. And a lot of the people who do teach or a lot of the courses which get taught aren't necessarily very helpful a lot of the time. So I am trying to bring testers up to speed that way.
What motivated you to get into this industry?
In testing? That's a really good question. I drifted into testing from the BBD space. I've always been interested in the testing world. When I was a technical project manager, I was involved in testing activities, in measuring defect rates, in studying models... I've been involved with the testing side of things for quite a while, and I think it all started in Sydney in 2011. I was helping teams do test automation there. And then one of my friends' managers at the time proposed this idea of what if you could use tests to document applications? What if you use tests to be able to document features? And so we talked about that. I thought that was a pretty cool idea and I came up with some prototypes. I got more involved in test automation – I mean I was teaching test automation before that probably in 2005-2006 – but actually getting into writing the tools and creating a test automation tool that would not just automate tests but also document features was new
What do you love the most about your job?
I like to see the evolution that people make in their work, in the way they work; when they learn to write good test automation techniques or to use really effective test automation frameworks.
I find it satisfying to see tests move from writing very crude ad hoc test scripts to tests that take less time to write new scripts. So it's sort of about what I like seeing. What I enjoy most about my work is just seeing that impact. Seeing the impact of when you teach people a technique and they pick it up, apply it and run it and how it actually has an impact on their career. It is not just about the theoretical esthetical idea of writing nice code but the concrete impact it has on the projects earning and on their own career paths.
Do you have an anecdote to share?
I have lots of anecdotes… There was one project I was working on, and we were doing a requirement 3 Amigo sessions. The Product Owner came up with the requirements, saying: ‘Right. Well. What needs to happen is we need to upload an Excel spreadsheet into the database, and then it needs to send out a message into our workflow system where all the system users have to be notified. And then they need to be able to update the workflow state or change the state and make this spreadsheet.’ It was basically all about locking down data. They could upload a spreadsheet, but it couldn't be changed. And the only way to change it was to submit a workflow process where they could go into the workflow and modify the data, and then they had their permissions and access rights. It was a very complicated system and one of the testers there on the team said ‘Yes, but why do we need to do that? What's the reasoning behind it?’ We were sort of doing the BBD approach of having three Amigos and being very analytical and critical about requirements as it came through.
I'm always asking, why do you need this? What's it for? And the PO said ‘Well, we need to do this in this way because that's the way it's always been done, that's the way it was done with the previous application.’ And so she asked her why was it done like that in the previous application? And turns out it dates back from an application before that, two applications back. It was where this file got uploaded, and the file was basically a reporting file from vendors about a bus timetable system. But basically, did the buses run on time? They had to upload statistics and pass a certain date. They weren't allowed to modify those statistics. And so they had this whole workflow in place because of the original system. So, one of the developers said, “Well, why don't we just send an email notifying everyone that this data has been uploaded?” And it turns out that's actually all that was needed.
So this requirement went from about three months work to about a day or less because the tester was asking these questions. So I quite liked that story. It saved the government department quite a lot of money just by asking the right question. I think that shows when testers do get involved earlier on, not just waiting to test stuff that comes out the end of the pipeline, but are actually involved in the requirements' discovery. The critical mindset that you have as a tester can really help to prevent waste much earlier than people often think.
What are the main points to succeed in agile transformation for you?
So the main success criteria of an agile transformation… I guess at the end of the day, are you delivering value sooner, and can you adapt to change? That's what agile essentially is about. Can you adapt to change more readily? But generally speaking, that's quite hard to measure, and you don't actually measure that until a fair bit down the line. So the thing I tend to keep an eye on is little things, little what I call proxy metrics. Like, how often do you deliver working software? How early do developers or development team members, including testers, get involved in requirements discovery, in talking about requirements? Do you have a testing team? If you've got a separate testing team, it's usually not a very good sign on the agile front. Do you have a UAT space? Do you allocate time to UAT testing at the end of each phase?
things tell me that we're not really doing things in a job way. So when I see teams that plan three months of work, and then they have another two months following that to actually do testing or integration testing or UIT testing or whatever they want to call it. That usually tells me that there is an agile process. Their agile transformation is not as optimal as it might be. And they're not actually using an agile approach if it's delivered when everything is ready. And that includes the testing.
How should companies organize testing teams for better efficiency?
That's a good question. So like I said, if you got a testing team, you're probably not being agile because having a testing team implies that you're using testing as a separate activity as the development is done, then we'll hand it over to testers. I mean, I saw a post about the different types of testing that tests should do. One of them was unit tests.Someone said developers shouldn't do anything, but testers should do everything, including unit testing. This is an odd idea since unit tests are often ineffective when done by testers. Unit tests are a sort of test that you want developers to manage before they actually do the work.
Typically, team testing is treated as an "after the fact" activity. You deliver a feature, then you test it, just like you deliver a release, then you test it. And that's probably the most inefficient way of testing I know. It also puts in a lot of waste, a lot of uncertainty into the process. So I'd say, how do you organize a testing team to be more efficient? Well, put the testers right into the teams. So instead of having a test team, you have communities of practice to share, promote and cultivate best practices. Instead of having a testing team, you have more experienced testers or test automation engineers who train the testers and help them bootstrap their automation or their BBD approach within a project or within a module. But you could also have a government. Often we'll call them councils, or communities of practice. They act as a technical counselor where you're overseeing the practices to make sure everybody is aligned. Good practices are being shared, but you don't have a separate testing team. So that would be my first advice, disband the team.
What do you think about the future of agile methods?
Agile methods are being industrialized, which is not a good thing. From what we see today, it's the waterfall type techniques that are labeled as agile, because it's much easier to sell to management and large organizations. It's a reassuring and structured approach that moves away from the core values of Agile.
XP has not adopted agility. This is a pity, since it is a superior technique to Scrum, as far as software delivery is concerned. It is ‘unfortunately’ more difficult to implement.
When you have teams that master continuous delivery, continuous integration, test-driven development… you will get a true Agile process.
X-Scale is a company that I watch closely for Agile methods. Their approach is that you need to shrink organizations by empowering teams to work together more effectively. Most techniques like Safe are not about empowering teams to work together more effectively, they are about building a complex hierarchy of command and control. While it may not look like it on paper, in practice, that's what happens. And command and control is never very conducive to good agile practices.
You wrote a few articles about BDD. How does it help with manual testing?
That's an interesting question because BBD is not really about testing. BBD is about requirements discovery and collaboration. Testing is an entirely different thing. But what could I say, how does BDD help with manual testing? I can say how BDD helps with a manual test. A manual tester in a team practicing BDD will get involved in requirements discovery and require conversations about requirements and testing strategies much earlier than in a conventional approach. They will be clearer about their requirements and able to provide feedback on the need to effectively test those requirements before implementation. This gives importance to the testers, more than in a traditional testing approach. So with BDD, we have the three amigos sessions or even discovery sessions earlier, where we take a user story or a feature and we pick it apart. Basically we cross-examine it and look for the examples and counterexamples and business rules. Testers are very good at it, they can provide huge value inputting, giving their inputs at that level. So for a manual tester, it does change their role quite significantly.
It gives them a lot more leverage earlier on. It also can reduce the amount of manual testing that needs to be done, at least for the boring manual testing, and leave time for more exploratory testing. Because if you do BDD well using an approach with executable specifications that drive the development of the process, there is very little automation to do afterwards. Basically, a product owner would update a set of Cucumber scenarios with the team to describe the business rules they wanted to implement. This would lead to failed test development and failed scenarios. They'd run it again, and that was it. That works really well, but it takes a lot of effort and a lot of skill to set that up. But once you set it up, you've got a really high performing team. The manual tester in that role will be very involved in articulating the requirements and making it works as expected.
It can also be a path for manual testers to get involved in automation because the automation done a BDD process is much more tightly coupled with the whole development flow. It's not something that happens in isolation. So manual testers can even work with developers to make that happen. It can be a more gentle path into automation than if you just have to learn automation from scratch.
What would you recommend to transition from manual to automated tests?
First thing, I would recommend not starting with the Udemy courses on Selenium or Cyprus or whatever. Don't start with the frameworks. Don't start with automating manual test scripts because that will lead very quickly to a dead end code. Basically, automation is coding and even with the low code tools. At the end of the day, you're going to be thinking with a coding mindset, you need to understand development. And so if you want to get into automation, you want to start with coding automation. Automation coding is coding. You can't really get away from that. So that's really the place where you want to start. The trick is most coding books, most coding courses on Udemy are really written from a development's perspective. So they are a little bit more broad, and they look at things that you don't necessarily need as a tester. So as a tester doing test automation, you need to understand all the language features or the development approach and so forth. But there's a lot of stuff you don't need. And so it can be helpful to find training courses that are more focused on testing. The problem is a lot of testing training courses are sort of a hard puzzle because a lot of testing training courses written by testers don't have a development mindset. They don't have the reflex for writing high quality automation code. There are a lot of bad habits in there, so sometimes it's a bit tricky to find the right mix. But if you do want to start with the coding side of things, get a good understanding of coding. Some people start with things like Python because it's easier. I'm not a huge fan unless you're going into a company, or you're in a company that actually uses Python. I'd recommend just starting with Java because it's the most widely used language in the automation world. JavaScript is useful as well.. The second bit of advice that I give, I guess, is don't try and be like some of the CVS you see on LinkedIn where they've got 100 testing tools listed. You only need one. You want to start with one and master that really well. And once you know a tool really well, whether it's selenium or playwright or whatever, it's fairly easy to transfer the automation skills. All of these automation skills are very transferable from one tool to another.
The most important aspect of automation is not the tools. The tools are relatively easy. The most important thing is the mindset and how to craft automation. How to craft, basically to know what scenarios to automate, how to automate them, what scenarios not to automate. And that's what most testers struggle with. You basically need to understand how to start with requirements automation and turn them into meaningful automated executable specifications. If you are automating requirements, you could change the whole team. If you're automating test scripts, and maybe you can save yourself a bit of time. So it's really a very different approach and if you want to get the leverage, then you need to understand how to automate requirements and how to think in those terms. The last thing is that you can learn a language pretty quickly if you put your mind to it.
But don't expect to master everything overnight. I mean, the training program I run for manual tests is six months long. Some testers have been on the program and they got up to speed within a month or two and have been getting jobs and good new roles in test automation. But you're continually learning throughout those six months and building up your skills because it's not an overnight journey. If people are saying that you'll learn automation in a 40 hours course, that's not true. You will learn it on a superficial level, but you won't have the depth of experience to be able to apply it. So don't expect a miracle solution overnight. It does take time. You do have to stick with it because there is quite a lot to learn.
Do you have a final word? Maybe something to remember for this interview?
I'd say that if you're a tester wanting to get into test automation, there is a struggle to start with. It's coming from manual testing. It does take some discipline and you need to make conscious action to say that you are going to learn automation. But it is definitely worth it. I know so many tests work with so many testers who went from knowing no coding at all to being very good. Even one of the students that I've worked with, she went from a help desk, and she's now leading a BBD transformation in a bank. So the career options that you get when you go down these paths are really satisfying. But you do have to have the discipline and stick to it.