Quality Assurance (QA) is essential when it comes to delivering solutions that satisfy your client’s requirements, provide great user experience, and run smoothly in the long-term. In fact, nowadays it’s not uncommon for the QA team to be as big as or larger than the development team.
Even so, largely due to a reduction in budgets and the ongoing pressures of deadlines, QA teams that take a comprehensive, strategic approach to QA are becoming increasingly rare. But cutting corners at the QA stage is never a good idea, and often only leads to software faults showing up at a later date, when fixing them is more difficult and, as a result, more expensive.
In order to make sure your QA team doesn’t give you any unnecessary grief, we’ve listed three common bad practices so that you can make sure you avoid them.
Inadequate Regression Testing
Every new release will bring new features and functionalities. However their introduction should not compromise the software’s existing capabilities. Making sure that that this is the case involves a process called regression testing. Regression testing comprises of re-running previously completed tests to see whether the program’s behaviour has changed as a result of the update. If bugs are present, these can then be fixed with enhancements, patches, or configuration changes.
Since new releases are built on top of the existing system – including previous all releases and updates – each time a new update is added, the amount of regression testing required also increases. This means that, as your software ages, it demands more and more regression testing to ensure that it’s operating smoothly. This often leads to QA teams working on older software running only a selection of important tests or, in some cases, even a random sample. But this is never a good practice, even if it saves time in the short-term. To avoid unnecessary mistakes, it’s crucial that you always complete thorough regression testing.
Insufficient Integrated System Testing
It’s not uncommon for one software system to depend on or work in tandem with another. Such integrated systems require special testing. Integration testing is the process of combining individual software modules before assessing how they function as a group. Integration testing verifies that software aggregates are working correctly, and is crucial for uncovering problems to do with how one system interacts with another.
Often, however, one system may still not be finished when its counterpart is due to undergo integration testing. QA teams faced with this scenario may attempt to overcome this problem by making manual tests appear comprehensive by mocking the integrated system’s interface. But such testing can only ever evaluate a system’s performance in isolation. The simple (though perhaps at times uncomfortable) fact is that you cannot complete thorough integration testing without using both systems.
Following Happy Path
The term “happy path” refers to the scenario in which nothing exceptional or problematic occurs, and everything goes as expected. It’s not surprising, then, that your QA team may wish to follow the happy path when testing your software – it makes their job quicker and easier.
But the only way to be absolutely sure that your system is operating properly is to test situations in which everything goes wrong. This means breaking all possible rules, exploiting all back doors, and consciously attempting to fool the system. Such testing is time-consuming and difficult, but is always absolutely worth it.
All of the problems highlighted above can be addressed with automated testing, including unit tests, integration testing, and load testing. However, it’s worth remembering that automated testing is only as ever as good as the quality of the tests themselves. So make sure your QA team performs thorough, extensive testing.
Moreover, there is no replacement for human testing, which is reserved for running business scenarios and uncovering issues that automated tests cannot detect (including colour, font, and alignment issues, as well as language and spelling).