As testers, we have the responsibility to always thrive for the best, to keep an eye for quality along with the whole team and to adapt. We should think twice before testing, before choosing the right tool, even before analyzing. It also includes people involved in the software testing process, from the very moment of posting the job. So, here are some of the many software testing misconceptions we should consider and think twice about.
If your test automation framework is more complex than the app itself, think twice.
This is a reference for automation testing scenarios created by testers for business specific needs. Of course, we agree there are complex frameworks such as Django, React, Ruby on Rails, etc.
For software testing frameworks, we need to first do an analysis, a mind map to really design the architecture and make it easy to maintain, to understand and scalable.
Using BDD in frameworks when all the people involved in the project are not aware of what Gherkin is, is not good. Also adding more than one reporting solution in order to have different result presentations…
Remember, there is a difference between complex and complication; definitely keep your framework away from being complicated, and complexity is not necessarily needed to be involved.
If you want to test your whole application using automation in test, think twice.
Probably this is because there is a sense of inferiority in being a tester who does not know automation. Many job posts focusing on merely testers with traditional techniques are decreasing. I have shared few thoughts around this in a previous article here.
Testing is more than that, testing is a discipline, with so much more duties than just creating automation testing. The same way there are many software testing approaches and techniques before considering automating some checks.
For instance, there may be a new feature within your application, which you would need first to apply some functional exploratory testing for instance in order to validate results first, before creating checks for automation testing.
There are shift left techniques that help in very early stages of the software delivery process. Many of those techniques do not use code as a testing solution.
If you are learning a lot of automation testing tools just because they are trendy, think twice.
Testers are constantly bombarded with so many tools that preach to be the silver bullet for automation testing. The fact we have so many tools does not mean we have to learn them all. Every business has its own needs, and every tester has special abilities and tastes, the key is to balance both.
There are, also, great free tools such as TestProject to help us integrate one or more testing tools in one single place, avoiding many installation and configuration issues, and at the same time giving us testing results and reports in a really nice graphic way.
If you are hiring testers and adding “100% automation test role” in the description, think twice.
Everything begins with a job post and the false idea that a tester will be hired for a “100% automation test role” should start changing. There is so much more in a day to day role for a tester: attending meetings, gathering requirements, test plan and test case creation, exploratory testing, functional testing, performance and security testing when applicable, setting up pipelines for CI, and much more.
Before posting tester roles, remember automation test is a technique, it is not meant to be what testers are.
If your test code is not as high quality as application code, think twice.
The fact that a tester creates an automation testing framework for an application, does not mean that our code should be inferior, or worst, without quality. There are many ways to measure quality for our code. Readability, reliability, scalability, reusability, portability and of course check coverage in sync with requirements. Those are factors that we usually look for in a testing framework code.
Plus, good practices you know: Commenting your code, use of patterns when needed like page object model, encapsulation, control version, etc.
Also, make sure to improve your development abilities, especially to power up your testing code and be able to help more junior resources to ramp up and achieve greater results.
If you skip software testing just because of urgent requirements, think twice.
Skipping tests is not alien to testers that have been in this situation of having urgent requirements. It happens many times with hot fixes, when there is a need to deliver to production very urgently. The fact that something is urgent, does not mean it needs to skip verification and validation.
It does not stop there, it also happens with features released to production, for instance, in an agile environment. All people involved should recognize and plan testing time accordingly, as we know the cost of an issue in production (it is mostly more expensive).
So, what else do you think we should think twice about? I would be happy to read your ideas around this!