logo logo

Identify the Automation Gap in Testing

Identify the Automation Gap in Testing

This post aims to encourage the discussions in terms of aligning the desired business outcome, automation build, and maintenance reality. The maintenance of automation is often overlooked by some organizations. Is there an unspoken automation gap?

Unspoken Automation Facts

Test automation has many benefits we as a community have experienced and learned in the last decade at various degrees. As a tester, I started my testing interests as a guy maintaining the automated tests suite as part of a project deliverable and handover. Since then, I found myself always managed to keep a hand on the maintenance part of the automation which enables me to experience some of the unspoken automation facts.

Late last year, I started thinking to write about my observations in regards to a gap in automation through the testing lense base on my limited years of experience and knowledge in the field.

Automation Gap in Testing, What’s that?

What do I mean by “automation gap”: A mismatch between the business dream of automation and automation in reality.

Business Dream of Automation

Business owner Jack says “Automated testing covers all critical business areas and scenarios as part of the new system project deliverables. The automation will show any regressions or issues when it happens promptly. Also, it is a big cost-saver for the organization because it replaces manual testing and manual testers. It runs faster. It runs consistently.”

Automation in Reality 

Automated tests are happily running to check if any expected outcome is not returned when it runs. It’s consistently checking the current outcome and compares to the expected outcome with no judgment or overlooking some minor details. There are hundreds if not thousands of tests to run every day. In general, automation execution is faster when compared to if executed by testers (human). Some tests fail and some pass, generally Green means everything is happy as far as automation concerns. Red is an alert that something went wrong. It normally requires further investigations by a tester (human) when something has gone Red.

The Emergence of the Automation Gap 

Birth of the automation gap

The automation gap is emerging as soon as the business requirements (dream) are created and passed onto the automation team. It’s the birth of the gap and it will grow as time goes on. Jack’s expectations of automation in testing have seeded the gap with some questionable items seeded due to its ambiguous and abstraction in nature. The gap was born when the automation in testing started building without further clarification and agreement on a common understanding of the automation capability.

“Automated testing covers all critical business areas and scenarios…”. It seems reasonable and achievable at first glance. The reality is that this expectation is so ambiguous which will eventually lead to the official birth of the gap.

  1. What are ALL of these critical business areas and scenarios? Does the automation builder have the same understanding as Jack (business owner)?
  2. What is the understanding and expectations of “automated testing”? Is it “automated testing” or more appropriate as “automated checking”? Can automation pick up unexpected scenarios which may well be critical to some users at some point?
  3. Any regressions really?
  4. Time-saving and cost-saving, really?

The attempt of the birth of the gap if the team had not clarified and agreed on the business expectations as part of the automation implementation.

The natural growth of the automation gap

For the sake of argument, let’s imagine the perfect match between business expectations and automated-checks when the system goes live in production. Both the system and automated checks are happily enjoying their v1.0 go-live moment. There’s no gap because they are the perfect match with great team efforts.

On day 6, some users reported an issue in production which evidently deemed as a critical scenario missed during the building process. The team addressed the issue quickly and the system is up running happily again in no time. Both users and business are happy, even the automated checks (look at its big Green smile). 

Wait… Has someone said something? Yes, there’s a gap now in this perfect match relationship. The system is now on its v1.01 but automated checks remain its v1.0 in terms of the understanding of the critical areas. You would imagine the automated checks had updated to v1.01 at the same time. I agree with you 100% if automated checks knew about this critical update. In reality, not many people thought the automated checks need to know this piece of information. So they forgot to tell the automated checks about this change. The gap started to grow naturally as time goes on, while neither the system or its automated checks are aware of until there’s a bigger problem later down in the line.

The inevitable automation gap

Some may say, we have the best team in the world which takes care of the ongoing system changes related to automated checks. We communicate and collaborate continuously and very effectively. The perfect match will remain between the system and its automated checks during their lifetime. Really? You got to show the rest of the world of your success in this.

At this stage, I saw an inevitable gap based on my very limited knowledge and experience of automation in testing. Automated checks suite is normally a black-box to business in most of the cases despite the attempts of many frameworks and tools. Should the business be trained to understand the underhood of automation to avoid the gap? Or should automated checks continue to remain black-box to business as long as it’s white-box to some? As long as someone can decode it into business language upon request at a given time in the future. 

Will you continuously invest in your automation maintenance?
Does this gap concern you?
What’s your plan to prevent or minimize the gap?
How often should someone double-check the existence or emergence of the automation gap?
How do you measure the ROI on automation?

Automation is not something you built and forgot. It’s like a child and you will need to spend time to look after it to help it “be the best it can be”.

 

About the author

Luke Liu

Inspired to become a Kung Fu Master in testing, while having fun on my learning journey.

Committed to delivering better quality solutions via a lean approach, continuous improvement, continuous learning. Powered by #TesterMindset.

Leave a Reply

FacebookLinkedInTwitterEmail