Before implementing new automation frameworks into QA functions, organisations must be sure they’re doing it for the right reasons. Here, Martin Sutcliffe, Iridium’s QA practice lead, busts some major misconceptions around the QA automation approach - so businesses can embrace the benefits with their eyes wide open. 
Myth - Automation testing replaces manual testing 
This oversimplifies a complex reality. While automation has its merits, it doesn't outright replace manual testing. Instead, they complement each other. Automation is excellent for repetitive, time-consuming tasks, and regression testing. Manual testing brings the human touch, critical thinking, and exploration. The smart move is integrating both a robust testing strategy that enhances test coverage and overall software quality. 
Myth - All testing should be automated 
There has been a rapid move towards automation, but it is a misleading notion that you should automate ALL testing. It is far from practical and doesn’t offer the benefits that a well-balanced testing strategy will. Automation is powerful, but it is not a silver bullet, and manual testing offers its own unique benefits, such as: 
· Adaptability to change: 
Manual testing is more adaptable to changes in the application's user interface or functionality. Testers can quickly adjust their approach based on visual cues and intuitive understanding. 
· Exploratory testing: 
Human testers excel in exploratory testing, where creativity, intuition, and adaptability are crucial. They can uncover unexpected issues that automated scripts might miss. 
· Usability testing: 
Manual testing is effective for evaluating the user experience and usability aspects of an application. Testers can provide subjective feedback on the user interface and overall user interaction. 
· Early testing in development: 
In the early stages of development, when the application undergoes frequent changes, manual testing is often more practical and allows for quick feedback on evolving features. 
· Cost-effective for short-term or small-scale projects: 
In some cases, particularly for small-scale projects or short-term efforts, the cost of setting up and maintaining automated tests may outweigh the benefits. Manual testing can be more cost-effective in such scenarios. 
· Complex test scenarios: 
Manual testing is often better suited for complex test scenarios that require a deep understanding of business logic, industry regulations, or specific user workflows. 
· Human judgment and intuition: 
Human testers bring subjective judgment, intuition, and real-world context to testing. They can identify issues that might be challenging to express in terms of automated test scripts. 
· Effective for one-time testing: 
When a particular test case is executed only once or infrequently, the time and effort to automate may not be justified. 
· Test case diversity: 
Manual testing allows for diverse test case execution, especially when dealing with multiple configurations, platforms, or environments. Testers can adapt to various situations during the testing process. 
· Human interaction and feedback: 
Manual testing allows testers to simulate real user interactions, providing valuable insights into the user experience. Testers can also provide immediate feedback on the application's behaviour. 
Automation suits repetitive, stable scenarios. The smart move? Automate wisely – focus on high-impact, repetitive tests. Don't force it where human judgment shines and appreciate the value that a balanced approach brings. 
Myth - Automation testing finds more defects than manual testing 
It's not about finding more but finding efficiently. Automation excels in repetitive tasks and catching specific issues. However, it might miss nuances that a human tester could catch, especially in exploratory testing. The truth? A balanced approach leveraging both automation and manual testing finds defects effectively. Don't fall for the numbers game, it’s about the right tool for the job. 
Myth - Automation testing is more expensive than manual testing 
Initially, setting up automation may seem pricey as it often requires a significant initial investment in terms of time, resources, and training. According to a report by Gartner, the initial investment for QA automation tools and infrastructure can range from tens of thousands to hundreds of thousands of dollars, depending on the size and complexity of the project*. 
There are also ongoing costs which need to be considered. Forrester Research indicates that maintenance costs can be substantial, sometimes exceeding 50% of the total automation costs. This includes the efforts needed to update scripts due to changes in the application under test**. However, these costs do not outweigh the significant long-term gains that kick in such as faster execution, repeated tests, and reduced human error. Indeed, where manual testing's cost escalates with project growth, there is easy scalability with automated testing. 
Balance is key. Strategic automation pays off over time, saving costs and boosting efficiency, but it is vital that you are realistic about the benefits. Overestimating the capabilities of the chosen framework can lead to unattainable expectations and disappointment. It is important to clearly define the capabilities and limitations of the automation framework and set realistic goals and expectations to avoid project setbacks and maintain team morale. By knowing what you can - and will - achieve, the initial outlay becomes an investment, not an expense. 
Myth - Test Automation is just test execution 
Test automation is not a mere playback of predefined steps, it involves robust planning, scripting, maintenance, and analysis. It demands a full-circle, strategic approach – from selecting the right tools and continuous integration to interpretation and feedback integration. In short, smart, comprehensive automation adds real value to the testing process 
Myth - Automation testing will solve all your QA problems 
While automation is a potent QA tool, it is not a magic fix. It won't address communication gaps, unclear requirements, or a lack of skilled testers. It's a tool, not a cure-all, and there are several considerations to keep in mind: 
· Initial investment and costs: 
Implementing a new automation framework often requires a significant initial investment in terms of time, resources, and training. Always start with a thorough cost-benefit analysis to justify the investment. Hidden costs, such as additional infrastructure requirements or unforeseen training needs, should be considered. 
· Learning curve for teams: 
Team members may face challenges adapting to the new framework, leading to productivity issues and delays. Provide comprehensive training programs to bridge the skill gap. Consider hiring external experts or allocating extra time for hands-on learning during the transition phase. 
· Compatibility issues: 
The new automation framework may not be fully compatible with existing tools, technologies, or third-party integrations. Conduct thorough compatibility tests and ensure seamless integration with other tools, version control systems, and CI/CD pipelines to avoid disruptions. 
· Unrealistic expectations: 
Overestimating the capabilities of the chosen framework can lead to unrealistic expectations and disappointment. Clearly define the capabilities and limitations of the automation framework. Set realistic goals and expectations to avoid project setbacks and maintain team morale. 
· Scalability challenges: 
The chosen framework may struggle to scale efficiently as the number of test cases and scenarios increases. Conduct scalability tests to ensure the framework can handle the anticipated growth in the number of test cases. Implement parallel execution and optimised resource utilisation for scalability. 
· Maintenance complexity: 
Inadequate design and planning may result in a complex maintenance process, making it challenging to update or modify automated tests. Emphasise modular and maintainable code practices. Regularly review and refactor automation scripts to accommodate changes in the application without major overhauls. 
· Dependency on key personnel: 
Over-reliance on specific individuals for framework implementation and maintenance may pose a risk if these key personnel leave the organisation. Promote knowledge sharing and documentation. Establish best practices to ensure that multiple team members are proficient in the automation framework, reducing dependency on specific individuals. 
· False positives and negatives: 
Inaccurate test results, such as false positives or negatives, can erode confidence in the automation framework. Implement robust validation mechanisms and regularly review and update test scripts to account for changes in the application. Invest in effective debugging tools to quickly identify and address false results. 
· Security and compliance risks: 
Introducing a new automation framework may pose security vulnerabilities or non-compliance with industry standards. Conduct thorough security assessments and ensure the framework aligns with industry-specific compliance requirements. Implement security testing within the automation framework to identify and mitigate risks. 
· Tool abandonment: 
Tools and frameworks can become obsolete or face discontinuation, leading to the need for a sudden switch. Regularly monitor the development and support status of the chosen framework. Have contingency plans in place, including the ability to migrate to a different tool if necessary. 
By proactively addressing these risks, organisations can enhance the success of implementing new automation frameworks and streamline their QA functions effectively. Before expecting miracles, ensure a solid QA foundation is in place. Smart automation complements a well-structured QA process but won't replace the need for strategic planning, collaboration, and a skilled QA team. Don't fall for the hype. Automation is part of the puzzle, not the whole picture. 
*Gartner, "Magic Quadrant for Software Test Automation," 2021. 
**Forrester Research, "The Total Economic Impact™ of Tricentis Tosca," 2020 
Tagged as: Quality Assurance
Share this post:

Leave a comment: 

Our site uses cookies. For more information, see our cookie policy. Accept cookies and close
Reject cookies Manage settings