Why Test Automation Fails In Theory and In Practice · 2019. 2. 14. · Test Automation Industry...
Transcript of Why Test Automation Fails In Theory and In Practice · 2019. 2. 14. · Test Automation Industry...
www.origsoft.com | © Original Software 1 www.origsoft.com | © Original Software
Why Test Automation Fails
In Theory and In Practice
Jim Trentadue – Software Quality Consulting Director
May 20th, 2016
www.origsoft.com | © Original Software 2
Agenda
2 Adoption challenges with Test Automation
3 Why are Test Automation adoptions failing
4
5 Correcting the Test Automation approach, concepts and applications
Why are Test Automation implementations failing
Test Automation Industry recap & current trends 1
Session recap 6
www.origsoft.com | © Original Software 3
Test Automation Industry:
Recap & Current Trends
www.origsoft.com | © Original Software 4
Test Automation Industry - recap
Record \ Playback
Structured Testing (invokes
more conditions)
Data Driven Keyword
Driven
Model / Object Based
Actions Based
Hybrid (combines 2 or more of
the previous frameworks)
First Record \ Playback tools appeared about 20-25 years ago
The primary objective was to test desktop applications. Some tools
could test broken URL links, but not dig into the objects
The tester was required to have scripting or programming knowledge
to make the tests run effectively, even in Record \ Playback
www.origsoft.com | © Original Software 5
Test Automation Industry – current trends
Number of test automation tools have increased to over 25-30, focused on
taking the scripting / programming out of the mix for the manual testers
Technical complexities have increased considerably to cover the following:
Manual Testers are slow to delve into test automation because almost every
market and open source solution requires some degree of coding
Products are becoming single niche tools, sometimes not offering a fully
integrated quality solution for the SQA organization
www.origsoft.com | © Original Software 6
Adoption of Test Automation;
Common myths and perceptions
www.origsoft.com | © Original Software 7
Adoption challenges
New challenges below have slowed productivity
Rapid deployment times to production
Shifts from a Waterfall SDLC to an Agile Methodology
How Test-Driven Development (TDD) impacts when to automate tests
How component testing such as: Database, Data Warehouse, Web-Service or
messaging testing impacts what tests to automate
To ensure success, many organizations have employed a Proof of Concept
www.origsoft.com | © Original Software 8
Common myths & perceptions
Test Automation
can’t be accomplished!
Significant increase in
time and people
Existing testers will
not be needed
Will serve all testing
needs
Associates training will
have no impact
Implement with a click of a button
www.origsoft.com | © Original Software 9
Why is the Test Automation
adoption failing in organizations?
www.origsoft.com | © Original Software 10
Case Study 1: Management Struggles
Excerpts from a test engineer’s interaction with mgmt.
Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster
“It Must Be Good, I’ve Already
Advertised It”
Management’s intention was to reduce testing time for
the system test. There was no calculation for our
investment, we had to go with it because manual
testing was no longer needed. The goal was to
automate 100 % of all test cases.
“Testers Aren’t Programmers”
My manager hired testers not be to programmers
so he did not have more developers in the
department. Once told that some tools require
heavy scripting / coding, the message was relayed
that testers need to do “Advanced Scripting”, rather
than “Programming”
“Automate Bugs”
An idea by one manager was to automate bugs
we received from our customer care center. We
were told to read this bug and automate this
exact user action. Not knowing the exact
function, we hard-coded the user data into our
automation. We were automating bugs for
versions that were not in the field anymore.
“Impress Customers (the Wrong Way)”
My boss had the habit of installing the untested beta
versions for presentations of the software in front of the
customer. He would install unstable versions and then
call developers at 5:30am to fix immediately. We
introduced automated smoke tests to ensure good builds.
www.origsoft.com | © Original Software 11
Adoption questions
Aligned into categories for questions not answered or asked
Who
• are the right people to create, maintain test cases and object repository information?
• may execute automated tests instead creating?
• will be the dedicated go to person(s) for the tool?
What
• is our organization’s definition of automated testing: what it is and what it is not?
• are the test automation objectives?
• is the expected testing coverage sought after with test automation?
When
• are testing resources available for this initiative?
• can training happen in relation to the procurement?
• are there project deadlines that compete against test automation?
Where
• will the test automation take place; dedicated test environment?
• are the associates located? This will decide whether a training should be remote on onsite
• will a central repository reside?
Why
• start a test automation initiative – what are the specific issues?
• run a vendor POC program; what do we want from this assessment?
• start with a top-down management approach?
How
• do our test automation objectives differ from our overall testing objectives?
• does manual testing fit into a test automation framework?
• will a full integrated quality suite function?
www.origsoft.com | © Original Software 12
Adoption Failures
The not so best practices & assumptions for adoption
• Manual Test Cases not written for automation
• Error handling is not considered as part of manual testing
• Modularity isn’t a key element for manual test case planning
Automates manual test
cases
• Testers should just automate the entire regression test library
• Application changes instantly added to automated library
• Full overnight run with discrepancies logged as app. defect
Must cover 100% of the application
• Replace manual testing time with automated testing time
• Manual testers can automate their own tests
• Separate initiative is not required, ample time within project
Testers can automate
in a release
www.origsoft.com | © Original Software 13
Why is the Test Automation
implementation failing in organizations?
www.origsoft.com | © Original Software 14
Case Study 2: Choosing the Wrong Tool
• Goal was to automate testing of major functionality on web development tools for internet software development company
Background for the Case Study
• Most functional tests were executed manually. Some dated automation scripts could potentially serve useful if restored Pre-existing Automation
• The GUI had gone through major changes for the application. Tool had to be flexible with app. changes & easy to maintain
New Tool or Major Maintenance Effort?
• Consensus was tool required coding; difficult for manual testers. Also, native object recognition provided did not work
Moving Forward with existing tool?
• Understand how automated tools use objects and realize that identification by image was not adequate
What was done after the tool was replaced?
• Examine solutions that work best with the your applications and controls. Maintainability & error-handling should be critical Conclusion
Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster
www.origsoft.com | © Original Software 15
Implementation Failures
What could wrong when trying to use procured solution
• AUT uses technologies that don’t work with the tool
• Attempt to playback my recording doesn’t playback at all
• Multiple tools on varied technologies (Desktop, Web, Mobile)
Won’t work on my
application
• Heavy maintenance in re-recording tests for GUI changes
• Only for regression testing of legacy apps; not new apps
• Can’t work when deployed incrementally; key for Agile
Automate my AUT if it changes?
• No time with project schedules and over-allocation
• Marketed to testers with a development skillset
• Doesn’t require cross-department support
Complex for manual
testers
www.origsoft.com | © Original Software 16
Correcting the Test Automation
approach
www.origsoft.com | © Original Software 17
Adoption corrections
Changing the mindset concerning test automation
• Automated differs from manual with entry, data & exit criteria
• Error handling is an essential part of automated test cases
• Modularity is how automated test cases are developed
Automates manual test
cases
• Start with regression library, but focus on tests with a high ROI
•
• App changes planned and automated in a maintenance window
• Failure analysis: Defect in the automated TC or AUT?
Must cover 100% of the application
• Initial cycle is longer than manual; cycle runtime reduced after
• Requires training on the tool, approach and best practices
• Separate initiative, defined unique objectives apart from projects
Testers can automate
in a release
Objectives aim for the highest ROI
TP includes automated and manual
Separate goals &
timelines
www.origsoft.com | © Original Software 18
Case Study 3: Multi-Solution Strategy • Leading electronic design automation company had products
comprised of over 120 programs. This was GUI-based and ported to all leading industry workstation hardware platforms
Background for the Case Study
Current Testing Activities
• Three distinct test phases:
• Evaluation (done by end-user),
• Integration
• System
• Integration & System done manually and consumed 38 person-weeks of effort for every software release on every platform
Proposed Solution
• Purchase solution or build internally
• Hire an outside consultant to validate the tool selection prior to purchase, then train testing staff
• DECISION: Build in-house tool
Integration test
automation
•Completed under a year
•Manual: 2-person weeks
• Automated: 1-person week – but not everything could be automated
•Manual: 10 person-weeks on every platform
System test
automation
•Automate in two phases: breadth and depth
•Manual: 8 person-weeks per platform
•Automated: 4 person-weeks total
•Total tests automated was approximately 79%
Test effort per platform cut in half
Consistent regression testing
Less dependent on system SME’s
Better use of testing resources
Key Benefits
Software Test Automation: Effective use of test execution tools – Fewster & Graham
First trial
Without automation
www.origsoft.com | © Original Software 19
Implementation corrections
Changing the approach to put automation into practice
• Develop criteria for vendors to prove their solution works
• Standardize object naming structure from Dev, very important
• Research fully integrated suites; vendor mgmt. will thank you!
Won’t work on my
application
• Find solution that you don’t re-record if adding new controls
• Consider starting your testing with automated tests to start
• Test automation should be a key element in sprint planning
Automate my AUT if it changes?
• Schedule a just-in-time training shortly after procurement
• Layout the resource plan on how every tester contributes
• Testing mgmt. can build the IT cross-functional support plan
Complex for manual
testers
Develop a resource
plan for all
Understand the object landscape
POC will prove the solution
www.origsoft.com | © Original Software 20
Case Study 4: Automate Government System
Experiences of Test Automation – Case Studies of Software Test Automation: Graham & Fewster
• Speed up the Department of Defense (DoD), software delivery. Leading-edge technology, but lengthy testing & certification processes were in place before software can deploy
Background for the Case Study
•The tool has to be independent of the System Under Test; no modules should be added
Can’t Be Intrusive to System Under Test (SUT)
•The solution(s), had to go across Windows, Linux, Solaris, Mac, etc.
Must Be OS Independent
•The tool needed to support all languages that the applications were written in
Must Be Independent of the GUI
•The system should be able to handle operations through the GUI and backend processes
Must Automate Tests for Both Display & Non-Display Interfaces
•The tool needs to test a system of systems: multiple servers, monitors and displays interconnected to form one SUT
Must Work in a Networked Multicomputer Environment
•One of the main requirements from the DoD. Testers were SME’s in the application, but not software developers writing scripts. Needed a code-free solution.
Non-Developers Should Be Able to Use the Tool
•Needed a framework that could allow for test management activities, including documenting a test case in the Automated Test and Re-Test (ATRT) solution, along with a hierarchical breakdown of test cases into test steps and sub-steps
Must Support an Automated Requirements Traceability Matrix
Solutions selected
spanned across
technologies
Reduced test cycles
from 5 testers to 1
Testing department
worked on automated
test cases, both
creation and execution
Additional processes
for Test Case & Test
Data Management
were now enabled
Test Automation ROI
www.origsoft.com | © Original Software 22
Presentation Recap
Reviewing the main points of the presentation
Automation industry focused on desktop initially, but now must cover multi-technology
stacks to cover many platforms
Primary skillset for automation was programming, but solutions are available offering
code-free test automation
Common perceptions and myths will exist; review the list of questions to be asked to
drive your strategy
Present real-life case studies to management and ensure this does not occur during
your research and implementation
Review common failures in the theory and implementation; compare if any are
problematic in your organization
Make the corrective action for each failure step accordingly; incorporate test
automation as part of your overall test strategy
www.origsoft.com | © Original Software 23
Jim Trentadue – Software Quality Consulting Director