Architecture of automated test cases for legacy applications
-
Upload
mikhail-vasylchenko -
Category
Software
-
view
123 -
download
2
Transcript of Architecture of automated test cases for legacy applications
Architecture of automated test cases
for legacy applications
MYKHAYLO VASYLCHENKO
Agenda
● Projects● Problems● Test cases architecture● Outcome● Questions
Projects
● Java● Linux remote servers● JBOSS● Oracle Weblogic● Oracle Coherence● DB● Web UI● SOAP/http● REST/http
Technologies
•Legacy applications in diverse ecosystem•Complex pipelines
Legacy Apps
Testing tool - CA LISA
● Started with 6 people in early 2014
● 11 people in the end of 2014
The Team
21 projects during 2014Around 3 test cases/man/day
Our Commitment
Covered only 3 projects(from 21)
4 Month Later...
Problems
DuplicatesAppending
Dependence
3 Major Problems
Big automation team
● Code duplication● Support issues
Duplicates
Duplicates
Extending for reuse...
● Huge methods● Hard to maintain
Appending
BeginningAppending
BeginningAppending
The team is dependent on:- quality of manual test cases- environment- developers
Manual QA – USA, Vietnam, IndiaEnvironment - USADevelopers – USADeployment and Automation – Ukraine
● Hard troubleshooting ● Communication takes ~80% of time
Dependence
Test case: Search for “User Name” and “all of these”1. Set valid “user name” to “find” field2. Select “all of these” condition3. Select “User Name” field4. Click “Search” button
Expected result:Grid with correct data is appeared
Dependence
Test is failed.WHY???
Dependence
Test is failed.Because of SQL query should be another:
Dependence
Legacy Apps
Legacy Apps
Missing relevant documentation
Legacy Apps
Missing relevant documentation
Missing key developers
Legacy Apps
Missing relevant documentation
Missing key developers
Missing momentum
Test cases architecture
Start from test cases
False Start
Start from test casesStart from analysis
False Start
Analysis
Test ID Test Name
001 Verify search for “User Name” field and “all of these” condition
002 Verify search for “User Name” field and “starts from” condition
003 Verify search for “User Name” field and “ends with” condition
004 Verify search for “City” field and “all of these” condition
005 Verify search for “City” field and “starts from” condition
…. ……
n Verify search for “State” field and “equals” condition
•We know functionality•We know different scenarios•We know common actions
parameters it should take and what actions it should perform
Encapsulation
Create mock-upsEncapsulation
● Get input and call method● Actions are "hidden" in methods
Met
hods
Test 1
Test 2
…
Test N
Test Cases
Test Cases
Real test cases + mocked methods = 80% done
TDD Style
Results
PASSED
FAILED
Fill mock-ups with real code
Coding
CodingMock up
AutomationReal
•Run tests•Review results
Ready
Outcome
No Duplicates when applying architectural approach
Duplicates
BEFORE
Duplicates
AFTER
●Know your features●Implement optimal methods from scratch
Appending
AppendingBEFORE
Appending
AFTER
•Test case is failed but automated
Dependence
•Test case is failed but automated•The reason is documented
Dependence
•Test case is failed but automated•The reason is documented•Resolved when feedback is provided
Dependence
Postponed Troubleshooting
Never stop and never closely investigate the problem
Aggressive Automation
21 projects are covered by automation5300 test cases automated
65% test cases – passed35% - failed due to reasons not dependent on
automation team
Outcome
Questions?
Mykhaylo [email protected]