The Process of Effecting Lasting Change...industries. He is also highly skilled in sound people...
Transcript of The Process of Effecting Lasting Change...industries. He is also highly skilled in sound people...
BIO PRESENTATION
International Conference On Software Testing Analysis & Review
October 27-31, 2003 San Jose, CA USA
T9
Thursday, October 30, 2003 11:15 AM
THE PROCESS OF EFFECTING
LASTING CHANGE: MOVING TO A
HIGHLY STRUCTURED TESTING
MODEL
Jon Harader & Vivek Bhatia Wells Fargo Internet Services Group QA
Jon Harader Over 19 years experience designing, developing, integrating, delivering and testing solutions within the Fortune 50 process, Mass Transportation, and Financial Services industries. He is also highly skilled in sound people management techniques, project management, process change leadership, and continuous quality improvement techniques. In addition, he has held a wide range of positions that have yielded his experience base: Programmer/Analyst, Systems Analyst, Supervisor, Project/Implementation Manager, and It Management Staff Consultant at Chevron; Senior/Lead Design Engineer, Product Line Director at small startup company, Spear Technologies and most recently, Manager of QA Technical Services at Wells Fargo.
Vivek Bhatia 13 years of experience in implementing and maintaining enterprise technology solutions. He has spent the last 4 years as a “hands-on” VP/executive Director, actively involved in daily team management, as well as working with executive committees to set future strategy and goals. He has a balance of technical and managerial experience, with 8 years in project and program management, and 5 years as a DBA and application development team lead. He also has well rounded and brand name industry experience, with 4.5 years in financial services, including Wells Fargo, Fannie Mae, ABN*Amro, NASD; 3 years in retail, including Nike, Estée Lauder; 3.5 years in government, including the FAA, DOJ, Navy; and 2 years in internal professional services consulting He is well versed in formal project management, having co-wrote Deloitte Consulting’ Project Management methodology based on the internationally accepted PMBOK model published by the Project Management Institute, and then having solely written the Project Management methodology and tools for Fort Point Partners Consulting, which generated an immediate 50 percent decrease in project costs for future projects, providing consistent on-time delivery.
10/30/03 StarWest 2003 1
The Process of Effecting The Process of Effecting Lasting ChangeLasting Change
Moving to a Highly Structured Testing ModelMoving to a Highly Structured Testing ModelPresented By:Presented By:
Vivek BhatiaVivek Bhatia & Jon Harader& Jon HaraderWells Fargo Internet Services Group QAWells Fargo Internet Services Group QA
10/30/03 StarWest 2003 2
What’s on Tap for Today
ContextStarting BenchmarksVisionPlanningExecutionMid Project RecycleParting Wisdom
10/30/03 StarWest 2003 3
Context: Business
Internet Financial Service Applications @ www.wellsfargo.com
On Line BankingOn Line Application ProcessBill PaymentBrokerageEtc.
10/30/03 StarWest 2003 4
Context: Internet Services Group
.COM arm of Wells Fargo that supports the Internet ChannelStarted out as an alternative customer channelIn the process of moving to a more mature organization
10/30/03 StarWest 2003 5
Context: Business Scope
Vast array of functions from 7 lines of businessOver 46 unique productsInternet Customer base (approx)
4.2M Online Banking Customers1.3M Bill Manager Customers
Internet channel is the Banks largest “store”
10/30/03 StarWest 2003 6
Context: Technical Scope
Several thousand web pagesSupported by
Large Number of Web serversLarge Number of Application serversMultiple Data CentersSeveral Internal SORsSeveral 3rd party end points
10/30/03 StarWest 2003 7
Context: ISG/QA Group
Functional TestingManualAutomated
Performance TestingQA-Technical Services
Ban
king
Bro
kera
ge
Bill
Man
ager
Publ
ic S
ite
10/30/03 StarWest 2003 8
Starting Benchmark: Mgmt Challenge
Unstructured Test Case Management System
Dept was person-centric, not process-centric
Testing not approached from a functional perspective – hard to quantify and prove coverage
10/30/03 StarWest 2003 9
Starting Benchmark: Mgmt Challenge
Each QA functional testing area (specific application) had its own standards and processes
No comparison criteria across testing groups
Difficult to determine size or pattern of data issues
10/30/03 StarWest 2003 10
Starting Benchmark: Staff ChallengeStaff assigned to test specific applications
App-specific standards and processes resulted in an a loss of efficiency and made it difficult to predict schedules if a non-seasoned person as was assigned
Data issues resulted in considerable manual effort
10/30/03 StarWest 2003 11
Starting Benchmark: Tech Challenge
Multiple non documented automation methodologies
No automated linkage between test case specification and automated test case code
Non database driven automation framework
10/30/03 StarWest 2003 12
What do we do now?
10/30/03 StarWest 2003 13
Vision: Management
Plot entire QA dept on CMM model as a single entity, and upgrade as needed
Level 2: Define, Document new core processes for entire deptLevel 3: Standardize across groups, ensure consistencyLevel 4: Define Metrics and Measure inter and intra-group progressLevel 5: Optimize metrics
Extract and Manage Knowledge into TCMS Contain and reduce overall costs
10/30/03 StarWest 2003 14
Vision: How to Improve our CMM level
Define and Document a standardized, best-of-breed process as the core based on the best of each individual process, and enforce where appropriate in the test management tool.
Dept was person-centric, not process-centric
Document test cases in the test management tool and deploy using functional test case categorization.
Testing not approached from a functional perspective – hard to quantify and prove coverage.
Buy[not build] & Deploy a highly structured, database-driven, Test Case Management System.
Unstructured Test Case Management System.
SolutionProblem
10/30/03 StarWest 2003 15
Vision: How to Improve our CMM level
Create a core set of standards and processes to deploy across all groups
Each QA Functional Testing Area has it own set of standards and process
Common, Core standards and processes allow metrics to be generated, managed and optimized.
No comparison criteria across testing groups.
Integrate automation framework with test management tool to track and trend test results.
No sense of size or pattern of data issues.
SolutionProblem
10/30/03 StarWest 2003 16
Vision: Staff
Consistent guidance and approach
Unified management team with common goals and objectives
Predictable work load
10/30/03 StarWest 2003 17
Vision: How to Help the Staff
Common standards and process across groups allow the automation and regression staff to work efficiently, and allow everyone to smooth out the work week.
App-specific standards and processes result in an a loss of efficiency and make it difficult to predict schedules
Consistent guidance, common standards, and process across groups allow individuals to move laterally and vertically.
Staff are assigned to specific applications
Database driven automation framework integrated with a test management tool and data SOR significantly reduce the number of data issues and the time required to solve any problems.
Data issues result in excessive manual effort.
SolutionProblem
10/30/03 StarWest 2003 18
Vision: Technical
Data Repository
TCMS SORs
Data Xfer Mechanism
Applications Under Test
TestData db
RegressionResults
Automation Code
Automation Framework
10/30/03 StarWest 2003 19
Vision: Technical Solution
Store test case specs in attributes in TCMS, then create a db-driven framework that will retrieve specs at runtime from TCMS.
No automated linkage between test case specification and automated test case.
Use a single, well documented, and tightly controlled methodology and framework across all groups.
Multiple non documented automation methodologies.
Integrate all systems via back-end database, and make sure the framework can link into the systems.
Non database driven automation framework.
SolutionProblem
10/30/03 StarWest 2003 20
Planning: Main & Peripheral projects
Implement structured TCMSDb-driven automation framework
1 db for automated Test Case data1 db for base state for Test Case data1 db for regression results
Deploy a functional treeCreate core cross-dept test case specification standards and processesRe-engineer cross-dept workflow to match core standards and processes
10/30/03 StarWest 2003 21
Planning: Anticipated Paradigm Shifts
Standardization of processes and test case specification templates within and across departments
Functional test categorization and managementTest case ownershipTest case maintenance
Highly structured and controlled input of data
10/30/03 StarWest 2003 22
Planning: Executive and Line Mgmt Buy-InFor executives, we focused on:
The # of people and $$ needed to support the current writing and coding paradigmThe # of people and time needed to run the test casesThe # of hours/week needed to solve problems that shouldn’t haveoccurred in the first place [i.e., data & process]
For line management, we focused onThe # of hours/week their people spent communicating test case specific information to other groupsTheir ability to quickly determine coverage and analyze results by functional areaThe reduced # of hours required to code and run test cases for their groupthe # of hours/week needed to solve problems that shouldn’t haveoccurred in the first place [i.e., data & process]
10/30/03 StarWest 2003 23
Execution: Implementation Process
Migrate 2-3 projects from Word into TCMS to assist with dept-specific attribute definition
Define Functional Tree Pilot 2-3 projects
directly entering specs into TCMS, modify as needed to meet usability, functional requirements. Iterate core & dept-specific processes, standards, and attributes
Q1 Q2 Q3proof of concept pilot stage
Create training materials, rollout to department using internal champions
rollout stage
Test
ing
Gro
up 1
QA
-Tec
h Sv
cs
Pilot
Test
ing
Gro
up 2
Proof of Concept1st Draft of core processes, test case specification standards, attributes
…
10/30/03 StarWest 2003 24
Execution: Proof of Concept Stage
Review existing test case organization & prepare strawman functional tree
Set up TCMS & Import/load sample test cases into tree & load all attributes
Review existing test case docs and dept template & prepare strawman DCF & reports
Dept SMEs and Mgr review strawman DCF, tree, & reports and approve for pilot
Dept Mgr picks one project to serve as the pilot, in which an analyst will enter info
Week 1 Week 2 Week 3 Week 4
Challenges:-Getting consensus on the functional tree definition and extension-Dept-specific custom field identification [DCF]-Determining correct level of data and test case detail
Advice:-Keep this as short as possible, but do enough due diligence to make sure you do your homework and that the pilot is successful.
Goal: Make sure TCMS works before putting it in the critical path where it could impact departmental project deadlines and budget.
Test
ing
Gro
upQ
A-T
S
10/30/03 StarWest 2003 25
Execution: Pilot Stage
Analyst uses TCMS to enter & maintain test case info for the life of 1 project
Analyst finds issues with defined DCFsor tree and reports to implementor
Compare to cross-project/cross-dept rqmts and determines whether to fix or escalate to dept SME or TCMS steering committee as appropriate
Either the fix is incorporated or Analyst told to accept it.
1 full project lifecycle or length of time to purge issues list
Post-mortem review by SMEs, Dept Mgr, and QATS to approve for full rollout to dept.
Challenges:-Creation of single standard, workflow, process, and reports to fit entire testing group.-Conflicting goals of adhering to new tree structure vs completing project-Miscommunication/misunderstanding what’s a given and what’s possible to change during pilot
Advice:-Be patient, go slowly, and focus on the usability of the end result. If you build a system that is more usable, better integrated, and more robust than the current state, people will naturally come around
2 weeks
Goal: Start slow and do one project, and use lessons learned to improve the process, standards, and usability
Test
ing
Gro
upQ
A-T
S
10/30/03 StarWest 2003 26
Execution: Rollout StageTe
stin
g G
roup
QA
-TS
Challenges:- Staff need to digest changes and extrapolate what this means to them and how they work
Advice:- Create customized training materials for each different application, so people can review and understand what this means to them- Do NOT have a single overview session for the department, as any concerns that don’t come up because the session has too many people will result in un-resolved angst
Create a core training guide for department, with specific customized examples for each functional or application testing group that will use system
Conduct individualized 2 hour training session for each functional or application group, led by internal champion, with all pilot participants present to act as teacher’s assistants
After 1-2days, active and regular follow-up with each person to make sure they know how to use the system
10/30/03 StarWest 2003 27
Mid Project Recycle
Top management buy-in
Standards, workflow, and process vary more than expected
Line management buy-in of theory but didn’t fully realize magnitude of impact
Usability is king
Technology is a foreign language
10/30/03 StarWest 2003 28
Parting Wisdom: Managing the Change While Getting The Wash Out
Pick a mix of ground troops & management for the pilot that can accurately speak for the group
Let the pilot take time if need be, but make sure participants feel like they’re building something that is both much more usable and meets their needs
Be flexible but stand by the guiding principles
10/30/03 StarWest 2003 29
Parting Wisdom: Top 10 Challenges1. People:
1. Acceptance of the constraints inherent in a highly structured testing model2. Acceptance of functional tree and extensions3. Acceptance of structured test definition and execution4. Conflicting goals of adhering to new structure vs. completing projects on
time(getting wash out)2. Process
1. Creating a common, core process and standards across groups2. Building consensus on dept-specific customized fields3. Level of required test case and data detail documentation4. Miscommunication/misunderstanding of what pilot participants can take as a
given, and what they can change.3. Technology
1. Out-of-box reports didn’t match needs2. TCMS UI is vastly different from Word or Excel, and not really customizable
10/30/03 StarWest 2003 30
Parting Wisdom: Advice
Get buy-in from key stakeholders before and throughout the project
Change is hard
Communicate, Communicate, Communicate
Use an internal person for the heavy lifting and main work