Post on 13-Sep-2014
description
SOFTWARE TESTING PROCESS, TECHNOLOGY & TREND
April 2013
KMS Technology - http://kms-technology.com
QA Symphony - http://www.qasymphony.com
AGENDA
• KMS Technology Overview 10’
• Software Testing Process & Trends 50’
• Software Testing Estimation 20’
• Break 15’
• Automation Testing & Tools 60’
• Future of Software Testing 20’
• Q&A 20’
3
© 2012 KMS Technology
KMS TECHNOLOGY OVERVIEW
Vu Pham
KMS TECHNOLOGY OVERVIEW
5
US Company/Passionate Global Workforce • 400 Resources & Growing in Vietnam and the US
• 160 Testers ~ 50% Workforce • Proven Leadership Team
World Class Infrastructure • Built for ISO 27001, Planned Certification in 2013
Best-in-Class SDLC Practices • CMMI and Agile focus • QASymphony - Commercial Agile Testing Solutions
recognized by Forrester with over 4,500 users
Best Clients – Raving Fans • 100% Referenceable and Ecstatic • 100% in long-term dedicated team
KMS SOFTWARE TESTING SERVICES
Testing Tools
Proprietary Tools
Commercial Tools
Open source Tools
Automation &
Performance
Testing Frameworks
Test Processes
Process Assessment
Best Practice
Implementation
Continuous Process
Improvement
Quality and Project
Management Metrics
KMS Testing Services
Testing
Consulting
Services
Life-cycle
Testing
Services
Automation
Testing
Services
Performance &
Load Testing
Services
Mobile and
Specialty Testing
Services
Flexible Staffing Option
Streamlined Processes & Frameworks
Tools & Automation Strategic Solution & Best Practices
Test Planning & Estimation
Test Design & Implementation
Test Execution QA Metrics
Driven Monitoring
QA Metrics Driven Process Improvements
6
OTHER KMS SERVICES
7
APPLICATION DEVELOPMENT
• J2EE and .NET expertise • Full lifecycle Product Development • Application modification and
customization APPLICATION SUPPORT
• Perform defect resolution, on-going maintenance of existing applications
APPLICATION REENGINEERING
• Re-engineer and migrate to a different technology and platform such as SaaS or Mobile
DATA WAREHOUSE / BUSSINESS INTELLIGENCE
• Develop and Deploy Data Warehouse solutions
• Data migration services • Report writing services
MOBILE DEVELOPMENT
• Apple iOS, Android SDK, and Windows 8 • Mobile gaming • Enterprise mobile apps
© 2012 KMS Technology
SOFTWARE TESTING PROCESS & TRENDS
Vu Pham
AGENDA
• Testing Process Evolution
• Fundamental Testing Process
• Components of Testing Process Framework
• Best Practices in Testing
9
DEVELOPMENT PROCESS EVOLUTION
60’s: Waterfall 80’s: RUP 00’s: Agile 70’s: V-Model
10
DEVELOPMENT PROCESS EVOLUTION (CONT.)
11
Client Advantages Disadvantages
• Simple model and easy to manage • Applicable for small software
• “Big Design Up Front” • Defect detected at late phases • High amounts of risk and uncertain
• Early testing involvement • Clear relationship between test phases
development phases
• Still possess limitation of sequential model • Require high amount of documentation • Duplication of testing effort
• Risk and uncertain are managed • Testing activities and process are
managed
• Heavy documentation • Late customer involvement – only at UAT
• Adaptable to changes • Early client involvement - Avoid
unrealistic requirements • Avoid spending time on useless
activities
• Require high-capable people • Need representative from client • Problem scaling up the architecture
SO HOW TESTING IS CHANGED?
• Black-box testing • System testing • Functional testing • Part-time tester
• Grey-box testing • System/Integration
testing • Functional testing • Full-time tester
• White-box testing • System-system • Non-functional testing • Fit-for-Use • Professional tester
12
60’– 80’: Nice To Have 90’: Should Have 00’: Must Have
AGENDA
• Testing Process Evolution
• Fundamental Testing Process
• Components of Testing Process Framework
• Best Practices in Testing
13
PHASES IN TESTING PROCESS
14
COMPONENTS OF TESTING PROCESS
Guidelines
TM - Test Plan Template
TM - Test Strategy Template
TM - Test Case Template
TM - Test Estimation Template
TM - Test Metrics Dashboard Template
TM - Defect Tracking Report Template
TM - Requirement to Test TM Template
TM - Test Daily / Weekly / Summary Report Template
Templates Checklist CK - Test Readiness Checklist
CK - Test Plan Review Checklist
CK - Test Case Review Checklist
CK - User Acceptance Test Checklist …
GD - Defect Tracking Guidelines
GD - Test Metrics Guidelines
GD - KPI Metrics Guidelines
GD - Test Estimation Guidelines
GD - User Acceptance Test Guidelines
PR – Testing Process (Detail)
PR - Testing Process Diagram (Xmind)
Process
15
Testing
Implementation
Requirement
Design
Deployment
Software Quality Assurance
Risk Management
Project Management
RELATIONSHIP WITH OTHER PROCESSES
CM
16
AGENDA
• Testing Process Evolution
• Fundamental Testing Process
• Components of Testing Process Framework
• Best Practices in Testing
17
WHAT ELSE WE NEED FOR PROCESS?
Plan Test
Design Test
Execute Test
Close Test
18
Actual testing needs more than just fundamental process:
• Solutions
• Best Practices
• Standards
• Tools
And more to become “Test Center of Excellence”
TESTING CENTER OF EXCELLENCE
Test Solutions
Automation Testing
Performance Testing
Mobile Testing
Specialty Testing
Best Practices
Process Assessment
Testing Estimation
Continuous Process
Improvement
Exploratory/Risk-
based Testing
Quality
Policy
Guidelines &
Templates
Fundamental
Testing Process
Quality Metrics &
Standards
Plan Test
Design Test
Execute Test
Close Test
19
TCoE = Processes + Practices + Solutions
WHY TEST SOLUTIONS?
20
About the Client Clearleap was the first company providing data streaming solution to offer a complete platform that allows TV everywhere possible
Business Challenges
• Simulate high volume of concurrent
users 100,000+ • Complete within a tight schedule • Limited budget for tool
KMS’s Solutions
• Tool Evaluation: Execute a proof of concept to evaluate both commercial and open source tools
• Planning: Determine a test strategy, approaches
• Test Design and Development: Design and develop scalable load testing architecture
• Execution and Reporting: Perform load testing and analyzing/reporting test results
Achievements
• Developed a scalable solution based on Jmeter
• Extremely reduced the cost of testing and tremendously increased ROI
• Found critical performance issues
WHY TEST SOLUTIONS? (CONT.)
• It takes months to build up solution from beginning
• Cost of commercial tools v.s open source tools
• Effective solutions differentiates us from other vendors
Typical Testing Solutions:
– Automation testing (web, desktop, mobile)
– Performance/Load Testing
– Security Testing
– Database/ETL Testing …
21
WHY BEST PRACTICES?
22
About the Client Global company supporting clinical trials in 67 countries. The Client offers services which include behavioral science, information technology, and clinical research
Business Challenges • 100% on time delivery with zero critical
bugs • Complicated paper process following
FDA regulations • Various testing platforms for both mobile
devices and desktop
KMS’s Solution • Process Establishment: Identify gaps in
current process; Leverage start-of the-art practices
• Process Improvement: Define and measure performance /quality metrics
• Lifecycle Testing: Perform all lifecycle testing activities
• Test Automation: Develop an automation framework to shorten test cycle
Achievements • New process helps reducing 60% testing
effort • No ‘critical’ defects identified during 1 year
of engagement • Moved paper work process to test
management system open new trend in clinical trial industrial
WHY BEST PRACTICES? (CONT.)
23
• Best practice improves outcome of activities
• Best practice has been proved of it effectiveness
• The more practices we use the higher maturity we are
Typical Testing Best Practice:
– Review and Lesson-Learnt
– Root Cause Analysis
– Risk-based/Exploratory Testing
– Estimation Method, ROI Model
– Quality Metric Dashboard
AGENDA
• Testing Process Evolution
• Fundamental Testing Process
• Components of Testing Process Framework
• Best Practices in Testing
24
Definition: CPI is an ongoing effort to improve quality of products, services, or processes
In software testing CPI is seeking for improvement of: • Quality
• Productivity
• Cost of Quality
• Time to Market …
CONTINUOUS PROCESS IMPROVEMENT
25
Assess
Plan Implement
Evaluate
• Three metric categories in practice:
– Product Quality Metrics – How good the overall quality of the product
– Process Effectiveness Metrics – How the processes of delivery are performed
– Testing and Test Automation Metrics – Detail status of testing activities, test outcome
Metrics are standards of measurement by which efficiency, performance, progress, or quality of a plan, process, project or product can be assessed with the aim to support continuous improvement
Wikipedia
QUALITY METRICS
26
• Defects by Status • Open Defects by Severity • Open Defects by Severity & Functional Area • Open Defects by Severity & Release • Open Defect Aging …
Product Quality Metrics
• Defect Identification in Pre-Prod / Prod • Weekly Defect Rates per Environment • Defect Escape Ratio
• Defects by Phase Found / Functional Area • Defects by Origin / Functional Area …
Process Effectiveness Metrics
• Test Coverage Planning • Execution Status / Execution Rate by
Functional Area/Cycle • Defect Rejection Ratio • Test Productivity …
Testing Metrics
• Percent Automatable • Automation Progress • Percent of Automated Testing Coverage …
Test Automation Metrics
QUALITY METRICS (CONT.)
27
Definition: Risk-based testing is testing method that base on identified risks to
– determine the “right level” of quality
– prioritize the tests and testing effort
– focus on most important testing areas first
with the aim to be clear of current quality status and to get the best return by the time completing testing
RISK-BASED TESTING
28
EXPLORATORY TESTING
29
“A style of testing in which you explore the software while
simultaneously designing and executing tests, using feedbacks from the last test to inform the
next.” Elisabeth Hendrickson
This type of testing helps: • Discovering unknown and un-detected
bugs
• Testers in learning new methods, test strategies, think out of the box
WHAT IS GOOD TESTING PROCESS?
30
WHAT IS GOOD TESTING PROCESS? (CONT.)
31
1. Quality Gate/Check Points
2. Peer Review
3. Metrics-driven Management
4. Root Cause Analysis
5. Defect Prevention
CHALLENGES IN ADOPTING NEW PROCESS
32
1. Fear of changes
2. Lack of management support
3. Lack of supporting tools/solutions
4. Not a long-term solution
5. Takes time to bring values
© 2012 KMS Technology
SOFTWARE TESTING ESTIMATION
Vu Pham
AGENDA
• Important of Software Estimation
• qEstimate - Test Case Point Analysis
• Effort Estimation Methods using qEstimate
34
IMPORTANT OF SOFTWARE ESTIMATION
• Software estimation
– process of determining the size, cost, time of software projects, often before work is performed
• Estimation is important for the success or failure of software projects. It provides input for:
– Making investment decisions
– Budget and staff allocation
– Stakeholder/Client negotiation …
35
WHY TESTING ESTIMATION IMPORTANT?
• Testing may consume up to 50% of project effort
– ~ 70% effort in critical mission systems
• Current problem
– No estimation for testing
– Estimation is done for the whole project rather than testing
36
POPULAR SOFTWARE ESTIMATION METHODS
• Sizing Methods
– Source Lines of Code (SLOC)
– Function Points Analysis …
• Effort Estimation Methods
– Expert Judgment/Experience
– Productivity Index …
• “Guestimate” Estimation Method
– Using a test distribution percentage (Ex: Testing is 30% of total effort)
37
AGENDA
• Important of Software Estimation
• qEstimate - Test Case Point Analysis
• Effort Estimation Methods using qEstimate
38
QESTIMATE – TESTING ESTIMATION
• qEstimate - TCPA estimates the size of testing using test cases as input
• Test case complexity is based on 4 elements:
• Checkpoints
• Precondition
• Test Data
• Type of Test
39
qEstimate: http://www.qasymphony.com/media/2012/01/Test-Case-Point-Analysis.pdf
QESTIMATE – TESTING ESTIMATION (CONT.)
Test Cases
Count Checkpoints
Determine Precondition Complexity
Determine Test Data
Complexity
Unadjusted TCP
Adjust with Test Type
TCP
40
AGENDA
• Important of Software Estimation
• qEstimate - Test Case Point Analysis
• Effort Estimation Methods using qEstimate
41
ESTIMATE TESTING EFFORT (CONT.)
Typically, testing effort is distributed into phases as below:
42
PRODUCTIVITY INDEX
• Effort is computed using Productivity Index of similar completed projects
• Productivity Index is measured as TCP per person-hour
PI = Average (TCP/Actual Effort)
Effort (hrs) = TCP/Productivity Index
Simple method
43
REGRESSION ANALYSIS
• Estimate effort of new projects using size and effort of completed projects
A and B is calculated based on historical data
y = Ax + B
0
10
20
30
40
50
60
70
80
90
100
0 100 200 300 400 500 600 700 800 900 1000
Eff
ort
(P
M)
Adjusted TCP
44
© 2012 KMS Technology
AUTOMATION TESTING & TOOLS
Thao Vo
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
46
THINKING OF AUTOMATION
Test Automation is…
Business values of Automation
Greater Coverage – More time for QA doing manual exploratory/risk-based
testing.
Improved Testing Productivity – Test suites can be run earlier and nightly
Reduced Testing Cycle – Help shorten time-to-market
Doing what manual testing cannot – Load testing
Using Testing Effectively – Automation testing reduces tediousness,
improve team morale
Increased Reusability – Tests can be ran across different platforms and
environments
The use of software and tools to perform the testing
Code-Driven – Testing at source code level with a variety of input arguments.
GUI-Driven – Testing at GUI level via keystrokes, mouse clicks, etc.
47
THINKING OF RETURN ON INVESTMENT
Tool, Implementation, Maintenance, Training,
etc. Save Time, Early
Response, Reliable, Repeatable, etc.
ROI: The most important measurement for test automation • ROI (effort): planning, development, maintenance, training, etc.
• ROI (cost): tool license, environment, management, automation resources, etc.
• ROI (quality): found defect, test coverage, etc.
48
END-TO-END TEST AUTOMATION PROCESS
1
• Assessment
• Evaluation
2
• Pilot
• Planning
3
• Design
• Implementation
4
• Execution
• Report
5 • Maintenance
49
Plan Test
Design Test
Execute Test
Close Test
ASSESSMENT & EVALUATION
• Assessment – Understand organization vision,
priorities, process & methodology
– Understand Application & Technology
– Identify the Test requirements
• Evaluation: – Vendor discussion (optional)
– Tool evaluation
– Recommendations
– Finalize Testing tools
50
PILOT & PLANNING
• Pilot – Do Proof of Concept
– Define Test process
– Finalize Test Approach & Methodology
– Define Entry & Exit criteria
• Planning: – Identity test requirements, test
cases for Automation
– Set up test environment
– Define Automation framework
– Finalize Resources and Test schedule
51
DESIGN & IMPLEMENTATION
• Design – Define standards, guidelines, Pre
& Post test procedures
– Design input, output data
– Monitoring tools and report metrics
– Design Automation framework
• Implementation:
– Build driver script, actions, keywords, data driven
– Build scripts
– Validate and run under application test
52
EXECUTION & MAINTENANCE
• Execution & Report – Setup environment
– Run and schedule tests
– Provide detailed and summary report
– Provide automation handbook & training
• Maintenance: – Implement new change request
– Define new enhancement
– Keep up-to-date with new function of application under test.
53
AUTOMATION TOOLS LANDSCAPE
• Tools – Quick Test Professional (HP)
– Functional Tester (IBM)
– SilkTest (Micro Focus)
– TestComplete (SmartBear)
– eggPlant (TestPlant)
– Etc.
• Advantages – Easy to use
– Support multiple technologies
• Disadvantages – Costly option (> 2K/license)
– Lack of customizations or limited integration with other tools
• Tools: – Selenium
– Watir
– Robotium
– Cucumber
– JMeter, SoapUI
– Etc.
• Advantages – Free
– Can be integrated with other open source tools
• Disadvantages – Some tools has limited support
from community
– Need customizations to be suitable for product under test
54
AUTOMATION CHALLENGES
High up-front investment cost
Demanding of skilled resource
Selection of the best testing tools and approach
Ineffective collaboration process
Persuade stakeholders to say “Yes”
55
AND SOLUTIONS
56
Above challenges can be resolved by investing into effective automation solution:
• Flexible enough to leverage open-source landscape
• Use high-level description language so any tester can use
• Generate useful report and metrics
• Automated tests can be ran with any tool
A SAMPLE BEST PRACTICE
57
About the Client Smart-pens revolutionize the act of writing
by recording and linking speech to
handwriting. This is fundamentally
advancing the way people capture, access
and share written and spoken information in
the paper and digital worlds.
Business Challenges • Various testing methods & testing
techniques: web service, performance and API testing
• Multiple iterations, early and frequent need of regression testing with limited resources
KMS’s Solution • Automation Planning: Define a Test
Strategy & Test Approach for load, performance & API testing using open source tools.
• Automation Design and Development: Design and develop effective Test Automation Framework.
Achievements • 70% of all testing has been automated
using open source solutions. • Framework and API web service testing
have extremely reduced cost and increased ROI.
• Performance automated testing solution has been implemented and run regularly for identifying/isolating potential bottle necks to improve system ‘Up-time’ for client businesses
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
58
PERFORMANCE TESTING
Determines…
User expectations
System constrains
Costs
Focuses on…
To answer… How many…?
How much…?
What happens if…?
Speed
Scalability
Stability
59
CROWD SPEED AVAILABITY
How many users before crashing?
Do we have enough hardware?
Where are the bottlenecks in the system?
Is the system fast enough to make customers happy?
Will it slow down or will it crash?
Did I purchase enough bandwidth from my ISP?
How reliable is our system
Will our system cope with the unexpected?
What will happen if our business grows?
The failure of an application can be costly Locate potential problems before our customer do Assume performance and functionality under real-work conditions Reduce infrastructure cost
60
PERFORMANCE TESTING OVERVIEW
PERFORMANCE TESTING PROCESS
61
Planning Preparation Baseline Execution Report
Objectives Setup
Test Data
Develop
Validate
Execute
Analyze
Optimize
Final Report
Assessment Plan
Strategy
PERFORMANCE TESTING CHALLENGES
• How to replicate production environment as close as possible
• Misleading data
• Different hardware configuration
Right Test Environment
• Commercial tools: HP LoadRunner, IBM Rational Performance Tester, Segue SilkPerformer, RadView WebLoad, NeoLoad, etc.
• Open Source tools: JMeter, OpenSTA, The Grinder, LoadUI, etc.
Testing Tool Selection
• Ambiguous requirements
• Unclear, unknown requirements
• Best practice: Should start with minimum load and developer consulting
Non-functional Req. Exploration
• Ineffective framework
• Select correct scenarios
• Wrong script implementation
Incorrect Implementation
Monitoring & Analysis
• Monitor performance each server in distributed testing
• Collect a huge output data and analyze the bottlenecks, slow spots, etc.
62
THE FUTURE CHALLENGES OF AUTOMATION
63
AGENDA
• Software Test Automation
• Software Performance Testing
• Tools Support Testing
64
TESTING TOOLS LANDSCAPE
65
ALM – Application Life-cycle Management
• Purpose: communicates across multiple project teams
• Typical Tools: Rally, VersionOne, HP ALM
TMS – Test Management System
• Purpose: manages requirement test matrix
• Typical Tools: HP QC, Test Link, QAComplete, qTest
DTS: Defect Tracking System
• Purpose: manage defect
• Typical Tools: BugZilla, Jira, Mantis
ATT: Automation Testing Tools
• Purpose: Regression and specific tests
• Typical Tools: QTP, TestComplete, Selenium, Watir, JMeter, LoadRunner
NEW TREND IN TESTING TOOLS
66
• Auto-sync requirements, test cases & defects
• Import/export, integrate with other systems
• Capture tools integrate into defect tracking tool
Save Time & Less Work
• View, mark result, update test cases and defects without leaving the target test application
• Create defect quickly Faster & Easy to Use
• Easy to customize new features
• Integrate into many specified tools
Customization & Integration
• Control and keep track of changes, assignments
• Track status across lifecycles
• View the real-time status, statistical data, associated trends
More Control, Visibility
• Flexible and low cost Cloud Deployment
TO BE A GOOD AUTOMATION QA…
Technical Skills:
• Software development
• Testing mindset, testing methodologies and types (both functional and non-functional test)
• Testing and development tools
• Operating systems, networking, database
• Technical writing
Others:
• Domain knowledge
• Soft Skills
67
© 2012 KMS Technology
FUTURE OF SOFTWARE TESTING
Vu Pham
WHERE WE ARE?
• Ho Chi Minh City and Hanoi are continuously in the top 10 emerging IT outsourcing cities (‘07 Today) http://www.tholons.com/Top50_article.pdf
69 Confidential
• What is typical ratio of Testers in VN IT company?
WHERE WE ARE? (CONT.)
70 Confidential
Ho Chi Minh city is destination of global outsourcing in testing
WHAT ARE OUR OPPORTUNITIES?
Facts: • Testing outsourcing
market value triple increased for every 4 year
• Many VN outsourcing companies are testing focus: Logigear, TMA, Global CyberSoft, KMS …
71 Confidential
FUTURE OF SOFTWARE TESTING
1. Faster – Higher – Stronger Faster release
– Need value from every hour spent on testing
Higher quality
– Greater test coverage of specified and implied requirements
Stronger capability
– Not only functionality but also performance, security, usability …
– Ability to develop test solutions
2. Complicated technology/application platform – Cloud Computing, Mobile, Enterprise System …
72
FUTURE OF SOFTWARE TESTING (CONT.)
3. Global testing team – global competition – Communication, Crowd-source Testing ...
4. Automation testing is must – More effective solutions are needed
5. Less on processes, more on practices – Agile, Exploratory, Rapid testing
73
SUMMARY
1. Testing is crucial for today business
2. It becomes professional of choice
3. Vietnam is destination of testing outsourcing
4. Automation testing is must in future
5. Requires intellectually, analytically and creatively mindset
6. It takes years to become good
7. Can’t be good if just learn from daily works
8. Is fast-paced career advancement
74
© 2012 KMS Technology
Q & A
© 2012 KMS Technology
APPENDIX
QESTIMATE – TESTING ESTIMATION
• qEstimate - TCPA estimates the size of testing using test cases as input
• Test case complexity is based on 4 elements:
• Checkpoints
• Precondition
• Test Data
• Type of Test
77
QESTIMATE – TESTING ESTIMATION (CONT.)
Test Cases
Count Checkpoints
Determine Precondition Complexity
Determine Test Data
Complexity
Unadjusted TCP
Adjust with Test Type
TCP
78
CHECKPOINT
• Checkpoint:
– Is the condition in which the tester verifies whether the result produced by the target function matches the expected criteria
– One test case consists of one or many checkpoints
• Counting rule:
One checkpoint is counted as one Test Case Point
79
PRECONDITION
• Counting rule:
Each complexity level of precondition is assigned a number of Test Case Points
80
TEST DATA
• Counting rule:
Each complexity level of Test Data is assigned a number of Test Case Points
81
ADJUSTED TEST CASE POINT
• Test Case Point counted till this point is considered Unadjusted Test Case Point (UTCP) n
UTCP = ∑TCPi i=1
• UTCP is adjusted by considering types of test case – Each type of test case is assigned a weight
– Adjusted Test Case Point (ATCP): n
ATCP = ∑UTCPi * Wi
i=1
• UTCPi - the number of UTCP counted for the test case ith.
• Wi - the weight of the test case ith, taking into account its test type
82
WEIGHT BY TYPE OF TEST
83
WEIGHT BY TYPE OF TEST (CONT.)
84
SUMMARY OF THE PROCESS
Test Cases
Count Checkpoints
Determine Precondition Complexity
Determine Test Data
Complexity
Unadjusted TCP
Adjust with Test Type
TCP
85
AGENDA
• Important of Software Estimation
• qEstimate - Test Case Point Analysis
• Effort Estimation Methods using qEstimate
86
ESTIMATE TESTING EFFORT
• Estimate testing effort using TCP
• Test effort distribution into four phases – Test Planning (TP)
– Test Analysis and Design (TAD)
– Test Execution (TE)
– Defect Tracking and Reporting (DTR)
• Effort estimation methods – Productivity Index (PI)
– Regression Analysis (RA)
Each of these phases may be performed multiple times in a project
87
ESTIMATE TESTING EFFORT (CONT.)
• Challenge is how to estimate effort need for each of phase in testing process
• Typically, testing effort is distributed into phases as below:
88
PRODUCTIVITY INDEX
• Effort is computed using Productivity Index of similar completed projects
• Productivity Index is measured as TCP per person-hour
PI = Average (TCP/Actual Effort)
Effort (hrs) = TCP/Productivity Index
Simple method
89
REGRESSION ANALYSIS
• Estimate effort of new projects using size and effort of completed projects
A and B is calculated based on historical data
y = Ax + B
0
10
20
30
40
50
60
70
80
90
100
0 100 200 300 400 500 600 700 800 900 1000
Eff
ort
(P
M)
Adjusted TCP
90
SUMMARY OF QESTIMATE PROCESS
91
qEstimate: http://www.qasymphony.com/media/2012/01/Test-Case-Point-Analysis.pdf