Test Management for Large, Multi-Project Programs

34
TI AM Tutorial 10/14/2014 8:30:00 AM "Test Management for Large, Multi-Project Programs" Presented by: Geoff Horne NZ/OZ/USTester Magazine Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ [email protected] www.sqe.com

Transcript of Test Management for Large, Multi-Project Programs

Page 1: Test Management for Large, Multi-Project Programs

TI AM Tutorial

10/14/2014 8:30:00 AM

"Test Management for Large,

Multi-Project Programs"

Presented by:

Geoff Horne

NZ/OZ/USTester Magazine

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073

888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com

Page 2: Test Management for Large, Multi-Project Programs

Geoff Horne

NZTester Magazine Geoff Horne has an extensive background in test program/project directorship and management, architecture, and

general consulting. In New Zealand Geoff established and ran ISQA as a testing consultancy which enjoys a local and international clientele in Australia, the United States, and the United Kingdom. He has held senior test management roles across a number of diverse industry sectors, and is editor and publisher of the recently launched NZTester magazine. Geoff has authored a variety of white papers on software testing and is a regular speaker at the STAR conferences. Married with four children, he enjoys writing and recording contemporary Christian music.

Page 3: Test Management for Large, Multi-Project Programs

1

Programme-Level

Test ManagementGeoff Horne, NZTester Magazine

[email protected]

October 2014

NZTester

www.nztester.co.nz

2

• 39 years IT in various roles including development, sales, consulting, IT

management and testing.

• The last 20 years has been exclusively in test/QA management & consulting.

• Extensive background in programme/project test management, advisory services,

governance, architecture and general consulting.

• Established & ran ISQA as a testing consultancy and practice 2000-2007 (it now

runs as a vehicle for my contracting services).

• Founder & publisher of NZTester, OZTester and USTester Magazines for which I

also undertake writing, editing & analysis duties. As this is my first foray into

publishing & journalism, I'm on a steep learning curve however thoroughly

enjoying myself.

• Recently taken on my first assignment as a software testing industry analyst with

a large American IT technology company; speaking at conferences and delivering

white papers and webinars.

NZTester

About Me:

Page 4: Test Management for Large, Multi-Project Programs

2

3

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | company

• Large distribution company

• Based in Los Angeles with distribution centres in San

Francisco and San Diego

• Retail outlets in 26 US regional centres

• Overseas distribution operations in Australia, EMEA,

South America and SouthEast Asia

4

www.nztester.co.nz

NZTester

www.nztester.co.nz

• Legacy centralised ERP system operated from Los Angeles

• Los Angeles, San Francisco and San Diego distribution

centres online running Sales, Ordering, Inventory,

Warehousing modules

• Retail outlets equipped in US regional centres equipped

with online POS/Inventory

• Overseas operations in Australia, EMEA, South America

and SouthEast Asia running similar implementations of

same or similar systems

Challenge | existing IT systems

Page 5: Test Management for Large, Multi-Project Programs

3

5

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | new IT systems

• Smaller de-centralised ERP systems operating in each of

the Los Angeles, San Francisco and San Diego distribution

centres

• Retail outlets upgraded with web-based POS/Inventory

• New web-based online Ordering system to be developed

and implemented

• Overseas operations in Australia, EMEA, South America

and SouthEast Asia to roll out same solution once US

distribution and larger regional centres are implemented

6

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | proposed solution

• JDEdwards web-based ERP systems operating in each of

the Los Angeles, San Francisco and San Diego distribution

centres with custom configurations and industry-specific

modifications

• Retail outlets upgraded to web-based POS system

provided by JDEdwards business partner

• New web-based online Ordering system to be developed

by specialist web development company

Page 6: Test Management for Large, Multi-Project Programs

4

7

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | activities

• JDEdwards ERP system configuration, modification and

implementation

• Retail outlets POS system implementation

• Online Ordering system software development (agile,

web-based)

• Configuration and implementation of middleware for

integration

• Migration of legacy ERP system data to JDEdwards

• Development of data warehouse, BI and reporting

• Appropriate level of security deployed across all systems

• Optimising all systems for peak performance

• Ongoing rollout of further modifications as required

8

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | you are the test architect!

• JDEdwards ERP system configuration, modification and

implementation

• Retail outlets POS system implementation

• Online Ordering system software development (agile,

web-based)

• Configuration and implementation of middleware for

integration

• Migration of legacy ERP system data to JDEdwards

• Development of data warehouse, BI and reporting

• Appropriate level of security deployed across all systems

• Optimising all systems for peak performance

• Ongoing rollout of further modifications as required

Page 7: Test Management for Large, Multi-Project Programs

5

9

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | test approaches

• V-Model

• Scripted Testing

• Risk-Based Testing

• Exploratory Testing

• Test Automation

Key Software Testing Strategies

White Box

Testing

Black Box

Testing

• Based on the program code

• Explores internal structure of code

• Verifies the integrity of the code

• Performed by developers

• Based on specified requirements

• Explores software functions & processes

• Ignores internal code construction

• Performed by testers

10NZTester

www.nztester.co.nz

Page 8: Test Management for Large, Multi-Project Programs

6

Key Software Testing Strategies

Gray Box

Testing

• Based on functional understanding

• Explores specific software functions

• Verifies software components

• Performed by more technical testers

• Close collaboration between testers &

developers

• Lends itself to test automation

• Better suited to investigative/exploratory

tesing approach

• Combines benefits of white & black box

testing wherever possible

• Not a complete substitute for either

however

11NZTester

www.nztester.co.nz

The Testing V-Model

User

Business

Requirements

Software

Specification

Software

Architecture

Technical

Specification

Unit

Testing

Integration

Testing

System

Testing

Acceptance

Testing

User

Code12

NZTester

www.nztester.co.nz

Page 9: Test Management for Large, Multi-Project Programs

7

13

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | v-model process componentsUnit TestingUnit Testing

Integration Integration

TestingTesting

System

Testing

AcceptanceAcceptance

TestingTesting

Testing a single program or subsidiary componentTesting a single program or subsidiary component

of a program for compliance to program/componentof a program for compliance to program/component

specifications when executed in isolationspecifications when executed in isolation

Testing of preTesting of pre--tested programs/components, integratedtested programs/components, integrated

together to create subtogether to create sub--systemssystems

Testing of the entire system for compliance to the

software’s functional specification

Testing of the system for compliance to theTesting of the system for compliance to the

business requirements specificationbusiness requirements specification

Development

Testing

Business

did the product get built right?

did the right product get built?

14

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | scripted testing

Test ID: STK003b Description: Move stock where neither location is a Hold location

Step Action Expected Actual Pass/ Defect

Result Result Fail No.

1 Select Stock Control Menu displayed Pass

Menu

2 Select Stock Movement Stock Movement screen Pass

option displayed

3 Select stock item NGS002 details displayed Pass

NGS002 Check details are correct

4 Select stock transfer Drop down displayed Pass

5 Enter From location Location details displayed Hold denotation not Fail 234

Loc010 including Hold denotation present

6 Enter To location Location details displayed Hold denotation Fail 235

Loc023 including Hold denotation present but incorrect

7 Click on OK Stock should be moved

Page 10: Test Management for Large, Multi-Project Programs

8

15

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | risk-based testing

Resources

Deadlines Test Coverage & Defects

$

16

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | risk-based testing

Time & cost Risk

Risk-based testing

Page 11: Test Management for Large, Multi-Project Programs

9

17

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | risk-based testing

Where to draw the line of “good enough”?

Risk

Time & Cost

18

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | exploratory testing

Page 12: Test Management for Large, Multi-Project Programs

10

19

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | Scripted vs Exploratory testing

What do we expect

it to do? What did it do?

same

What did it do?Is this what we

want it to do?

yes

no

not

same

Validatory (scripted)

Investigatory (exploratory)

Nexttestscript

Dependsoncurrenttest

20

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | test automation

1 - The user records an activity

in an application, the tool captures

the keystrokes and develops

a programmatical script as it goes

2 - Information about the state of

the application is captured by

inserting test “cases” during

recording which are stored

either in the script or in separate

files. The suite of automated

scripts becomes the baseline

3 - The automated scripts are played

back against a new version of the

application and the tool reports

on the differences between

that played back and that

originally captured

4 - The user then analyses each

difference to determine whether

it is an expected difference eg.

legitimate application change

or unexpected eg. defect. The

baseline can be updated

with legitimate changes and

reports logged for defects

Page 13: Test Management for Large, Multi-Project Programs

11

21

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | test automation

Testing types:

• Regression

• Load

• Performance

• Volume

• Stress

• Installation

• Configuration

22

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | testing within agile

Traditional Waterfall Methodology

Requirements

Design

Build

Test

Implementation

Page 14: Test Management for Large, Multi-Project Programs

12

23

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | testing within agile

Agile Methodology

User Stories

Design

Build

Test

Design

Build

Test

Design

Build

Test

Design

Build

Test

Implementation

24

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | testing within agile

Agile Methodology

Design

Build

Test

Design

Build

Test

Design

Build

Test

Design

Build

Test

Implementation

Time-boxed mini-waterfall-like iterations

(sprints)

Design

Build

Test

User Stories

Page 15: Test Management for Large, Multi-Project Programs

13

25

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | testing within agile

Agile Methodology

Design

Build

Test

Design

Build

Test

Design

Build

Test

Design

Build

Test

Implementation

Time-boxed mini-waterfall-like iterations

(sprints)

Design

Build

Test

User Stories

26

www.nztester.co.nz

NZTester

www.nztester.co.nz

Approaches | testing within agile

Agile Methodology

Design

Build

Test

Design

Build

Test

Design

Build

Test

Design

Build

Test

Implementation

Time-boxed mini-waterfall-like iterations

(sprints)

Design

Build

Test

Design

Build

Test

User Stories

Page 16: Test Management for Large, Multi-Project Programs

14

27

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | you are the test architect!

• Core - JDEdwards ERP system configuration etc

• Retail - Retail outlets POS system implementation

• Online Ordering - Online Ordering system software

development (agile, web-based)

• Integration - Middleware for integration

• Data Migration - Migration of legacy ERP system data

• Reporting - Data warehouse, BI and reporting

• Security - Appropriate level of security deployed

• Performance - Optimising all systems for peak

performance

• Mods - Ongoing rollout of further modifications as any

gaps in requirements are identified

28

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | simple schematic

Core JDEdwards

Middleware

Retail

Data Warehouse

Online Ordering

Retail

Retail

Retail

Page 17: Test Management for Large, Multi-Project Programs

15

29

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | existing resources

• Single test environment for ERP and Retail testing before

release into Production

• Maintenance releases for all applications provided on a

quarterly basis by respective vendors with one major

version release per annum

• Internal IT team includes small team of permanent test

analysts who know the existing applications well and

perform mostly adhoc testing on new releases

• International sites received tested releases and check for

localisations before release into Production

• No automation of any sort

30

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | test scope

• Testing has to encompass the entire project cycle from

beginning to end of NZ system implementation included

additional modifications

• Running under project conditions

• Six months duration expected

• Test solution will determine level of resourcing

• Budget is available for additional test resources

• There is budget for test tools albeit restricted

• There are high-level business requirements available

however the detail-level will be worked on regular design

workshops

• There is an expectation that business SMEs and users will

be available to assist where needed

Page 18: Test Management for Large, Multi-Project Programs

16

31

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | test solution design

• Which test approaches would you apply against which

stream of activities and why?

• What would be the test phases to make up the test

project?

• What tools would you recommend?

• How would you resource testing?

• How would you structure the test team?

• Which of the other project/IT teams would you be

interacting with the most and why?

• What sort of reporting would you put in place?

32

www.nztester.co.nz

NZTester

www.nztester.co.nz

Challenge | test solution design

• What testing processes would you put in place to support

testing?

• What and how many test environments will you need?

• How will you obtain test data and databases?

• Where will you deploy the business SMEs and users

Page 19: Test Management for Large, Multi-Project Programs

17

Sticky Testing Questions:

• How much testing have we done?

• Is testing progressing as it should be?

• What has to be done to finish testing on time?

• How much testing is able to be progressed?

• How fast is testing going?

• How fast does testing need to go to finish on time?

33

www.nztester.co.nz

NZTester

Sticky Testing Questions (cont):

• Are defects being closed off fast enough to finish on time?

• Are we over the hump yet?

• Are we getting better with defects?

• How fast are we finding defects?

• How fast are we closing off defects?

• How fast do we need to close off defects to finish on time?

34

www.nztester.co.nz

NZTester

Page 20: Test Management for Large, Multi-Project Programs

18

Asked By:

• Sponsors

• Steering Committees

• Business Owners

• Programme Managers

• Project Managers

• Peers & their teams

• Test Teams

• Business Partners

• CUSTOMERS!!

35

www.nztester.co.nz

NZTester

Sticky Testing Questions:

• Easy to provide rough, finger-in-the-wind answers

• Not so easy to provide based on real information

36

www.nztester.co.nz

NZTester

Page 21: Test Management for Large, Multi-Project Programs

19

So What Information Do We Need To

Answer These:

37

www.nztester.co.nz

NZTester

Test Execution:

• The no. of test cases already passed

• The total no. test cases to be executed in the cycle

• The length of time the cycle has been active

So how do we use?

38

www.nztester.co.nz

NZTester

Page 22: Test Management for Large, Multi-Project Programs

20

Test Execution:

• The no. of test cases already passed

• The total no. test cases to be executed in the cycle

• The length of time the cycle has been active

Example:

• Say 66 test cases have passed

• Cycle has been active for 8 working days

• = a pass rate of ~8.25 test cases per day

39

www.nztester.co.nz

NZTester

Test Execution:

• The no. of test cases already passed

• The total no. test cases to be executed in the cycle

• The length of time the cycle has been active

So:

• Say 178 test cases to execute in total

• = 112 still to be passed (178 - 66)

• @ ~8.25 test cases passing per day up to that point

• Remaining 112 = a further ~13.5 working days to complete

40

www.nztester.co.nz

NZTester

Page 23: Test Management for Large, Multi-Project Programs

21

Defects:

• The no. of defects found

• The no. of defects closed off

• The length of time the cycle has been active

41

www.nztester.co.nz

NZTester

Defects:

• The no. of defects found

• The no. of defects closed off

• The length of time the cycle has been active

Example:

• Say 9 defects have been found

• Cycle has been active for 8 working days

• = a log rate of ~1.1 defects per day

• Say 5 defects have been fixed, retested & closed

• = a close rate of ~.06 defects per day

42

www.nztester.co.nz

NZTester

Page 24: Test Management for Large, Multi-Project Programs

22

We now have 4 key pieces of

information:

• Test case pass rate of 8.25/day

• Defect log rate of 1.1/day

• Defect close rate of 0.6/day

• Estimated completion is ~13.5 days away

(if we started on 20 Jul then we’ll finish ~21 Aug)

Which can be plotted over time to see trends

43

www.nztester.co.nz

NZTester

44

Plot the trends over time

www.nztester.co.nz

NZTester

Page 25: Test Management for Large, Multi-Project Programs

23

45

www.nztester.co.nz

NZTester

Plot the trends over time

46

www.nztester.co.nz

NZTester

Plot the trends over time

Page 26: Test Management for Large, Multi-Project Programs

24

47

www.nztester.co.nz

NZTester

Plot the trends over time

We now have 4 key pieces of information:

• Test case pass rate of 8.25/day

• Defect log rate of 1.1/day

• Defect close rate of 0.6/day

• Expected completion is ~13.5 days away

However:

48

www.nztester.co.nz

NZTester

Page 27: Test Management for Large, Multi-Project Programs

25

We now have 4 key pieces of information:

• Test case pass rate of 8.25/day

• Defect log rate of 1.1/day

• Defect close rate of 0.6/day

• Expected completion is ~13.5 days away

However:

What if we only have another 7 days of testing available before

the deadline?

In theory, we’ll only get to pass another ~58 test cases = 124

total passed, out of 178 = ~70%, not good. 49

www.nztester.co.nz

NZTester

And typically…

• More defects are found at the start of testing

• Less test cases are passed at the start of testing

• More defects are closed towards the end of testing

• No guarantee conditions will stay the same:

• Scope changes

• Lose test team members

• Leave requirements etc etc

• Test cases are variable in length eg. 66 short & quick v 112 long &

slow

• Snapshot in time only

50

www.nztester.co.nz

NZTester

Page 28: Test Management for Large, Multi-Project Programs

26

51

What’s the solution?

www.nztester.co.nz

NZTester

52

Need to improve productivity & get

the rates up

• More testers?

• More developers to fix defects?

• Defer lower priority test cases?

• Fix fewer defects?

• Work longer hours/weekend?

• Introduce testing shifts if viable?

• Lobby to extend the deadline?

• Whatever your project management arsenal of skills & talents can

muster!

And….

www.nztester.co.nz

NZTester

Page 29: Test Management for Large, Multi-Project Programs

27

53

Need to improve productivity & get

the rates up

• More testers?

• More developers to fix defects?

• Defer lower priority test cases?

• Fix fewer defects?

• Work longer hours/weekend?

• Introduce testing shifts if viable?

• Lobby to extend the deadline?

• Whatever your project management arsenal of skills & talents can

muster!

And need to work out what the rates need to be to meet the

deadline on a DAILY basis! NZTester

We now have 8 key pieces of information:

• Test case pass rate of 8.25/day, need 16/day (112 test

cases/7 days)

• Defect log rate of 1.1/day, will log another ~8 defects (1.1 x 7

days)

• Defect close rate of 0.6/day:

• 5 already closed

• close another ~4 defects (0.6 x 7 days)

• need to close the additional 8 = 12 total to close

• = required close rate of ~2/day

• Estimated completion is ~13.5 days away, needs to be done

in 7 days

Which again can be plotted over time to see trends:54

www.nztester.co.nz

NZTester

Page 30: Test Management for Large, Multi-Project Programs

28

55

Manage & track daily….

www.nztester.co.nz

NZTester

56

www.nztester.co.nz

NZTester

Manage & track daily….

Page 31: Test Management for Large, Multi-Project Programs

29

57

www.nztester.co.nz

NZTester

Manage & track daily….

58

www.nztester.co.nz

NZTester

Manage & track daily….

Page 32: Test Management for Large, Multi-Project Programs

30

Not…..

59

www.nztester.co.nz

NZTester

60

Rather…..

www.nztester.co.nz

NZTester

Page 33: Test Management for Large, Multi-Project Programs

31

61

Key…..

www.nztester.co.nz

NZTester

62

www.nztester.co.nz

NZTester

www.nztester.co.nz

Page 34: Test Management for Large, Multi-Project Programs

32

Test Solution DesignGeoff Horne, NZTester Magazine

[email protected]

August 2014

NZTester