xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

13
Tracking the Technical Critical Path: Systems Engineering Meets Project Management James R. Armstrong Stevens Institute of Technology Castle Point on Hudson Hoboken, NJ 02070 Copyright © 2009 by James Armstrong. Published and used by APCOSE with permission. Abstract. The critical path in the project schedule, the path with the least slack, has been a mainstay of project management for many years. Along with earned value techniques, the project manager uses this method to track schedule and cost progress. The term Technical Critical Path appears sporadically as the sequence of activities needed to develop and incorporate critical technologies that will make the program successful. However, the connection between these concepts is not clearly defined. At the same time, much has been written about Technical Performance Measures (TPMs) but it has been confined to the technical management side of systems engineering. This paper will address how the two worlds of systems engineering and project management should be integrated to combine the technical, cost, and schedule performance. Also addressed are the correct implementation of TPMs and several problems encountered in current usage. Project Management Focus The Iron Triangle. The three attributes of cost, schedule, and technical performance have long been the basis of project management. However, the emphasis in project management has been on methods to track cost and schedule. As shown in figure 1, the WBS, schedules, and earned value are structured to address these management concerns. Figure 1. The Iron Triangle with Management Emphasis WBS xxxxxx xxxxxxxxxx xxxxx xxxxxxxxxxxx xxxxx xxxxxxxxxx Xxxxx xxxxxxxxxxxx xxxxx xxxxxxxxx xxxxx xxxxxxxx xxxxx xxxxx xxxxx xxxxx xxxxx xxxxx xxxxxxxx xxxxx xxxxx xxxxxxxxx xxxxxx xxxxx xxxxxxxx xxxxx xxxxxxxxx ? Budget Baseline Management Reserve Time $$ EV Actuals EAC CV SV PM Baseline Plan JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC AWD PDR CDR TRR FCA /PCA Event Schedule Milestones HW Design HW Fab HW Test SW Design SW Test/Int HW/SW Integ System Test Schedule Technical Cost

Transcript of xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Page 1: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Tracking the Technical Critical Path: Systems

Engineering Meets Project Management

James R. Armstrong

Stevens Institute of Technology

Castle Point on Hudson

Hoboken, NJ 02070 Copyright © 2009 by James Armstrong. Published and used by APCOSE with permission.

Abstract. The critical path in the project schedule, the path with the least slack, has been a

mainstay of project management for many years. Along with earned value techniques, the project

manager uses this method to track schedule and cost progress. The term Technical Critical Path

appears sporadically as the sequence of activities needed to develop and incorporate critical

technologies that will make the program successful. However, the connection between these

concepts is not clearly defined. At the same time, much has been written about Technical

Performance Measures (TPMs) but it has been confined to the technical management side of

systems engineering. This paper will address how the two worlds of systems engineering and

project management should be integrated to combine the technical, cost, and schedule

performance. Also addressed are the correct implementation of TPMs and several problems

encountered in current usage.

Project Management Focus

The Iron Triangle. The three attributes of cost, schedule, and technical performance have long

been the basis of project management. However, the emphasis in project management has been on

methods to track cost and schedule. As shown in figure 1, the WBS, schedules, and earned value

are structured to address these management concerns.

Figure 1. The Iron Triangle with Management Emphasis

WBSxxxxxx xxxxxxxxxx

xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxx

Xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxx

xxxxx xxxxxxxxxx

xxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxxxxxxxx

xxxxx xxxxxxxxx

xxxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxx xxxxxxxxx

?

Budget Baseline

Management Reserve

Time

$$

EV

Actuals

EAC

CVSV

PM BaselinePlan

JANFEBMARAPRMAYJUNJULAUGSEPOCTNOVDEC

AWD PDR CDR TRR FCA

/PCA

EventSchedule

Milestones

HW Design

HW Fab

HW Test

SW Design

SW Test/Int

HW/SW Integ

System Test

Schedule Technical

Cost

Page 2: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

In theory, the WBS tasks should have technical objectives. In practice, this is not often well

defined. In some cases, the specification requirements are referenced, but may not actually be

what is expected at the point in the program where an early design is analyzed or prototype tested.

The use of TPMs can provide a better linkage as will be described in the following paragraphs.

Critical Path

Basic. The path in a network schedule with the least (or most negative) schedule slack is the

critical path. This concept has long been used for managing the highest risk schedule events on a

project. A project may have more than one equally critical path or management may decide to

track more than just the most critical path. Tracking the three most critical paths addresses the risk

that a problem may appear on a path that is close to critical and overtake the original path without

warning. Figure 2 is from a project where the critical path analysis identified multiple parallel

critical paths leading to fixed events. After identification of this situation, action was taken to

address this significant project risk. In this instance, multiple paths had to be monitored

throughout the program.

Figure 2. Critical Path Analysis Example

Technical Critical Path

Limited References. The term was introduced to this paper by Dr. Yeo of the Nanyang

Technological University during discussions about TPMs, February 19, 2009. It can be found in

various references where it addresses the series of technology developments that are necessary for

a development to be successful. One example is a briefing on planning for advancement in solar

energy by the U.S. Department of Energy (2006), Figure 3.

My Integrated Schedule

Q2 1998 Q3 1998 Q4 1998 Q1 1999 Q2 1999 Q3 1999 Q4 1999 Q1 2000 Q2 2000 Q3 2000 Q4 2000 Q1 2001 Q2 2001 Q3 2001 Q4 2001 Q1 2002 Q2 2002 Q3 2002

Acquisition

Design & Produ ction

Test & Evaluatio n

Implemen tation

Human F actors

RRI/ASE Soft ware Delivery

AR INC VDLM2/ATN fo r ATS+ AC ARS

American Airlin es CPD LC Program

American 767 Program

Rockwell CMU/VDR D evelopment ( need ties to cert)

767 Fligh t Deck Integration Activities (need ties t o cert)

FAA Flig ht Stan dards Tasks [Street er, Do n]

FAA Cert ificatio n Tasks (Steve Van Trees, AIR)

Requirements

Development3 90 days

06/29/19910/30/1990 days

Prep./Review of SIR4 5 days

10/26/19910/30/1990 days

SIR Released5 0 days

10/30/19910/30/1990 days

Bid/Prop osal/AwardActivities 6 45 days

11/02/19901/01/1990 days

Contract Award7 0 days

01/15/19901/15/1990 days

Design U sabilit y Stud y4785 days01/04/19904/30/199

0 days

Controller HCI Results48 0 days04/30/19904/30/199

20.4 days

Build 1 H ost/DL AP S/WDevelopment

9 368 days05/05/19909/29/200311 days

FC A/PC A Complete ( HCSSW /ProATN DL AP)

11 0 days09/29/20009/29/200311 days

Internal F AA Ho stInteroperabilit y Testin g &

Debugging25 86 days

10/02/20001/29/200311 days

Host Soft ware Suitabil it y28 22 days

01/30/20002/28/200311 days

PTR fixes, RegressionTesting, f inal H ostSoft ware Uplevel

29 40 days03/01/20004/25/200311 days

Host A5f 1.2 Co de Cut off20 0 days

04/25/20004/25/200311 days

Host Integratio n/keysit e21 159 days

04/26/20012/04/200311 days

National Delivery A5f1.222 0 days

12/04/20012/04/200311 days

Procedures/Training Stud y49 87 days

07/01/19910/30/1991132 days

CMU/Flig ht DeckPreliminary Design (tie toRockwell/Boein g effor ts)

175360 days09/01/19901/17/200323 days

CMU CPDLC/C M APreliminary Design

122129 days01/01/19906/30/199204 days

Aircraft C onfig uration

Review13264 days

01/01/19903/31/1990 days

B767 Deliveries13367 days

02/01/19905/04/1990 days

TSA Complete1340 days

06/11/19906/11/199201 days

Cockpit Display Definition13513 days

06/14/19906/30/199204 days

Development of CockpitDisplay/W iring Updates

136450 days06/14/19903/02/200200 days

Design U sabilit y Stud y51 85 days

01/04/19904/30/1990 days

Pilot HCI Result s52 0 days

04/30/19904/30/199247 days

Architect ure/H MIDefinitio n Finalized

121 0 days06/30/19906/30/199

204 days

Flight Deck HCI Evaluation53 23 days

08/25/19909/24/199

335 days

An alizeR esults54 33 days

09/27/19911/10/199335 days

Commun it y Co nsensu s55 36 days

11/12/19912/31/199334 days

CMU CPDLC/C M ADevelopment

123325 days07/01/19909/27/200

204 days

High Level Certi ficatio nAp plication172 0 days

01/31/19901/31/1990 days

AGC App roval1621 day

04/15/19904/15/1990 days

Public Co mmen t

16345 days04/20/19906/21/199383 days

Commen t Disp osition16430 days

06/22/19908/02/199383 days

AC 20DC Final

165 0 days08/03/19908/03/199383 days

American develops

certification plan andcomplian ce data

173220 days08/03/19906/05/200383 days

AF M Sup plementsDeveloped17490 days

06/06/20010/09/200966 days

CMU/Flig ht Deck

Preliminary Design

Completed1760 days

01/17/20001/17/200323 days

CMU/Flig ht Deck Detai led

Design (tie to

Rockwell/Boein g effor ts)177160 days

01/18/20008/28/200323 days

CMU/Flig ht Deck Detai ledDesign C omplet e

1780 days08/28/20008/28/200323 days

Operatio nal EnvironmentDefinitio n (F AA-Hritz)

158214 days12/07/19909/30/199560 days

Operatio nal Saf et yAssessment (F AA-Hrit z)

159214 days12/07/19909/30/199560 days

Systems Requirement sand Procedures

160167 days05/03/19912/21/199502 days

AC O app roval ofcertification plan, design

data17980 days

08/29/20012/18/200323 days

VDL Mod e 2 Development117295 days

01/11/19902/25/200357 days

Contract Preparation13174 days

06/01/19901/28/200278 days

Contract Award forCPDLC B uild 1

14 0 days01/30/20001/30/200278 days

Pre-CTS Soft ware Delivery60 0 days

02/07/20002/07/200221 days

AT NSI R RI Delivered1180 days

02/07/20002/07/200221 days

AT NSI R RI Integ/Debu g119151 days

02/23/20009/20/200209 days

CPDLC and CM A ASE'sDelivery61 0 days

02/07/20002/07/200223 days

AC I ASE Delivered1240 days

02/07/20002/07/200223 days

Reqts An alysis t oIntegrate ATNSI RRI and

ASEs15 50 days

03/02/20005/10/200255 days

CMU ACI CPDL C/CM AASE Integ/Debu g

125149 days02/15/20009/08/200217 days

CMU/VD R FunctionalTesting12648 days

09/28/20012/04/200204 days

Avionics/Groun d Syst emTesting and Up dates

(Estimated)12760 days

12/05/20002/26/200204 days

R/L CMU Available1280 days

02/26/20002/26/200204 days

Pro ATN DL AP Accept edand Complete10 0 days

05/11/20005/11/200255 days

Design, Implement, TestAT NSI R RI, ASE

16180 days07/24/20003/30/200203 days

Conduct FinalConfiguration

Au dits/In corporateChanges1723 days

04/02/20005/02/200203 days

FC A/PC A Complete(ATNSI D L AP)18 0 days

05/02/20005/02/200203 days

System D elivered to T &E

Site240 days

05/02/20005/02/200203 days

Internal F AA FinalInteroperabilit y Testin g &

Debugging2640 days

05/14/20007/06/200196 days

FAA HMI Cockp it Survey97 4 days

11/09/19911/12/1990 days

Cockpit Integrat ion H MIprogram with T ech Center

98131 days04/01/19909/30/199

1239 days

Sim Hard wareConfiguration 1

99219 days04/01/19902/01/200

336 days

Sim Hard wareConfiguration F inal

10067 days03/01/20006/01/200316 days

Sim Data Link Provision101120 days

07/03/20012/15/200295 days

AT SS/Pilot/Syst emIntegratio n Plan ning

56382 days05/17/19910/31/200870 days

CPDLC IRD/ICD69130 days

05/28/19911/25/199833 days

Interface bet ween ARINC,FAATC

74 60 days08/09/19910/29/199852 days

Implemen t Test bed GSs

75 60 days08/09/19910/29/199852 days

Demo test/prep

76196 days11/26/19908/25/200

833 days

Support Demon stratio n77 170 days

08/28/20004/20/200833 days

Demonst rate

AT SS/Pilot/Syst emIntegratio n57 86 days

12/18/20004/16/200837 days

R/L VDL Mode 2 VDRAvailable1130 days

10/01/19910/01/199571 days

B/L VDL Mode 2 VDRAvailable

1140 days06/30/20006/30/200376 days

AVLC Testing w/RColl ins65100 days

06/30/20011/16/200944 days

Provision Aircraft for RedLabel avionics137140 days

08/01/20002/12/200214 days

VDL-2/C MU/Flig ht DeckIntegratio n13840 days

03/05/20004/27/200200 days

Cockpit integrat ionComplete1390 days

04/27/20004/27/200200 days

Quick Ch ange ECO's ( tieto out of service testin g)

140153 days07/03/20001/31/200890 days

Integrate ATN D isplaysand CMU in Sim10246 days

04/30/20007/02/200200 days

Review and Upd ateHuman F actors

Requirements andGuidance45 304 days

12/01/19901/28/2001153 days

Human F actors Testin gComplete580 days

04/16/20004/16/200837 days

Procedure Refin ement for

PET AL IIe109133 days

06/01/20012/04/200333 days

Installatio n Insp ection(tied to C MU/flight Deck

completion)18123 days

04/30/20005/30/200229 days

Maintenance & MELManuals Complete

107485 days10/26/19909/01/200422 days

Aircraft and Op eratingManuals Complete

106152 days06/01/20012/31/200337 days

Maintenance CrewTraining & Procedures for

PET AL IIe1080 days

12/31/20012/31/200337 days

AA Approved Training

Program104 451 days

01/08/19909/29/200402 days

AA PET AL II Maastrich tFirst Flig ht182 1 day

05/31/20005/31/200229 days

PET AL II

18360 days07/02/20009/21/200208 days

PET AL II Test R esultAn alysis

18410 days09/24/20010/05/200208 days

CMU Up dates and Release12915 wks

10/08/20001/18/20041.6 wks

Planned B/L CM U Availfor Cert

1300 days01/18/20001/18/200208 days

Ad dress PET AL II TestIssues an d Fix

18515 wks10/08/20001/18/20041.6 wks

Provision Aircraft for

Black Lab el CM US14170 days

01/21/20004/26/200208 days

767's Available f or Miami1420 days

04/26/20004/26/200208 days

AF M Sup plementsCompleted1865 days

01/21/20001/25/200632 days

FAA T yp e Desig nAp proval

187 1 day01/28/20001/28/200632 days

Ground Based ( ServiceProvider/F AA) NetMgt

Description/Req s70 87 days

05/28/19909/27/199489 days

Ops Perf & SuccessCriteria

71 175 days05/28/19901/27/200401 days

AR INC, F AATC G/Ginterop @ F AATC

81 40 days04/06/20005/31/20091 days

Begin Int egratio n Test ing(need Flight Deck inpu t,

sho ws sim option)30 0 days

07/06/20007/06/200196 days

Integratio n Test ing

31 55 days07/09/20009/21/200196 days

Acceptab ilit y Testing

32 44 days09/24/20011/22/200196 days

DL AP PT R Fixes &Regression Test ing

33 50 days11/23/20001/31/200196 days

OT&E Co mplete

34 0 days01/31/20001/31/200196 days

Install production Gro undStation f or testing

79 60 days09/08/20011/30/200657 days

A/G interop test ing wit hRockwell, American fo r

P2E80107 days

12/01/20004/30/200657 days

AR INC, F AATC A/Ginterop @ F AATC

82170 days07/06/20002/28/200609 days

OT&E Su pport83170 days

07/06/20002/28/20066 days

Site Prep, HostImplemen tation,

Installatio n & C heckou t,SAT Activities

36133 days08/01/20002/01/200195 days

Delivery t o Key Site37 10 days

02/04/20002/15/200195 days

Field Familiarization38 63 days

02/18/20005/15/200195 days

SAT Testing Co mplete39 0 days

05/15/20005/15/200195 days

VDL Mod e 2/AT NDevelopment64741 days

06/29/19904/30/200119 days

AT S M2/ATN IOC66 0 days

04/30/20004/30/200119 days

Install VD L Mod e 2/AT N inMiami/op s service syst em

86 120 days10/15/20003/29/2000 days

Pre-operations88 130 days

12/03/20005/31/2000 days

Post-operations89140 days

06/03/20012/13/200403 days

Procedure/Train ingDevelopment/R efinement

for Miami110578 days

03/17/19906/01/200443 days

AGC App roval14544 days

04/15/19906/15/199557 days

Public Co mmen t14623 days

06/16/19907/16/199557 days

AF S Team Approval14768 days

07/19/19910/20/199557 days

AC 120.com Fin al1480 days

11/01/19911/01/199550 days

American DevelopsValidatio n Test Plan

151220 days11/01/19909/01/200550 days

POI Checks Ops ApprovalTest Plan15265 days

09/04/20012/01/200550 days

Validatio n Test15323 days

06/05/20007/05/200419 days

American develops an dsubmits operational

approval plan spec150220 days

11/01/19909/01/200638 days

Operatio nal SpecificationSpec/Operation al

Ap proval Issued ( AFS POI)1540 days

11/01/20011/01/200335 days

Initial Op eratingCapabilit y (IOC)

40 0 days05/15/20005/15/200195 days

Prep. For OperationalRediness Demo nstration

(ORD)41 67 days

05/16/20008/16/200488 days

First OR D42 0 days

08/16/20008/16/200488 days

Flight Standard s DataCollectio n durin g live ops

155360 days06/04/20010/20/200182 days

Svc Level Agreementwit h F AA72 60 days

03/12/20006/01/200260 days

Service for Miami liveoperation s90543 days

06/03/20006/30/2000 days

Data Link Implementat ionReview

92156 days06/01/19901/04/1990 days

Retrofit Plan for AmericanFleet

95 43 days02/01/19903/31/199

1370 days

RTC A 172 VDL Mode 2

MOPS (Son Tran)167296 days

04/15/19906/01/200794 days

AIR-100 T SO

168270 days06/02/20006/14/200794 days

AIR-100 T SO process170143 days

04/15/19911/01/1991217 days

FAA- ARINC Co ntract forBld 1

67 0 days05/28/19905/28/1990 days

Critical Airports/EnRo uteIAGS Dep lo yed

85 0 days12/31/19912/31/1990 days

Host/

DLAP

RRI/AS

E

767

Host Cutoff

Miami

Petal II

Path at or near critical

Key Events

Fixed Events

My Integrated ScheduleQ2 1998 Q3 1998 Q4 1998 Q1 1999 Q2 1999 Q3 1999 Q4 1999 Q1 2000 Q2 2000 Q3 2000 Q4 2000 Q1 2001 Q2 2001 Q3 2001 Q4 2001 Q1 2002 Q2 2002 Q3 2002

Acquisition

Design & Produ ction

Test & Evaluatio n

Implemen tation

Human F actors

RRI/ASE Soft ware Delivery

AR INC VDLM2/ATN fo r ATS+ AC ARS

American Airlin es CPD LC Program

American 767 Program

Rockwell CMU/VDR D evelopment ( need ties to cert)

767 Fligh t Deck Integration Activities (need ties t o cert)

FAA Flig ht Stan dards Tasks [Street er, Do n]

FAA Cert ificatio n Tasks (Steve Van Trees, AIR)

Requirements

Development3 90 days

06/29/19910/30/1990 days

Prep./Review of SIR4 5 days

10/26/19910/30/1990 days

SIR Released5 0 days

10/30/19910/30/1990 days

Bid/Prop osal/AwardActivities 6 45 days

11/02/19901/01/1990 days

Contract Award7 0 days

01/15/19901/15/1990 days

Design U sabilit y Stud y4785 days01/04/19904/30/199

0 days

Controller HCI Results48 0 days04/30/19904/30/199

20.4 days

Build 1 H ost/DL AP S/WDevelopment

9 368 days05/05/19909/29/200311 days

FC A/PC A Complete ( HCSSW /ProATN DL AP)

11 0 days09/29/20009/29/200311 days

Internal F AA Ho stInteroperabilit y Testin g &

Debugging25 86 days

10/02/20001/29/200

Requirements

Development3 90 days

06/29/19910/30/1990 days

Prep./Review of SIR4 5 days

10/26/19910/30/1990 days

SIR Released5 0 days

10/30/19910/30/1990 days

Bid/Prop osal/AwardActivities 6 45 days

11/02/19901/01/1990 days

Contract Award7 0 days

01/15/19901/15/1990 days

Design U sabilit y Stud y4785 days01/04/19904/30/199

0 days

Controller HCI Results48 0 days04/30/19904/30/199

20.4 days

Build 1 H ost/DL AP S/WDevelopment

9 368 days05/05/19909/29/200311 days

FC A/PC A Complete ( HCSSW /ProATN DL AP)

11 0 days09/29/20009/29/200311 days

Internal F AA Ho stInteroperabilit y Testin g &

Debugging25 86 days

10/02/20001/29/200311 days

Host Soft ware Suitabil it y28 22 days

01/30/20002/28/200311 days

PTR fixes, RegressionTesting, f inal H ostSoft ware Uplevel

29 40 days03/01/20004/25/200311 days

Host A5f 1.2 Co de Cut off20 0 days

04/25/20004/25/200311 days

Host Integratio n/keysit e21 159 days

04/26/20012/04/200311 days

National Delivery A5f1.222 0 days

12/04/20012/04/200311 days

Procedures/Training Stud y49 87 days

07/01/19910/30/1991132 days

CMU/Flig ht DeckPreliminary Design (tie toRockwell/Boein g effor ts)

175360 days09/01/19901/17/200323 days

CMU CPDLC/C M APreliminary Design

122129 days01/01/19906/30/199204 days

Aircraft C onfig uration

Review13264 days

01/01/19903/31/1990 days

B767 Deliveries13367 days

02/01/19905/04/1990 days

TSA Complete1340 days

06/11/19906/11/199201 days

Cockpit Display Definition13513 days

06/14/19906/30/199204 days

Development of CockpitDisplay/W iring Updates

136450 days06/14/19903/02/200200 days

Design U sabilit y Stud y51 85 days

01/04/19904/30/1990 days

Pilot HCI Result s52 0 days

04/30/19904/30/199247 days

Architect ure/H MIDefinitio n Finalized

121 0 days06/30/19906/30/199

311 days

Host Soft ware Suitabil it y28 22 days

01/30/20002/28/200311 days

PTR fixes, RegressionTesting, f inal H ostSoft ware Uplevel

29 40 days03/01/20004/25/200311 days

Host A5f 1.2 Co de Cut off20 0 days

04/25/20004/25/200311 days

Host Integratio n/keysit e21 159 days

04/26/20012/04/200311 days

National Delivery A5f1.222 0 days

12/04/20012/04/200311 days

Procedures/Training Stud y49 87 days

07/01/19910/30/1991132 days

CMU/Flig ht DeckPreliminary Design (tie toRockwell/Boein g effor ts)

175360 days09/01/19901/17/200323 days

CMU CPDLC/C M APreliminary Design

122129 days01/01/19906/30/199204 days

Aircraft C onfig uration

Review13264 days

01/01/19903/31/1990 days

B767 Deliveries13367 days

02/01/19905/04/1990 days

TSA Complete1340 days

06/11/19906/11/199201 days

Cockpit Display Definition13513 days

06/14/19906/30/199204 days

Development of CockpitDisplay/W iring Updates

136450 days06/14/19903/02/200200 days

Design U sabilit y Stud y51 85 days

01/04/19904/30/1990 days

Pilot HCI Result s52 0 days

04/30/19904/30/199247 days

Architect ure/H MIDefinitio n Finalized

121 0 days06/30/19906/30/199

204 days

Flight Deck HCI Evaluation53 23 days

08/25/19909/24/199

335 days

An alizeR esults54 33 days

09/27/19911/10/199335 days

Commun it y Co nsensu s55 36 days

11/12/19912/31/199334 days

CMU CPDLC/C M ADevelopment

123325 days07/01/19909/27/200

204 days

High Level Certi ficatio nAp plication172 0 days

01/31/19901/31/1990 days

AGC App roval1621 day

04/15/19904/15/1990 days

Public Co mmen t

16345 days04/20/19906/21/199383 days

Commen t Disp osition16430 days

06/22/19908/02/199383 days

AC 20DC Final

165 0 days08/03/19908/03/199383 days

American develops

certification plan andcomplian ce data

173220 days08/03/19906/05/200383 days

AF M Sup plementsDeveloped17490 days

06/06/20010/09/200966 days

CMU/Flig ht Deck

Preliminary Design

Completed1760 days

01/17/20001/17/200323 days

CMU/Flig ht Deck Detai led

Design (tie to

Rockwell/Boein g effor ts)177160 days

01/18/20008/28/200323 days

CMU/Flig ht Deck Detai ledDesign C omplet e

1780 days08/28/20008/28/200323 days

Operatio nal EnvironmentDefinitio n (F AA-Hritz)

158214 days12/07/19909/30/199560 days

Operatio nal Saf et yAssessment (F AA-Hrit z)

159214 days

204 days

Flight Deck HCI Evaluation53 23 days

08/25/19909/24/199

335 days

An alizeR esults54 33 days

09/27/19911/10/199335 days

Commun it y Co nsensu s55 36 days

11/12/19912/31/199334 days

CMU CPDLC/C M ADevelopment

123325 days07/01/19909/27/200

204 days

High Level Certi ficatio nAp plication172 0 days

01/31/19901/31/1990 days

AGC App roval1621 day

04/15/19904/15/1990 days

Public Co mmen t

16345 days04/20/19906/21/199383 days

Commen t Disp osition16430 days

06/22/19908/02/199383 days

AC 20DC Final

165 0 days08/03/19908/03/199383 days

American develops

certification plan andcomplian ce data

173220 days08/03/19906/05/200383 days

AF M Sup plementsDeveloped17490 days

06/06/20010/09/200966 days

CMU/Flig ht Deck

Preliminary Design

Completed1760 days

01/17/20001/17/200323 days

CMU/Flig ht Deck Detai led

Design (tie to

Rockwell/Boein g effor ts)177160 days

01/18/20008/28/200323 days

CMU/Flig ht Deck Detai ledDesign C omplet e

1780 days08/28/20008/28/200323 days

Operatio nal EnvironmentDefinitio n (F AA-Hritz)

158214 days12/07/19909/30/199560 days

Operatio nal Saf et yAssessment (F AA-Hrit z)

159214 days12/07/19909/30/199560 days

Systems Requirement sand Procedures

160167 days05/03/19912/21/199502 days

AC O app roval ofcertification plan, design

data17980 days

08/29/20012/18/200323 days

VDL Mod e 2 Development117295 days

01/11/19902/25/200357 days

Contract Preparation13174 days

06/01/19901/28/200278 days

Contract Award forCPDLC B uild 1

14 0 days01/30/20001/30/200278 days

Pre-CTS Soft ware Delivery60 0 days

02/07/20002/07/200221 days

AT NSI R RI Delivered1180 days

02/07/20002/07/200221 days

AT NSI R RI Integ/Debu g119151 days

02/23/20009/20/200209 days

CPDLC and CM A ASE'sDelivery61 0 days

02/07/20002/07/200223 days

AC I ASE Delivered1240 days

02/07/20002/07/200223 days

Reqts An alysis t oIntegrate ATNSI RRI and

ASEs15 50 days

03/02/20005/10/200255 days

CMU ACI CPDL C/CM AASE Integ/Debu g

125149 days02/15/20009/08/200217 days

CMU/VD R FunctionalTesting12648 days

09/28/20012/04/200204 days

Avionics/Groun d Syst emTesting and Up dates

(Estimated)12760 days

12/05/20002/26/200204 days

R/L CMU Available1280 days

02/26/20002/26/200204 days

Pro ATN DL AP Accept edand Complete

12/07/19909/30/199560 days

Systems Requirement sand Procedures

160167 days05/03/19912/21/199502 days

AC O app roval ofcertification plan, design

data17980 days

08/29/20012/18/200323 days

VDL Mod e 2 Development117295 days

01/11/19902/25/200357 days

Contract Preparation13174 days

06/01/19901/28/200278 days

Contract Award forCPDLC B uild 1

14 0 days01/30/20001/30/200278 days

Pre-CTS Soft ware Delivery60 0 days

02/07/20002/07/200221 days

AT NSI R RI Delivered1180 days

02/07/20002/07/200221 days

AT NSI R RI Integ/Debu g119151 days

02/23/20009/20/200209 days

CPDLC and CM A ASE'sDelivery61 0 days

02/07/20002/07/200223 days

AC I ASE Delivered1240 days

02/07/20002/07/200223 days

Reqts An alysis t oIntegrate ATNSI RRI and

ASEs15 50 days

03/02/20005/10/200255 days

CMU ACI CPDL C/CM AASE Integ/Debu g

125149 days02/15/20009/08/200217 days

CMU/VD R FunctionalTesting12648 days

09/28/20012/04/200204 days

Avionics/Groun d Syst emTesting and Up dates

(Estimated)12760 days

12/05/20002/26/200204 days

R/L CMU Available1280 days

02/26/20002/26/200204 days

Pro ATN DL AP Accept edand Complete10 0 days

05/11/20005/11/200255 days

Design, Implement, TestAT NSI R RI, ASE

16180 days07/24/20003/30/200203 days

Conduct FinalConfiguration

Au dits/In corporateChanges1723 days

04/02/20005/02/200203 days

FC A/PC A Complete(ATNSI D L AP)18 0 days

05/02/20005/02/200203 days

System D elivered to T &E

Site240 days

05/02/20005/02/200203 days

Internal F AA FinalInteroperabilit y Testin g &

Debugging2640 days

05/14/20007/06/200196 days

FAA HMI Cockp it Survey97 4 days

11/09/19911/12/1990 days

Cockpit Integrat ion H MIprogram with T ech Center

98131 days04/01/19909/30/199

1239 days

Sim Hard wareConfiguration 1

99219 days04/01/19902/01/200

336 days

Sim Hard wareConfiguration F inal

10067 days03/01/20006/01/200316 days

Sim Data Link Provision101120 days

07/03/20012/15/200295 days

AT SS/Pilot/Syst emIntegratio n Plan ning

56382 days05/17/19910/31/200870 days

CPDLC IRD/ICD69130 days

05/28/19911/25/199833 days

Interface bet ween ARINC,FAATC

74 60 days08/09/19910/29/199852 days

Implemen t Test bed GSs

75 60 days08/09/19910/29/199852 days

Demo test/prep

76196 days11/26/19908/25/200

10 0 days05/11/20005/11/200255 days

Design, Implement, TestAT NSI R RI, ASE

16180 days07/24/20003/30/200203 days

Conduct FinalConfiguration

Au dits/In corporateChanges1723 days

04/02/20005/02/200203 days

FC A/PC A Complete(ATNSI D L AP)18 0 days

05/02/20005/02/200203 days

System D elivered to T &E

Site240 days

05/02/20005/02/200203 days

Internal F AA FinalInteroperabilit y Testin g &

Debugging2640 days

05/14/20007/06/200196 days

FAA HMI Cockp it Survey97 4 days

11/09/19911/12/1990 days

Cockpit Integrat ion H MIprogram with T ech Center

98131 days04/01/19909/30/199

1239 days

Sim Hard wareConfiguration 1

99219 days04/01/19902/01/200

336 days

Sim Hard wareConfiguration F inal

10067 days03/01/20006/01/200316 days

Sim Data Link Provision101120 days

07/03/20012/15/200295 days

AT SS/Pilot/Syst emIntegratio n Plan ning

56382 days05/17/19910/31/200870 days

CPDLC IRD/ICD69130 days

05/28/19911/25/199833 days

Interface bet ween ARINC,FAATC

74 60 days08/09/19910/29/199852 days

Implemen t Test bed GSs

75 60 days08/09/19910/29/199852 days

Demo test/prep

76196 days11/26/19908/25/200

833 days

Support Demon stratio n77 170 days

08/28/20004/20/200833 days

Demonst rate

AT SS/Pilot/Syst emIntegratio n57 86 days

12/18/20004/16/200837 days

R/L VDL Mode 2 VDRAvailable1130 days

10/01/19910/01/199571 days

B/L VDL Mode 2 VDRAvailable

1140 days06/30/20006/30/200376 days

AVLC Testing w/RColl ins65100 days

06/30/20011/16/200944 days

Provision Aircraft for RedLabel avionics137140 days

08/01/20002/12/200214 days

VDL-2/C MU/Flig ht DeckIntegratio n13840 days

03/05/20004/27/200200 days

Cockpit integrat ionComplete1390 days

04/27/20004/27/200200 days

Quick Ch ange ECO's ( tieto out of service testin g)

140153 days07/03/20001/31/200890 days

Integrate ATN D isplaysand CMU in Sim10246 days

04/30/20007/02/200200 days

Review and Upd ateHuman F actors

Requirements andGuidance45 304 days

12/01/19901/28/2001153 days

Human F actors Testin gComplete580 days

04/16/20004/16/200837 days

Procedure Refin ement for

PET AL IIe109133 days

06/01/20012/04/200333 days

Installatio n Insp ection(tied to C MU/flight Deck

completion)18123 days

04/30/20005/30/200229 days

Maintenance & MELManuals Complete

107485 days10/26/19909/01/200422 days

833 days

Support Demon stratio n77 170 days

08/28/20004/20/200833 days

Demonst rate

AT SS/Pilot/Syst emIntegratio n57 86 days

12/18/20004/16/200837 days

R/L VDL Mode 2 VDRAvailable1130 days

10/01/19910/01/199571 days

B/L VDL Mode 2 VDRAvailable

1140 days06/30/20006/30/200376 days

AVLC Testing w/RColl ins65100 days

06/30/20011/16/200944 days

Provision Aircraft for RedLabel avionics137140 days

08/01/20002/12/200214 days

VDL-2/C MU/Flig ht DeckIntegratio n13840 days

03/05/20004/27/200200 days

Cockpit integrat ionComplete1390 days

04/27/20004/27/200200 days

Quick Ch ange ECO's ( tieto out of service testin g)

140153 days07/03/20001/31/200890 days

Integrate ATN D isplaysand CMU in Sim10246 days

04/30/20007/02/200200 days

Review and Upd ateHuman F actors

Requirements andGuidance45 304 days

12/01/19901/28/2001153 days

Human F actors Testin gComplete580 days

04/16/20004/16/200837 days

Procedure Refin ement for

PET AL IIe109133 days

06/01/20012/04/200333 days

Installatio n Insp ection(tied to C MU/flight Deck

completion)18123 days

04/30/20005/30/200229 days

Maintenance & MELManuals Complete

107485 days10/26/19909/01/200422 days

Aircraft and Op eratingManuals Complete

106152 days06/01/20012/31/200337 days

Maintenance CrewTraining & Procedures for

PET AL IIe1080 days

12/31/20012/31/200337 days

AA Approved Training

Program104 451 days

01/08/19909/29/200402 days

AA PET AL II Maastrich tFirst Flig ht182 1 day

05/31/20005/31/200229 days

PET AL II

18360 days07/02/20009/21/200208 days

PET AL II Test R esultAn alysis

18410 days09/24/20010/05/200208 days

CMU Up dates and Release12915 wks

10/08/20001/18/20041.6 wks

Planned B/L CM U Availfor Cert

1300 days01/18/20001/18/200208 days

Ad dress PET AL II TestIssues an d Fix

18515 wks10/08/20001/18/20041.6 wks

Provision Aircraft for

Black Lab el CM US14170 days

01/21/20004/26/200208 days

767's Available f or Miami1420 days

04/26/20004/26/200208 days

AF M Sup plementsCompleted1865 days

01/21/20001/25/200632 days

FAA T yp e Desig nAp proval

187 1 day01/28/20001/28/200632 days

Ground Based ( ServiceProvider/F AA) NetMgt

Description/Req s70 87 days

05/28/19909/27/199489 days

Ops Perf & SuccessCriteria

71 175 days05/28/19901/27/200401 days

AR INC, F AATC G/Ginterop @ F AATC

81 40 days

Aircraft and Op eratingManuals Complete

106152 days06/01/20012/31/200337 days

Maintenance CrewTraining & Procedures for

PET AL IIe1080 days

12/31/20012/31/200337 days

AA Approved Training

Program104 451 days

01/08/19909/29/200402 days

AA PET AL II Maastrich tFirst Flig ht182 1 day

05/31/20005/31/200229 days

PET AL II

18360 days07/02/20009/21/200208 days

PET AL II Test R esultAn alysis

18410 days09/24/20010/05/200208 days

CMU Up dates and Release12915 wks

10/08/20001/18/20041.6 wks

Planned B/L CM U Availfor Cert

1300 days01/18/20001/18/200208 days

Ad dress PET AL II TestIssues an d Fix

18515 wks10/08/20001/18/20041.6 wks

Provision Aircraft for

Black Lab el CM US14170 days

01/21/20004/26/200208 days

767's Available f or Miami1420 days

04/26/20004/26/200208 days

AF M Sup plementsCompleted1865 days

01/21/20001/25/200632 days

FAA T yp e Desig nAp proval

187 1 day01/28/20001/28/200632 days

Ground Based ( ServiceProvider/F AA) NetMgt

Description/Req s70 87 days

05/28/19909/27/199489 days

Ops Perf & SuccessCriteria

71 175 days05/28/19901/27/200401 days

AR INC, F AATC G/Ginterop @ F AATC

81 40 days04/06/20005/31/20091 days

Begin Int egratio n Test ing(need Flight Deck inpu t,

sho ws sim option)30 0 days

07/06/20007/06/200196 days

Integratio n Test ing

31 55 days07/09/20009/21/200196 days

Acceptab ilit y Testing

32 44 days09/24/20011/22/200196 days

DL AP PT R Fixes &Regression Test ing

33 50 days11/23/20001/31/200196 days

OT&E Co mplete

34 0 days01/31/20001/31/200196 days

Install production Gro undStation f or testing

79 60 days09/08/20011/30/200657 days

A/G interop test ing wit hRockwell, American fo r

P2E80107 days

12/01/20004/30/200657 days

AR INC, F AATC A/Ginterop @ F AATC

82170 days07/06/20002/28/200609 days

OT&E Su pport83170 days

07/06/20002/28/20066 days

Site Prep, HostImplemen tation,

Installatio n & C heckou t,SAT Activities

36133 days08/01/20002/01/200195 days

Delivery t o Key Site37 10 days

02/04/20002/15/200195 days

Field Familiarization38 63 days

02/18/20005/15/200195 days

SAT Testing Co mplete39 0 days

05/15/20005/15/200195 days

VDL Mod e 2/AT NDevelopment64741 days

06/29/19904/30/200119 days

AT S M2/ATN IOC66 0 days

04/30/20004/30/200119 days

Install VD L Mod e 2/AT N inMiami/op s service syst em

86 120 days10/15/20003/29/2000 days

Pre-operations88 130 days

12/03/20005/31/2000 days

Post-operations89140 days

06/03/20012/13/200403 days

Procedure/Train ingDevelopment/R efinement

for Miami110578 days

03/17/19906/01/200443 days

AGC App roval14544 days

04/15/19906/15/199557 days

Public Co mmen t14623 days

06/16/19907/16/199557 days

AF S Team Approval14768 days

07/19/19910/20/199557 days

AC 120.com Fin al1480 days

11/01/19911/01/199550 days

American DevelopsValidatio n Test Plan

151220 days11/01/19909/01/200550 days

POI Checks Ops ApprovalTest Plan15265 days

09/04/20012/01/200550 days

Validatio n Test15323 days

06/05/20007/05/200419 days

American develops an dsubmits operational

approval plan spec150220 days

11/01/19909/01/200638 days

Operatio nal SpecificationSpec/Operation al

Ap proval Issued ( AFS POI)1540 days

11/01/20011/01/200335 days

Initial Op eratingCapabilit y (IOC)

40 0 days05/15/20005/15/200195 days

Prep. For OperationalRediness Demo nstration

(ORD)41 67 days

05/16/20008/16/200488 days

First OR D42 0 days

08/16/20008/16/200488 days

Flight Standard s DataCollectio n durin g live ops

155360 days06/04/20010/20/200182 days

Svc Level Agreementwit h F AA72 60 days

03/12/20006/01/200260 days

Service for Miami liveoperation s90543 days

06/03/20006/30/2000 days

Data Link Implementat ionReview

92156 days06/01/19901/04/1990 days

Retrofit Plan for AmericanFleet

95 43 days02/01/19903/31/199

1370 days

RTC A 172 VDL Mode 2

MOPS (Son Tran)167296 days

04/15/19906/01/200794 days

AIR-100 T SO

168270 days06/02/20006/14/200794 days

AIR-100 T SO process170143 days

04/15/19911/01/1991217 days

FAA- ARINC Co ntract forBld 1

67 0 days05/28/19905/28/1990 days

Critical Airports/EnRo uteIAGS Dep lo yed

85 0 days12/31/19912/31/1990 days

Host/

DLAP

RRI/AS

E

767

Host Cutoff

Miami

Petal II

Path at or near critical

Key Events

Fixed Events

Page 3: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Figure 3. Technical Critical Path Reference

The term can also be found in a Defence Acquisition University briefing on Systems Engineering

Plans. Measuring this overall technology development would appear to be on the scale of

Technology Readiness Levels.

Technology Readiness Levels. NASA identified a nine-level scale to address the maturity of

technologies for application to programs. As can be seen in figure 4 (Mankins, 1995), the levels

can be broadly interpreted. The differences between levels are primarily driven by factors other

than specified performance. For instance, a demonstration can be of a component or a subsystem.

It can be performed in a laboratory environment or the appropriate space or ground environment.

This is appropriate for addressing general progress in a longer term program over several phases.

However, there is an opportunity to tie the levels of technical progress in a specific program to a

more detailed definition of the progress.

Figure 4. Technology Readiness Levels

Actual system “flight proven” through successful

mission operations

Actual system completed and “flight qualified”

through test and demonstration (Ground or Flight)

System prototype demonstration in a space

Environment

System/subsystem model or prototype demonstration

in a relevant environment (Ground or Space)

Component and/or breadboard validation in relevant

Environment

Component and/or breadboard validation in

laboratory environment

Analytical and experimental critical function and/or

characteristic proof-of-concept

Technology concept and/or application formulated

Basic principles observed and reported

Page 4: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Technical Performance Measures

Need for technical input. The primary emphasis of management systems is on cost and schedule.

The Earned Value Measurement System (EVMS) is intended to connect the tracking of dollars

with the actual progress being made. While the descriptions of tasks in a Work Breakdown

Structure (WBS) call for a definition of technical objectives, specific guidance on how to establish

these objectives is not readily available. This leaves the integration of the EV management

techniques and the technical effort to the reader. In practice, the WBS tasks often refer to the

accomplishment of a task or completion of a product without a qualitative description of what the

task or product is intended to achieve. As a result, the EVMS and technical progress tracking are

generally addressed in a disconnected fashion. Properly constructed and implemented TPMs can

provide the technical progress information to address the technical critical path and help the

integration of the technical and management disciplines. However, several common mistakes in

TPM use must be avoided to allow this integration to take effect.

Why more on TPMs. Descriptions of TPMs and

how to use them abound. For instance, the

INCOSE Systems Engineering Handbook

(INCOSE, 2006) has a good description of what

they are and an example, figure 5. They have been

used in programs effectively for decades.

Therefore, it was with surprise that they were

observed in systems engineering training classes as

the one concept that was most frequently

problematic in exercises. Later, as students used

their internal process assets in class to develop

TPMs it was discovered that the processes did not

address the key features of TPMs in their directions

or templates. In another instance, the same was

found to be the case in an appraisal.

What to measure. The first concern is to determine the parameters that should be measured. All

measurement decisions of this kind should begin with determining the question to be answered by

the data. In the case of TPMs, the status of the progress on the most critical design parameters is

what the data should reveal. The determination of which parameters are the most critical is

accomplished through a combination of requirements decomposition and risk management.

Requirements decomposition. There are many terms for the highest level of requirements for a

system. Measure of Effectiveness (MOE) will be used here but other terms with similar meanings

include Key Performance Parameter, Most Important Requirement, and Key System Attributes.

For instance, you might be concerned about cost of ownership for a vehicle. Gas mileage is one

contributor. However, if that is achieved through a very complex control system that is expensive

to build and maintain, the total cost, the MOE, may not be achieved. However, if the mileage is

determined to be the critical component of the MOE and at risk, it would be a very likely

candidate. Further, we would look for a detail parameter in engine design that is both the design

challenge and a driving factor in achieving the desired performance. This trace of MOE to TPM is

shown in figure 6 as described in a PSM and INCOSE report (PSM/INCOSE 2005).

Figure 5. SE Handbook TPM Example

Page 5: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Figure 6. Relationship between MOEs and TPMs

Risk. Of course, the parameter in question

should also show up on the list of technical

risks as one of the higher risks in combined

impact and likelihood of occurring, as shown in

figure 7. TPM tracking then becomes one of

the mitigations as a monitoring technique. It

also provides the trigger for future decisions to

add or cancel specific mitigation activities.

Should the progress not be satisfactory,

additional mitigations may be triggered or

decisions for alternate designs made. Should

progress exceed plans, efforts on back-up

designs might be canceled.

Plan for changes. The next step is to define the

projected change in the monitored parameter

over the life of the development, figure 8. As

we have been discussing it, this would likely be an improvement to meet the desired value.

However, there may be a positive margin at the start of design, e.g., the product looks to be well

under a weight limit, and we want to make sure that the margin does not disappear in the first few

weeks of serious design. In figure 5, that case is represented by the term “limit” rather than

“target” for a performance objective we are trying to achieve. In the case where the product starts

out overweight, the initial prediction would be above the target and the prediction curve would

show the plan for reduction.

Measures of

Effectiveness

(MOEs)

Measures of

Performance

(MOPs)

Technical

Performance

Measures

(TPMs)

Key

Performance

Parameters

(KPPs)Mission

Needs or

Operating

Issues Increasing

Scope of

Technical Solution

Increasing

Technical Resolution

& Periodic Insight

Technical

Insight

(Progress

& Risk)

Technical Measures are Interdependent

Figure 3. Relationship of TPMs and MOEs

5

43

21

Risk #1: Related to selected TPM

Pro

bab

ility

Severity

Figure 7. TPMs and Risk

Page 6: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Figure 8. Prediction of Change

The shape used in this example is a simple exponential form where more rapid progress is

expected early in the development with progress becoming more difficult as the program proceeds.

This is a notional example. Planned progress for a specific program may look considerably

different. For instance, figure 5 shows a TPM where a design margin is monitored and the plan is

to minimize degradation early and expect some larger changes later in the program. Some

examples to avoid are presented later in this paper.

Adding ties to events. The next step is one that is most commonly omitted. This is essential to the

integration or separation of the technical and management processes. In this step, progress on the

plan curve is explicitly tied to specific events in the WBS and schedule, particularly the events on

the Technical Critical Path, as shown in figure 9. One description of tasks says to “identify

physical products, milestones, technical performance goals, or other indicators that will be used to

measure progress.” (NASA) In practice, this means referring an early design tasks or prototype

effort to the final target specification. The results are misleading if the expectation is that the final

number will not be met at that time or that a positive margin will be mostly retained. It is more

realistic to make the technical performance goal for that early event to the predictions in the TPM

plan. If the top highlighted task in the WBS in figure 5 is a prototype development to address the

technical risk relating to the TPM parameter, the technical goal for that task should be the same as

the value on the TPM plan curve at that time in the program. In this case, it appears to be about

half way between the initial estimate and the final target.

Figure 9. Integration of TPM and WBS

Target or Limit

Initial

estimate

Time

Reserve margin

not shown

How will we

close (or

maintain)

this gap?

Expected

change

Target or Limit

Initial

estimate

Predicted

ProgressProject Schedule

Estimate

results at each task

completion

and map to

graph

WBSxxxxxx xxxxxxxxxx

xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxx

Xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxx

xxxxx xxxxxxxxxx

xxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxxxxxxxx

xxxxx xxxxxxxxx

xxxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxx xxxxxxxxx

Page 7: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Measurement method. In conjunction with tying to the WBS, the measurement method for each

critical point in the schedule, figure 10. The method should be integrated into the WBS. It may be

planned that the estimate of weight at the completion of initial design drawings will be by analysis

rather than actual measurement of a prototype. If so, that point needs to be clearly defined at the

outset, not debated at the completion of the design task.

Figure 10. Identifying Measurement Methods

Additional integrations. In some cases, particularly later in the development, the TPM plan

should be tied to the Test and Evaluation Master Plan and other verification plans. A mismatch

between the pass/fail criteria and expected progress at the time of test will certainly be

problematic. Another relationship to maintain is with the entrance and exit criteria for design

reviews. If the plan was to show 90% performance on a challenging parameter at CDR as

demonstrated by a prototype, that is what should be expected at the review. The customer, internal

or external, should not be complaining that it isn’t 95%. Likewise, if the results are only supported

by analysis because the prototype is not completed, the designer should expect a strong response.

In either case, neither side should be caught off guard and have to argue adequacy of method or

results at this point since the plan was agreed to at the start.

Thresholds. There are several approaches to Thresholds. Figure 11 depicts an upper and lower

bound. This is the range in which results are acceptable without further action. If performance

deviates beyond these limits, management action is to be taken. If the performance deviates in the

negative direction, additional resources may be applied or additional mitigation actions may be

triggered. As long as the performance is within the bounds, internal and external customers and

management should not consider it a problem. In another approach, double boundaries are applied

with green, yellow, and red bands around the target. In this method, green is not a problem, yellow

is handled within the program, and red is addressed by higher management outside the program.

Target or Limit

Initialestimate

Defined updates

• Analysis

• Prototype

• Test

• Etc.

Time

Should connect

to TEMP, other

verification

plans, and

design reviews

Page 8: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Figure 11. Addition of Thresholds

Some organizations do not use a “good” side limit and some use the plan as the boundary. Figure

5 is an example of this approach. While the argument is made that there is no value in

investigating good news, there are two reasons why this can be appropriate. One is to verify that

the good news is real. The other is to evaluate the possibility of reallocating resources to more

needy areas. While the good side is not as critical as a negative threshold breach, it is worth more

that casual attention.

Tracking Progress. Of course, the efforts so far are without value if the actual results are not

tracked as the program progresses, figure 12, and corrective action taken as necessary. The

tracking frequency is established in the original plan along with the collection and analysis

methods. Relationships to other risk mitigations are maintained in the risk plans.

Figure 12. Tracking Actuals

Target or Limit

Variance

Thresholds

Time

Target or Limit

Time

ActualPlan

Start

Perf

orm

an

ce

Delivery

Page 9: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Integration

When the complete process is followed, TPMs and risk management combine to add the technical

planning and progress to the overall management process as shown in Figure 13 and real meaning

to tracking the Technical Critical Path.

Figure 13. TPMs integrating SE and PM

EVMS impacts. There are some complications in using this integrated approach. When a task is

completed but the performance is not within the planned value limits, the earned value should not

be taken. As with any incomplete task, the impact on downstream tasks must be specifically

addressed. The complication comes in the case of a prototype, for instance, where the course of

action is not to continue to redo the prototype. Even though the original task, building and testing

the prototype, will end, care must be taken to properly handle whatever the additional work results

from the failure and not just take full earned value and move on.

An easier to accept but still less than straightforward situation exists if the activity exceeds the

limits on the positive side. The concern here is to assess reductions in future tasks that may be

available.

The most complex situation would be where several requirements are in play and some go to the

good side while others go to the bad. In this case, a decision based on overall risk exposure from

the combined factors is a more reasonable way to address the question. A net risk reduction goal

could be identified as the objective rather than individual parameter changes.

TPM Uses

The initial value of TPMs is to proactively forecast the technical progress to be achieved instead

of only reactively responding to technical events. TPMs identify differences between actual

versus planned performance to provide informed management. This enables the technical and

program management to assess and predict progress towards achieving the performance values,

determine the impact of these differences on system effectiveness, and react effectively. TPMs

provide early indicator of risks and problems requiring management attention.

54

3

21 Target

Pro

pe

lle

nt

Time

WBSxxxxxx xxxxxxxxxx

xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxx

Xxxxx xxxxxxxxxxxx

xxxxx xxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxx

xxxxx xxxxxxxxxx

xxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxxxxxxx xxxxxxxx

xxxxx xxxxxxxxxxx

xxxxx xxxxxxxxx

xxxxxx xxxxxxxx

xxxxx xxxxxxxxxxxxx xxxxxxxxx

Budget Baseline

Management Reserve

Time

$$

EV

Actuals

EAC

CVSV

PM BaselinePlan

JANFEBMARAPRMAYJUNJULAUGSEPOCTNOVDEC

AWD PDR CDR TRR FCA

/PCA

EventSchedule

Milestones

HW Design

HW Fab

HW Test

SW Design

SW Test/Int

HW/SW Integ

System Test

Schedule Technical

Cost

Page 10: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Figure 14 shows the use of TPMs in several modes. On the left, the TPMs are monitoring

performance gaps as they are closed. On the right, the TPMs monitor positive margins to assure

that they do not disappear too rapidly. In the center is an alternative that may not be used often, if

ever, but should be considered. If a key parameter is initially on target, but has a history of varying

making it a risk, it would be a candidate to monitor.

Figure 14. TPM Use Examples

In addition to the tracking progress, TPMs also allow the manager to determine where

opportunities exist to make design trades to reduce overall risks. One example of this was

described by Hamman and Mackertich (INCOSE 2007) where the understanding of positive

technical margins enabled a trade of technical performance to reduce cost with significant savings.

TPMs can also be used to monitor performance improvement after delivery as shown in figure 15.

However, this is an additional use and not the main focus of predicting technical progress during

development.

Figure 15. TPM Use After Delivery

Target

Pro

ba

bility o

f

Dete

ctio

nTime

Limit

Ba

nd

wid

th

Use

Time

Target or

Limit MT

BF

Time

Limit

Th

rou

gh

pu

t

Time

Target

Pro

ce

ssin

g

Tim

e

Time

Maintain

Level

Manage Improvement Manage Margin

Extended Target

Re

lia

bility (M

TB

F)

Time

Delivery

Initial Target

Page 11: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Things to Avoid

Driving through the rear view mirror. Some programs and organizational processes have fallen

into the trap of defining TPMs as reporting past results only as in figure 16. In addition to the

obvious issue that the future expectations are not defined, the current state has no target to define

its acceptability. In this case, there is no decision point to determine when corrective action is

needed.

Figure 16. Looking in the Rear View Mirror

Straight line predictions. This mistake, figure 17, is not acceptable in any progress prediction

with extremely rare exceptions. The last few improvements are almost always a bit slower.

However, the practice continues to occur.

Figure 17. Straight Line Prediction

The miracle. One guide that had some good guidance on TPMs seems, unfortunately, to have

fallen victim to the elimination of military standards. While the text was valuable, a different kind

of lesson can be gleaned from the example at the end of the guide, figure 18. In this example,

Target

We

igh

t

Program Schedule Time

Today

Target

We

igh

t

Program Schedule Time

Page 12: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

progress on throughput time in information processing software was to be tracked. The time per

message needed to be reduced to meet the requirements. Modest progress was predicted through

the development process. Some degradation was predicted at system test. However, the miracle

predicted was the sudden improvement when the system is to be fielded and live data added. In

reality, it should be expected that things get worse when real world is encountered. One possible

explanation is that the chart meant track throughput and to reflect a positive margin with realistic

expectations of the margin disappearing when fielded.

Figure 18. Predicting a Miracle

Only used after delivery. The use of TPMs after delivery is an acceptable as described above.

However, one error that has occurred frequently in classroom exercises is to believe that TPMs

only start at delivery, figure 19. The key is to predict and track performance throughout

development.

Figure 19. TPMs after delivery

- USAF TPM Guide Example

Target

Pro

ce

ssin

g T

ime

Unit test

System test

Live data test

Program Schedule Time

Extended Target

Re

lia

bility (M

TB

F)

Time

Delivery

Initial Target

Now

Page 13: xxxxx xxxxxxxxxx EV - stagingppi.ppi-int.com

Summary

The concept of a Technical Critical Path is evident in some form on most projects that depend on

technological advances. In order to make it work, there needs to be a measurement approach that

will yield actionable information with enough granularity to allow for management response.

Technical Performance Measures are a basic part of technical management that can be applied to

provide a that level of measurement to the Technical Critical path. While the concept of TPMs is

quite simple, the execution has been sporadic in many programs and organizations. Following a

few simple guidelines can produce effective measurement of technical progress and aid the

management processes such as earned value measurement. If done correctly, the TPMs can be a

missing link between the technical and management sides of a development project, provide

tracking of the Technical Critical Path, and increase the likelihood of project success.

References

INCOSE 2006, Systems Engineering Handbook - A Guide for System Life Cycle Processes and

Activities, Version 3, Edited by Cecilia Haskins.

Hamman and Mackertich, Get Smart – Enabling Enterprise Systems Intelligence and Decision

Making Through Critical Parameter Management, INCOSE, 2007

Mankins, J. C. 1995. Technology Readiness Levels. White paper, NASA Office of Space Access

and Technology. http://www.hq.nasa.gov/office/codeq/trl/trl.pdf.

Office of the Deputy Under Secretary of Defense for Acquisition and Technology, Systems and

Software Engineering, Enterprise Development. Systems Engineering Plan Preparation Guide,

Version 2.01. Washington, DC: ODUSD(A&T)SSE/ED, 2008.

Roedler and Jones, Technical Measurement, PSM/INCOSE 2005

Solar America Initiative: Technology Pathway Partnerships (TPP’s), US Department of Energy

presentation for April 18-19, 2006 meeting in Chicago

Systems Engineering Principles, Defense Acquisition University presentation, 22 May, 2008

BIOGRAPHY

Jim Armstrong has practiced systems engineering for 40 years, performing various roles including

configuration management, test, deployment, chief engineer, program manager, and program

element monitor. For the last 20 years, he taught, consulted, and appraised systems engineering in

industry and government. Also, he was on the author teams for most of the systems engineering

standards and models.