Earned readiness management for scheduling, monitoring … · Earned readiness management for...

14
Earned readiness management for scheduling, monitoring and evaluating the development of complex product systems Romulo Magnaye a , 1 , Brian Sauser b , , Peerasit Patanakul c , 2 , David Nowicki b , 3 , Wesley Randall b , 4 a Ramapo College of New Jersey, Anisfield School of Business, 505 Ramapo Valley RD, Mahwah, NJ 07430, United States b University of North Texas, College of Business, Complex Logistics Systems, 1155 Union Circle, #311396, Denton, TX 76203, United States c Penn State University at Erie, The Behrend College, Sam and Irene Black School of Business, 4701 College Drive, Erie, PA 16563, United States Received 18 April 2013; received in revised form 13 January 2014; accepted 21 January 2014 Available online 10 February 2014 Abstract How should the development of a complex product system (CPS) be managed in a manner that focuses on process milestones, which is responsive to changes in technology and requirements; based on maturity measures; and applied in an interactive manner, in addition to facilitating timely feedback? This is considered to be an important question in project management. Project management tools and techniques have been inadequate for monitoring technology development in a CPS. If the technologies are not properly matured by a specic period of time, the progress of the project can be in detriment. To address this important gap, the objective of this study is to develop a new maturity-focused methodology for scheduling, monitoring and evaluating the development of a system. We present Earned Readiness Management (ERM) for system scheduling, monitoring and evaluation which was developed and validated using a case study. Future research on ERM is also discussed in this paper. © 2014 Elsevier Ltd. APM and IPMA. All rights reserved. Keywords: Control; Earned readiness management; Integration readiness level; IRL; Monitoring; Scheduling; System readiness level; SRL; Technology readiness level; TRL 1. Introduction Proper planning and control through scheduling, monitoring and evaluation have been found to be among the necessary elements, which contribute to the success of new products and research and development projects (Dvir and Lechler, 2004; Pinto and Slevin, 1987). Likewise, product development is a significantly competitive advantage for firms (Browning and Ramasesh, 2005). Unfortunately, for complex product systems (CPS) (Hobday, 1998; Hobday et al., 2005) or high technology projects (Archibald, 2003), control cannot yet be effectively achieved because process metrics or performance measures for systems development have not been fully developed (Hertenstein and Platt, 2000; Suomala, 2004). CPS are characterized by customized, interconnected subsystems, high cost, produced in low volume, require a breadth and depth of knowledge and skills, involve multiple collaborators, and have continuous integration with client and supplier (Sauser, 2008). Herroelen (2005) would classify CPS as high variability and high dependency products which has witnessed a latency in research and practice for project scheduling. In the absence of performance measures, project results cannot be measured and compared against pre-specified benchmarks making it difficult to control outcomes (Choudhury and Sabherwal, 2003; Kirsch, 1996). In addition, currently available tools and techniques for planning and control are fragmented and not used consistently throughout the process (Patanakul et al., 2010; Pawar and Driva, 1999). This absence of process metrics along with high Corresponding author. Tel.: + 1 940 565 4693. E-mail addresses: [email protected] (R. Magnaye), [email protected] (B. Sauser), [email protected] (P. Patanakul), [email protected] (D. Nowicki), [email protected] (W. Randall). 1 Tel.: +1 201 684 6263. 2 Tel.: +1 814 898 6534. 3 Tel.: +1 940 565 3673. 4 Tel.: +1 940 369 8912. www.elsevier.com/locate/ijproman 0263-7863/$36.00 © 2014 Elsevier Ltd. APM and IPMA. All rights reserved. http://dx.doi.org/10.1016/j.ijproman.2014.01.009 Available online at www.sciencedirect.com ScienceDirect International Journal of Project Management 32 (2014) 1246 1259

Transcript of Earned readiness management for scheduling, monitoring … · Earned readiness management for...

Available online at www.sciencedirect.com

ScienceDirect

www.elsevier.com/locate/ijpromanInternational Journal of Project Management 32 (2014) 1246–1259

Earned readiness management for scheduling, monitoring andevaluating the development of complex product systems

Romulo Magnaye a,1, Brian Sauser b,⁎, Peerasit Patanakul c,2,David Nowicki b,3, Wesley Randall b,4

a Ramapo College of New Jersey, Anisfield School of Business, 505 Ramapo Valley RD, Mahwah, NJ 07430, United Statesb University of North Texas, College of Business, Complex Logistics Systems, 1155 Union Circle, #311396, Denton, TX 76203, United States

c Penn State University at Erie, The Behrend College, Sam and Irene Black School of Business, 4701 College Drive, Erie, PA 16563, United States

Received 18 April 2013; received in revised form 13 January 2014; accepted 21 January 2014Available online 10 February 2014

Abstract

How should the development of a complex product system (CPS) be managed in a manner that focuses on process milestones, which isresponsive to changes in technology and requirements; based on maturity measures; and applied in an interactive manner, in addition to facilitatingtimely feedback? This is considered to be an important question in project management. Project management tools and techniques have beeninadequate for monitoring technology development in a CPS. If the technologies are not properly matured by a specific period of time, the progressof the project can be in detriment. To address this important gap, the objective of this study is to develop a new maturity-focused methodology forscheduling, monitoring and evaluating the development of a system. We present Earned Readiness Management (ERM) for system scheduling,monitoring and evaluation which was developed and validated using a case study. Future research on ERM is also discussed in this paper.© 2014 Elsevier Ltd. APM and IPMA. All rights reserved.

Keywords: Control; Earned readiness management; Integration readiness level; IRL; Monitoring; Scheduling; System readiness level; SRL; Technology readinesslevel; TRL

1. Introduction

Proper planning and control through scheduling, monitoringand evaluation have been found to be among the necessaryelements, which contribute to the success of new products andresearch and development projects (Dvir and Lechler, 2004;Pinto and Slevin, 1987). Likewise, product development is asignificantly competitive advantage for firms (Browning andRamasesh, 2005). Unfortunately, for complex product systems(CPS) (Hobday, 1998; Hobday et al., 2005) or high technology

⁎ Corresponding author. Tel.: +1 940 565 4693.E-mail addresses: [email protected] (R. Magnaye),

[email protected] (B. Sauser), [email protected] (P. Patanakul),[email protected] (D. Nowicki), [email protected] (W. Randall).1 Tel.: +1 201 684 6263.2 Tel.: +1 814 898 6534.3 Tel.: +1 940 565 3673.4 Tel.: +1 940 369 8912.

0263-7863/$36.00 © 2014 Elsevier Ltd. APM and IPMA. All rights reserved.http://dx.doi.org/10.1016/j.ijproman.2014.01.009

projects (Archibald, 2003), control cannot yet be effectivelyachieved because process metrics or performance measures forsystems development have not been fully developed(Hertenstein and Platt, 2000; Suomala, 2004). CPS arecharacterized by customized, interconnected subsystems, highcost, produced in low volume, require a breadth and depth ofknowledge and skills, involve multiple collaborators, and havecontinuous integration with client and supplier (Sauser, 2008).Herroelen (2005) would classify CPS as high variability andhigh dependency products which has witnessed a latency inresearch and practice for project scheduling. In the absence ofperformance measures, project results cannot be measured andcompared against pre-specified benchmarks making it difficultto control outcomes (Choudhury and Sabherwal, 2003; Kirsch,1996). In addition, currently available tools and techniques forplanning and control are fragmented and not used consistentlythroughout the process (Patanakul et al., 2010; Pawar andDriva, 1999). This absence of process metrics along with high

1247R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

variability, dependency and lack of consistency make itdifficult to realize the use of control mechanisms such that itcan influence in a positive manner the performance of CPSdevelopment (Tiwana and Keil, 2009).

More than an unwillingness to bother with such measures,perhaps project or engineering managers do not use themconsistently because they find them to be irrelevant or unable toaddress the need to control the development of CPS. In particular,these planning and control tools focus on measuring specificperformance aspects of the system, such as task completion, costand schedule, which may be important to some stakeholders butare unable to show if the system is maturing adequately over thedevelopment lifecycle. Concentrating on the measurement ofthese variables can mislead a project's focus in terms of whichactivities have been accomplished. Browning and Ramasesh(2005) showed this exist in product development models whichhave over emphasized the activities rather than the interactions ordeliverables. In a review of process models for productdevelopment, Browning and Ramasesh (2005) state that mostmodels focus on optimizing the relationship between elements,and do not consider the project or system-level view, which willsub-optimize the system. Humans tend to apply more attention toactivities that are being measured and rewarded (Blackburn andValerdi, 2009; Chiesa et al., 2008; Shuman et al., 1995).Therefore, it follows that when traditional project managementtools are applied to measure cost and schedule variances, projectmanagers will concentrate on meeting costs and schedule targetsas they complete the prescribed work packages. Unfortunately, inthe case of CPS, achieving favorable variances do not guaranteethat the system is maturing as planned. Rather, what happens isthat by focusing on the work packages and ensuring that they areachieved on time and within cost, the project manager is unable toview the system as a whole and assess its overall readiness. Forexample, when novel CPS development tasks are identified, thereis no certainty that they will lead to the desired maturity6 of thesystem as those tasks are completed. Those work packages areeducated guesses based on expertise, data, and standards. Asmore information regarding the new technology becomesavailable, some tasks may have to be revised. However, byfocusing on the project tasks and ensuring that they are achievedon time and within cost, the project manager may not realize thisuntil much later. Maintaining a system-wide view is mostimportant during the earlier phases of the development lifecyclewhen uncertainty is still high but corrective actions are stillmanageable (Tang and Otto, 2009). To encourage a system-wideperspective of the development process, the project manager mustuse system-wide process measures to control the development ofsystems through proper planning, scheduling and monitoring.

The importance of system scheduling, monitoring andevaluation and the absence of an accepted effective approachwas a motivation to examine current practices through a review ofthe literature in product and systems management and engineer-ing, and related fields followed by verifications through our

6 In this paper, “maturity” or “developmental maturity” is the characterizationof the development status of a technology/integration/system that can bequantified to determine the corresponding readiness (Tan et al., 2011).

research contracts and roundtable discussions with industry andgovernment. We used our findings to develop a conceptualframework for CPS planning, scheduling, monitoring andevaluation which can help project and engineering managers tocontrol the development process and contribute towards itssuccess. We applied the model to a CPS (i.e. space systemundergoing development) to show how it works and establishtheoretical validity.

Our study makes a couple of novel contributions to research.First, it provides managerial context to the use of a system-focuseddevelopment scale (i.e. System Readiness Level [SRL]), whichhas been previously proposed as a CPS measure of developmentalmaturity (Sauser et al., 2008a,b). Secondly, this research illustrateshow a system development plan can be translated into areadiness-oriented scheduling, monitoring and evaluation ap-proach which is simple and familiar to project and engineeringmanagers. The approach is also interactive and absorbs changes intechnologies, architecture and cost estimates.

2. Methodology

In order to understand how the development of a complexproduct system can be effectively controlled, we reviewed theliterature, analyzed the concepts that previous researchers havesuggested and talked to professionals involved in the field.

Our review of the literature focused on the fields of SystemsEngineering Management which is currently the main arenawhere the development of complex products is discussed. Wealso included the fields of New Product Development (NPD)projects as well as the management of Research andDevelopment (R&D). The search was done electronicallyusing the key phrases project control, scheduling, monitoring,evaluation, system maturity, system readiness. We specificallylooked for research output which focused on what factors wereidentified to be important to the success of the projects. Morespecifically, we wanted to know if Control (scheduling,monitoring and evaluation) was an important factor. It was.We then narrowed it down to the characteristics of the controlmechanisms which would be effective as far as complex productsystems are concerned. Those which were relevant are the onescited in this paper. In cases where multiple papers have beenpublished by the same author(s) on the same topic, we only usedthe most complete versions— not necessarily the earliest nor thelatest versions. The goals were to establish the role which controlmechanisms play as well as identify what characteristics suchmechanisms should have in order to be effective.

In the event that practice in the field may be ahead of theliterature, we discussed our initial findings with a small groupof practitioners from the Department of Army, NorthropGrumman Corporation and Lockheed Martin Corporation.This group added their own thoughts to our efforts.

These early research activities allowed us to identify thecharacteristics of a management approach which can beeffectively applied towards successful development of complexproducts. We, then, formulated a new approach which wasdiscussed with an expanded group of practitioners and researchers.This group included participants from Analytic Services Inc., JB

1248 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

Consulting, Infologic Inc., Institute for Defense Analyses, JohnsHopkins University — Applied Physics Laboratory, LockheedMartin Corporation, National Aeronautics and Space Administra-tion, National Reconnaissance Office, Northrop GrummanCorporation, Science Applications International Corporation,U.S. Departments of Defense, Army and Navy and VerizonCorporation. The approach was also presented in conferences ondefense acquisition.

When the response from this larger group turned out to befavorable, we applied the approach to a space system toestablish its conceptual validity, albeit with disguised primaryinformation combined with secondary data. This methodologyis illustrated by the diagram in Fig. 1.

3. Results

3.1. Effective control of systems development

Complex product systems (Hobday, 1998; Hobday et al.,2005) or high technology projects (Archibald, 2003) are typicallyconceptualized from a set of very demanding requirements andthen designedwith a high level of uncertainty with respect to theirarchitectures and constituent components. This uncertainty

Explore current practice

Literature review• Systems engineering

management

• New productdevelopment projects

• Research &Development projects

Workshops,

roundtable

discussions and

research contracts

with industry and

government

Validated

characteristics of

control

Access to an

actual system not

feasible

Role and

characteristics of

effective control

Fig. 1. Research approach

results from the anticipated novelty and high technologicalcontents of the proposed system (Shenhar, 2001; Shenhar andDvir, 2007). Unfortunately, these are often used as the excuse forproceeding with the development of a system without sufficientsystem-wide development plans, scheduling, monitoring andevaluation mechanisms, which can be used as the basis forevaluating its maturity. Having a plan and monitoring process arenecessary for the success of development efforts. Cooper andKleinschmidt (Cooper and Kleinschmidt, 2007) concluded thatproject plans and schedules along with monitoring and feedbackare among the ten factors critical to successful projectimplementation. More specifically, Pinto and Mantel (1990)found that for R&D projects, ineffective scheduling is stronglyrelated to failure of the implementation process and thatmonitoring and feedback can impact client satisfaction. Similarly,Dvir and Lechler (2004) found that the quality of planning has apositive effect on efficiency (schedule, budget and scope),perceived value and client satisfaction. The lack of an appropriateplan makes it difficult to control the development process whichcan lead to cost overruns, delays or total failure as have happenedin some government projects (Dodaro, 2009, 2010), research anddevelopment efforts (Slevin and Pinto, 1987) and new productdevelopment (Bart, 1991, 1999).

Model development andvalidation

Conceptual Model:

Evaluate accomplishments

Write test case:

HRSDS

System development schedule :readiness levels, time, costs

Monitor progress

Managerial action

Validation

Readiness-oriented system breakdown structure

Theoretical

framework for

scheduling,

monitoring and

evaluation of

systems under

development

for developing ERM.

1249R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

In order to be successful, managerial control (which beginswith setting out the development plan, schedule and how aproject will be monitored and evaluated) is applied in the earlierstages of new product development (Bonner et al., 2002). Theplan must be executed in such a way that it holds performance toplan, should not be overly restrictive (Bart, 1991, 1999) and bebased on milestones (Andersen, 1996; Anderson et al., 1995).The ability of managerial control to positively influence thesuccess of the development is also reinforced when it is applied inan interactive, flexible and responsive manner (Bisbe and Otley,2004; Iansiti, 1995). Davila (2000), Davila et al. (2009) identifiedtwo roles that managerial control plays: 1) promoting goalcongruence among team members and 2) reducing uncertainty(resulting from novelty and high technological content) byenhancing coordination and learning through the creation of aninformation infrastructure. This is most important during thedesign and planning stages of the development of new CPS. Thisneed for control through coordination and learning using timelygathering and fast dissemination of information becomes criticalto success.

Among the information that are needed are changing marketor user requirements, technological improvements (from withinthe development organization and from competing effortselsewhere) and the realities of systems integration. To exercisecontrol, knowledge that is generated during systems developmenthas to be reckoned against the original design and developmentplans.

Hertenstein and Platt (2000) observed that mechanisms tocontrol the development process can be classified into threecategories: 1) the position of product development in theorganization, 2) the product development process and 3) theperformance measures applied during the process. To besuccessful, it has been observed that the product developmentteam must be positioned such that it is permanently headed by aheavyweight manager who reports directly to a senior executive(Clark and Fujimoto, 1991; Clark et al., 1987; Cooper, 2005).Furthermore, the product development process must be wellplanned and articulated and, along with the performancemeasures, must be linked to the strategy of the organization(Brown and Eisenhardt, 1995; Cooper and Kleinschmidt, 1987,1993, 1995). Finally, when the system under development isvery complex and has high levels of uncertainty due to theirnovelty and high technological content (e.g. CPS), thedevelopment process must be in the optimizing level — thehighest level of process maturity as defined by Karandikar et al.(1992). At this stage, there has to be a high degree of controlover the process and the major focus is on the application ofprocess metrics and lessons learned in order to quickly identifythe problem areas and be able to respond promptly.

3.2. Monitoring and control methodologies for assessingsystem maturity

Techniques used to monitor development efforts that comefrom project management have been recommended (Archibald,2003; Buys, 2008). Besides being used for schedulingpurposes, GANTT charts, PERT, CPM, and Critical Chain

Scheduling can also be used for monitoring and controlpurposes. Earned Value Management (EVM) (Barr, 1996;Jaafari, 1996; Kim et al., 2003; Moselhi et al., 2004; Raby,2000) is another widely used tool in CPS for project monitoringand control since its application, to a certain extent, has provento be effective. During the 1990's, in response to escalatingcosts, schedule delays and project failures, an attempt wasmade to introduce some structure to the development of CPS inthe US Department of Defense (DoD). DoD applied EarnedValue Management successfully to the extent that cost overrunsfor the major weapons systems of the DoD were reduced by95% during the next couple of years following the applicationof EVM (Abba, 1997, 2001). However, several years later, thesame problems returned and the Government AccountabilityOffice (GAO) began raising the issue. It estimated that by 2008,the major DoD systems under development were over budgetby up to $296 billion (GAO, 2008). Our conversations withpractitioners indicated that EVM turned out to be ineffectivebecause for the approach to work a precise breakdown of all thetasks is needed— something that is not possible because a CPSthat is very novel can only freeze its requirements late in thedevelopment cycle. This can pose a significant challenge indeveloping a meaningful project cost baseline.

As a possible solution DoD began to advocate the need fortechnologies to be sufficiently matured before they areincorporated into systems that are under development. It hasalso strongly recommended the use of metrics to assess thereadiness of these technologies and systems into which they areincorporated. With regard to measuring the maturity of newtechnologies and systems, process metrics or performancemeasures for systems development are not yet fully developed(Suomala, 2004). Currently available tools and techniques —such as budgets, Quality Function Deployment (Hauser andClausing, 1988), Concept Selection Process (Pugh, 1991), FirstRequirements Elucidator Demonstration (FRED) (Kasser,2004), Integrated Design Model (Vollerthun, 2002), SubsystemTradeoff Functional Equation (Shell, 2003), Design forManufacturability (Whitney, 1988), Design-Build-Test Cycle(Clark and Fujimoto, 1991) and Periodic Prototyping(Wheelwright and Clark, 1992), Cost as an IndependentVariable or CAIV (Brady, 2001) and Lean Product Develop-ment Flow (Oppenheim, 2004) — are fragmented and not usedconsistently throughout the process (Pawar and Driva, 1999).

To be able to measure the maturity of technologies, USgovernment organizations like the National Aeronautics andSpace Administration (NASA) and DoD adopted the Technol-ogy Readiness Level (TRL) scale (Mankins, 1995, 2002; Sadinet al., 1989). This scale has also been accepted by foreigndefense establishments and coalitions, the US Department ofEnergy (DoE) and other institutions that have to manage thedevelopment of new high technology elements. A summary ofTRL is presented in Table 1.

However, systems involve more than just their componenttechnologies. More importantly, these components have to belinked together which inductively indicates that integrationelements must also be evaluated and managed accordingly. Theaddition of a relevant metric, separate from TRL can be effective

Table 1Technology Readiness Levels (DoD, 2005).

TRL Definition

9 Actual system proven through successful mission operations8 Actual system completed and qualified through test and demonstration7 System prototype demonstration in operational environment6 System/subsystemmodel or prototype demonstration in relevant environment5 Component and/or breadboard validation in relevant environment4 Component and/or breadboard validation in laboratory environment3 Analytical and experimental critical function and/or characteristic

proof-of-concept2 Technology concept and/or application formulated1 Basic principles observed and reported

Table 3System Readiness Levels (Sauser et al., 2010).

SRL Range Name Definitions

Not applicable Phase F Closeout0.90 to 1.00 Phase E Operation and sustainment0.85 to 0.89 Phase D System assembly, integration and test launch0.80 to 0.84 Phase C Final design and fabrication0.50 to 0.79 Phase B Preliminary design and technology completion0.20 to 0.49 Phase A Concept and Technology Development0.10 to 0.19 Pre-Phase A Concept Studies

NOTE: These ranges have been derived conceptually.

1250 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

because integration elements tend to be different in nature fromtechnology elements and system integration has become a corecapability in organizations developing CPS (Henderson andClark, 1990). For example, they tend to be multi-dimensional andmulti-faceted. To address this, an Integration Readiness Level(IRL) scale (see Table 2) was formulated by Sauser et al. (2006)and refined by Sauser et al. (2007) and Sauser et al. (2009, July).

Eventually, both metrics were used to develop a system-widematurity scale called System Readiness Level (SRL). It iscalculated as a function of TRL and IRL (Sauser et al., 2008b,2010) for assessing developmental maturity of systems or CPS.

A summary for space systems is presented in Table 3. Pleasetake note that the values in Table 3 are conceptually derived andnot based on a calibration using real systems. That is, if TRL andIRL values are still within a certain early range, say 1 to 3, thecalculated SRL for the system may range from 0.20 to 0.49indicating that it is still in the Concept and TechnologyDevelopment phase. SRL has been well documented as anassessment of system development (Azizian et al., 2010; Cilli andParnell, 2010; Erhardt et al., 2010; Forbes et al., 2009; Garret etal., 2009; Magnaye et al., 2009, 2010; Majumdar, 2007;Ramirez-Marquez and Sauser, 2009; Scearce, 2007).

A recent development in systems engineering is theapplication of resource allocation in the formulation of maturity-oriented development plans. This aspect was addressed by

Table 2Integration readiness levels (Sauser et al., 2007).

IRL Definition

9 Integration is Mission Proven through successful mission operations.8 Actual integration completed and Mission Qualified through test and

demonstration, in the system environment.7 The integration of technologies has been Verified and Validated with

sufficient detail to be actionable.6 The integrating technologies can Accept, Translate, and Structure

Information for its intended application.5 There is sufficient Control between technologies necessary to establish,

manage, and terminate the integration.4 There is sufficient detail in the Quality and Assurance of the integration

between technologies.3 There is Compatibility (i.e. common language) between technologies to

orderly and efficiently integrate and interact.2 There is some level of specificity to characterize the Interaction (i.e. ability

to influence) between technologies through their interface.1 An Interface between technologies has been identified with sufficient

detail to allow characterization of the relationship.

Ramirez-Marquez and Sauser (2009) through a maturity-basedmodel called SRLmax, but, as the name implies, it is designed tomaximize the readiness of a system given a pre-determinedavailability of resources. When it comes to containing costs, atbest, this model can be used indirectly to set cost targets providedthe temptation to develop the system faster than it needs to be isavoided. This can be accomplished by limiting the availability ofresources allocated for each development period. Unfortunately,owners of systems under development have a tendency to seetheir products completed sooner than necessary. This is probablytrue when the projects fall behind schedule. In order to avoid atendency to spend more than necessary to catch up, there is a needto develop a methodology that is more directly targeted atformulating a system development plan that can be used tominimize the cost of developing a system. To address this,Magnaye et al. (2010) suggested a similar model calledSCODmin. This model was used to formulate a systemdevelopment plan which can identify which components of thesystem should be matured to what readiness and when such thatthe development costs are minimized while a pre-determinedlevel of system maturity is reached at the end of each year.

3.3. Summary of current practices

The review of the literature and discussions with practi-tioners indicate that a properly constituted managerial moni-toring and control methodology which is focused on thematurity of technologies, integration elements and the systemas a whole is important to the success of new systemsdevelopment. To exercise proper control there must be adevelopment plan to serve as the foundation for a scheduling,monitoring and evaluation method. It serves two purposes:promoting goal congruence and reducing uncertainty.

To have success during the development of systems, themethodology for control must be applied early, not overlyrestrictive, based on maturity as milestones, interactive andresponsive to changes in technology and requirements. This canbe facilitated through an interactive managerial control systemthat promotes goal congruence and enhance learning through thecreation of an information infrastructure with process perfor-mance metrics that are linked to the strategy of the enterprise.Metrics which can measure the maturity of technologies,integration elements and the system itself have been formulated.These have been used in resource optimization models todetermine optimal development plans.

1251R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

4. A maturity model for control (scheduling, monitoringand evaluation)

Based on the review of the literature and discussions withpractitioners, the conceptual model that we formulated has thefollowing steps and is represented in Fig. 2 by the shaded areas:

1. Define a readiness-oriented system breakdown structure2. Assign costs to achieve every readiness level of each element3. Identify the optimal development plan4. Translate the plan into a system development schedule5. Establish a readiness measurement baseline6. Track progress7. Evaluate performance8. Apply corrective measures (as required)9. Identify, disseminate and apply lessons learned.

The system can be broken down into the individualreadiness levels, their readiness-oriented work packages, andthen arranged into a system readiness breakdown structure(SRBS). A simple example for a system with two technologies,one integration and undergoing an 8-year development isdiscussed below.

Level 1 is the whole system, Level 2 indicates its criticalelements, level 3 represents their readiness levels which have tobe reached (see description of TRL and IRL in Tables 1 and 2),and level 4 are the work packages that must be completed inorder to achieve the readiness levels. The cost of maturing eachand every element of the system throughout the developmentprocess must be estimated. The estimated costs for eachreadiness level are obtained through an engineering analysis,analogy, expert estimates or combinations of these. For verynovel technologies and integration elements, it may not bepossible to accurately determine the cost of maturing from onereadiness level to the next. In such cases, the total cost of atechnology may be estimated using analogy or expert opinionsand then allocated to each readiness level based on a probability

System Architecture Critical technoIntegration Ele

Readiness-orienSystem Breakdown S

STrack ProgressEvaluate

Feedback and Less

Fig. 2. Conceptual m

distribution. Prior research has shown that, at least for a defensesystem, cost allocation can be fitted to a Rayleigh distribution(Abernethey, 1984; Blumenson and Miller, 1963; Lee et al.,1993).

The data on cost per readiness level for each element of thesystem is the main input to the optimization methodologies foridentifying an optimal development plan which reflects thedevelopment objective (maximize maturity or minimizedevelopment cost). These have been presented previously byRamirez-Marquez and Sauser (2009) and Magnaye et al.(2010). The description of SRL was presented earlier inTable 3 (for calculation of SRL, see Ramirez-Marquez andSauser (2009) and Sauser et al. (2008b, 2010)). For example,after the cost data have been estimated for the hypothetical2-element, 1-integration model and fed into cost minimizationmodel, a sample output may look like the one illustrated byTable 4. It shows that to reach a system readiness level of 0.37 atthe end of year four, Technology 1 must be at TRL5, Technology2 at TRL5 and the integration element 1,2 must be at TRL3.

The SRBS can then be combined with the optimaldevelopment plan to set a system development schedule. Thisis illustrated by Table 5. It shows when to achieve each andevery readiness level and how much they would cost (shownwith randomly generated cost figures). A work package isawarded an earned readiness value if the targeted readinesslevel has been achieved, as determined by an independentassessment process, which the organization prescribes.

The readiness measurement baseline is the Budgeted Cost ofReadiness Scheduled (BCRS) which is the cumulative sum ofthe value of all the readiness levels over time. Actualperformance is represented by the Budgeted Cost of ReadinessAchieved (BCRA) and its cost as Actual Cost of ReadinessAchieved (ACRA). These ERM concepts, their relationship toeach other and how they compare to similar Earned ValueManagement measures are illustrated in Fig. 3.

The earned readiness is compared to the readiness measure-ment baseline at any point in time to evaluate progress. A

logy &ments

Breakdown cost per RL:Technical estimates or Rayleigh distribution

SCODmin Model:PSDA

Optimal developmentplan

tedtructure

Total cost of each critical element

ystem Development Schedule

ons learned

odel of ERM.

Table 4Example of an Optimal Development Plan.

Fiscal year Target SRL TRL IRL

Technology 1 Technology 2 Integration 1,2

8 1.00 9 8,9 8,97 0.74 7,8 6,7 6,76 0.48 6 5 55 0.44 6 5 44 0.37 5 5 2,33 0.25 3,4 4 12 0.00 2 3 01 0.00 1 1,2 0

1252 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

negative cost variance results when the Actual Cost of ReadinessAchieved (ACRA) is greater than the Budgeted Cost ofReadiness Achieved (BCRA). The Readiness Cost PerformanceIndex or RCPI is the ratio of BCRA to ACRA (that is, RCPI =BCRA/ACRA). The RCPI is unfavorable when it is less than 1.By the same token, an unfavorable readiness or schedulevariance is obtained when BCRA is less than the BudgetedCost of Readiness Scheduled (BCRS). The ReadinessPerformance Index or RPI is the ratio of BCRA to BCRS(that is, RPI = BCRA/BCRS). It will be unfavorable when it isless than 1.

Table 5Example of a System Development Schedule (with hypothetical figures).

Yr 1 Yr 2 Yr 3

1.System Under Development1.1 Critical Technology Element 11.1.1 TRL 1 21.1.2 TRL 2 41.1.3 TRL 3 41.1.4 TRL 4 31.1.5 TRL 51.1.6 TRL 61.1.7 TRL 71.1.8 TRL 81.1.9 TRL 9

1.2 Critical Technology Element 21.2.1 TRL 1 41.2.2 TRL 2 21.2.3 TRL 3 51.2.4 TRL 4 61.2.5 TRL 51.2.6 TRL 61.2.7 TRL 71.2.8 TRL 81.2.9 TRL 9

1.3 Critical Integration Element 1,21.3.1 IRL 1 141.3.2 IRL 21.3.3 IRL 31.3.4 IRL 41.3.5 IRL 51.3.6 IRL 61.3.7 IRL 71.3.8 IRL 81.3.9 IRL 9

Total 8 9 27Target SRL 0.00 0.00 0.25

Finally, the last two steps of the conceptual model –applying corrective measures and managing organizationallearning – are contingent to the development of each specificsystem.

5. Illustrative example

To show how the conceptual model of ERM can be used, weapplied it to a system under development. We developed a casearound a space robotic system that was developed by NASAbut was later aborted in favor of an alternative mannedapproach. We assumed that this aborted system was revivedand the development scenarios were created using publiclyavailable information and disguised cost data. The Test Case isbased upon an unpublished case, Robotic Servicing Mission forthe Hubble Space Telescope (written by the same authors),which presented the actual development of the robotic servicingsystem. The relevant highlights of the case are presented initalics below.

After many years of exceptional service, NASA had beenconsidering a technical solution to repairing the Hubble SpaceTelescope (HST) as it had far surpassed its expected lifetime.Its gyroscopes were approaching the end of their life cycle, itsbatteries were nearing their lower levels of acceptable

Yr 4 Yr 5 Yr 6 Yr 7 Yr 8

65 4

1216

14

8 8 51420

2514

108

1620

1640

2738

32 29 29 118 1180.37 0.44 0.48 0.74 1.00

Fig. 3. Earned readiness management concepts compared to earned value management.

1253R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

performance and the fine-guidance sensors had already begunfailing. If HST was not serviced soon and the batteries ran out,or navigational and attitude control was lost, certaininstruments aboard would be permanently damaged eitherdue to low temperatures or direct exposure to sunlight. Inorder to repair the HST, NASA had to perform a servicingmission (SM-4) to keep HST operating into the future. Theproblem was that HST had not been serviced since before theColumbia disaster and the Columbia Accident InvestigationBoard (CAIB) requirements for shuttle operations wouldimpact a HST SM-4 since time and fuel considerations wouldnot allow the crew to safely dock to the International SpaceStation (ISS) if damage was found on the heat shield. To combatthis problem a Robotic Servicing Mission (RSM) had been

IRL 6

IRL 6IRL 2

IRL 6

IRL 2

IRL 5

Tech 3TRL 7

Tech 6TRL 6

Tech 5TRL 6

Tech 4TRL 6

Tech 2TRL 8

Tech 1TRL 8

IRL 5

Fig. 4. System concept diagram. Tech 1 — Remote Manipulator System(RMS); Tech 2 — Special Purpose Dexterous Manipulator (SPDM); Tech 3 —Electronic Control Unit (ECU); Tech 4 — Autonomous Grappling (AG); Tech5 — Autonomous Proximity Operations (APO); and Tech 6 — Laser ImageDetection and Radar (LIDAR).

suggested thereby reducing cost, and the risk to human life. As anindependent committee was established to determine thefeasibility of this mission, in January 2010 the manager of HSTServicing Mission Operations, submitted an updated plan toproceed with the development of HRSDM for use in ServicingMission 5 scheduled for May, 2014. Based on consultations andcollaboration with the original project participants or theirsuccessor entities, the plan retained the original technologies andarchitecture of the system (see Fig. 4) but also incorporated thelatest engineering advances. The plan estimated that incre-mental costs to mature the technologies from 2010 to 2014 willamount to $77.17 million while integrating them into thesystem will cost another $188.57 million for a total cost of$265.74 million (see Table 6).

The conceptual model for the Earned Readiness Manage-ment approach to scheduling, monitoring and evaluation of thedevelopment process was applied to this case. The first stepwas to breakdown the cost estimates from Table 6 to determine

Table 6HRSDS Incremental (2010–2014) development costs (with disguised numbers).

Component Cost millions of $ General specs and comments

Remote manipulator system 9.00Special purpose dexterousmanipulator

7.65 7 degrees of freedom

Electronic control unit 14.23Autonomous grappling 21.50 7 degrees of freedomAutonomous proximityoperations

11.87 6 degrees of freedom,multi-mode, passiveproximity sensing

Laser image detection andradar

12.92

System integration 188.57Total 265.74

Table 7Incremental cost to mature technologies, in millions of dollars.

Technology 1 2 3 4 5 6

TRL1234567 8.76 4.67 7.808 6.89 4.21 5.31 1.239 9.00 7.65 7.34 8.53 1.89 3.89Total 9.00 7.65 14.23 21.50 11.87 12.92

1254 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

the incremental cost of maturing every element through eachreadiness level (obtained from Magnaye et al. (2010)). Theseare shown in Tables 7 and 8.

The numbers in the cells are the resources needed toaccomplish the readiness level for that particular component ofthe system. For example, in Table 7, we would need $9.00 millionto mature technology 1 from TRL8 to TRL9. Similarly, fromTable 8, we would need $1.00 million to mature Integration (1,2)from IRL5 to IRL6.

Next, we apply the SCODmin optimization model(Magnaye et al., 2010) to the data and obtained the solutionor development plan shown in Table 9. According to this plan,to mature the whole system from the Concept and TechnologyDevelopment Phase (an SRL of 0.48) to the end of thePreliminary Design and Technology Completion Phase (SRLof 0.69) by 2011, we must allocate resources such that the TRLand IRL values shown by the third row from the bottom of thetable are achieved (in bold italics). That is, technology 1 mustbe at TRL8, technology 2 must be at TRL8, technology 3 mustbe at TRL9 and so on.

From the optimal development plan, we prepared the systemdevelopment schedule by showing the system readinessbreakdown structure along the vertical, the timetable on thehorizontal and the costs in the cells formed by their intersections.The schedule from 2004 to 2014 is shown as Table 10. Accordingto this, work on the Electronic Control Unit (technology 3),

Table 8Incremental cost to mature integrations, in millions of dollars.

Integration 1,2 1,3 2,3 2,4 3,5 4,5 5,6

IRL123 4.53 1.234 5.81 2.195 7.21 5.956 1.00 2.75 9.00 7.007 1.75 2.00 0.50 5.40 3.45 12.00 8.088 4.00 4.00 4.50 6.32 4.57 14.32 10.039 6.00 6.50 5.50 7.45 6.78 17.65 11.10Total 12.75 12.5 10.50 21.92 14.80 70.52 45.58

Autonomous Proximity Operations module (technology 5), theintegration links between elements 1&2, 1&3, 2&3 and 5&6 areto resume in Fiscal Year 2010 requiring a budgetary allocation of$20.230 million. The elements and integration links that shouldbe worked on for the succeeding years can also be identified fromTable 10 along with their costs. Altogether, the $700 million thatwas already spent for development from 2004 to 2005 followedby mothballing until 2009 plus the estimated incremental cost of$265.74 million from 2010 to 2014 (see Table 6) will lead to atotal system cost of $965.74 million.

There are situations where the optimization algorithmsuggested that work on an element of the system be put on-holdduring some years and resumed at a later year. For example, theintegration link between elements 2&3 is advanced to an IRL of 7by 2010 but no work on it is called for during 2011. It is resumedonly in 2012. Similar situations exist for the integration linksbetween elements 3&5 and 5&6. This could mean that workersand equipment may be idle for a year. It is not a big issue if theycan be diverted to other tasks. Otherwise, it representsopportunity costs (due to foregone revenues) and administrativeones associated with laying-off workers and re-hiring them later.This should be avoided unless there are genuine technicaljustifications for it. To avoid this problem, the development teammay be allowed to start such affected activities early to maintaincontinuity. For example, instead of waiting until 2012 asprescribed by the algorithm, work on reaching an IRL of 8 forthe integration link between elements 2&3 can be moved forwardto 2011. The project managers and the owners of the systemmustweigh the trade-off between incurring some expenditures earlierthan originally planned versus the administrative and technicalcosts of putting the affected elements of the system “on-hold”until it can be re-started at the later date. The adjusted systemdevelopment schedule is shown in Table 11. The actual progressof the development process can be regularly measured andcompared to the original schedule. The readiness and costperformance measures using ERM can be calculated to identifyproblem areas and formulate remedial measures.

To illustrate how to report and evaluate the progress of thedevelopment process, consider the scenario described below.

Towards the end of Fiscal Year 2013, the system had beenplanned to be completing the System Assembly and TestLaunch Phase with a SRL value of 0.89. Of the total budget of$54.02 million, $44.26 million has been spent — a favorablecost variance of $9.76 million. However, an assessment of thesystem under development leads to the uneasy conclusion thatwhile tasks have been completed and most of the componentshave been produced and tested individually in space, assemblyhas been problematic. The support projects have been delayedbecause some system components when integrated into thesystem were not performing according to the specifications.This was particularly true for the Autonomous Grappling Unit(Technology 4). It was unable to pass qualification through testand demonstration (TRL 8) as planned when its software failedto operate as intended. This, in-turn, affected the integrationelements that connect it to the Special Purpose DexterousManipulator (Technology 2) and the Autonomous ProximityOperations (Technology 5).

Table 9Optimal development plan.

Fiscal year Target SRL Phase TRL IRL

1 2 3 4 5 6 1,2 1,3 2,3 2,4 3,5 4,5 5,6

2014 1.00 Operations & Sustainment 9 9 9 9 9 9 9 9 9 9 9 9 92013 0.89 System Assembly & Test Launch 9 9 9 8 9 9 9 9 9 8 8 5 72012 0.80 Final Design & Fabrication 8 9 9 6 9 9 9 9 9 5 8 4 62011 0.69 Preliminary Design &

Technology Completion8 8 9 6 9 9 8 8 7 5 7 2 4

2010 0.58 8 8 8 6 7 6 7 7 7 5 6 2 42009 0.48 Concept & Technology Development 8 8 7 6 6 6 5 6 6 5 6 2 2

1255R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

The summary report that was prepared for this notionalexample by the project manager shows that based on Table 12, theproject did not reach the projected SRL of 0.89 and was unable todeploy. Based on these numbers, the Readiness Cost PerformanceIndex is equal to BCRA/ACRA = 22.12/44.26/22 = 0.5 which isa very unfavorable variance. The Readiness Performance Index(RPI) is BCRA/BCRS = 22.12/54.02 = 0.41 which is alsounfavorable. Thus, by failing to mature Technology 4 to TRL8in 2013, its integrations to Technologies 2 and 5 were alsonegatively impacted leading to cost overruns and delays. Toresolve these problems, it has been determined that the softwareprojects that control Technology 4 and its integration linksmust bereplaced.

6. Conclusions and future research

This paper presented an approach which can operationalizethe implementation of a system development plan such as theone obtained from the cost minimization optimization model(SCODmin model) proposed by Magnaye et al. (2010). Theapproach – Earned Readiness Management (ERM) – providesproject managers or system engineering managers with thetools to schedule, monitor and evaluate the completion of tasksaimed at achieving the planned maturity of the system asmeasured by SRL. This is the primary contribution of thispaper. With ERM, it is now possible to exercise a moreeffective maturity-focused managerial control over the processof developing new systems as has been suggested by theGovernment Accountability Office and practitioners them-selves. ERM also reinforces the ability of project managers todefine and analyze the development of individual technologiesthat go into a system not as isolated projects but as critical partsof an integrated unit.

ERM can be used in an interactive manner that can also beintegrated with the learning processes of the developmentorganization. This is important when the system involves a highlevel of novelty and technological contents. Such a system willundergo multiple designs, technology choices and cost estimates,generating valuable insights and lessons. These can be capturedby an iterative application of ERM as the novel technologies,architecture and functionalities within the system becomeclarified and better understood. For example, new more accuratecost data can be entered into the SCODmin optimizationmodel togenerate a revised development plan which can then be translatedinto the system breakdown structure, schedule and so on. The

same could be done with changes in technology choices, systemarchitecture or capabilities.

In accordance with the wishes of the practitioners that weconsulted throughout the research process, ERM is both simpleand familiar. The system development schedule is in the form of aGantt chart, which is used routinely in project management. Thesystem readiness breakdown structure is very similar to a projectwork breakdown structure (WBS) while determining readinessand cost variances and performance indices using ERM is almostthe same as in project earned value management. Like EVM,ERM is only as good as the data that comes into it. Projectmanagement of systems is done in a dynamic environment withchanging system requirements over the system's life cycle.Therefore, the associated project management tasks and activitiesthat are determined by the ERM calculations have to change asthe system evolves.

As a new approach to controlling the development of complexsystems, it can open up new research opportunities. The mostpressing one is to verify and validate the metrics, which serve asits foundation. These are the TRL, IRL and SRL scales. Theymust be applied to a wide cross-section of technologies andsystems across all the relevant domains which include, but are notlimited to, strategic national defense, aerospace, software, energy,transport, environment and economic systems. The primary goalwould be to determine which range of values of SRL correspondsto which phases of the development life cycle for each domain. Itshould be expected that values which are calibrated for spacesystems will be very different from those which will work fornaval systems. Furthermore, within each domain, values for oneclassification of products will be different for another verydifferent group. For example, in naval systems, the SRLcalibration for aircraft carriers will be different from those fordestroyers and escort ships.

When the SRL scale for a particular classification of systemhas been accepted by researchers and practitioners, ERM itselfmust be validated by examining its practicality when managingthe development of a system. This would involve a longitudinalresearch study of that particular class of systems frombeginning to deployment and disposal. Earned Value Manage-ment for projects did not become a mature concept until it wasexperimented with by the students and faculty of the US AirForce Institute of Technology. Perhaps ERM for systems mustalso be subjected to the same amount of scrutiny by the academicinstitutions and contractors associated with the Departments ofDefense and Energy as well as NASA.

Table 10System development schedule.

1. HUBBLE ROBOTIC SERVICING AND DE-ORBIT SYSTEM

Adjusted Sytem Development Schdule (Fiscal Years) and Cost ($Millions)

2006-09 2010 2011 2012 2013

9.000

7.650

6.890

7.340

8.760

4.210

8.530

4.670

5.310

1.890

7.800

1.230

3.890

1.000

1.750

4.000

6.000

6.500

4.500

5.500

2.280

2.750

5.400

6.320

2.290

7.450

6.780

4.530

5.810

7.210

9.000

12.000

14.320

17.650

10.030

11.100

8.080

7.000

45.270

0.7900.690

49.360

5.950

1.2302.190

20.230

0.580

0.000

0.480

700.000

0.480

54.020

0.890

96.860

1.000

265.740

3.450

4.000

2.000

0.500

2014 TOTAL

1.1 CRITICAL TECHNOLOGY ELEMENTS1.1.1 Remote Manipulator (Element 1)

1.1.1.1 TRL=1

1.1.1.2 TRL=2

1.1.1.3 TRL=3

1.1.1.4 TRL=41.1.1.5 TRL=5

1.1.1.6 TRL=6

1.1.1.7 TRL=71.1.1.8 TRL=8

1.1.1.9 TRL=9

1.1.2 Special Purpose Dexterous Manipulator (Element 2)

1.1.2.9 TRL=9

1.1.3 Electronic Control Unit (Element 3)

1.1.3.8 TRL=8

1.1.3.9 TRL=9

1.1.4 Autonomus Grappling (Element 4)

1.1.4.7 TRL=7

1.1.4.8 TRL=8

1.1.4.9 TRL=9

1.1.5 Autonomous Proximity Operations (Element 5)

1.1.5.7 TRL=7

1.1.5.8 TRL=8

1.1.5.9 TRL=9

1.1.6 Laser Image Detector and Radar (ELement 6)

1.1.6.7 TRL=7

1.1.6.8 TRL=81.1.6.9 TRL=9

1.2 CRITICAL INTEGRATION ELEMENTS

1.2.1 Elements 1 and 2

1.2.1.1 IRL=1

1.2.1.2 IRL=2

1.2.1.3 IRL=3

1.2.1.4 IRL=4

1.2.1.5 IRL=5

1.2.1.6 IRL=6

1.2.1.7 IRL=7

1.2.1.8 IRL=8

1.2.1.9 IRL=9

1.2.2 Elements 1 and 3

1.2.2.7 IRL=7

1.2.2.8 IRL=8

1.2.2.9 IRL=9

1.2.3 Elements 2 and 3

1.2.3.7 IRL=7

1.2.3.8 IRL=8

1.2.3.9 IRL=9

1.2.4 Elements 2 and 41.2.4.6 IRL=6

1.2.4.7 IRL=7

1.2.4.8 IRL=8

1.2.4.9 IRL=9

1.2.5 Elements 3 and 5

1.2.5.7 IRL=7

1.2.5.8 IRL=8

1.2.5.9 IRL=9

1.2.6 Elements 4 and 5

1.2.6.3 IRL=3

1.2.6.4 IRL=4

1.2.6.5 IRL=5

1.2.6.6 IRL=6

1.2.6.7 IRL=7

1.2.6.8 IRL=8

1.2.6.9 IRL=9

1.2.7 Elements 5 and 6

1.2.7.3 IRL=3

1.2.7.4 IRL=4

1.2.7.5 IRL=5

1.2.7.6 IRL=6

1.2.7.7 IRL=7

1.2.7.8 IRL=8

1.2.7.9 IRL=9

Total Annual COst ($Millions)

System Readiness Level

LEGEND:

0.000

- Completed Work

- Program Mothballed

- Work to be Completed after Program Resumed at the End of FY2009 and Cost

2004-05

1256 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

Table 11Adjusted system development schedule.

1. HUBBLE ROBOTIC SERVICING AND DE-ORBIT SYSTEM

1.1 CRITICAL TECHNOLOGY ELEMENTS

1.1.1 Remote Manipulator (Element 1)

1.1.1.1 TRL=1

1.1.1.2 TRL=2

1.1.1.3 TRL=3

1.1.1.4 TRL=4

1.1.1.5 TRL=5

1.1.1.6 TRL=6

1.1.1.7 TRL=7

1.1.1.8 TRL=8

1.1.1.9 TRL=9

1.1.2 Special Purpose Dexterous Manipulator (Element 2)

1.1.2.9 TRL=9

1.1.3 Electronic Control Unit (Element 3)

1.1.3.8 TRL=8

1.1.3.9 TRL=9

1.1.4 Autonomus Grappling (Element 4)

1.1.4.7 TRL=7

1.1.4.8 TRL=8

1.1.4.9 TRL=9

1.1.5 Autonomous Proximity Operations (Element 5)

1.1.5.7 TRL=7

1.1.5.8 TRL=8

1.1.5.9 TRL=9

1.1.6 Laser Image Detector and Radar (ELement 6)

1.1.6.7 TRL=7

1.1.6.8 TRL=8

1.1.6.9 TRL=9

1.2 CRITICAL INTEGRATION ELEMENTS

1.2.1 Elements 1 and 2

1.2.1.1 IRL=1

1.2.1.2 IRL=2

1.2.1.3 IRL=3

1.2.1.4 IRL=4

1.2.1.5 IRL=5

1.2.1.6 IRL=6

1.2.1.7 IRL=7

1.2.1.8 IRL=8

1.2.1.9 IRL=9

1.2.2 Elements 1 and 3

1.2.2.7 IRL=7

1.2.2.8 IRL=8

1.2.2.9 IRL=9

1.2.3 Elements 2 and 3

1.2.3.7 IRL=7

1.2.3.8 IRL=8

1.2.3.9 IRL=9

1.2.4 Elements 2 and 4

1.2.4.6 IRL=6

1.2.4.7 IRL=7

1.2.4.8 IRL=8

1.2.4.9 IRL=9

1.2.5 Elements 3 and 5

1.2.5.7 IRL=7

1.2.5.8 IRL=8

1.2.5.9 IRL=9

1.2.6 Elements 4 and 5

1.2.6.3 IRL=3

1.2.6.4 IRL=4

1.2.6.5 IRL=5

1.2.6.6 IRL=6

1.2.6.7 IRL=7

1.2.6.8 IRL=8

1.2.6.9 IRL=9

1.2.7 Elements 5 and 6

1.2.7.3 IRL=3

1.2.7.4 IRL=4

1.2.7.5 IRL=5

1.2.7.6 IRL=6

1.2.7.7 IRL=7

1.2.7.8 IRL=8

1.2.7.9 IRL=9

Total Annual COst ($Millions)

System Readiness Level

LEGEND:

0.000

- Completed Work

- Program Mothballed

- Work to be Completed after Program Resumed at the End of FY2009 and Cost

Adjusted Sytem Development Schdule (Fiscal Years) and Cost ($Millions)

2004-05 2006-09 2010 2011 2012 2013 2014 TOTAL

9.000

7.650

7.340

6.890

4.670

5.310

1.890

7.800

1.2303.890

1.000

1.750

2.000

0.500

4.000

4.000

4.500

5.500

2.750

5.400

6.320

2.290

7.450

6.780

2.280

3.450

4.530

5.810

7.210

9.000

12.000

14.320

17.650

265.740

1.000

96.860

11.100

10.030

0.890

54.020

8.080

7.000

45.270

0.7900.690

49.360

5.950

1.230

2.190

20.230

0.5800.480

0.000700.000

0.480

6.500

6.000

8.760

4.210

8.530

1257R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

Table 12Sample ERM accomplishment report, As of 09/30/2013.

1 2 3 4 5 6 7

Systemelements

ExpectedreadinessTRL or IRLor SRL

AccomplishedreadinessTRL or IRLor SRL

Budgeted costof readinessscheduled(BCRS) $MM

Budgeted costof readinessachieved (BCRA)$MM

Actual costof readinessachieved (ACRA)$MM

Remarks

CTE 1 9 9 9.00 9.00 9.00CTE 2 9 9 0 0 0 TRL accomplished in previous yearCTE 3 9 9 0 0 0 TRL accomplished in previous yearCTE 4 8 6 12.97 0 9.00 Software needs replacementCTE 5 9 9 0 0 0 TRL accomplished in previous yearCTE 6 9 9 0 0 0 TRL accomplished in previous yearCIE 1,2 9 9 0 0 0 IRL accomplished in previous yearCIE 1,3 9 9 0 0 0 IRL accomplished in previous yearCIE 2,3 9 9 0 0 0 IRL accomplished in previous yearCIE 2,4 8 6 14.47 2.75 8.89 Software needs replacementCIE 3,5 8 8 2.29 2.29 2.29CIE 4,5 5 4 7.21 0 7.00 Software needs replacementCIE 5,6 7 7 8.08 8.08 8.08System 0.89 0.83 54.020 22.12 44.26 Unable to test launch the system

1258 R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

References

Abba, W.F., 1997. Earned value management — reconciling government andcommercial practices. Program Manager 26, 58–63.

Abba, W.F., 2001. How Earned Value Got To Prime Time: A Short Look Backand Glance Ahead. The Measurable News.

Abernethey, T., 1984. An Application of the Rayleigh Distribution to ContractCost Data. Naval Postgraduate School, Monterey, CA.

Andersen, E.S., 1996. Warning: activity planning is hazardous to your project'shealth! Int. J. Proj. Manag. 14, 89–94.

Anderson, E.S., Grude, K.V., Haug, T., 1995. The Goal Directed ProjectManagement, Second edition. Kogan Page, London.

Archibald, R.D., 2003. Managing High Technology Programs and Projects,Third edition. John Wiley & Sons, Hoboken, NJ.

Azizian, N., Rico, D., Sarkani, S., Mazzuchi, T., 2010. The current state ofDOD's technology readiness assessment (TRA) practice and its impact onsystem quality and program outcome. Conference on System EngineeringResearch, Hoboken, NJ.

Barr, Z., 1996. EarnedValueAnalysis: ACase Study. ProjectManagementNetwork.Bart, C.K., 1991. Controlling new products in large diversified firms: a

presidential perspective. J. Prod. Innov. Manag. 8, 4–17.Bart, C.K., 1999. Controlling new products: a contingency approach. Int.

J. Technol. Manag. 18, 395–413.Bisbe, J., Otley, D., 2004. The effects of the interactive use of management

control systems on product innovation. Account. Organ. Soc. 29,709–737.Blackburn, C., Valerdi, R., 2009. Navigating the metrics landscape:an introductory literature guide to metric selection, implementation, &decision making. 7th Annual Conference on Systems Engineering Research.Loughborough University, England.

Blumenson, L.E., Miller, K.S., 1963. Properties of generalized Rayleighdistributions. Ann. Math. Stat. 34, 903–910.

Bonner, J.M., Ruekert, R.W., Walker, O.C., 2002. Upper management controlof new product development projects and project performance. J. Prod.Innov. Manag. 19, 233–245.

Brady, J., 2001. Systems engineering cost as an independent variable. Syst.Eng. 4, 233–241.

Brown, S.L., Eisenhardt, K.M., 1995. Product development: past research,present findings, and future directions. Acad. Manag. Rev. 20, 343–378.

Browning, T., Ramasesh, R., 2005. A survey of activity network-based processmodels for managing product development projects. Prod. Oper. Manag. 16,217–240.

Buys, R., 2008. Project planning: planning for action. In: Sage, A.P., Rouse,W.B. (Eds.), Handbook of Systems Engineering and Management. Wiley,Hoboken, NJ, pp. 1251–1282.

Chiesa, V., Frattini, F., Lazzarotti, V., Manzini, R., 2008. Designing aperformance measurement system for the research activities: a referenceframework and an empirical study. J. Eng. Int. J. Technol. Manag. 25,213–226.

Choudhury, V., Sabherwal, R., 2003. Portfolios of control in outsourcedsoftware development projects. Inf. Syst. Res. 14, 291–314.

Cilli, M., Parnell, G., 2010. Vision for a multiple objectives decision supporttool for assessing initial business cases of military technology investments.8th Conference on Systems Engineering Research, Hoboken, NJ.

Clark, K.B., Fujimoto, T., 1991. Product Development Performance: Strategy,Organization, and Management in the World Auto Industry (Hardcover).Harvard Business School Press, Cambridge, MA.

Clark, K.B., Chew, W.B., Fujimoto, T., 1987. Product development in theworld auto industry. Brookings Papers on Economic Activity 729.

Cooper, R.G., 2005. Product Leadership: Pathways to Profitable Innovation,second ed. Basic Books, New York.

Cooper, R.G., Kleinschmidt, E.J., 1987. New products: what separates winnersfrom losers? J. Prod. Innov. Manag. 4, 169–184.

Cooper, R.G., Kleinschmidt, E.J., 1993. Screening new products for potentialwinners. Long Range Plan. 26, 74–81.

Cooper, R.G., Kleinschmidt, E.J., 1995. Benchmarking the firm's critical successfactors in new product development. J. Prod. Innov. Manag. 12, 374–391.

Cooper, R.G., Kleinschmidt, E.J., 2007. Winning businesses in productdevelopment: the critical success factors. Res. Technol. Manag. 50, 52–66.

Davila, A., 2000. An empirical study on the drivers ofmanagement control systems'design in new product development. Account. Organ. Soc. 25, 383–409.

Davila, A., Foster, G., Li, M., 2009. Reasons for management control systemsadoption: insights from product development systems choice by early-stageentrepreneurial companies. Account. Organ. Soc. 34, 322–347.

DoD, 2005. DoD Directive 5000.2. Department of Defense, Washington, DC.Dodaro, G.L., 2009. Defense Acquisitions: Assessment of Selected Weapons

Programs. Government Accountabiloty Office, Washington, DC.Dodaro, G.L., 2010. NASA: Assessment of Selected Large-Scale Projects.

Government Accountability Office, Washington, DC.Dvir, D., Lechler, T., 2004. Plans are nothing, changing plans is everything: the

impact of changes on project success. Res. Policy 33, 1–15.Erhardt, W., Flanigan, D., Herdlick, B., 2010. Defining and measuring

federated model fidelity to support system-of-systems design and develop-ment. 8th Conference on Systems Engineering Research, Hoboken, NJ.

Forbes, E., Volkert, R., Gentile, P., Michaud, K., 2009. Implementation of amethodology supporting a comprehensive system-of-systems maturityanalysis for use by the Littoral Combat Ship mission module program.Sixth Annual Acquisition Research Symposium. Naval PostgraduateSchool, Monterey, California, pp. 131–144.

1259R. Magnaye et al. / International Journal of Project Management 32 (2014) 1246–1259

GAO, 2008. Better weapon program outcomes require discipline, accountability,and fundamental changes in the acquisition environment. In: GAO (Ed.), U.S.Government Accountability Office, Washington, DC.

Garret, R.J., Anderson, S., Baron, N., Moreland, J.J., 2009. Managing theinterstitials— a systems of systems framework suited for the ballistic missiledefense system. National Fire Control Symposium, Nellis Air Force Base.

Hauser, J.R., Clausing, D., 1988. The house of quality. Harv. Bus. Rev. 63–73.Henderson, R.M., Clark, K., 1990. Architectural innovation: the reconfiguration

of existing product technologies and the failure of established firms. Adm.Sci. Q. 35, 9–30.

Herroelen, W., 2005. Project scheduling — theory and practice. Prod. Oper.Manag. 14, 413–432.

Hertenstein, J.H., Platt, M.B., 2000. Performance measures and managementcontrol in new product development. Account. Horiz. 14, 303–323.

Hobday, M., 1998. Product complexity, innovation and industrial organization.Res. Policy 26, 689–710.

Hobday, M., Davies, A., Prencipe, A., 2005. Systems integration: a corecapability of the modern corporation. Ind. Corp. Chang. 14, 1109–1143.

Iansiti, M., 1995. Shooting the rapids: managing product development inturbulent environments. Calif. Manag. Rev. 38, 37–58.

Jaafari, A., 1996. Time and priority allocation scheduling technique forprojects. Int. J. Proj. Manag. 14, 289–299.

Karandikar, H.M., Wood, R.T., Byrd, J., 1992. Process and technologyreadiness assessment for implementing concurrent engineering. SecondAnnual International Symposium of the National Council on SystemsEngineering (NCOSE), Seattle, Washington.

Kasser, J.E., 2004. The first requirements elucidator demonstration (FRED)tool. Syst. Eng. 7, 243–256.

Kim, E., Wells, W.G.J., Duffey, M., 2003. A model for effective implementationof earned value management methodology. Int. J. Proj. Manag. 21, 375–382.

Kirsch, L.J., 1996. The management of complex tasks in organizations:controlling the systems development process. Organ. Sci. 7, 1–21.

Lee, D.A., Hogue, M.R., Gallagher, M., 1993. Determining a budget profile froma development cost estimate. Program Analysis and EvaluationDepartment ofDefense, Washington, DC.

Magnaye, R.B., Sauser, B.J., Ramirez-Marquez, J.E., 2009. Using a systemmaturity scale to monitor and evaluate the development of systems. SixthAnnual Acquisition Research Symposium. Naval Postgraduate School,Monterey, California, pp. 95–112.

Magnaye, R.B., Sauser, B.J., Ramirez-Marquez, J.E., 2010. Systems developmentplanning using readiness levels in a cost of development minimization model.Syst. Eng. 13, 311–323.

Majumdar, W., 2007. System of Systems Technology Readiness Assessment.Naval Postgraduate School, Monterey, CA.

Mankins, J.C., 1995. Technology Readiness Levels. NASA.Mankins, J.C., 2002. Approaches to strategic research and technology (R&T)

analysis and road mapping. Acta Astronaut. 51, 3–21.Moselhi, O., Li, J., Alkass, S., 2004. Web-based integrated project control

system. Constr. Manag. Econ. 22, 35–46.Oppenheim, B., 2004. Lean product development flow. Syst. Eng. 7, 352–376.Patanakul, P., Iewwongcharoen, B., Milosevic, D., 2010. An empirical study on

the use of project management tools and techniques across project life-cycleand their impact on success. J. Gen. Manag. 35, 41–65.

Pawar, K.S., Driva, H., 1999. Performance measurement for product design anddevelopment in amanufacturing environment. Int. J. Prod. Econ. 60–61, 61–68.

Pinto, J.K., Slevin, D.P., 1987. Critical factors in successful project implemen-tation. IEEE Trans. Eng. Manag. EM-34, 22–27.

Pinto, J.K., Mantel, S.J., 1990. The causes of project failure. IEEE Transactionson Engineering Management 37, 269–276.

Pugh, S., 1991. Total Design: Integrated Methods for Successful ProductEngineering. Addison-Wesley, MA.

Raby, M., 2000. Project management via earned value. Work Study 49, 6–10.Ramirez-Marquez, J.E., Sauser, B.J., 2009. System development planning via

system maturity optimization. IEEE Trans. Eng. Manag. 56, 533–548.Sadin, S.R., Povinelli, F.P., Rosen, R., 1989. The NASA technology push

towards future space mission systems. Acta Astronaut. 20, 73–77.Sauser, B., 2008. NASA Strategic Project Leadership in an Era of Better,

Faster, Cheaper: Striving for Systems Innovation. VDMVerlag Dr. Müller,Saarbrücken, Germany.

Sauser, B., Verma, D., Ramirez-Marquez, J., 2006. From TRL to SRL: theconcept of systems readiness levels. Conference on Systems EngineeringResearch, Los Angeles, CA.

Sauser, B., Ramirez-Marquez, J., Henry, D., DiMarzio, D., Gorod, A., Gove,R., Rosen, J., 2007. Methods for estimating system readiness levels. Schoolof Systems and Enterprises White Paper.

Sauser, B., Ramirez-Marquez, J., Henry, D., DiMarzio, D., 2008a. A systemmaturity index for the systems engineering life cycle. Int. J. Ind. Syst. Eng.3, 673–691.

Sauser, B., Ramirez-Marquez, J., Magnaye, R., Tan, W., 2008b. A systemsapproach to expanding the technology readiness level within defenseacquisition. Int. J. Def. Acquis. Manag. 1, 39–58.

Sauser, B., Forbes, E., Long, M., McGrory, S., 2009. Defining an integrationreadiness level for defense acquisition. International Conference of theInternational Council on Systems Engineering. INCOSE, Singapore(July).

Sauser, B.J., Magnaye, R.B., Tan, W., Ramirez-Marquez, J.E., Sauser, B.,2010. Optimization of system maturity and equivalent system mass forexploration systems development. 8th Conference on Systems EngineeringResearch, Hoboken, NJ.

Scearce, P., 2007. A Study of USGovernment's Satellite Incumbents and Follow-On Competitions. Massachusetts Institute of Technology, Cambridge, MA.

Shell, T., 2003. The synthesis of optimal systems design solutions. Syst. Eng. 6,92–105.

Shenhar, A.J., 2001. One size does not fit all projects: exploring classicalcontingency domains. Manag. Sci. 47, 394–414.

Shenhar, A.J., Dvir, D., 2007. Reinventing Project Management: The DiamondApproach to Successful Growth and Innovation (Hardcover). HarvardBusiness School Press, Cambridge, MA.

Shuman, P.A., Ramsey, D.L., Prestwood, D.C.L., 1995. Measuring R & Dperformance. Res. Technol. Manag. 38, 45–55.

Slevin, D.P., Pinto, J.K., 1987. Balancing strategy and tactics in projectimplementation. Sloan Manag. Rev. 29, 33–41 (Fall).

Suomala, P., 2004. The life cycle dimension of new product developmentperformance measurement. Int. J. Innov. Manag. 8, 193–221.

Tan, W., Sauser, B.J., Ramirez-Marquez, J.E., 2011. Analyzing componentimportance in multifunction multicapability systems development maturityassessment. IEEE Trans. Eng. Manag. 58, 275–294.

Tang, V., Otto, K.N., 2009. Multifunctional enterprise readiness: beyond thepolicy of break-test-fix cyclic rework. ASME 2009 International DesignEngineering Technical Conferences & Computers and Information EngineeringConference, San Diego, CA.

Tiwana, A., Keil, M., 2009. Control in internal and outsourced softwareprojects. J. Manag. Inf. Syst. 26, 9–44.

Vollerthun, A., 2002. Design-to-market: integrating conceptual design andmarketing. Syst. Eng. 5, 315–326.

Wheelwright, S.C., Clark, K.B., 1992. Revolutionizing Product Development.The Free Press, New York.

Whitney, D.E., 1988. Manufacturing by design. Harv. Bus. Rev. 66, 83–90.