The Constructive Systems Engineering Cost Model - Center for

71
Understanding Differences Between System of Systems Engineering and Traditional Systems Engineering Ph.D Qualifying Exam Proposal Jo Ann Lane February 2007 Submitted to: Barry W. Boehm, Ph.D. – TRW Professor of Software Engineering F. Stan Settles, Ph.D. – Professor & IBM Chair in Engineering Management George Friedman, Ph.D. – Adjunct Professor Azad Madni, Ph.D. – Adjunct Professor Paul Adler, Ph.D. – Professor, Marshall School of Business Daniel J. Epstein Department of Industrial & Systems Engineering Andrew & Erna Viterbi School of Engineering University of Southern California

Transcript of The Constructive Systems Engineering Cost Model - Center for

Page 1: The Constructive Systems Engineering Cost Model - Center for

Understanding Differences Between System of Systems Engineering

and Traditional Systems Engineering

Ph.D Qualifying Exam Proposal Jo Ann Lane

February 2007 Submitted to: Barry W. Boehm, Ph.D. – TRW Professor of Software Engineering F. Stan Settles, Ph.D. – Professor & IBM Chair in Engineering Management George Friedman, Ph.D. – Adjunct Professor Azad Madni, Ph.D. – Adjunct Professor Paul Adler, Ph.D. – Professor, Marshall School of Business

Daniel J. Epstein Department of Industrial & Systems Engineering Andrew & Erna Viterbi School of Engineering University of Southern California

Page 2: The Constructive Systems Engineering Cost Model - Center for

TABLE OF CONTENTS

Abstract Chapter I: Introduction

1. Research Overview 2. Motivation 3. Proposition and Hypothesis 4. Intended Research Contribution

Chapter II: Background and Related Work 1. Overview of Literature Review 2. What is a “System of Systems”? 3. Traditional Systems Engineering, SoSE, and Related Industry Standards 4. Engineering Cost/Schedule: What Do Engineering Cost Models Look At? 5. Related Organization Theory Concepts 6. Ways to Understand Potential Differences Between SoSE and TSE: System

Dynamics Models Chapter III: Proposed Methodology

1. Overview of Research Design 2. Data Collection Instrument 3. Process Models 4. Boundary Objects 5. Data Collection and Analysis 6. Potential Threats to Validity and Limitations

Chapter IV: Research Plan 1. Schedule Summary 2. Target SoSE and TSE Programs

References Appendix A A.1 SoSE-TSE Comparison Survey Form A.2 Summary of ANSI/EIA 632 System Engineering Processes Appendix B – Analysis Demonstration Using Sample SoSE and TSE Projects B.1 Sample SoSE Survey Response: Jail Information Management System

B.2 Sample SoSE Process Model: Jail Information Management System B.3 Sample TSE Survey Response: Provisioning System B.4 Sample TSE Process Model: Provisioning System B.5 Sample Comparative Analysis

List of Tables 1. COSOSIMO Sub-Model Parameters List of Figures 1. COSOSIMO Overview

2

Page 3: The Constructive Systems Engineering Cost Model - Center for

ABSTRACT

Today’s need for more complex, more capable systems in a short timeframe is leading more organizations towards the integration of existing systems, Commercial-Off-the-Shelf (COTS) products, and new systems into network-centric, knowledge-based systems of systems (SoS). With this development approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE). Recent reports are indicating that SoSE activities are considerably different from the more Traditional Systems Engineering (TSE) activities. Other systems engineering experts believe that there is nothing really different with respect to system engineering activities or component-based engineering in the SoS environment—that there are only differences in scale and complexity. However, most of these beliefs are opinions based on ad hoc observations, albeit from experts working in the SoSE arena, and not substantiated by case study analyses or data. The goal of this research is to investigate SoSE through the study of several large-scale SoSE programs to determine if there are significant differences between SoSE and TSE processes and, if so, to describe them in terms of key cost drivers and impacts to associated effort.

This research effort surveys both SoSE projects and relatively large scale, complex TSE projects to identify key SoS characteristics and SoSE processes, then develops process models for a set of SoSE and TSE projects in order to compare SoSE activities and associated effort of these projects with TSE activities/effort. The resulting analysis is designed to answer the question “are SoSE processes different from TSE processes and, if so, how”. This research effort will provide valuable insights into SoSE as well as provide data to support the on-going development of SoSE cost models at the University of Southern California (USC) Center for Systems and Software Engineering (CSSE).

This proposal presents 1) a statement of the research topic and the intended research contribution, 2) a review of relevant literature, 3) the proposed methodology for addressing the SoSE research illustrated using a sample SoSE and TSE project, and 4) a plan for the completion of this research dissertation.

3

Page 4: The Constructive Systems Engineering Cost Model - Center for

CHAPTER I: INTRODUCTION

1. Research Overview Today’s need for more complex, more capable systems in a short timeframe is leading more organizations towards the integration of existing systems, Commercial-Off-the-Shelf (COTS) products, and new systems into network-centric, knowledge-based system of systems (SoS). With this development approach, system development processes to define the new architecture, identify sources to either supply or develop the required components, and eventually integrate and test these high level components are evolving and are being referred to as SoS Engineering (SoSE). With the advent of this new engineering “specialty”, many are raising questions about how different SoSE is from Traditional Systems Engineering (TSE) and whether there are really any significant differences between SoSE and TSE.

This research effort will survey both SoSE projects and relatively large scale, complex TSE projects to identify key SoS characteristics and SoSE processes, then develop process models for a set of SoSE and TSE projects in order to compare SoSE activities and associated effort of these projects with TSE activities and effort. The resulting analysis is designed to answer the question “are SoSE processes different from TSE processes and, if so, how”. This research effort will provide valuable insights into SoSE as well as provide data to support the on-going development of SoSE cost models at the University of Southern California (USC) Center for Systems and Software Engineering (CSSE).

This proposal presents 1) a statement of the research topic and the intended research contribution, 2) a review of relevant literature, 3) the proposed methodology for addressing the SoSE research illustrated using a sample SoSE and TSE project, and 4) a plan for the completion of this research dissertation.

2. Motivation Recent reports [DoD, 2006b; Northrup et al, 2006] are indicating that SoSE activities are considerably different from the more TSE activities. Other systems engineering experts believe that there is nothing really different with respect to system engineering activities or component-based engineering in the SoS environment—that there are only differences in scale and complexity [DoD, 2006b; Lane, 2005a]. However, most of these beliefs are opinions based on ad hoc observations, albeit from experts working in the SoSE arena, and not substantiated by case study analyses or data. The goal of this research is to investigate SoSE through the study of several large-scale, net-centric, software-intensive SoSE programs to determine if there are significant differences between SoSE and TSE processes and, if so, to identify the types of differences.

3. Proposition and Hypotheses The principle research question being addressed in this research is:

Is SoSE significantly different from TSE? And if there are significant differences, do these differences impact system engineering cost model parameters?

The proposed central hypothesis is:

4

Page 5: The Constructive Systems Engineering Cost Model - Center for

There exists a set of activities in SoSE projects that are distinctly different from those performed on TSE projects.

To conduct the proposed research, the following null hypothesis is evaluated:

SoSE processes/activities are the same as TSE processes and there is no significant difference in the way TSE processes are applied to SoSE projects.

4. Intended Research Contribution This research is intended to provide the following contributions in the areas of systems engineering and cost modeling:

1. Understanding of SoSE architectures and engineering processes and how they are similar/dissimilar to other engineering approaches/specializations.

2. Understanding of SoSE processes to support SoSE cost modeling, cost estimation, risk management, and process tradeoff evaluations.

5

Page 6: The Constructive Systems Engineering Cost Model - Center for

CHAPTER II: BACKGROUND AND RELATED WORK

1. Overview of Literature Review SoSE is considered by many to be a multi-disciplinary area and is, in fact, a popular topic at many multi-disciplinary conferences such as those on Integrated Design and Process Technology [SDPS, 2006] and Complex Systems [The Aerospace Corporation et al., 2007]. These conferences reach out to researchers in the areas of biology, sociology, psychology, business process engineering, mathematics, computer science, and engineering, to name a few, to share cross-cutting information and concepts that may help to understand complex topics of interest.

SoSE includes management and organizational aspects as well as technical aspects. On the technical side, it includes systems engineering specialties as well as software and information management specialties. As these systems become larger and larger, people are looking at the unique aspects of complex systems and working on the edge of chaos to develop new and innovative approaches for SoSE [Northrop et al., 2006, Kreitman, 1996; Sheard, 2006; Berryman et al., 2006; Prokopenko et al., 2006; Highsmith, 2000].

The following sections summarize the literature in the areas key to this proposed research, namely, system of systems, systems of systems engineering, traditional systems engineering, engineering cost modeling, organizational theory, and system dynamics modeling.

2. What is a “System of Systems”? The earliest references in the literature to “systems within systems” or “system of systems” can be found in [Berry, 1964] and [Ackoff, 1971]. These 1960-1970 era SoS concepts are early insights into the evolution of systems of today. Even though the term “system of systems” was not commonly used at this time, systems of systems were being developed and deployed. These SoSs are represented by undersea surveillance and weapons systems such as the Integrated Undersea Surveillance System (IUSS) [FAS, 2006; IUSSCAA, 2006], Sound Surveillance System (SOSUS) [GlobalSecurity.ORG, 2005], and Anti-Submarine Warfare (ASW) system [Smithsonian Institution, 2000] used during the Cold War era to track and evade Russian submarines; the Global Positioning System (GPS) [NAVSTAR, 2006] that is today considered both as a SoS and a component system for other SoSs; and military command and control centers. As these types of integrated systems became more common, system engineering experts and researchers began to define and study them as a special class of systems. And, as the term has become a popular way to represent a strategic and economic approach to enhancing existing system capabilities, we now have an abundance of definitions.

A review of recent publications [Lane and Valerdi, 2005] shows that the term “system-of-systems” means many things to many different people and organizations. In the business domain, an SoS is the enterprise-wide or multiple enterprise integration and sharing of core business information across functional and geographical areas. In the military domain, an SoS is a dynamic communications infrastructure and a configurable set of component systems to support operations in a constantly changing, sometimes adversarial, environment. For some, an SoS may be a multi-system architecture that is planned up-front by a prime contractor or lead system integrator. For others, an SoS is an architecture that evolves over time, often driven by organization needs, new technologies appearing on the horizon, and available budget and

6

Page 7: The Constructive Systems Engineering Cost Model - Center for

schedule. The evolutionary SoS architecture is more of a network architecture that is reconfigured and grows with needs and available resources.

Some SoS definitions refer to “emergent behaviors” or a “common purpose” of SoSs where an SoS can perform functions that cannot be provided by any of the constituent systems [Cocks, 2006; DoD, 2006a; Eisner, 1993; Kriegel, 1999; Maier, 1998; Sage and Cuppan, 2001; Shenhar, 1994; USAF SAB, 2005]. However, if one reviews definitions of a system [ANSI/EIA, 1999; Blanchard and Fabrycky, 1998; INCOSE, 2006; ISO/IEC, 2002], one sees that many of these definitions indicate that a system is a set of components working together for a common objective or purpose. So, “emergent behavior” does not appear to be a system characteristic unique to SoSs. What is controversial in the SoS arena is whether emergent behaviors should be planned and managed. There are those that would like to see the development of convergent protocols that could be implemented in a variety of systems to “SoS-enable” them, allowing these systems to easily come and go in SoSs with little additional effort [USAF SAB, 2005]. There are others that are concerned that if one is not careful, there may be undesirable emergent behaviors (e.g., safety or security problems), and to avoid these problems, emergent behaviors must be planned, tested, and managed [DoD, 2006b].

In any case, users and nodes in the SoS network may be either fixed or mobile. Communications between component systems in the SoS are often some combination of common and custom-defined protocols. Networks may tie together other networks as well as nodes and users. SoS component systems typically come and go over time. These component systems can operate both within the SoS framework and independent of this framework. In a general sense, it is challenging to define clear boundaries of a specific SoS because of its dynamic nature.

What is unique to SoSs by many definitions [Maier, 1998; Sage and Cuppan, 2001; USAF SAB, 2005; DoD, 2006a, DoD, 2006b, Cocks, 2006] is that they are comprised of component systems that possess the following characteristics:

a. Operationally independent, meaning that each of the components can perform useful functions both within the SoS and outside of the SoS

b. Managerially independent, meaning that each of the components is managed and maintained for its own purpose.

The research described in this proposal assumes that the SoSs under consideration are those that are comprised of component systems that possess the characteristics of operational and managerial independence.

3. Traditional Systems Engineering, SoSE, and Related Industry Standards The International Council on Systems Engineering (INCOSE) Systems Engineering Handbook [INCOSE, 2006] provides an overview of the origins of systems engineering, starting in 1829 with the development of the rocket locomotive and then developing some rigor in the 1930’s with the engineering of the British air defense systems and Bell Labs work. Today, there are several standards for systems engineering processes:

ANSI/EIA Standard 632 [ANSI/EIA, 1999]: This standard defines a set of five systems engineering task areas and 33 activities associated with the various task areas. The purpose of this standard is to support the engineering (or re-engineering) of a system.

7

Page 8: The Constructive Systems Engineering Cost Model - Center for

The phases addressed are system conceptualization, development, and transition to operation. A description of the five tasks and 33 activities are provided in Appendix A.2.

ISO/IEC Standard 15288 [ISO/IEC, 2002]: This ISO standard provides a framework for describing system lifecycle processes in the areas of hardware, software and human interfaces. The scope of the processes is from system conception, through the initial development, operation, maintenance, and evolution of the system, all the way through to system retirement.

Defense Acquisition Guidebook (DAG) [DoD, 2006a]: The DAG is designed to provide acquisition personnel and their industry partners with a comprehensive reference to best business practices and supporting policies, statues, and lessons learned applicable to the development of DoD systems. As part of this guidance, the DAG defines and describes eight technical management processes (technical planning, requirements management, interface management, risk management, configuration management, technical data management, technical assessment, and decision analysis) and eight technical processes (requirements development, logical analysis, design solution, implementation, integration, verification, validation, and transition).

Software Engineering Institute’s Capability Maturity Model Integrated (CMMI) [SEI, 2001]: The CMMI provides a framework to help an organization define its standard systems engineering processes. It focuses on process areas applicable to both systems and software engineering (process management, project management, engineering, and support) and provides guidance for defining standard organizational processes in these areas in terms of goals, practices, and typical work products. The framework is organized in a manner that also provides guidance for continual process improvement and a process for assessing the maturity of a given organization’s system and software engineering processes.

While these standards have various phases of interest and describe different process areas and activities, there is considerable overlap between the phases, process areas, and activities of each. For example, ANSI/EIA 632 is viewed as a more detailed description of the ISO/IEC 15288 phases of conceptualization, development, and transition to operation. The DAG references ANSI/EIA 632, ISO/IEC 15288, and the SEI CMMI as examples of best practice models and standards. However, in an attempt to standardize terminology across these various models and standards, the DAG defines its own set of processes.

Most of these standards address SoS development to some extent and at least imply that the processes, standards, and models apply to SoSE. The DoD Defense Acquisition group has gone a step further and developed an SoSE Guidebook that extends the DAG:

SoSE Draft Guidebook [DoD, 2006b]: The SoSE Guidebook is designed to be used in conjunction with the DAG. It expands on chapter 4 of the DAG that describes the 16 systems engineering technical and technical management processes and addresses SoS and SoSE key considerations from the perspective of the SoS sponsors, the program manager, and the chief engineer. These processes and considerations are described in the context of the systems engineering “Vee” process model.

The following sections describe aspects of SoSE in an attempt to understand how it might be different from TSE.

8

Page 9: The Constructive Systems Engineering Cost Model - Center for

Key Activities for SoSE Organizations: While SoSs are conceptually simple, they get very complex when dealing with security, safety, information management on the technical side and multiple stakeholders and vendors/suppliers on the management side. To better understand SoSE, SoS projects have been observed and SoS engineers surveyed with respect to the types of issues they typically face [Lane, 2005a; Lane, 2005b; Lane, 2005c]. The following describes these findings.

Once an SoSE team is under contract to develop an SoS, they quickly begin to concurrently define the scope of the SoS, plan the activities to be performed, analyze the requirements, and start developing the SoS architecture/framework. As the scope, requirements, and architecture start to firm up, the SoSE organization begins source selection activities to identify the desired component system suppliers. Then, as the suppliers start coming on board, the SoSE organization must focus on teambuilding, re-architecting, and feasibility assurance with the selected suppliers. Teambuilding is critical since the SoSE organization and the selected suppliers have often been competitors in the past and now must work together as an efficient, integrated team. Re-architecting is often necessary to make adjustments for the selected system components that may not be compatible with the initial SoS architecture or other selected components. And feasibility assurance is conducted to better evaluate technical options and their associated risks. Many of the technical risks in an SoS framework are due to incompatibilities between different system components or limitations of older system components with today’s technology.

As the SoS development teams begin to coalesce, the SoSE organization focuses on incremental acquisition activities for the development and integration/test of the required component systems for the SoS. During this process, there are often continuous changes and new risks that must be managed. In addition, the SoSE organization is continuously looking for opportunities to simplify the SoS architecture and reduce effort, schedule, and risks. Key management issues include:

• Number of stakeholders – The stakeholders in an SoS development effort are numerous. They come from sponsoring and funding organizations as well as the various user communities that have high expectations for the planned SoS.

• Number of development organizations – Because the component systems are often “owned” by an organization other than the sponsoring or SoSE organizations, there is often a separate development organization associated with each component system in the SoS. In addition, some of the component systems can be systems of systems in their own right. This means that there may be lower level suppliers associated with each component system, adding to the number of development organizations.

• Number of decision “approvers” – Studies [Blanchette, 2005; Pressman and Wildavsky, 1973] have shown that as the number of people involved in the decision-making process increases, the probability of getting a timely (or even any decision) often decreases. In the SoS development arena, the stakeholders, the system component “owners”, and well as the SoSE organization are often all involved in making key decisions.

• Cross-cutting risks – These are risks that cut across organizational boundaries and/or component systems (as opposed to component system risks that can be managed by the component system supplier). Key to a successful SoS is negotiating solutions that are optimal for the SoS, and not necessarily optimal for some of the component systems.

9

Page 10: The Constructive Systems Engineering Cost Model - Center for

This requires component system stakeholders or suppliers to sometimes implement changes for the SoS which are to their detriment.

• Schedules – A key feature of SoSs is that the component systems within an SoS are typically independently owned and managed by another organization. This means that SoS timelines are often controlled by other “outside” goals and timelines. SoS-enabling features are often incorporated into SoS components along with the other enhancements and features planned by the component “owner”. There may be long-lead enhancements that are not required by the SoS architecture/system, but are more important to the component owner or user organization and will delay implementation of the SoS features. Also, these other on-going changes (not required for the SoS) may impact the stability of the component (including its architecture). A current example of this is the limited resources and specialists available to develop new features needed to support today’s Iraq operations vs. features needed to support the Army’s Future Combat System (FCS) SoS in the future [Dalrymple, 2006]. While this may be perceived as more of a schedule issue, it can also impact effort since component delivery delays can result in inefficient integration activities and significant rework.

As SoSE organizations try to scale up their TSE management processes, they find that there are often new and unexpected issues. Typical management issues include [Lane, 2005c]:

• Traditional planning and scheduling may lead to unacceptably long schedules, requiring the SoSE organization to be more creative in both their technical and implementation approaches.

• Planning and tracking activities must integrate inputs from a variety of different organizations, each with its own (and probably different) process.

• Traditional oversight and coordination can spread key SoSE personnel too thin.

• More emphasis is required for contracting and supplier management. Incentives are often needed to better align priorities and focus of the system component supplier organizations. In addition, contracts must provide mechanisms to allow suppliers to participate more in the change management process to help assess impacts and to develop efficient approaches to proposed changes.

• Standardization of all processes may be overwhelming. The SoSE organization needs to decide what to standardize and what to let the suppliers control.

• The decision making process involves considerably more organizations. As mentioned above, this can make the decision making process much more complex and time-consuming and it may have significant impacts on the overall schedule and effort.

• Risk management for cross-cutting risks needs to cross organizational boundaries. It is important that risk management activities for cross-cutting risks don’t select strategies that are optimal for one area of the SoS, but are to the detriment of other areas. The focus must be on the overall SoS.

Since SoS development efforts usually span many years and include many incremental or evolutionary developments, there are opportunities for the SoSE organization to adapt and mature their processes to the SoSE environment. One of the key observations is how SoSE organizations are attempting to blend traditional processes with more agile processes [Madachy

10

Page 11: The Constructive Systems Engineering Cost Model - Center for

et al, 2006]. They are more agile when dealing with risk, change, and opportunity management for future increments, but plan for stabilized evolutionary increments in the near term. Key to this approach is knowing when to plan, control, and stabilize and when to be more flexible, agile, and streamlined. The agile teams are responsible for performing acquisition intelligence, surveillance, and reconnaissance functions, and then rebaselining future increment solutions as necessary.

Reported Differences Between SoSE and TSE: Many have reported on differences between TSE and SoSE in recent reports and conferences. Most of these differences are in the areas of architecting; prototyping, experimentation, and tradeoffs; and SoS scope and performance. Several [Meilich, 2006; USAF SAB, 2005] have stated that SoS architecting must focus more on composability than traditional design by decomposition and that architectures are net-centric as opposed to hierarchical. It has also been noted that in order to successfully develop an SoS, there must be intense concept phase analysis followed by continuous anticipation of change and supported by on-going experimentation [USAF SAB, 2005]. Extensive modeling and simulation are also required to better understand emergent behaviors [Finley, 2006] and to support early, first order tradeoffs at the SoS level and evaluations of alternatives [Garber, 2006; Finley 2006]. Over the long term, [USAF SAB, 2005] reports that it will be important to discover and utilize standard convergence protocols that will “SoS-enable” candidate component systems and support their incorporation into multiple SoSs. SoSs also seem to extend the concepts of system flexibility and adaptability [USAF SAB, 2005] and it has become clear to some that the human must be considered as part of the SoS [Siel 2006; Meilich, 2006; USAF SAB, 2005]. Finally, SoSs are designed to be dynamically reconfigured as needs change [USAF SAB, 205] and therefore, the organizational scope of the SoS is defined at runtime instead of during system development [Meilich, 2006].

In the United States Department of Defense (DoD) arena, many key challenges for SoSE have been observed. It can be difficult to get the necessary commitment and cooperation between multiple government organizations, the SoS proponents, and the associated suppliers/vendors. Therefore, new business models and incentives are needed to encourage working together at the SoS level [Garber, 2006]. This also requires accountability at the SoS enterprise level and the removal of multiple decision-making layers [Pair, 2006]. Often in the early stages of a large program, there is an urgency and a temptation to take shortcuts. However, experience has shown that in the case of SoSs, it is important to take the time to do the necessary analyses and tradeoffs [Garber, 2006] and to focus on commonality of data, architecture compatibility, and business strategies at the SoS level [Pair, 2006] as well as human-system integration [Siel, 2006, Meilich, 2006], technology maturity [Finley, 2006], and the necessary evolutionary management of the SoS [Boehm, 2006; Meilich, 2006].

The question remains, however, as to how much, if at all, do these identified differences impact the associated systems engineering effort and schedule. Are these differences significant enough to cause us to plan development efforts differently, perform additional or different activities, or to allocate resources differently? Or is this just a natural evolution of the traditional systems engineering process that still falls within the typical systems engineering effort profiles that are the basis for current TSE cost estimation [Valerdi, 2005]?

11

Page 12: The Constructive Systems Engineering Cost Model - Center for

4. Engineering Cost/Schedule: What Do Engineering Cost Models Look At? As mentioned in the introduction, one of the goals of this research is to better understand SoSE and support the development of cost models that may be used to estimate cost and schedule and support tradeoff analyses. Much work has been done in the area of software development cost modeling, systems engineering cost modeling, COTS integration cost modeling, as well as some work in the area of SoSE cost modeling [Boehm et al., 2005]. These cost models focus on the engineering product, the processes used to develop the product, and the skills and experience levels of the technical staff responsible for the development of the product. These parametric cost models require a set (one or more) size drivers to describe the size of the product to be developed. The size driver(s) are used to compute a nominal effort (labor hours) for the project. In addition, there are a set of cost drivers to adjust the nominal effort either up or down, depending on the characteristics of the product to be developed, the processes used to develop the product, and the experience and capabilities of the people developing the product. Key to these cost models is understanding how much influence each cost driver should have on the estimated effort.

Planning, Requirements Management,

and Architecting (PRA)

Source Selection and Supplier

Oversight (SO)

SoS Integrationand Testing

(I&T)

Size Drivers

Cost Drivers

SoSDefinition andIntegrationEffort

Planning, Requirements Management,

and Architecting (PRA)

Source Selection and Supplier

Oversight (SO)

SoS Integrationand Testing

(I&T)

Size Drivers

Cost Drivers

SoSDefinition andIntegrationEffort

Figure 1. COSOSIMO Overview

Initial efforts in the SoSE cost model arena have resulted in the Constructive SoS Integration Cost Model (COSOSIMO), shown in Figure 1 [Lane and Boehm, 2007]. As a result of inputs from affiliates and SoSE analysis, this cost model is being developed as three sub-models: Planning, Requirements, and Architecture (PRA), Source Selection and Oversight (SO), and Integration and Test (I&T). Table 1 shows the size drivers and cost drivers that have been developed through workshops with USC CSSE industry

affiliates for each of the SoSE cost sub-models in COSOSIMO.

Table 1. COSOSIMO Sub-Model Parameters [Lane, 2006]

Type Name Description

PRA Parameters

Number of SoS-Related Requirements

Represents the number of requirements for the SoS of interest at the SoS level. Requirements may be functional, performance, feature, or service-oriented in nature depending on the methodology used for specification. They may be defined by the customer or contractor. SoS requirements can typically be quantified by counting the number of applicable shalls, wills, shoulds, and mays in the SoS or marketing specification. Note: some work may be required to decompose requirements to a consistent level so that they may be counted accurately for the appropriate SoS-of-interest.

Size Drivers

Number of SoS Interface Protocols

Number of distinct net-centric application protocols to be provided/supported by the SoS framework. Note: This does NOT include protocols internal to a given SoS component system, but it does include protocols external to the SoS and between the SoS component systems. Also note that this is not a count of total protocols used between the various component systems, but rather a count of distinct protocols at the SoS level.

12

Page 13: The Constructive Systems Engineering Cost Model - Center for

Table 1. COSOSIMO Sub-Model Parameters [Lane, 2006]

Type Name Description

Requirements Understanding

Rates the level of understanding of the SoS requirements by all of the SoS stakeholders including the SoS customers and sponsors, SoS PRA team members, component system owners, users, etc. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use.

Level of Service Requirements

Rates the difficulty and criticality of satisfying the ensemble of level of service requirements or Key Performance Parameters (KPPs), such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability.

SoS Stakeholder Team Cohesion

Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, Integrated Product Team (IPT) framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team.

PRA Team Cohesion Represents the anticipated level of PRA team cooperation and cohesion, personnel capability and continuity, as well as PRA personnel experience with the relevant domains, applications, language, and tools for SoS SoSE personnel working on the PRA activities.

PRA Process Maturity Rates the maturity level and completeness of the SoSE team’s PRA processes and plans.

PRA Tool Support Rates the coverage, integration, and maturity of the PRA tools in the SoS engineering and management environments.

PRA Cost/Schedule Compatibility

Rates the extent of business or political pressures to reduce the cost and schedule associated with the PRA activities and processes.

Cost Drivers

SoS PRA Risk Resolution

A multi-attribute parameter that represents the number of major SoS PRA risk items, the maturity of the associated risk management and mitigation plans, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS PRA risk areas.

SO Parameters

Number of Independent Component System Organizations

Number organizations managed by the SoSE team that are providing SoS component systems Size Drivers

Number of Unique Component Systems

Number of types of component systems that are planned to operate within the SoS framework. If there are multiple versions of a given type that have different interfaces, then the different versions should also be included in the count of component systems.

Requirements Understanding

Rates the level of understanding of the SoS requirements between the SoSE team and the component system suppliers/vendors. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use.

Architecture Maturity Represents the level of maturity of the SoS architecture. It includes the level of detail of the interface protocols and the level of understanding of the performance of the protocols in the SoS framework.

Level of Service Requirements

Rates the difficulty and criticality of satisfying the ensemble of level of service requirements or KPPs, such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability.

SoSE/Supplier Team Cohesion

Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, IPT framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team.

SO Team Capability Represents the anticipated level of SO team cooperation and cohesion, personnel capability and continuity, as well as SO personnel experience with the relevant domains, applications, language, integration tools, and integration platform(s) used by the various suppliers/vendors

SO Process Maturity Rates the maturity level and completeness of the SoSE SO processes and plans.

SO Tool Support Rates the coverage, integration, and maturity of SO tools in the SoS engineering and management environment.

Cost Drivers

SO Process Cost/Schedule Compatibility

Rates the extent of business or political pressures to reduce cost and schedule.

13

Page 14: The Constructive Systems Engineering Cost Model - Center for

Table 1. COSOSIMO Sub-Model Parameters [Lane, 2006]

Type Name Description

SoS SO Risk Resolution

A multi-attribute parameter that represents the number of major SoS SO risk items, the maturity of the associated risk management and mitigation plans, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS SO risk areas.

I&T Parameters

Number of SoS Interface Protocols

Number of distinct net-centric application protocols to be provided/supported by the SoS framework.

Number of Operational Scenarios

Represents the number of operational scenarios that an SoS must satisfy. Such scenarios include both the nominal stimulus-response thread plus all of the off-nominal threads resulting from bad or missing data, unavailable processes, network connections, or other exception-handling cases. The number of scenarios can typically be quantified by counting the number of SoS states, modes, and configurations defined in the SoS concept of operations or by counting the number of “sea-level” use cases [Cockburn 2001], including off-nominal extensions, developed as part of the operational architecture.

Size Drivers

Number of Unique Component Systems

Number of types of component systems that are planned to operate within the SoS framework. If there are multiple versions of a given type that have different interfaces, then the different versions should also be included in the count of component systems.

Requirements Understanding

Rates the level of understanding of the SoS requirements by all of the SoS stakeholders including the SoS customers and sponsors, SoS I&T team members, component system owners, users, etc. Primary sources of added systems engineering effort are unprecedented capabilities, unfamiliar domains, or capabilities whose requirements are emergent with use.

Architecture Maturity Represents the level of maturity of the SoS architecture with respect to I&T activities. It includes the level of detail of the interface protocols and the level of understanding of the performance of the protocols in the SoS framework.

Level of Service Requirements

Rates the difficulty and criticality of satisfying the ensemble of level of service requirements or KPPs, such as security, safety, transaction speed, communication latency, interoperability, flexibility/adaptability, and reliability.

I&T Team Cohesion Represents a multi-attribute parameter which includes leadership, shared vision, diversity of stakeholders, approval cycles, group dynamics, IPT framework, team dynamics, trust, and amount of change in responsibilities. It further represents the heterogeneity in stakeholder community of the end users, customers, implementers, and development team.

SoS I&T Team Capability

Represents the anticipated level of SoS I&T team cooperation and cohesion, personnel capability and continuity, as well as I&T personnel experience with the relevant domains, applications, language, integration tools, and integration platform(s) needed to integrate the SoS system components and test the SoS.

I&T Process Maturity Rates the maturity level and completeness of the SoSE processes and plans, and in particular, those associated with I&T activities and the SOS integration lab.

I&T Tool Support Coverage, integration, and maturity of the tools in the SoS I&T environment.

I&T Process Cost/Schedule Compatibility

Rates the extent of business or political pressures to reduce the cost and schedule associated with the I&T processes and activities.

SoS I&T Risk Resolution

Represents the number of major SoS I&T risk items, the maturity of risk management and mitigation plan, compatibility of schedules and budgets, expert availability, tool support, and level of uncertainty in SoS I&T risk areas.

Component System Maturity and Stability

Indicates the maturity level of the component systems (number of new component systems versus number of component systems currently operational in other environments), overall compatibility of the component systems with each other and the SoS interface protocols, the number of major component system changes being implemented in parallel with the SoS framework changes, and the anticipated change in the component systems during SoS integration activities.

Cost Drivers

Component System Readiness

Indicates readiness of component systems for integration. User evaluates level of Verification and Validation (V&V) that has/will be performed prior to integration and the level of subsystem integration activities that will be performed prior to integration into the SoS integration lab.

14

Page 15: The Constructive Systems Engineering Cost Model - Center for

The next steps in the development of the COSOSIMO sub-models are to define an appropriate set of complexity weights for the size drivers and an appropriate range of values for each of the cost drivers.

5. Related Organizational Theory Concepts Many of the systems engineering and SoSE practices and processes (both management and technical) can be traced to those that began to develop in the early to mid-1900s as project-based, multi-disciplinary work became the norm in the areas of construction industry, city planning, civil engineering, aircraft design and development, car design and development, and oil industry systems [Pinney, 2001]. In fact, in many of these industries today, much of the design and production activities are based on systems that are often integrated into what might be considered by some as SoSs.

While it is important to understand the more general theory and the history of today’s engineering processes, this is not the focus of this research. Rather this research is trying to identify discriminators between two similar engineering specialties: SoSE and TSE. Of interest here is organizational theory focusing on management and technical processes and communication dynamics for very large distributed, multi-disciplinary engineering projects with performing organizations often crossing company, political, and national boundaries.

Processes and Process Modeling: SoSs typically fall into the realm of complex systems [Sheard, 2006]. And as complex systems become more of the norm, many are writing about developing complex systems on the edge of chaos [Highsmith, 2000; Markus et al., 2002; Sheard, 2006] and the associated need for flexible, adaptable, agile development processes to deal with rapid change and to facilitate innovative solutions in ultra-large solution spaces [Boehm and Lane, 2006; Kreitman, 1996]. Teams must be able to experiment and change processes when progress is not as expected [Kreitman, 1996]. In order to understand when progress is not as expected, it is important to have management processes to track cost, schedule, and completed work so that deviations can be identified early. Some are quick to point out that many system development projects are overly ambitious and often management fears that if this is known early on, the program will be cancelled—so true progress is hidden and management encourages the team to work harder in the hope that the development team will come up with a miracle to get the program back on track [Brooks, 1995; Kreitman, 1996; Carlock and Fenton, 2001; Carlock and Lane, 2006]. By combining accurate management tracking and flexible technical processes, many problems can be avoided.

Often engineering processes and associated process models are defined to be tailorable to accommodate differences between projects [SEI, 2001]. Therefore, through the analysis of project process models, one can often discern key differences between different types of projects. For example, [Lane, 1999] discusses how traditional software development processes are tailored to support the development of software functionality through the integration of Commercial Off-the-Shelf (COTS) products. By analyzing process models and resource allocations captured in project Work Breakdown Structures (WBSs), it was found that there were some key differences. For COTS integration, key activities are performed in a different order and effort is distributed very differently across the key activities than in traditional software development projects. For the COTS integration project, there are considerably more requirements analyses, trade studies, and product evaluations (testing) prior to software design and development and a considerable focus on associated business processes to be supported by the software system. In addition, the

15

Page 16: The Constructive Systems Engineering Cost Model - Center for

COTS integration implementation activities consist more of user interface customization, insertion of business rules and initialization data into the COTS products, and the development of glue code to support integration with other COTS products. This is instead of the “develop new code” approach in traditional software development. Because the implementation activities are considerably different, COTS integration projects require some different skill sets than are typically applied to new software development projects: more software product evaluators, domain experts to work with the user community in deciding on key business rules and initialization data, and system programmers to implement COTS product integrations with database systems, legacy systems, and other COTS products. [Lane, 1999] also shows how these process differences translated into significant differences with respect to software system development cost and schedule.

Technical Communication—Boundary Objects: To engineer new systems or SoSs (systems-of-interest) requires that a diverse group of specialists come together to develop a system or SoS. As part of this effort, these specialists must use their expertise to innovate, create, and transfer knowledge to other members of the team and to communicate with stakeholders and end users. The term “boundary object” is sometimes used to describe that set of knowledge artifacts that are used to convey information about the desired system-of-interest capabilities and associated priorities, the overall structure of the system-of-interest architecture, the mechanisms and protocols that will be used to integrate the components, as well as to explain system-of-interest concepts and capabilities to stakeholders and eventual users of the system. These artifacts can consist of documents, databases, models, and prototypes. The artifacts can be physical entities or mental models. That the artifact exists does not make it a boundary object—to be a boundary object, the artifact must be used to transfer, translate, or transform information from one group to another [Star and Griesemer, 1989; Carlile, 2002].

A survey of boundary object literature shows that boundary objects have been used in various domains such as social science, design engineering, product development, software development, and service industries and the artifacts used as boundary objects in these settings varied significantly depending on the needs of the interacting groups [Fong et al., 2007]. Therefore, in the engineering domain, as one changes focus from hierarchical decomposition in TSE to composition in SoSE, one might expect to find that there are differences in the types of boundary objects used by the various engineering team in each engineering discipline.

Other Coordination, Communication, Collaboration, and Decision Making Dynamics: As engineering teams get larger, coordination, communications, collaboration, and the resulting decision making activities become more time-consuming and difficult. This is especially so when one views the various specialties that participate in the engineering of a system or SoS. For example, performance, human factors, security, safety, as well as functional performance must be addressed in a manner that produces a system or SoS that meets the customer’s needs. Technical decisions need to be made in the context of all of these areas—decisions that meet the overall needs, yet are not to the detriment to any of these system-of-interest aspects. Based on these and other observations, [Lu, 2003] suggests that engineering is evolving to more of a collaborative negotiation process. Others [Pressman and Wildavsky, 1973] provide a case study showing how the decision making process erodes as the number of decision makers increase over time. [Dorner, 1996] has identified and analyzed the roots of catastrophes in strategic planning and associated strategic decision making and provides techniques for recognizing and avoiding

16

Page 17: The Constructive Systems Engineering Cost Model - Center for

errors at the strategic level in complex situations. Related to this, [Kreitman, 1996] talks about the need to make sure ALL affected organizations are involved in cross-cutting decisions to ensure proper tradeoffs in situations similar to those that can impact both the SoS level and the component system level. [Friedman, 2005] also looks at many of these issues on a very global level, focusing on the importance building strategic alliances across geographical and political boundaries and interacting in a very international culture to develop larger and larger resources bases needed to keep up with the rapidly increasing pace of business. Finally, Eberhart Rechtin made some interesting observations on communication and coordination issues in his system architecture heuristics [Rechtin, 1991]:

• The time in days, T, to obtain an approval requiring N signatures is approximately T=2N-2. And associated with this is the Pareto principal: Perhaps only 20% of the signatures really count.

• The probability of implementing new ideas depends on the number of persons in the chain leading to their implementation and on the probability that each person understands and retransmits that idea.

• A team producing at the fastest rate humanly possible spends half its time coordinating and interfacing.

Taking all of these observations, heuristics, and findings together, one can conclude that as systems get larger and more complex, resulting in larger and more diverse engineering teams, coordination and collaboration needs to be both broad and deep. In addition, negotiation and decision making needs to happen at the appropriate levels and be broad enough to include the right considerations (effective), but not extend unnecessarily to multiple levels in the organization (efficient).

6. Ways to Understand Potential Differences Between SoSE and TSE: System Dynamics Models

System dynamics modeling tools are visual modeling tools that allow one to conceptualize, simulate, and analyze models of dynamic systems and processes. Simulation models are built from causal loop or stock and flow diagrams. Relationships among system variables (or influences) are entered as causal connections. The model is analyzed throughout the building process by looking at the causes and uses of a variable and at the loops involving the variable. System dynamics models are also executable, allowing the user to explore the behavior of the model and conduct “what if” analyses.

Researchers in the areas of TSE and SoSE are using these tools to identify and better understand key influences in various engineering activities. For example, [Madachy et al., 2006] showed how the use of both agile and plan-driven processes can be used to develop and deploy large systems in a constantly changing environment. [Cresswell et al., 2002] describes how a model might be used to better understand collaboration, trust building, and knowledge sharing in a complex, intergovernmental information system project while [Greer et al., 2005] investigates “disconnects” in baselines across multiple organizations in a large software-intensive space system development program. Through another model, [Black and Repenning, 2001] investigated the under-allocation of resources in the early phases of a project and the impacts that it had on increasing error rates, overworked engineers, and declining performance. [Ferreira,

17

Page 18: The Constructive Systems Engineering Cost Model - Center for

2002] developed a model to investigate the impact of requirements volatility on software development projects. [Ford and Sterman, 2003] use a system dynamics model to analyze concurrent development projects and interactions between the technical and behavioral dimensions of the projects.

These sample system dynamics models have been very insightful in understanding influences and dynamics in situations often common to systems engineering and development. Using these techniques, it is possible to analyze high level observations or indicators and better understand how processes can be changed to take advantage of certain good dynamics and to minimize the impact of undesirable dynamics.

18

Page 19: The Constructive Systems Engineering Cost Model - Center for

CHAPTER III: PROPOSED METHODOLOGY

1. Overview of Research Design The proposed approach for this research is to evaluate the associated SoSE processes and associated characteristics and to compare them to TSE processes. The framework for this research is based upon guidelines provided in [Cresswell, 2003; Johnson and Onwuegbuzie, 2004; Brace, 2004]. Because little formal research has been done in this emerging area, a mixed methods approach (both quantitative and qualitative) has been selected. This approach seems especially well-suited to the areas of SoSE since there are no clearly accepted definitions of what this engineering discipline is and if it is, in fact, significantly different from TSE.

More specifically, this research will be accomplished by:

1. Conducting surveys to determine SoSE key activities in SoSE projects, lessons learned from SoSE programs, and the percentage of total effort spent in SoSE key activities.

2. Develop a process model for each SoSE and TSE project included in the survey. 3. Classify each project in the survey into the following categories: SoS or non-SoS 4. Determine commonality of activities by program category (SoSE and TSE) 5. Evaluate process models to determine potential new activities and to validate the

percentage of effort assigned to key activities in survey 6. For each common activity, determine if there is any significant difference from the

various populations (SoSE and TSE) a. Compute mean, σ, σ2, correlation coefficient on the percentage of effort for

activity b. Conduct regression analysis

7. Determine relative significance of SoSE activities not common to SE activities by analyzing the percentage of overall effort spent on the SoSE-only activities and how important that activity was with respect to the success of the project

8. Analyze data to see if differences still exist when data viewed within domain or technology categories

9. Summarize the findings of the comparisons using various graphical techniques (e.g., histograms, Pareto charts, Kiviat diagrams).

2. Data Collection Instrument To collect data for this research effort, a survey and a set of interview questions have been developed. The intent of the survey form and interview questions is to capture characteristics of the system itself, the processes used to develop the system, and the personnel/organizations supporting the development effort. The survey form and interview questions were initially developed using key SoS and SoSE definitions and characteristics identified through the literature review for this research, then refined through workshops with the USC CSSE industry and government affiliates. The resulting data collection form is provided in Appendix A.

19

Page 20: The Constructive Systems Engineering Cost Model - Center for

Most of the data to be collected is either quantitative or categorical. For the quantitative information, the respondent is asked to provide a specific value or to evaluate a feature or characteristic using a ratio scale (e.g., percentage of total) or an interval scale (usually from one (low) to 5 (high)).

A few of the questions ask the respondents to provide textual information, often to elaborate on previous quantitative or categorical responses or to ensure that key aspects or categories have not been omitted from the data collection instrument.

3. Process Models As part of this research effort, process models will be developed to capture the key engineering activities performed, the relationships between the activities, relative durations of the key activities, as well as the identification of influences that might affect the effort associated with the key activities. The goal of the process model data collection is to capture the process activities at a level that will potentially highlight differences between SoSE and SE (similar to the analysis done in [Lane, 1999]), but not delve down into the lower level procedures that capture more organizational and cultural aspects of the processes.

To determine an appropriate set of the tool(s) to capture and evaluate the key engineering activities, a trade study was conducted [Lane et al., 2007]. Initial findings indicated that there are a variety of process modeling tools available in the marketplace that support a variety of purposes: project planning and analysis, development of standard business processes, business process re-engineering, and analysis of activities for lean-Six Sigma improvements to name a few. Because the focus of the evaluation was to identify a set of tools to compare SoSE activities with TSE activities, the evaluation looked at the tools primarily oriented towards a) project planning and analysis, b) process documentation and analysis tools to support lean-Six Sigma initiatives or business process re-engineering, and c) analysis tools to understand influences on activities or tasks. Criteria for evaluating the process modeling tools for the SoSE-TSE comparison were extracted from SoSE conferences, workshops, and guidelines [SoSECE, 2006; Purdue, 2006; USAF SAB, 2005] that identified potential areas where SoSE and TSE differed.

Tool comparisons against the desired process model capabilities showed that the combined use of Microsoft Project [Microsoft, 2003] and a system dynamics modeling tool such as Vensim [Ventana, 2006] or iThink [isee, 2007] can capture and help analyze much of the SoSE and TSE process information needed to perform the desired SoSE-TSE comparison. The process model information will be captured as a high level, resource loaded WBS. Any system dynamics models developed to further analyze process observations will be based on the data provided in the resource-loaded WBS and quantitative information from the data collection instrument.

4. Boundary Objects Key to successful SoSE is the communication of information between the various organizations developing the different parts of the SoS. The lead organization is responsible for identifying and specifying requirements as well as developing the overall architecture. The component system suppliers and vendors are responsible for providing component systems that are interoperable in the SoS framework. The types of information and the mechanisms used to convey information between these various organizations may shed light on differences between TSE and SoSE. Therefore, during the interviews to capture process model descriptions, efforts

20

Page 21: The Constructive Systems Engineering Cost Model - Center for

will also be made to capture descriptions of the boundary objects used to convey information between various organizations. These include the boundary objects used between the SoS stakeholders and the lead organization, between the various lead organization teams, between the lead organization and the component system suppliers/vendors, and direct communications between vendors and suppliers. An initial set of typical systems engineering boundary objects is provided to help the interviewee understand the types of information desired and to standardize the terminology used for potentially common boundary objects. The interview process will capture brief descriptions of each boundary object, how the boundary object is used, and a categorization of the boundary object as to its usefulness and persistence.

5. Data Collection and Analysis The proposed approach for SoSE data collection is face-to-face interviews. This technique allows the researcher to request a comprehensive set of data, encourages consistency of interpretations of the questions, and allows the researcher to follow up immediately with unexpected or unanticipated responses to capture additional information to help with the analysis of the responses. Data will be collected from project representatives familiar with the project at a high technical or management level. Data for each project/program may be supplied by more than one project representative, depending on the areas of expertise of each representative. There is no requirement for the programs to have completed a single increment or iteration. Where actual data is not available, estimates from the planning process will be used.

The data collection process for a given program will start with the survey form. After that is completed, the interviewer will solicit inputs from the project representative to generate a project process model and capture descriptions of boundary objects used in the processes to transfer, translate, or transform information across project organizational boundaries. The final set of data collected will be based on a set of open-ended questions designed to capture relevant information not covered by the survey, process model, or boundary object descriptions.

The goal is to capture data from at least 10 completed or in-progress programs in each category: SoSE and TSE. The primary areas of analysis will be to look at how easy it is to discriminate between SoSE and TSE projects, the types of key processes and activities performed on projects and the distribution of effort across those process areas and activities, a comparison of common boundary objects for each set of projects (SoSE and TSE), a comparison of technical focus areas between SoSE and TSE projects, a comparison of requirements change rates for each set of projects, and a comparison of potential SoSE influences.

SoSE/TSE Project Characteristics: To evaluate the projects in this area, mean responses will be calculated and compared for the SoSE characteristics and the SoSE Profiler table.

Process Areas and Activities: Effort allocations to the ANSI/EIA 632 process areas will be compared for both sets of projects. For each common process area/activity, the mean, σ, σ2, correlation coefficient will be computed for the percentage of effort and then regression analysis performed. In addition, the process model WBSs will be analyzed to determine consistency with the allocation of effort across the ANSI/EIA 632 process areas and to identify any addition key activities for comparison using this same technique.

Boundary Objects: Common boundary objects (more than two projects in one of the sets of projects using a given boundary object) will be captured and the percentage of projects using each of the common boundary objects will be compared across the two sets.. Analysis will also

21

Page 22: The Constructive Systems Engineering Cost Model - Center for

be conducted to assess the mean “usefulness” and “persistence” of these common boundary objects.

Technical Focus Areas: For each technical focus area, the mean, σ, σ2, and correlation coefficient will be computed for the percentage of effort in each set of projects, then compared across the two sets of projects.

Requirements Changes: Using the initial number of requirements and the change history provided by phase, the average requirements volatility for each project set in each phase will be computed. Requirements volatility will be calculated by adding the number of requirements added, modified, and deleted in a given phase and dividing that total by the total number of initial requirements for each project, then computing the mean, σ, σ2, and correlation coefficient for each set of projects.

Potential SoSE Differences: Areas to investigate were identified from the reported SoSE differences discussed earlier:

• Percentage of innovative or immature technologies used • Number of parallel engineering activities • Order in which engineering activities/processes are performed • Percentage of activities focused on SoS interfaces (internal and external) • Percentage of activities/effort focused on net-centric technology/issues • Percentage of activities/effort focused on information/knowledge management/issues • Impact of standards/lack of standards • Impact of protocols/lack of standard or convergent protocols • Organizational management issues related to multiple partners/suppliers • Organizational management issues related to distributed teams • New engineering specialties required to develop net-centric software-intensive SoSs • New contracting approaches needed to support collaborative development and

rapid/continual change • Competing business goals for suppliers/vendors (eg, engineering for the good of the

SoS as opposed to engineering to optimize SoS component systems) • Lifecycle model differences.

For each of these areas (listed in the Influence table in the survey form), the responses will be converted to a numeric value (positive=3, negative=-3, and none=0). Then, the mean, σ, σ2, and correlation coefficient will be computed for the corresponding numeric responses in each set of projects and these values compared across the two sets of projects.

Textual Responses: Coding techniques will be used to identify common themes in textual responses used to explain or amplify other numeric or category responses. Counts of common themes for specific areas will be computed for the projects in each set. Then, the mean, σ, σ2, and correlation coefficient will be computed for count in each set of projects and these values compared across the two sets of projects.

22

Page 23: The Constructive Systems Engineering Cost Model - Center for

System Dynamics Model(s): During the analysis of the survey, process model WBS, and interview information, system dynamics models may be developed to better understand areas of significant difference and the resulting impacts to cost and schedule.

Appendix B provides sample data for both an SoSE and a TSE project as well as sample analysis charts to highlight discernable differences between the two.

6. Potential Threats to Validity and Limitations Several threats to the validity of this research as well as research limitations have been identified in the areas of data precision, sample sizes, and general engineering variance. The following describes these threats and limitations, the potential impacts it will have on this research, and the steps that have been taken to mitigate these potential threats.

Effort Precision: One of the potential key indicators for differences between SoSE and TSE activities is the amount of effort or the relative amount of effort spent on an activity. The ways effort data are captured and reported on systems engineering and SoS projects can be both limiting to this proposed research and potentially pose a threat to the validity of this research. Effort is typically captured as labor hours. However, this data is not always precise on engineering projects. Companies working on large programs attempt to capture effort using a set of pre-defined categories. Sometimes the categories are related to the skills and level of experience of the people working on the project. Other times the categories are related to the engineering phases or activities of the project. And there are cases where both types of categories are used. When effort is captured by engineering phase or activity, it is not always easy to determine when one phase/activity ends and the next one starts. In addition, all engineering hours are not always recorded on projects. Engineering staff and management are often only required to record how they spent the first eight hours of the work day, but are not required to record any hours worked after the first eight. This means that these “extra” hours, often referred to as “uncompensated overtime”, are often not captured or tracked on projects. This can result in problems where the data is not broken out in a way that supports the research project, values for the specified labor categories are inaccurate, or labor profiles are skewed.

Because the detailed effort data is often not precise and the goal of this research is to only show significant differences between SoSE and TSE, effort is going to be viewed from a much higher level: resource loading (number of people assigned to a task) at the WBS level and percentages of total effort. The assumption is that any under-reporting of labor hours is somewhat consistent across activities and that the reported effort is a close-enough approximation to identify any significant differences between the two sample sets of projects.

Sample Size: TSE and SoSE projects tend to span multiple years. In addition, there are not that many large-scale projects. Therefore, it may be difficult to obtain enough projects to conduct a reasonable comparison.

To perform a good analysis of potential differences between SoSE and TSE with a very limited sample of projects in each category, it is important to minimize variance in other variables that may account for the differences. For example, if SoSE projects are compared to relatively small TSE projects, one might not be able to discern whether any observed differences were due to the SoSE/TSE differences or differences with respect to the size and complexity of the projects. Therefore, the projects selected for this research will be those that have relatively similar size and complexity and are in similar domains. The domain selected for this research effort is

23

Page 24: The Constructive Systems Engineering Cost Model - Center for

government systems (as opposed to commercial systems), and preferably, projects sponsored by the DoD.

General Engineering Variance: In the systems engineering arena, processes are typically not standardized even within a single organization. While there may some attempts at standardization at a high level, many aspects of systems development (for both traditional systems and SoSs) vary due to the nature of the system under development. Therefore, when comparing SoSE to TSE activities, one needs to be careful not to attribute system-specific differences to the more general engineering disciplines. Also, over time techniques and methodologies evolve or are replaced by new techniques and methodologies. Therefore, the larger the timeframe for the sample projects, the more variation there may be in the processes, techniques, and methodologies across the sample sets of projects.

To make sure that normal variation between systems engineering projects is not viewed as SoSE/TSE differences, projects in each category will be evaluated to determine the range of common variation. Then, any potential differences between SoSE and TSE will only be classified as significant if they are significantly different than the common variation among projects. Likewise, if the candidate projects span a large timeframe, the projects will be categorized with respect to timeframes and the projects in the different timeframes will be analyzed to determine the significance of any differences due to evolving techniques and methodologies.

24

Page 25: The Constructive Systems Engineering Cost Model - Center for

CHAPTER IV: RESEARCH PLAN

1. Schedule Summary The following summarizes the schedule for the key activities associated with this research proposal:

Completed – Identification of candidate SoSE size and cost drivers Completed – Investigate potential areas of difference between SoSE and TSE Completed – Conduct initial literature review Completed – Develop survey and interview questions Completed – Identify target SoSE and SE programs for survey By 5/07 – Complete initial SoSE and SE surveys using face-to-face

interviews and telephone interviews By 09/07 – Complete analysis of collected data and develop findings By 11/07 – Document initial effort and accuracy of results By 01/08 – Document effort results, Final Defense.

2. Target SoSE and TSE Programs The focus for this research will be on government programs, preferably limited to DoD projects if there are a sufficient number of these. The following SoSE and TSE candidates have been identified to date:

• Air Force Institute of Technology (AFIT) Case Studies projects – Hubble telescope – Theater Battle Management Core System (TBMCS) – F-111 – C5-A

• Boeing 777 • [Carlock and Fenton, 2001] projects (IRS, FAA, Navy, DOJ) • Columbia space shuttle • Cubic Defense Systems project(s) • Future Combat Systems (FCS) component systems • Global Positioning System (GPS) • Krygiel projects

– Defense Mapping Agency’s Digital Production System (DSP) – US Army’s Task Force XXI

• National Aeronautics and Space Administration (NASA) Goddard project • Northrop Grumman project(s) (initial data received) • Theater Medical Information Program • Additional selected projects from the Constructive Systems Engineering Cost Model

(COSYSMO) development effort.

Points of contacts/sources of information have been identified for all of the projects identified above. The final selection of which projects to include in the analysis will depend on the completeness of information provided, with preference given to projects that have completed at least a single increment of functionality and are operational.

25

Page 26: The Constructive Systems Engineering Cost Model - Center for

References Ackoff, R., “Towards a System of Systems Concepts”, Management Science, Vol 17, No. 11, Theory

Series, pp. 661-671, July 1971. ANSI/EIA (1999). ANSI/EIA-632-1988 Processes for Engineering a System. Berry, B., “Cities as Systems within Systems of Cities”, The Regional Science Association Papers,

Volume 13, 1964. Berryman, M., Allison, A., Abbott, D. (2006); "Optimizing Gennetic Algorithm Strategies for Evolving

Networks", Symposium on Complex Systems Engineering, http://cs.calstatela.edu/wiki/index.php/Symposium_on_Complex_Systems_Engineering, accessed on 1/11/2007.

Black, L. and Repenning, N. (2001); “Why firefighting is never enough: preserving high-quality product development”, System Dynamics Review, Vol. 17, No: 1, pp. 33-62

Blanchard, B. and Fabrycky, W. (1998). Systems Engineering and Analysis, Prentice Hall. Blanchette, S. (2005); U.S. Army Acquisition – The Program Executive Officer Perspective,

Special Report CMU/SEI-2005-SR-002. Boehm, B. (2006); “Some Future Trends and Implications for Systems and Software Engineering

Processes”, Systems Engineering 9(1), pp. 1-19. Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B., Horowitz, E., Madachy, R., Reifer, D. J. and

Steece, B. (2000). Software Cost Estimation With COCOMO II, Prentice Hall. Boehm, B. and Lane J. (2006) "21st Century Processes for Acquiring 21st Century Software-Intensive

Systems of Systems." CrossTalk: Vol. 19, No. 5, pp.4-9.Boehm, B., Valerdi, R., Lane, J., Brown, A., (2005) “COCOMO Suite Methodology and Evolution,”

CrossTalk - The Journal of Defense Software Engineering, Vol. 18, No. 4, pp. 20-25, April 2005. Brace, I. (2004); Questionnaire Design: How to Plan, Structure, and Write Survey Material for Effective

Market Research, Kogan Page Limited. Brooks, F. (1995). The Mythical Man-Month: Essays on Software Engineering, Addison-Wesley. Carlile, P. (2002); “A Pragmatic View of Knowledge and Boundaries: Boundary Objects in New Product

Development.” Organization Science Vol. 13: 442-455. Carlock, P. and Fenton, R. (2001) "System of Systems (SoS) Enterprise Systems for Information-

Intensive Organizations," Systems Engineering, Vol. 4, No. 4, pp. 242-261 Carlock, P., and J. Lane (2006); “System of Systems Enterprise Systems Engineering, the Enterprise

Architecture Management Framework, and System of Systems Cost Estimation”, 21st International Forum on COCOMO and Systems/Software Cost Modeling.

Cockburn, A. (2001); Writing Effective Use Cases, Addison-Wesley. Cocks, D. (2006); “How Should We Use the Term “System of Systems” and Why Should We

Care?”, Proceedings of the 16th Annual INCOSE International Symposium. Cresswell, A. et al. (2002); "Modeling Intergovernmental Collaboration: A System Dynamics

Approach", Proceedings of the 35th Annual Hawaii International Conference on System Sciences. Cresswell, J. (2003); Research Design Qualitative, Quantitative, and Mixed Methods Approaches,

Second Edition, Sage Publications Dalrymple, E. (2006); “Future Combat Systems SoS Characteristics and Critical Success Factors”,

Proceedings of the USC CSSE Convocation. Department of Defense (DoD) (2006); Defense Acquisition Guidebook, Version 1.6, accessed at

http://akss.dau.mil/dag/ on 2/2/2007. Department of Defense (DoD) (2006); System of Systems Engineering Guide: Considerations for Systems

Engineering in a System of Systems Environment, draft version 0.9. DiMario, M. (2006); “System of Systems Characteristics and Interoperability in Joint Command

Control”, Proceedings of the 2nd Annual System of Systems Engineering Conference Dorner, D. (1996); The Logic of Failure, Metropolitan Books.

26

Page 27: The Constructive Systems Engineering Cost Model - Center for

Eisner, H. (1993); “RCASSE: Rapid Computer-Aided Systems of Systems Engineering”, Proceedings of the 3rd International Symposium of the National Council of System Engineering, NCOSE, Vol. 1, pp 267-273.

Federation of American Scientists (FAS), “Integrated Undersea Surveillance System (IUSS)”, accessed at http://www.fas.org/irp/program/collect/iuss.htm on 12/27/2006.

Ferreira S. (2002); Measuring the Effects of Requirements Volatility on Software Development Projects. Ph.D. Dissertation, Arizona State University.

Finley, J. (2006); “Keynote Address”, Proceedings of the 2nd Annual System of Systems Engineering Conference

Fong, A., Srinivasan, J, Valerdi, R. (2007); "Boundary Objects as a Framework to Understand the Role of Systems Integrators", Proceedings of Fifth Annual Conference on Systems Engineering Research.

Ford D. and Sterman J. (2003); "Iteration Management for Reduced Cycle Time in Concurrent Development Projects", Concurrent Engineering Research and Application (CERA) Journal.

Friedman, T. (2005), The World is Flat: A Brief History of the Twenty-First Century, Farrar, Straus and Giroux, New York.

Garber, V. (2006); “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference

GlobalSecurity.ORG (2005), Sound Surveillance System (SOSUS), http://www.globalsecurity.org/intell/systems/sosus.htm, accessed on 1/20/2007.

Greer, D., Black, L., Adams, R. (2005), "Improving Inter-Organizational Baseline Alignment in Large Space System Development Programs", IEEE Aerospace Conference.

Highsmith, J. (2000); Adaptive Software Development: A Collaborative Approach to Managing Complex Systems, Dorset House Publishing.

INCOSE (2006); Systems Engineering Handbook, Version 3, INCOSE-TP-2003-002-03. isee Systems (2007), "iThink", http://www.iseesystems.com/Softwares/Business/ithinkSoftware.aspx

accessed on 2/10/2007. ISO/IEC (2002). ISO/IEC 15288:2002(E) Systems Engineering - System Life Cycle Processes. IUSS-Caesar Alumni Association (IUSSCAA), IUSS History, http://www.iusscaa.org/history.htm

accessed on 12/27/2006. Johnson, R. and Onwuegbuzie, A. (2004); "Mixed Methods Research: A Research Paradigm Whose

Time Has Come", Educational Researcher, Vol. 33, No. 7, pp. 14-26. Kauffman, S. (1995); At Home in the Universe: The Search for the Laws of Self-Organization and

Complexity, Oxford University Press. Kreitman, K.(1996), "From 'The Magic Gig' to Reliable Organizations: A New Paradigm for the Control

of Complex Systems", Symposium on Complex Systems Engineering, http://cs.calstatela.edu/wiki/index.php/Symposium_on_Complex_Systems_Engineering, accessed on 1/11/2007.

Krygiel, A. (1999); Behind the Wizard’s Curtain; CCRP Publication Series, July, 1999, p. 33 Lane, J. (1999), "Quantitative Assessment of Rapid System Development Using COTS Integration",

Proceedings of Eleventh Annual Software Technology Conference. Lane, J. (2005a); "COSOSIMO October 2005 Workshop", Proceedings of the 20th Forum on COCOMO

and Software Cost Modeling, USC CSE. Lane, J. (2005b); "System of Systems Lead System Integrators: Where do They Spend Their Time and

What Makes Them More/Less Efficient: Background for COSOSIMO", University of Southern California Center for Systems and Software Engineering, USC-CSE-2005-508.

Lane, J. (2005c); “System of Systems (SoS) Processes”, Proceedings of USC CSE Annual Research Review, March 2005.

Lane, J (2006). "COSOSIMO Parameter Definitions", USC-CSE-TR-2006-606. Los Angeles, CA: University of Southern California Center for Systems and Software Engineering.

Lane, J. and Boehm, B. (2007); "System of Systems Cost Estimation: Analysis of Lead System Integrator Engineering Activities", Information Resources Management Journal, Vol. 20, No. 2, pp. 23-32.

27

Page 28: The Constructive Systems Engineering Cost Model - Center for

Lane, J., Settles, S., and Boehm, B. (2007); "Assessment of Static and Dynamic Process Model Tools to Support the Analysis of System of System Engineering Activities", Proceedings of the Fifth Annual Conference on Systems Engineering Research.

Lane, J. and Valerdi, R., (2005); “Synthesizing SoS Concepts for Use in Cost Estimation”, Proceedings of IEEE Systems, Man, and Cybernetics Conference.

Lu, S. (2003); Engineering as Collaborative Negotiation: A New Paradigm for Collaborative Engineering, http://wisdom.usc.edu/ecn/about_ECN_what_is_ECN.htm accessed on 2/14/2007.

Madachy, R., Boehm, B., Lane, J. (2006); "Assessing Hybrid Incremental Processes for SISOS Development", USC CSSE Technical Report USC-CSSE-2006-623.

Maier, M. (1998); “Architecting Principles for Systems-of-Systems”; Systems Engineering, Vol. 1, No. 4 (pp 267-284)

Markus, M., Majchrzak, A., Gasser, L (2002), "A Design Theory for Systems That Support Emergent Knowledge Processes", MIS Quarterly, Vol. 26, No.3.

Meilich, A. (2006); “System of Systems Engineering (SoSE) and Architecture Challenges in a Net Centric Environment”, Proceedings of the 2nd Annual System of Systems Engineering Conference

Microsoft (2003); Project Standard 2003 Overview, accessed at http://www.microsoft.com/office/project/prodinfo/standard/overview.mspx on 11/1/2006.

NAVSTAR Global Positioning System Joint Program Office, http://gps.losangeles.af.mil/ , accessed on 12/6/2006.

Northrop, L., et al. (2006); Ultra-Large-Scale Systems: The Software Challenge of the Future, Software Engineering Institute.

Pair, C. (2006); “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference.

Pinney, B. (2001); Projects, Management, and Protean Times: Engineering Enterprise in the United States, 1870-1960, PhD Dissertation, Massachusetts Institute of Technology.

Pressman, J. and Wildavsky, A. (1973); Implementation: How Great Expectations in Washington are Dashed in Oakland; Or, Why It’s Amazing that Federal Programs Work at All, This Being a Saga of the Economic Development Administration as Told by Two Sympathetic Observers Who Seek to Build Morals on a Foundation of Ruined Hopes, University of California Press.

Prokopenko, M., Bochetti, F., and Ryan, A. (2006); "An Information-Theoretic Primer on Complexity, Self-Organisation and Emergence", Symposium on Complex Systems Engineering, http://cs.calstatela.edu/wiki/index.php/Symposium_on_Complex_Systems_Engineering, accessed on 1/11/2007.

Purdue University, (2006); Proceedings of AFOSR SoSE Workshop, 17-18 May 2006. Rechtin, E. (1991); Systems Architecting: Creating & Building Complex Systems, Prentice Hall. Sage, A. and Cuppan, C. (2001); "On the Systems Engineering and Management of Systems of Systems

and Federations of Systems." Information, Knowledge, and Systems Management 2: 325-345. SEI (2001), Capability Maturity Model Integration (CMMI), CMU/SEI-2002-TR-001. Sheard, S. (2005); Practical Applications of Complexity Theory for Systems Engineers, Systems and

Software Consortium, Incorporated. Sheard, S. (2006), "Foundations of Complexity Theory for Systems Engineering of Systems of Systems",

Proceedings of the IEEE Conference on System of Systems Engineering. Shenhar, A.(1994); “A New Systems Engineering Taxonomy”, Proceedings of the 4th International

Symposium of the National Council on System Engineering, National Council on System Engineering, Vol. 2, pp 261-276.

Siel, C. (2006); “Keynote Presentation”, Proceedings of the 2nd Annual System of Systems Engineering Conference

Smithsonian Institute, National Museum of American History (2000); Submarine Missions: Anti-Submarine Warfare, http://americanhistory.si.edu/subs/work/missions/warfare/index.html, accessed on 1/20/2007.

28

Page 29: The Constructive Systems Engineering Cost Model - Center for

Society for Design and Process Science (SDPS) (2006); Proceedings of the Ninth World Conference on Integrated Design and Process Technology, Volume 1.

Star, S., and Griesemer, J. (1989); “Institutional Ecology, ‘Translations’ and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39.” Social Studies of Science Vol. 19: 387-420.

System of Systems Engineering Center for Excellence (SoSECE) (2006); Second Annual SoS Engineering Conference, 25-26 July 2006.

The Aerospace Corporation, MITRE, RAND, and Third Millennium Systems, (2007); Symposium on Complex Systems Engineering, http://cs.calstatela.edu/wiki/index.php/Symposium_on_Complex_Systems_Engineering, accessed on 1/11/2007.

United States Air Force Scientific Advisory Board (2005); Report on System-of-Systems Engineering for Air Force Capability Development; Public Release SAB-TR-05-04

Valerdi, R. (2005); Constructive Systems Engineering Cost Model. PhD. Dissertation, University of Southern California.

Ventana Systems Incorporated (2006), Vensim® Version 5 User’s Guide. Wheatley, M. (1992); Leadership and the New Science: Learning about Organization from an Orderly

Universe, Berrett-Koehler Publishers.

29

Page 30: The Constructive Systems Engineering Cost Model - Center for

Appendix A

A.1 SoSE-TSE Comparison Survey Form A.2 Summary of ANSI/EIA 632 System Engineering Processes

Page 31: The Constructive Systems Engineering Cost Model - Center for

A.1 SoSE-TSE Comparison Survey A.1 SoSE-TSE Comparison Survey

Center for Systems & Software Engineering941 W. 37th Pl., Salvatori 328, Los Angeles, A 90089-0781, CPhone: (213) 740-5703, FAX: (213) 740-4927 Phone: (213) 740-5703, FAX: (213) 740-4927

Purpose of Survey Purpose of Survey

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE).

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE). The purpose of this survey is to obtain a better understanding of The purpose of this survey is to obtain a better understanding of

• What people are referring to as “systems of systems” • What people are referring to as “systems of systems” • How SoSE activities might differ from TSE activities • How SoSE activities might differ from TSE activities • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities. • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities.

Survey Participant Information Survey Participant Information Name: Name:

Organization: Title Organization: Title

Street Address: Street Address:

City: State: Zip: City: State: Zip:

Phone: Email: Phone: Email:

Role on Program (check all that apply): Role on Program (check all that apply):

Procuring agency/organization

Lead System Integrator (LSI)

Systems Engineer

Component supplier/vendor

Architect

Supplier/vendor oversight

Integrator/Tester

Verification and Validation

Other:

# years experience in this role:

# programs supported in this role:

A-1

Page 32: The Constructive Systems Engineering Cost Model - Center for

Program Characteristics (complete a separate survey form for each Program) Name of Program/System:

Period of Performance:

Goal of system:

Iteration/Build that data pertains to

Business Domain

Government (Federal, State, Municipal, etc.)

Commercial

Other (please describe):

System Domain (check all that apply)

Aerospace and Defense Other Government Services

Commercial services Commercial products

Business Enterprise support

Brief description of domain/type:

Funding strategy:

“Cost plus” or time and materials at system-of-interest level

Fixed-price at system-of-interest level

Level of effort at system-of-interest level

By component system/platform

Cost sharing

Strategic alliance

Other (please describe):

Organizational interdependencies:

Number of suppliers:

Number of vendors:

Number of partners:

Number of subcontractors:

Number of layers of associated entities:

A-2

Page 33: The Constructive Systems Engineering Cost Model - Center for

System Characteristics and Properties If you think your system-of-interest is an SoS, indicate why you refer to your system as an SoS:

Indicate which of the following characteristics apply to your system-of-interest:

Component systems can operate/function independent of the system-of-interest

% independent:

% dependent upon system-of-interest for operation:

Component systems are independently managed

Evolutionary development process used to develop system-of-interest

Possesses emergent behavior(s)

Geographically distributed

Number of unique component systems:

Number of system-of-interest unique/distinct application/communications protocols:

Number of system-of-interest capabilities or requirements:

Number of system-of-interest user scenarios (cloud-level):

System-of-interest architecture style (indicate all that apply)

Service oriented architecture (e.g., publish/subscribe)

“Plug and play” using standard interface protocols

Hierarchical

Distributed

Federation

Point to point integration

Other (please describe):

Don’t know

List the technologies used to enable any system-of-interest net-centricity (e.g., types of networks, convergent (standard) protocols, run-time protocol converters, data standards, run-time data converters, common operating environment/tools, security).

Describe the role of reuse/COTS in architecture decisions.

Briefly describe the types of emergent behavior anticipated or identified as a result of creating the system-of-interest.

Briefly describe the system-of-interest integration strategy.

A-3

Page 34: The Constructive Systems Engineering Cost Model - Center for

System Profiler*

For the system-of-interest, indicate the appropriate System Category for each Characteristic listed below.

System Category Context Characteristic

TSE SoSE SoSE Challenges

System Behavior

Known system behavior

System behavior fairly predicable

System behavior will evolve

System Context

Desired Outcome

Improve existing capability

Change existing capability

Build fundamentally new capability

Mission Environment

Mission stable Mission evolves slowly

Mission very fluid, ad hoc

Strategic Context

Scope of Effort Single function Single enterprise Extended enterprise

Stakeholder Relationships

Relationships stable

New relationships Resistance to changing relationships

Stakeholder Context

Stakeholder Involvement

Stakeholders concur

Agree in principle; some not involved

Multiple equities; distrust

Acquisition Environment

Single program, single system

Single program, multiple systems

Multiple programs, multiple systems

Implementation Context

Scale of Effort Single user class Similar users Many different users

* Based on System Profiler in [DoD, 2006b]

A-4

Page 35: The Constructive Systems Engineering Cost Model - Center for

System-of-interest Characteristics Development Effort Profile: Indicate in the following table the approximate percentage of project effort expended for each of the listed process areas (for a more detailed description of the listed the example tasks/activities, see Appendix A.2). If key activities from your project appear to be missing, please add at the end. The sum of the percentages of effort should total 100%.

Process Example EIA/ANSI 632 Tasks % of Total Effort

1. Product Supply

2. Product Acquisition

Acquisition and Supply

3. Supplier Performance

4. Process Implementation Strategy

5. Technical Effort Definition

6. Schedule and Organization

7. Technical Plans

8. Work Directives

9. Progress Against Plans and Schedules

10. Progress Against Requirements

11. Technical Reviews

12. Outcomes Management

Technical Management

13. Information Dissemination

14. Acquirer Requirements

15. Other Stakeholder Requirements

16. System Technical Requirements

17. Logical Solution Representations

18. Physical Solution Representations

System Design

19. Specified Requirements

20. Implementation Product Realization

21. Transition to Use

A-5

Page 36: The Constructive Systems Engineering Cost Model - Center for

Process Example EIA/ANSI 632 Tasks % of Total Effort

22. Effectiveness Analysis

23. Tradeoff Analysis

24. Risk Analysis

25. Requirements Statements Validation

26. Acquirer Requirements Validation

27. Other Stakeholder Requirements Validation

28. System Technical Requirements Validation

29. Logical Solution Representations Validation

30. Design Solution Verification

31. End Product Verification

32. Enabling Product Readiness

Technical Evaluation

33. End Products Validation

Program Management

Business Process Re-engineering

Other Processes (Please describe)

System-of-Interest Technical Focus: Independent of the categories above, indicate the percentage of total effort spent in the following areas.

Technical Focus % of Total Effort System-of-interest interfaces (internal and external) Net-centric technology/issues Information/knowledge management/issues

System-of-Interest Life Cycles Model: Indicate which of the following process models most accurately describe the processes used to develop the system-of-interest.

System “V” Incremental Evolutionary Agile Spiral Other:

A-6

Page 37: The Constructive Systems Engineering Cost Model - Center for

System-of-Interest Processes:

Identify key processes and relationships between processes using a 2-3 level Microsoft Project description.

Indicate any applicable maturity levels/certifications for project process (e.g., Software Engineering Institute (SEI) Capability Maturity Model Integrated (CMMI) level or ISO certification).

Lead organization:

Vendors/suppliers/partners/subcontractors (use multiple entries as needed):

Indicate flexibility of processes and amount of experimentation used on project:

System-of-Interest Boundary Objects: Identify, briefly describe and categorize system-of-interest boundary objects used to transfer, translate, or transform knowledge between the stakeholders, the engineering teams, and any component system suppliers/vendors/partners/subcontractors. Some candidate boundary objects have been listed. Strike through those that were not used on your program and add others from your program that are not listed below. (Add as many rows as necessary to capture all relevant boundary objects.)

Boundary Object Brief Description

Use

fuln

ess1

Pers

iste

nce2

Concept of operations

System requirements specification

Requirements repository

Interface specifications

Interface design

Software requirements specification

Database design

UML/SYSML diagrams (specify which ones and provide an entry for each one used)

DoDAF diagrams (specify which ones and provide an entry for each one used)

Prototypes (specify an entry for each one utilized)

Exceptional Designer/Engineer (Keeper of the Vision)

1 Usefulness: Rating from 1 to 5 on useful of boundary object for communicating between groups (5 is Very High, 1 is Very Low or n t useful at a . o ll)2 Persistence: Initial phase where boundary object created and typical duration of boundary object (phases in which it is used/evolved)

A-7

Page 38: The Constructive Systems Engineering Cost Model - Center for

Boundary Object Brief Description

Use

fuln

ess1

Pers

iste

nce2

Overall synchronicity of boundary objects (rate from 1 (low) to 5 (high)):

Requirements Volatility: For completed projects, indicate the requirements volatility experienced on the program by phase.

Initial number of requirements (e.g., number of “shalls”):

Phase

Con

cept

ualiz

e

Dev

elop

Tes

t and

E

valu

atio

n

Tra

nsiti

on to

O

pera

tion

Requirements Added

Requirements Modified

Requirements Deleted

A-8

Page 39: The Constructive Systems Engineering Cost Model - Center for

Engineering Influences: Indicate whether the following had any influences (positive, negative, or none) on your engineering activities.

Influence Positive Negative None

Innovative technology

Immature technology

Number of parallel engineering activities

Impact of standards

Impact of lack of standards

Impact of protocols

Impact of lack of standard or convergent protocols

Organizational management issues related to multiple vendors/suppliers/partners/subcontractors

Organizational management issues related to distributed teams

New engineering specialties required to develop system-of-interest

New contracting approaches needed to support collaborative development

New contracting approaches needed to support rapid/continual change

Competing business goals for suppliers/vendors (e.g., engineering for the good of the system-of-interest as opposed to engineering to optimize system-of-interest components)

Any other key influences?

A-9

Page 40: The Constructive Systems Engineering Cost Model - Center for

Success of Program and Overall Assessment of SoSE/TSE Differences Rate the overall success of the system-of-interest program to date for each of the perspectives below using a high, medium, or low rating.

Aspect Rating Comments: Key Reasons for Success or Not So Successful

Achieving/achieved business goals

Achieving/achieved desired emergent behaviors

User acceptance

Meeting/met original budgets, schedules

Evolving/evolved system in a timely manner as needed changes are/were identified

Flexible enough to support long term evolution

What else is important?

How different is SoSE from TSE? Rate using a scale from 1 (little or no difference) to 5 (extremely different):

A-10

Page 41: The Constructive Systems Engineering Cost Model - Center for

Appendix A.2 Summary of ANSI/EIA 632 System Engineering Processes3

Process ANSI/EIA 632 Task Definition

1. Product Supply Assess acquisition request, offer or directive, negotiate agreement, deliver products

2. Product Acquisition Prepare acquisition requests, evaluate supplier response, make offer, negotiate agreement, accept delivered products

Acquisition and Supply

3. Supplier Performance Define supplier relationships, participate in product teams, monitor product metrics and assess products (invoke tasks 9-11 as applicable), flow down Concept of Operations (CONOPs) requirement changes, control requirement changes, assess progress against requirements, validate products received (invoke task 33a)

4. Process Implementation Strategy

Identify: stakeholders, applicable documents, associated process approaches, applicable life cycles. Identify and define: technical process and project integration, and progress assessment. Document all of the above in a Process Implementation Strategy.

5. Technical Effort Definition

Identify project requirements, establish information database, define risk management strategy, define product and process metrics, establish trade/off cost goals, identify Technical Performance Measurements (TPMs), identify applicable project tasks, identify methods and tools, establish technology insertion approaches

6. Schedule and Organization

Develop event-based and calendar-based schedules, Identify resource requirements, define staffing/discipline needs, define team and org. structure

7. Technical Plans Develop Engineering Plan, Risk Plan, Technical Review Plan, Validation Plans, Verification Plans, Other Applicable Plans (e.g., Human Factors, Security Plans)

8. Work Directives Develop work packages and generate work authorizations

Technical Management

9. Progress Against Plans and Schedules

Identify events, tasks, and process metrics for monitoring, collect and analyze metrics data, compare process metrics against plans and schedules, implement required changes

3 Electronic Industries Alliance, EIA Standard 632: Processes for Engineering a System, January 1999.

A-11

Page 42: The Constructive Systems Engineering Cost Model - Center for

Process ANSI/EIA 632 Task Definition

10. Progress Against Requirements

Identify product metrics to be monitored, collect and analyze product metrics data, record rationale for decisions/assumptions, compare results against requirements, identification and implementation of required changes

11. Technical Reviews Identify technical review objectives and requirements, determine progress against event-based plan, establish technical review board, agenda and speakers, prepare technical review package and presentation material, conduct technical review, close-out review

12. Outcomes Management

Capture process outcomes, perform configuration management, perform change management, perform interface management, perform risk management, perform data and document management, manage information database, manage and track requirements

13. Information Dissemination

Provide progress status, provide planning information, disseminate approved and controlled requirements, provide formation for and from reviews, make available design data and schema, make available lessons learned, report variances, disseminate data deliverables, disseminate approved changes, disseminate directives

14. Acquirer Requirements

Identify, collect, and prioritize acquirer's system requirements, ensure completeness and consistency of the set of collected acquirer requirements (invoke task 26), record set of acquirer requirements

15. Other Stakeholder Requirements

Identify and collect other stakeholders' end product requirements, identify and collect other stakeholders' enabling product requirements, identify and collect other stakeholders' external constraints, ensure completeness and consistency of the set of other stakeholders' requirements (invoke task 27), record set of other stakeholder requirements.

System Design

16. System Technical Requirements

Establish required transformation rules, priorities, inputs, outputs, states, modes, and configurations, define operational requirements, define performance requirements, analyze acquirer and other stakeholder requirements (e.g. human factor effects, capacities and timing, technology constraints, product design constraints), challenge questionable requirements, resolve identified conflict of requirements, prepare a set of acceptable system technical requirement statements, ensure completeness and consistency of the set of system technical requirements (invoke task 28), reset the set of system technical requirements

A-12

Page 43: The Constructive Systems Engineering Cost Model - Center for

Process ANSI/EIA 632 Task Definition

17. Logical Solution Representations

Select and implement one or more these four approaches (Functional Analysis, Object Oriented Analysis, Structured Analysis, Information Modeling), or another approach designated by enterprise policies, guides, or standards; Establish a set of logical solution representations (see list); Assign system technical requirements --- including performance requirements and constraints; Identify, define, and validate derived technical requirement statements (invoke task 25); Ensure completeness and consistency of the logical solution representations (invoke task 29); Record logical solution representations and derived technical requirements

18. Physical Solution Representations

Analyze logical solution representation sets, assigned system and derived technical requirements: Assign representations, derived technical requirements and unassigned system technical requirements to appropriate physical entities (see list)

19. Specified Requirements

Fully characterize design solution, Ensure design solution consistency (invoke task 30), Specify requirements, Record design solution and related specified requirements, Establish projects for development of enabling products

20. Implementation Acquire Products (Goods or Services), Validate acquired products (invoke task 33), assemble/integrate validated end products, Verify integrated end products (Invoke Req. 31), Verify enabling products for each associated process (invoke task 32), Validate the verified end product (invoke task 33b)

Product Realization

21. Transition to Use Acquire and put in place enabling products, Prepare end products for shipping or storage, Prepare the operational sites, Installation of products, Perform commissioning, provide ghosting, train users and maintenance personnel, provide in-service support

22. Effectiveness Analysis

Plan effectiveness analyses, Analyze system cost effectiveness, analyze total ownership cost, analyze environmental impacts, analyze system effectiveness, record outcomes of effectiveness analysis

23. Tradeoff Analysis Plan tradeoff analysis, perform tradeoff analysis, record outcomes of tradeoff analysis

24. Risk Analysis Identify risks, characterize risks, prioritize risks, evaluate ways to avert risks, define and implement a plan or approach for averting each significant risk, capture and communicate risk analysis outcomes

Technical Evaluation

25. Requirements Statements Validation

Invoked by task 17. Analyze and ensure each technical requirement statement with (list of criteria), Analyze and ensure each technical requirement statements in pairs and as a set are stated with (list of criteria)

A-13

Page 44: The Constructive Systems Engineering Cost Model - Center for

Process ANSI/EIA 632 Task Definition

26. Acquirer Requirements Validation

Invoked by task 14. Select methods and define procedures, Establish downward traceability, Establish upward traceability, Identify and resolve variances, Record validation results

27. Other Stakeholder Requirements Validation

Invoked by task 15. Select methods and define procedures, Establish downward traceability, Establish upward traceability, Identify and resolve variances, Record validation results

28. System Technical Requirements Validation

Invoked by task 16. Select methods and define procedures, Establish downward traceability, Establish upward traceability, Analyze assumptions, Analyze other system technical requirements, Identify and resolve variances, Perform Revalidation, Record validation results

29. Logical Solution Representations Validation

Invoked by task 17. Select methods and define procedures, Establish downward traceability, Establish upward traceability, Analyze assumptions, Identify and resolve variances, Perform Revalidation, Record validation results

30. Design Solution Verification

Invoked by task 19. Plan the design solution verification in accordance with the Verification Plan, the agreement, and the applicable enterprise-based life cycle phase, and level in the system structure, Perform the planned design solution verification using selected methods and procedures within the established verification environment, Perform reverification, Record verification results

31. End Product Verification

Invoked by task 20. Plan the end product verification in accordance with the Verification Plan, the agreement, and the applicable enterprise-based life cycle phase, and level in the system structure, Perform the planned end product verification using selected methods and procedures within the established verification environment, Perform reverification, Record verification results

32. Enabling Product Readiness

Invoked by task 20. Plan enabling product readiness determination in accordance with the agreement, and the applicable enterprise-based life cycle phase, and level in the system structure, Perform planned enabling product readiness determination using selected methods and procedures, Reaccomplish readiness determination, Record readiness determination results

33. End Products Validation

Invoked by task 3. Confirmation by examination and provision of objective evidence that the specific intended use of an end product, or an aggregation of end products, is accomplished in an intended usage environment; Representative tasks include: Determine validation exit criteria, Acquire appropriate test article, Conduct validation, Perform revalidation, Record validation results.

A-14

Page 45: The Constructive Systems Engineering Cost Model - Center for

Appendix B Analysis Demonstration Using Sample SoSE and TSE Projects

B.1 Sample SoSE Survey Response: Jail Information Management System

B.2 Sample SoSE Process Model: Jail Information Management System

B.3 Sample TSE Survey Response: Provisioning System

B.4 Sample TSE Process Model: Provisioning System

B.5 Sample Comparative Analysis

B-1

Page 46: The Constructive Systems Engineering Cost Model - Center for

B.1 Sample SoSE Survey Response B.1 Sample SoSE Survey Response

Center for Systems & Software Engineering941 W. 37th Pl., Salvatori 328, Los Angeles, A 90089-0781, CPhone: (213) 740-5703, FAX: (213) 740-4927 Phone: (213) 740-5703, FAX: (213) 740-4927

Purpose of Survey Purpose of Survey

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE).

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE). The purpose of this survey is to obtain a better understanding of The purpose of this survey is to obtain a better understanding of

• What people are referring to as “systems of systems” • What people are referring to as “systems of systems” • How SoSE activities might differ from TSE activities • How SoSE activities might differ from TSE activities • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities. • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities.

Survey Participant Information Survey Participant Information Name: EXAMPLE 1 Name: EXAMPLE 1

Organization: Title Organization: Title

Street Address: Street Address:

City: State: Zip: City: State: Zip:

Phone: Email: Phone: Email:

Role on Program (check all that apply): Role on Program (check all that apply):

Procuring agency/organization

X Lead System Integrator (LSI)

Systems Engineer

Component supplier/vendor

Architect

Supplier/vendor oversight

Integrator/Tester

Verification and Validation

Other:

# years experience in this role: 10

# programs supported in this role: 3

B-2

Page 47: The Constructive Systems Engineering Cost Model - Center for

Program Characteristics (complete a separate survey form for each Program) Name of Program/System: Jail Information Management System

Period of Performance: 1993-2002

Goal of system: Replace legacy jail management system and add new capabilities in a net-centric environment

Iteration/Build that data pertains to 1

Business Domain

X Government (Federal, State, Municipal, etc.)

Commercial

Other (please describe):

System Domain (check all that apply)

Aerospace and Defense X Other Government Services

Commercial services Commercial products

X Business Enterprise support

Brief description of domain/type:

Funding strategy:

“Cost plus” or time and materials at system-of-interest level

X Fixed-price at system-of-interest level

Level of effort at system-of-interest level

By component system/platform

Cost sharing

Strategic alliance

Other (please describe):

Organizational interdependencies:

Number of suppliers:

Number of vendors: 5

Number of partners:

Number of subcontractors: 1

Number of layers of associated entities:

B-3

Page 48: The Constructive Systems Engineering Cost Model - Center for

System Characteristics and Properties If you think your system-of-interest is an SoS, indicate why you refer to your system as an SoS: Integration of multiple COTS products in a net-centric environment with a configurable number of jail nodes.

Indicate which of the following characteristics apply to your system-of-interest:

X Component systems can operate/function independent of the system-of-interest

% independent: 75% (COTS applications)

% dependent upon system-of-interest for operation: 25% (network components/node replication)

X Component systems are independently managed

Evolutionary development process used to develop system-of-interest

X Possesses emergent behavior(s)

X Geographically distributed –across multiple jail sites in county

Number of unique component systems: 6

Number of system-of-interest unique/distinct application/communications protocols: 1

Number of system-of-interest capabilities or requirements: 4500

Number of system-of-interest user scenarios (cloud-level): 500

System-of-interest architecture style (indicate all that apply)

Service oriented architecture (e.g., publish/subscribe)

“Plug and play” using standard interface protocols

Hierarchical

Distributed

Federation

X Point to point integration (APIs)

X Other (please describe): Oracle multi-master replication

Don’t know

List the technologies used to enable any system-of-interest net-centricity (e.g., types of networks, convergent (standard) protocols, run-time protocol converters, data standards, run-time data converters, common operating environment/tools, security). Oracle multi-master replication, First Data Bank data standards, security groups defined by core COTS product.

Describe the role of reuse/COTS in architecture decisions. 100% COTS products, with significant extension to core COTS product (33% additional functionality)

Briefly describe the types of emergent behavior anticipated or identified as a result of creating the system-of-interest. Better care and tracking of jail inmates. Better jail staff safety. Fewer lawsuits related to errors.

Briefly describe the system-of-interest integration strategy. Oracle 7-way Multi-Master Replication in a distributed node environment.

B-4

Page 49: The Constructive Systems Engineering Cost Model - Center for

System Profiler*

For the system-of-interest, indicate the appropriate System Category for each Characteristic listed below.

System Category Context Characteristic

TSE SoSE SoSE Challenges

System Behavior

Known system behavior

System behavior fairly predicable

System behavior will evolve

System Context

Desired Outcome

Improve existing capability

Change existing capability

Build fundamentally new capability

Mission Environment

Mission stable Mission evolves slowly

Mission very fluid, ad hoc

Strategic Context

Scope of Effort Single function Single enterprise Extended enterprise

Stakeholder Relationships

Relationships stable

New relationships Resistance to changing relationships

Stakeholder Context

Stakeholder Involvement

Stakeholders concur

Agree in principle; some not involved

Multiple equities; distrust

Acquisition Environment

Single program, single system

Single program, multiple systems

Multiple programs, multiple systems

Implementation Context

Scale of Effort Single user class Similar users Many different users

* Based on System Profiler in [DoD, 2006b]

B-5

Page 50: The Constructive Systems Engineering Cost Model - Center for

System-of-interest Characteristics Development Effort Profile: Indicate in the following table the approximate percentage of project effort expended for each of the listed process areas (for a more detailed description of the listed the example tasks/activities, see Appendix A.2). If key activities from your project appear to be missing, please add at the end. The sum of the percentages of effort should total 100%.

Process Example EIA/ANSI 632 Tasks % of Total Effort

1. Product Supply

2. Product Acquisition

Acquisition and Supply

3. Supplier Performance

25%

4. Process Implementation Strategy

5. Technical Effort Definition

6. Schedule and Organization

7. Technical Plans

8. Work Directives

9. Progress Against Plans and Schedules

10. Progress Against Requirements

11. Technical Reviews

12. Outcomes Management

Technical Management

13. Information Dissemination

30%

14. Acquirer Requirements

15. Other Stakeholder Requirements

16. System Technical Requirements

17. Logical Solution Representations

18. Physical Solution Representations

System Design

19. Specified Requirements

5%

20. Implementation Product Realization

21. Transition to Use

20%

B-6

Page 51: The Constructive Systems Engineering Cost Model - Center for

Process Example EIA/ANSI 632 Tasks % of Total Effort

22. Effectiveness Analysis

23. Tradeoff Analysis

24. Risk Analysis

25. Requirements Statements Validation

26. Acquirer Requirements Validation

27. Other Stakeholder Requirements Validation

28. System Technical Requirements Validation

29. Logical Solution Representations Validation

30. Design Solution Verification

31. End Product Verification

32. Enabling Product Readiness

Technical Evaluation

33. End Products Validation

10%

Program Management

Business Process Re-engineering 5%

Develop system business rules and associated security groups

5%

Other Processes (Please describe)

System-of-Interest Technical Focus: Independent of the categories above, indicate the percentage of total effort spent in the following areas.

Technical Focus % of Total Effort System-of-interest interfaces (internal and external) 5% Net-centric technology/issues 10% Information/knowledge management/issues 5%

System-of-Interest Life Cycles Model: Indicate which of the following process models most accurately describe the processes used to develop the system-of-interest.

System “V” X Incremental

Evolutionary Agile Spiral Other:

B-7

Page 52: The Constructive Systems Engineering Cost Model - Center for

System-of-Interest Processes:

Identify key processes and relationships between processes using a 2-3 level Microsoft Project description. See attached.

Indicate any applicable maturity levels/certifications for project process (e.g., Software Engineering Institute (SEI) Capability Maturity Model Integrated (CMMI) level or ISO certification).

Lead organization: SEI CMMI Level 3

Vendors/suppliers/partners/subcontractors (use multiple entries as needed): none

Indicate flexibility of processes and amount of experimentation used on project: Little to none in early stages. Considerable multi-master experimentation needed during integration.

System-of-Interest Boundary Objects: Identify, briefly describe and categorize system-of-interest boundary objects used to transfer, translate, or transform knowledge between the stakeholders, the engineering teams, and any component system suppliers/vendors/partners/subcontractors. Some candidate boundary objects have been listed. Strike through those that were not used on your program and add others from your program that are not listed below. (Add as many rows as necessary to capture all relevant boundary objects.)

Boundary Object Brief Description

Use

fuln

ess4

Pers

iste

nce5

Concept of operations

System requirements specification

Requirements document developed by support contractor prior to contract award

2 RFP to contract award

Requirements repository Access database installed on project server 5 Start of contract to end of contract

Interface specifications

Interface design

Software requirements specification

Database design Oracle database schema critical for replication analysis and integration of COTS

4 Detailed design to end of project

UML/SYSML diagrams Use cases, “as is” and “to be” versions 4 Project start through training

DoDAF diagrams Top Level Systems View (SV-1) 1 Early phases

Prototypes (specify an entry for each one utilized)

Exceptional Designer/Engineer

Early multi-master expert from Oracle—left the project in early stages. Replaced by another multi-master expert

5 Reqs Analysis

4 Usefulness: Rating from 1 to 5 on useful of boundary object for communicating between groups (5 is Very High, 1 is Very Low or n t useful at a . o ll)5 Persistence: Initial phase where boundary object created and typical duration of boundary object (phases in which it is used/evolved)

B-8

Page 53: The Constructive Systems Engineering Cost Model - Center for

Boundary Object Brief Description

Use

fuln

ess4

Pers

iste

nce5

(Keeper of the Vision) during integration. Integration/Test

Legacy data mapping Showed the relationship between the existing legacy data and the new database structure

5 Detailed design through cutover to the new system

Test procedures Detailed test procedures developed by implementation team and used by jail staff to conduct acceptance testing

4 Implementation through acceptance

Problem report database

Describes system problems detected during acceptance testing, training, and cut-over

5 Acceptance testing through cutover

Overall synchronicity of boundary objects (rate from 1 (low) to 5 (high)): 5

Requirements Volatility: For completed projects, indicate the requirements volatility experienced on the program by phase.

Initial number of requirements (e.g., number of “shalls”): 4500

Phase

Con

cept

ualiz

e

Dev

elop

Tes

t and

E

valu

atio

n

Tra

nsiti

on to

O

pera

tion

Requirements Added

Requirements Modified 30

Requirements Deleted

B-9

Page 54: The Constructive Systems Engineering Cost Model - Center for

Engineering Influences: Indicate whether the following had any influences (positive, negative, or none) on your engineering activities.

Influence Positive Negative None

Innovative technology +++

Immature technology +++

Number of parallel engineering activities +

Impact of standards X

Impact of lack of standards X

Impact of protocols ++

Impact of lack of standard or convergent protocols X

Organizational management issues related to multiple vendors/suppliers/partners/subcontractors

+

Organizational management issues related to distributed teams ++

New engineering specialties required to develop system-of-interest

+++

New contracting approaches needed to support collaborative development

X

New contracting approaches needed to support rapid/continual change

X

Competing business goals for suppliers/vendors (e.g., engineering for the good of the system-of-interest as opposed to engineering to optimize system-of-interest components)

+

Any other key influences? none

B-10

Page 55: The Constructive Systems Engineering Cost Model - Center for

Success of Program and Overall Assessment of SoSE/TSE Differences Rate the overall success of the system-of-interest program to date for each of the perspectives below using a high, medium, or low rating.

Aspect Rating Comments: Key Reasons for Success or Not So Successful

Achieving/achieved business goals

H Strong stakeholder commitment

Achieving/achieved desired emergent behaviors

H Strong stakeholder commitment

User acceptance H Extensive business process re-engineering and training planned from the beginning

Meeting/met original budgets, schedules

M Immature technology caused more work than estimated

Evolving/evolved system in a timely manner as needed changes are/were identified

M Few changes, but system accommodation of new components planned for early on

Flexible enough to support long term evolution

L Support for multi-master replication not part of long term plans

What else is important?

How different is SoSE from TSE? Rate using a scale from 1 (little or no difference) to 5 (extremely different): 4

B-11

Page 56: The Constructive Systems Engineering Cost Model - Center for

B.2 Sample SoSE Process Model: Jail Information Management System (page 1 of 2)

B-1

Page 57: The Constructive Systems Engineering Cost Model - Center for

B.2 Sample SoSE Process Model: Jail Information Management System (page 2 of 2)

B-2

Page 58: The Constructive Systems Engineering Cost Model - Center for

B.3 Sample TSE Survey Response B.3 Sample TSE Survey Response

Center for Systems & Software Engineering941 W. 37th Pl., Salvatori 328, Los Angeles, A 90089-0781, CPhone: (213) 740-5703, FAX: (213) 740-4927 Phone: (213) 740-5703, FAX: (213) 740-4927

Purpose of Survey Purpose of Survey

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE).

The University of Southern California (USC) Center for Systems and Software Engineering (CSSE) is conducting System of Systems (SoS) and SoS Engineering (SoSE) research to better understand cost and schedule implications of approaches being used to develop and evolve SoSs and how these differ from Traditional Systems Engineering (TSE). The purpose of this survey is to obtain a better understanding of The purpose of this survey is to obtain a better understanding of

• What people are referring to as “systems of systems” • What people are referring to as “systems of systems” • How SoSE activities might differ from TSE activities • How SoSE activities might differ from TSE activities • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities. • Whether the SoSE-TSE differences are due more to technologies/architectures or engineering

processes/activities.

Survey Participant Information Survey Participant Information Name: Example 2 Name: Example 2

Organization: Title Organization: Title

Street Address: Street Address:

City: State: Zip: City: State: Zip:

Phone: Email: Phone: Email:

Role on Program (check all that apply): Role on Program (check all that apply):

Procuring agency/organization

X Lead System Integrator (LSI)/ Prime Contractor

Systems Engineer

Component supplier/vendor

Architect

Supplier/vendor oversight

Integrator/Tester

Verification and Validation

Other:

# years experience in this role: 10

# programs supported in this role: 3

B-1

Page 59: The Constructive Systems Engineering Cost Model - Center for

Program Characteristics (complete a separate survey form for each Program) Name of Program/System: Logistics Data System

Period of Performance: 1990-1992

Goal of system: Automate provisioning activities for new government equipments

Iteration/Build that data pertains to

Business Domain

X Government (Federal, State, Municipal, etc.)

Commercial

Other (please describe):

System Domain (check all that apply)

X Aerospace and Defense Other Government Services

Commercial services Commercial products

X Business Enterprise support

Brief description of domain/type:

Funding strategy:

“Cost plus” or time and materials at system-of-interest level

X Fixed-price at system-of-interest level

Level of effort at system-of-interest level

By component system/platform

Cost sharing

Strategic alliance

Other (please describe):

Organizational interdependencies:

Number of suppliers:

Number of vendors: 1

Number of partners: 2

Number of subcontractors:

Number of layers of associated entities: 1

B-2

Page 60: The Constructive Systems Engineering Cost Model - Center for

System Characteristics and Properties If you think your system-of-interest is an SoS, indicate why you refer to your system as an SoS:

Indicate which of the following characteristics apply to your system-of-interest:

Component systems can operate/function independent of the system-of-interest

% independent:

% dependent upon system-of-interest for operation:

Component systems are independently managed

Evolutionary development process used to develop system-of-interest

X Possesses emergent behavior(s)

Geographically distributed

Number of unique component systems: 1

Number of system-of-interest unique/distinct application/communications protocols: 2

Number of system-of-interest capabilities or requirements: 895

Number of system-of-interest user scenarios (cloud-level):

System-of-interest architecture style (indicate all that apply)

Service oriented architecture (e.g., publish/subscribe)

“Plug and play” using standard interface protocols

Hierarchical

Distributed

Federation

Point to point integration

X Other (please describe): Client-server

Don’t know

List the technologies used to enable any system-of-interest net-centricity (e.g., types of networks, convergent (standard) protocols, run-time protocol converters, data standards, run-time data converters, common operating environment/tools, security). 2 standard data formats to communicate with external systems.

Describe the role of reuse/COTS in architecture decisions. “System upgrade” funding forced the selection of the hardware and database management system vendor.

Briefly describe the types of emergent behavior anticipated or identified as a result of creating the system-of-interest. Automation of provision activities, generation of standard reports.

Briefly describe the system-of-interest integration strategy. None.

B-3

Page 61: The Constructive Systems Engineering Cost Model - Center for

System Profiler*

For the system-of-interest, indicate the appropriate System Category for each Characteristic listed below.

System Category Context Characteristic

TSE SoSE SoSE Challenges

System Behavior

Known system behavior

System behavior fairly predicable

System behavior will evolve

System Context

Desired Outcome

Improve existing capability

Change existing capability

Build fundamentally new capability

Mission Environment

Mission stable Mission evolves slowly

Mission very fluid, ad hoc

Strategic Context

Scope of Effort Single function Single enterprise Extended enterprise

Stakeholder Relationships

Relationships stable

New relationships Resistance to changing relationships

Stakeholder Context

Stakeholder Involvement

Stakeholders concur

Agree in principle; some not involved

Multiple equities; distrust

Acquisition Environment

Single program, single system

Single program, multiple systems

Multiple programs, multiple systems

Implementation Context

Scale of Effort Single user class Similar users Many different users

* Based on System Profiler in [DoD, 2006b]

B-4

Page 62: The Constructive Systems Engineering Cost Model - Center for

System-of-interest Characteristics Development Effort Profile: Indicate in the following table the approximate percentage of project effort expended for each of the listed process areas (for a more detailed description of the listed the example tasks/activities, see Appendix A.2). If key activities from your project appear to be missing, please add at the end. The sum of the percentages of effort should total 100%.

Process Example EIA/ANSI 632 Tasks % of Total Effort

1. Product Supply

2. Product Acquisition

Acquisition and Supply

3. Supplier Performance

2%

4. Process Implementation Strategy

5. Technical Effort Definition

6. Schedule and Organization

7. Technical Plans

8. Work Directives

9. Progress Against Plans and Schedules

10. Progress Against Requirements

11. Technical Reviews

12. Outcomes Management

Technical Management

13. Information Dissemination

8%

14. Acquirer Requirements

15. Other Stakeholder Requirements

16. System Technical Requirements

17. Logical Solution Representations

18. Physical Solution Representations

System Design

19. Specified Requirements

30%

20. Implementation Product Realization

21. Transition to Use 40%

B-5

Page 63: The Constructive Systems Engineering Cost Model - Center for

Process Example EIA/ANSI 632 Tasks % of Total Effort

22. Effectiveness Analysis

23. Tradeoff Analysis

24. Risk Analysis

25. Requirements Statements Validation

26. Acquirer Requirements Validation

27. Other Stakeholder Requirements Validation

28. System Technical Requirements Validation

29. Logical Solution Representations Validation

30. Design Solution Verification

31. End Product Verification

32. Enabling Product Readiness

Technical Evaluation

33. End Products Validation

20%

Program Management

Business Process Re-engineering

Other Processes (Please describe)

System-of-Interest Technical Focus: Independent of the categories above, indicate the percentage of total effort spent in the following areas.

Technical Focus % of Total Effort System-of-interest interfaces (internal and external) 5% Net-centric technology/issues --- Information/knowledge management/issues 10%

System-of-Interest Life Cycles Model: Indicate which of the following process models most accurately describe the processes used to develop the system-of-interest.

X System “V” Incremental Evolutionary Agile Spiral Other:

B-6

Page 64: The Constructive Systems Engineering Cost Model - Center for

System-of-Interest Processes:

Identify key processes and relationships between processes using a 2-3 level Microsoft Project description. See attached.

Indicate any applicable maturity levels/certifications for project process (e.g., Software Engineering Institute (SEI) Capability Maturity Model Integrated (CMMI) level or ISO certification).

Lead organization: SEI CMM Level 2

Vendors/suppliers/partners/subcontractors (use multiple entries as needed): None

Indicate flexibility of processes and amount of experimentation used on project: Little

System-of-Interest Boundary Objects: Identify, briefly describe and categorize system-of-interest boundary objects used to transfer, translate, or transform knowledge between the stakeholders, the engineering teams, and any component system suppliers/vendors/partners/subcontractors. Some candidate boundary objects have been listed. Strike through those that were not used on your program and add others from your program that are not listed below. (Add as many rows as necessary to capture all relevant boundary objects.)

Boundary Object Brief Description U

sefu

lnes

s6

Pers

iste

nce7

Concept of operations

System requirements specification

DoD 2167A format 2 System requirements analysis and software allocation

Requirements repository

Interface specifications

Interface design

Software requirements specification

DoD 2167 A format 3 Reqs phase through testing

Database design Entity-relationship diagram 4 Design through testing

Data flow diagram DFD based on requirements and user inputs using Joint Application Development process

4 Requirements analysis through prelim design

DoDAF diagrams

Prototypes User interface—prototype evolved into actual interface

5 Prelim design through implementation

Exceptional Designer/Engineer (Keeper of the Vision)

Lead architect communicated with users, development team, and test team

3 From project start to completion

6 Usefulness: Rating from 1 to 5 on useful of boundary object for communicating between groups (5 is Very High, 1 is Very Low or n t useful at a . o ll)7 Persistence: Initial phase where boundary object created and typical duration of boundary object (phases in which it is used/evolved)

B-7

Page 65: The Constructive Systems Engineering Cost Model - Center for

Boundary Object Brief Description

Use

fuln

ess6

Pers

iste

nce7

Test Procedures DoD 2167A format 4 Detailed design through system acceptance

Overall synchronicity of boundary objects (rate from 1 (low) to 5 (high)): 5

Requirements Volatility: For completed projects, indicate the requirements volatility experienced on the program by phase.

Initial number of requirements (e.g., number of “shalls”): 895

Phase

Con

cept

ualiz

e

Dev

elop

Tes

t and

E

valu

atio

n

Tra

nsiti

on to

O

pera

tion

Requirements Added 2

Requirements Modified

Requirements Deleted 1

B-8

Page 66: The Constructive Systems Engineering Cost Model - Center for

Engineering Influences: Indicate whether the following had any influences (positive, negative, or none) on your engineering activities.

Influence Positive Negative None

Innovative technology (4Gl instead of Ada) ++

Immature technology n/a

Number of parallel engineering activities n/a

Impact of standards ++

Impact of lack of standards n/a

Impact of protocols n/a

Impact of lack of standard or convergent protocols n/a

Organizational management issues related to multiple vendors/suppliers/partners/subcontractors

n/a

Organizational management issues related to distributed teams +

New engineering specialties required to develop system-of-interest

n/a

New contracting approaches needed to support collaborative development

n/a

New contracting approaches needed to support rapid/continual change

n/a

Competing business goals for suppliers/vendors (e.g., engineering for the good of the system-of-interest as opposed to engineering to optimize system-of-interest components)

n/a

Any other key influences?

B-9

Page 67: The Constructive Systems Engineering Cost Model - Center for

Success of Program and Overall Assessment of SoSE/TSE Differences Rate the overall success of the system-of-interest program to date for each of the perspectives below using a high, medium, or low rating.

Aspect Rating Comments: Key Reasons for Success or Not So Successful

Achieving/achieved business goals

H Stakeholder commitment

Achieving/achieved desired emergent behaviors

H Stakeholder commitment

User acceptance H & L Users initially excited as they watched the system developed, but were concerned about having a job is the system were deployed.

Meeting/met original budgets, schedules

H Change to 4GL from Ada requirements saved a seriously underbid fixed priced project

Evolving/evolved system in a timely manner as needed changes are/were identified

L Few changes

Flexible enough to support long term evolution

L Selected vendor dropped database management system product since it was not a good competitive product.

What else is important?

How different is SoSE from TSE? Rate using a scale from 1 (little or no difference) to 5 (extremely different): 4

B-10

Page 68: The Constructive Systems Engineering Cost Model - Center for

B.4 Sample TSE Process Model: Provisioning System (Page 1 of 2)

B-1

Page 69: The Constructive Systems Engineering Cost Model - Center for

B.4 Sample TSE Process Model: Provisioning System (Page 2 of 2)

B-2

Page 70: The Constructive Systems Engineering Cost Model - Center for

B.5 Sample Comparative Analysis

Comparison of ANSI/EIA 632 Process Areas

Figure B-1 shows the project allocation of effort across the ANSI/EIA 632 process areas using a Kiviat Diagram. As can be seen from this chart, there are significant differences in how effort was expended on these two projects. During the actual research effort, single projects will not be compared, rather the means of each population (SoSE and TSE) will be compared.

ANSI/EIA Process Area Comparison: Kiviat Diagram

0%

20%

40%Acquisition and Supply

Technical Management

System Design

Product RealizationTechnical Evaluation

Business Process Re-engineering

Rules and Security

SoSETSE

Figure B-1. ANSI/EIA 632 Process Comparison

Figures B-2 and B-3 provide a Pareto view of these same differences.

B-1

Page 71: The Constructive Systems Engineering Cost Model - Center for

Pareto Analysis of SoSE Process Areas

0%5%

10%15%20%25%30%35%

Tech

nica

lM

anag

emen

t

Acq

uisi

tion

and

Supp

ly

Prod

uct

Rea

lizat

ion

Tech

nica

lEv

alua

tion

Syst

emD

esig

n

Bus

ines

sPr

oces

s Re-

engi

neer

ing

Rul

es a

ndSe

curit

y

% E

ffort

Figure B-2. Pareto View of SoSE Activities.

Pareto Analysis of TSE Process Areas

0%

10%

20%

30%

40%

50%

Prod

uct

Rea

lizat

ion

Tech

nica

lEv

alua

tion

Syst

emD

esig

n

Tech

nica

lM

anag

emen

t

Acqu

isiti

onan

d Su

pply

% E

ffort

Figure B-3. Pareto View of TSE Activities.

These few diagrams highlight some of the analysis charts that can be generated with just two projects to compare. However, when evaluating sets of SoSE and TSE projects, additional statistical analysis will be performed for each population to ensure that the significance of the data is understood. Additional data analysis samples will be provided as part of the oral presentation of this proposal.

B-2